WorldWideScience

Sample records for program bivariate analyses

  1. A dynamic bivariate Poisson model for analysing and forecasting match results in the English Premier League

    NARCIS (Netherlands)

    Koopman, S.J.; Lit, R.

    2015-01-01

    Summary: We develop a statistical model for the analysis and forecasting of football match results which assumes a bivariate Poisson distribution with intensity coefficients that change stochastically over time. The dynamic model is a novelty in the statistical time series analysis of match results

  2. Powerful bivariate genome-wide association analyses suggest the SOX6 gene influencing both obesity and osteoporosis phenotypes in males.

    Science.gov (United States)

    Liu, Yao-Zhong; Pei, Yu-Fang; Liu, Jian-Feng; Yang, Fang; Guo, Yan; Zhang, Lei; Liu, Xiao-Gang; Yan, Han; Wang, Liang; Zhang, Yin-Ping; Levy, Shawn; Recker, Robert R; Deng, Hong-Wen

    2009-08-28

    Current genome-wide association studies (GWAS) are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically. To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI), with the osteoporosis risk phenotype, hip bone mineral density (BMD), scanning approximately 380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6) gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82x10(-7) and 1.47x10(-6), respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the approximately 380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS) cohort containing 3,355 Caucasians (1,370 males and 1,985 females) from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat. Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis.

  3. Powerful bivariate genome-wide association analyses suggest the SOX6 gene influencing both obesity and osteoporosis phenotypes in males.

    Directory of Open Access Journals (Sweden)

    Yao-Zhong Liu

    2009-08-01

    Full Text Available Current genome-wide association studies (GWAS are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically.To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI, with the osteoporosis risk phenotype, hip bone mineral density (BMD, scanning approximately 380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6 gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82x10(-7 and 1.47x10(-6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the approximately 380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS cohort containing 3,355 Caucasians (1,370 males and 1,985 females from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat.Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis.

  4. Bivariate discrete Linnik distribution

    Directory of Open Access Journals (Sweden)

    Davis Antony Mundassery

    2014-10-01

    Full Text Available Christoph and Schreiber (1998a studied the discrete analogue of positive Linnik distribution and obtained its characterizations using survival function. In this paper, we introduce a bivariate form of the discrete Linnik distribution and study its distributional properties. Characterizations of the bivariate distribution are obtained using compounding schemes. Autoregressive processes are developed with marginals follow the bivariate discrete Linnik distribution.

  5. Ordinal bivariate inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlationincreasing switches and/or median......-preserving spreads. For the canonical 2x2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  6. Ordinal Bivariate Inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    2016-01-01

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlation-increasing switches and....../or median-preserving spreads. For the canonical 2 × 2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  7. Mixed effects models with bivariate and univariate association parameters for longitudinal bivariate binary response data.

    Science.gov (United States)

    Ten Have, T R; Morabia, A

    1999-03-01

    When two binary responses are measured for each study subject across time, it may be of interest to model how the bivariate associations and marginal univariate risks involving the two responses change across time. To achieve such a goal, marginal models with bivariate log odds ratio and univariate logit components are extended to include random effects for all components. Specifically, separate normal random effects are specified on the log odds ratio scale for bivariate responses and on the logit scale for univariate responses. Assuming conditional independence given the random effects facilitates the modeling of bivariate associations across time with missing at random incomplete data. We fit the model to a dataset for which such structures are feasible: a longitudinal randomized trial of a cardiovascular educational program where the responses of interest are change in hypertension and hypercholestemia status. The proposed model is compared to a naive bivariate model that assumes independence between time points and univariate mixed effects logit models.

  8. Automating Ultrasonic Vocalization Analyses: The WAAVES Program

    Science.gov (United States)

    Reno, James M.; Marker, Bryan; Cormack, Lawrence K.; Schallert, Timothy; Duvauchelle, Christine L.

    2014-01-01

    Background Human emotion is a crucial component of drug abuse and addiction. Ultrasonic vocalizations (USVs) elicited by rodents are a highly translational animal model of emotion in drug abuse studies. A major roadblock to comprehensive use of USV data is the overwhelming burden to attain accurate USV assessment in a timely manner. One of the most accurate methods of analyzing USVs, human auditory detection with simultaneous spectrogram inspection, requires USV sound files to be played back 4% normal speed. New Method WAAVES (WAV-file Automated Analysis of Vocalizations Environment Specific) is an automated USV assessment program utilizing MATLAB’s Signal and Image Processing Toolboxes in conjunction with a series of customized filters to separate USV calls from background noise, and accurately tabulate and categorize USVs as flat or frequency-modulated (FM) calls. In the current report, WAAVES functionality is demonstrated by USV analyses of cocaine self-administration data collected over 10 daily sessions. Results WAAVES counts are significantly correlated with human auditory counts (r(48)=0.9925; p<0.001). Statistical analyses used WAAVES output to examine individual differences in USV responses to cocaine, cocaine-associated cues and relationships between USVs, cocaine intake and locomotor activity. Comparison with Existing Method WAAVES output is highly accurate and provides tabulated data in approximately 0.4% of the time required when using human auditory detection methods. Conclusions The development of a customized USV analysis program, such as WAAVES streamlines USV assessment and enhances the ability to utilize USVs as a tool to advance drug abuse research and ultimately develop effective treatments. PMID:23832016

  9. Some properties of a 5-parameter bivariate probability distribution

    Science.gov (United States)

    Tubbs, J. D.; Brewer, D. W.; Smith, O. E.

    1983-01-01

    A five-parameter bivariate gamma distribution having two shape parameters, two location parameters and a correlation parameter was developed. This more general bivariate gamma distribution reduces to the known four-parameter distribution. The five-parameter distribution gives a better fit to the gust data. The statistical properties of this general bivariate gamma distribution and a hypothesis test were investigated. Although these developments have come too late in the Shuttle program to be used directly as design criteria for ascent wind gust loads, the new wind gust model has helped to explain the wind profile conditions which cause large dynamic loads. Other potential applications of the newly developed five-parameter bivariate gamma distribution are in the areas of reliability theory, signal noise, and vibration mechanics.

  10. Bivariate value-at-risk

    Directory of Open Access Journals (Sweden)

    Giuseppe Arbia

    2007-10-01

    Full Text Available In this paper we extend the concept of Value-at-risk (VaR to bivariate return distributions in order to obtain measures of the market risk of an asset taking into account additional features linked to downside risk exposure. We first present a general definition of risk as the probability of an adverse event over a random distribution and we then introduce a measure of market risk (b-VaR that admits the traditional b of an asset in portfolio management as a special case when asset returns are normally distributed. Empirical evidences are provided by using Italian stock market data.

  11. Systematic derivation of correct variability-aware program analyses

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Dimovski, Aleksandar S.; Brabrand, Claus

    2015-01-01

    A recent line of work lifts particular verification and analysis methods to Software Product Lines (SPL). In an effort to generalize such case-by-case approaches, we develop a systematic methodology for lifting single-program analyses to SPLs using abstract interpretation. Abstract interpretation...... for lifting analyses and Galois connections. We prove that for analyses developed using our method, the soundness of lifting follows by construction. The resulting variational abstract interpretation is a conceptual framework for understanding, deriving, and validating static analyses for SPLs. Then we show...... how to derive the corresponding variational dataflow equations for an example static analysis, a constant propagation analysis. We also describe how to approximate variability by applying variability-aware abstractions to SPL analysis. Finally, we discuss how to efficiently implement our method...

  12. SPSS and SAS programs for generalizability theory analyses.

    Science.gov (United States)

    Mushquash, Christopher; O'Connor, Brian P

    2006-08-01

    The identification and reduction of measurement errors is a major challenge in psychological testing. Most investigators rely solely on classical test theory for assessing reliability, whereas most experts have long recommended using generalizability theory instead. One reason for the common neglect of generalizability theory is the absence of analytic facilities for this purpose in popular statistical software packages. This article provides a brief introduction to generalizability theory, describes easy to use SPSS, SAS, and MATLAB programs for conducting the recommended analyses, and provides an illustrative example, using data (N = 329) for the Rosenberg Self-Esteem Scale. Program output includes variance components, relative and absolute errors and generalizability coefficients, coefficients for D studies, and graphs of D study results.

  13. Incomplete Bivariate Fibonacci and Lucas -Polynomials

    Directory of Open Access Journals (Sweden)

    Dursun Tasci

    2012-01-01

    Full Text Available We define the incomplete bivariate Fibonacci and Lucas -polynomials. In the case =1, =1, we obtain the incomplete Fibonacci and Lucas -numbers. If =2, =1, we have the incomplete Pell and Pell-Lucas -numbers. On choosing =1, =2, we get the incomplete generalized Jacobsthal number and besides for =1 the incomplete generalized Jacobsthal-Lucas numbers. In the case =1, =1, =1, we have the incomplete Fibonacci and Lucas numbers. If =1, =1, =1, =⌊(−1/(+1⌋, we obtain the Fibonacci and Lucas numbers. Also generating function and properties of the incomplete bivariate Fibonacci and Lucas -polynomials are given.

  14. Exact Asymptotics of Bivariate Scale Mixture Distributions

    OpenAIRE

    Hashorva, Enkelejd

    2009-01-01

    Let (RU_1, R U_2) be a given bivariate scale mixture random vector, with R>0 being independent of the bivariate random vector (U_1,U_2). In this paper we derive exact asymptotic expansions of the tail probability P{RU_1> x, RU_2> ax}, a \\in (0,1] as x tends infintiy assuming that R has distribution function in the Gumbel max-domain of attraction and (U_1,U_2) has a specific tail behaviour around some absorbing point. As a special case of our results we retrieve the exact asymptotic behaviour ...

  15. Five-Parameter Bivariate Probability Distribution

    Science.gov (United States)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  16. Analyse

    DEFF Research Database (Denmark)

    Greve, Bent

    2007-01-01

    Analyse i Politiken om frynsegoder med udgangspunkt i bogen Occupational Welfare - Winners and Losers publiceret på Edward Elgar......Analyse i Politiken om frynsegoder med udgangspunkt i bogen Occupational Welfare - Winners and Losers publiceret på Edward Elgar...

  17. Reliability for some bivariate beta distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models there has been a large amount of work as regards estimation of the reliability R=Pr( Xbivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate beta. The calculations involve the use of special functions.

  18. Reliability for some bivariate gamma distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models, there has been a large amount of work as regards estimation of the reliability R=Pr( Xbivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate gamma. The calculations involve the use of special functions.

  19. Covariate analysis of bivariate survival data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  20. Almost opposite regression dependence in bivariate distributions

    OpenAIRE

    Siburg, Karl Friedrich; Stoimenov, Pavel A.

    2014-01-01

    Let X,Y be two continuous random variables. Investigating the regression dependence of Y on X, respectively, of X on Y, we show that the two of them can have almost opposite behavior. Indeed, given any e > 0, we construct a bivariate random vector (X,Y) such that the respective regression dependence measures r2|1(X,Y), r1|2(X,Y) ∈ [0,1] introduced in Dette et al. (2013) satisfy r2|1(X,Y) = 1 as well as r1|2(X,Y)

  1. Spectral density regression for bivariate extremes

    KAUST Repository

    Castro Camilo, Daniela

    2016-05-11

    We introduce a density regression model for the spectral density of a bivariate extreme value distribution, that allows us to assess how extremal dependence can change over a covariate. Inference is performed through a double kernel estimator, which can be seen as an extension of the Nadaraya–Watson estimator where the usual scalar responses are replaced by mean constrained densities on the unit interval. Numerical experiments with the methods illustrate their resilience in a variety of contexts of practical interest. An extreme temperature dataset is used to illustrate our methods. © 2016 Springer-Verlag Berlin Heidelberg

  2. Analysing Student Programs in the PHP Intelligent Tutoring System

    Science.gov (United States)

    Weragama, Dinesha; Reye, Jim

    2014-01-01

    Programming is a subject that many beginning students find difficult. The PHP Intelligent Tutoring System (PHP ITS) has been designed with the aim of making it easier for novices to learn the PHP language in order to develop dynamic web pages. Programming requires practice. This makes it necessary to include practical exercises in any ITS that…

  3. On the Bivariate Kummer-Beta Type IV Distribution

    NARCIS (Netherlands)

    Jacobs, R.; Bekker, A.; Human, S.W.

    2012-01-01

    In this article, the non central bivariate Kummer-beta Type IV distribution is introduced and derived via the Laplace transform of the non central bivariate beta distribution by Gupta et al. (2011 ). We focus on and discuss the central bivariate Kummer-beta Type IV distribution; this distribution is

  4. Bivariate Rayleigh Distribution and its Properties

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2007-01-01

    Full Text Available Rayleigh (1880 observed that the sea waves follow no law because of the complexities of the sea, but it has been seen that the probability distributions of wave heights, wave length, wave induce pitch, wave and heave motions of the ships follow the Rayleigh distribution. At present, several different quantities are in use for describing the state of the sea; for example, the mean height of the waves, the root mean square height, the height of the “significant waves” (the mean height of the highest one-third of all the waves the maximum height over a given interval of the time, and so on. At present, the ship building industry knows less than any other construction industry about the service conditions under which it must operate. Only small efforts have been made to establish the stresses and motions and to incorporate the result of such studies in to design. This is due to the complexity of the problem caused by the extensive variability of the sea and the corresponding response of the ships. Although the problem appears feasible, yet it is possible to predict service conditions for ships in an orderly and relatively simple manner Rayleigh (1980 derived it from the amplitude of sound resulting from many independent sources. This distribution is also connected with one or two dimensions and is sometimes referred to as “random walk” frequency distribution. The Rayleigh distribution can be derived from the bivariate normal distribution when the variate are independent and random with equal variances. We try to construct bivariate Rayleigh distribution with marginal Rayleigh distribution function and discuss its fundamental properties.

  5. A User’s Guide to BISAM (BIvariate SAMple): The Bivariate Data Modeling Program.

    Science.gov (United States)

    1983-08-01

    SU1Sam 842 EN*NE*OSSS :2. C 5 .... !0N SSU A S2S. SZ LUEN............S..A a&:. 55m CUNSSU A C2. 5 LUSENGUT.USYNSUSI SUSAN C21 THE 1S*VANNS*IT SUSAN4...SUSAN 142sS REAL YUNU.Y UA 2427 VUUUSUA 2429. 15. SUSAN 2S2 as TO 20SSA 2422. SD UUP SAN 2425 LP * .WIP I SSA 384 25 UIUU-P.TS sD TD

  6. Object-oriented fault tree evaluation program for quantitative analyses

    Science.gov (United States)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  7. A note on finding peakedness in bivariate normal distribution using Mathematica

    Directory of Open Access Journals (Sweden)

    Anwer Khurshid

    2007-07-01

    Full Text Available Peakedness measures the concentration around the central value. A classical standard measure of peakedness is kurtosis which is the degree of peakedness of a probability distribution. In view of inconsistency of kurtosis in measuring of the peakedness of a distribution, Horn (1983 proposed a measure of peakedness for symmetrically unimodal distributions. The objective of this paper is two-fold. First, Horn’s method has been extended for bivariate normal distribution. Secondly, to show that computer algebra system Mathematica can be extremely useful tool for all sorts of computation related to bivariate normal distribution. Mathematica programs are also provided.

  8. An assessment on the use of bivariate, multivariate and soft ...

    Indian Academy of Sciences (India)

    The paper presented herein compares and discusses the use of bivariate, multivariate and soft computing techniques for collapse susceptibility modelling. Conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) models representing the bivariate, multivariate and soft computing techniques ...

  9. Some bivariate distributions for modeling the strength properties of lumber

    Science.gov (United States)

    Richard A. Johnson; James W. Evans; David W. Green

    Accurate modeling of the joint stochastic nature of the strength properties of dimension lumber is essential to the determination of reliability-based design safety factors. This report reviews the major techniques for obtaining bivariate distributions and then discusses bivariate distributions whose marginal distributions suggest they might be useful for modeling the...

  10. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1bivariate distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  11. Evaluation dam overtopping risk based on univariate and bivariate flood frequency analysis

    Science.gov (United States)

    Goodarzi, E.; Mirzaei, M.; Shui, L. T.; Ziaei, M.

    2011-11-01

    There is a growing tendency to assess the safety levels of existing dams based on risk and uncertainty analysis using mathematical and statistical methods. This research presents the application of risk and uncertainty analysis to dam overtopping based on univariate and bivariate flood frequency analyses by applying Gumbel logistic distribution for the Doroudzan earth-fill dam in south of Iran. The bivariate frequency analysis resulted in six inflow hydrographs with a joint return period of 100-yr. The overtopping risks were computed for all of those hydrographs considering quantile of flood peak discharge (in particular 100-yr), initial depth of water in the reservoir, and discharge coefficient of spillway as uncertain variables. The maximum height of the water, as most important factor in the overtopping analysis, was evaluated using reservoir routing and the Monte Carlo and Latin hypercube techniques were applied for uncertainty analysis. Finally, the achieved results using both univariate and bivariate frequency analysis have been compared to show the significance of bivariate analyses on dam overtopping.

  12. STUDI PERBANDINGAN ANTARA ALGORITMA BIVARIATE MARGINAL DISTRIBUTION DENGAN ALGORITMA GENETIKA

    Directory of Open Access Journals (Sweden)

    Chastine Fatichah

    2006-01-01

    Full Text Available Bivariate Marginal Distribution Algorithm is extended from Estimation of Distribution Algorithm. This heuristic algorithm proposes the new approach for recombination of generate new individual that without crossover and mutation process such as genetic algorithm. Bivariate Marginal Distribution Algorithm uses connectivity variable the pair gene for recombination of generate new individual. Connectivity between variable is doing along optimization process. In this research, genetic algorithm performance with one point crossover is compared with Bivariate Marginal Distribution Algorithm performance in case Onemax, De Jong F2 function, and Traveling Salesman Problem. In this research, experimental results have shown performance the both algorithm is dependence of parameter respectively and also population size that used. For Onemax case with size small problem, Genetic Algorithm perform better with small number of iteration and more fast for get optimum result. However, Bivariate Marginal Distribution Algorithm perform better of result optimization for case Onemax with huge size problem. For De Jong F2 function, Genetic Algorithm perform better from Bivariate Marginal Distribution Algorithm of a number of iteration and time. For case Traveling Salesman Problem, Bivariate Marginal Distribution Algorithm have shown perform better from Genetic Algorithm of optimization result. Abstract in Bahasa Indonesia : Bivariate Marginal Distribution Algorithm merupakan perkembangan lebih lanjut dari Estimation of Distribution Algorithm. Algoritma heuristik ini mengenalkan pendekatan baru dalam melakukan rekombinasi untuk membentuk individu baru, yaitu tidak menggunakan proses crossover dan mutasi seperti pada Genetic Algorithm. Bivariate Marginal Distribution Algorithm menggunakan keterkaitan pasangan variabel dalam melakukan rekombinasi untuk membentuk individu baru. Keterkaitan antar variabel tersebut ditemukan selama proses optimasi berlangsung. Aplikasi yang

  13. Copula bivariate probit models: with an application to medical expenditures.

    Science.gov (United States)

    Winkelmann, Rainer

    2012-12-01

    The bivariate probit model is frequently used for estimating the effect of an endogenous binary regressor (the 'treatment') on a binary health outcome variable. This paper discusses simple modifications that maintain the probit assumption for the marginal distributions while introducing non-normal dependence using copulas. In an application of the copula bivariate probit model to the effect of insurance status on the absence of ambulatory health care expenditure, a model based on the Frank copula outperforms the standard bivariate probit model. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting

    2011-03-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  15. Cost and cost threshold analyses for 12 innovative US HIV linkage and retention in care programs.

    Science.gov (United States)

    Jain, Kriti M; Maulsby, Catherine; Brantley, Meredith; Kim, Jeeyon Janet; Zulliger, Rose; Riordan, Maura; Charles, Vignetta; Holtgrave, David R

    2016-09-01

    Out of >1,000,000 people living with HIV in the USA, an estimated 60% were not adequately engaged in medical care in 2011. In response, AIDS United spearheaded 12 HIV linkage and retention in care programs. These programs were supported by the Social Innovation Fund, a White House initiative. Each program reflected the needs of its local population living with HIV. Economic analyses of such programs, such as cost and cost threshold analyses, provide important information for policy-makers and others allocating resources or planning programs. Implementation costs were examined from societal and payer perspectives. This paper presents the results of cost threshold analyses, which provide an estimated number of HIV transmissions that would have to be averted for each program to be considered cost-saving and cost-effective. The methods were adapted from the US Panel on Cost-effectiveness in Health and Medicine. Per client program costs ranged from $1109.45 to $7602.54 from a societal perspective. The cost-saving thresholds ranged from 0.32 to 1.19 infections averted, and the cost-effectiveness thresholds ranged from 0.11 to 0.43 infections averted by the programs. These results suggest that such programs are a sound and efficient investment towards supporting goals set by US HIV policy-makers. Cost-utility data are pending.

  16. A comparison of bivariate and univariate QTL mapping in livestock populations

    Directory of Open Access Journals (Sweden)

    Sorensen Daniel

    2003-11-01

    Full Text Available Abstract This study presents a multivariate, variance component-based QTL mapping model implemented via restricted maximum likelihood (REML. The method was applied to investigate bivariate and univariate QTL mapping analyses, using simulated data. Specifically, we report results on the statistical power to detect a QTL and on the precision of parameter estimates using univariate and bivariate approaches. The model and methodology were also applied to study the effectiveness of partitioning the overall genetic correlation between two traits into a component due to many genes of small effect, and one due to the QTL. It is shown that when the QTL has a pleiotropic effect on two traits, a bivariate analysis leads to a higher statistical power of detecting the QTL and to a more precise estimate of the QTL's map position, in particular in the case when the QTL has a small effect on the trait. The increase in power is most marked in cases where the contributions of the QTL and of the polygenic components to the genetic correlation have opposite signs. The bivariate REML analysis can successfully partition the two components contributing to the genetic correlation between traits.

  17. Bivariate functional principal components analysis: considerations for use with multivariate movement signatures in sports biomechanics.

    Science.gov (United States)

    Warmenhoven, John; Cobley, Stephen; Draper, Conny; Harrison, Andrew; Bargary, Norma; Smith, Richard

    2017-11-10

    Sporting performance is often investigated through graphical observation of key technical variables that are representative of whole movements. The presence of differences between athletes in such variables has led to terms such as movement signatures being used. These signatures can be multivariate (multiple time-series observed concurrently), and also be composed of variables measured relative to different scales. Analytical techniques from areas of statistics such as Functional Data Analysis (FDA) present a practical alternative for analysing multivariate signatures. When applied to concurrent bivariate time-series multivariate functional principal components analysis (referred to as bivariate fPCA or bfPCA in this paper) has demonstrated preliminary application in biomechanical contexts. Despite this, given the infancy of bfPCA in sports biomechanics there are still necessary considerations for its use with non-conventional or complex bivariate structures. This paper focuses on the application of bfPCA to the force-angle graph in on-water rowing, which is a bivariate structure composed of variables with different units. A normalisation approach is proposed to investigate and standardise differences in variability between the two variables. The results of bfPCA applied to the non-normalised data and normalised data are then compared. Considerations and recommendations for the application of bfPCA in this context are also provided.

  18. Copula bivariate probit models: with an application to medical expenditures

    OpenAIRE

    Winkelmann, Rainer

    2011-01-01

    The bivariate probit model is frequently used for estimating the effect of an endogenous binary regressor (the "treatment") on a binary health outcome variable. This paper discusses simple modifications that maintain the probit assumption for the marginal distributions while introducing non-normal dependence using copulas. In an application of the copula bivariate probit model to the effect of insurance status on the absence of ambulatory health care expen- diture, a model based on the Frank ...

  19. Bivariate Genomic Footprinting Detects Changes in Transcription Factor Activity

    Directory of Open Access Journals (Sweden)

    Songjoon Baek

    2017-05-01

    Full Text Available In response to activating signals, transcription factors (TFs bind DNA and regulate gene expression. TF binding can be measured by protection of the bound sequence from DNase digestion (i.e., footprint. Here, we report that 80% of TF binding motifs do not show a measurable footprint, partly because of a variable cleavage pattern within the motif sequence. To more faithfully portray the effect of TFs on chromatin, we developed an algorithm that captures two TF-dependent effects on chromatin accessibility: footprinting and motif-flanking accessibility. The algorithm, termed bivariate genomic footprinting (BaGFoot, efficiently detects TF activity. BaGFoot is robust to different accessibility assays (DNase-seq, ATAC-seq, all examined peak-calling programs, and a variety of cut bias correction approaches. BaGFoot reliably predicts TF binding and provides valuable information regarding the TFs affecting chromatin accessibility in various biological systems and following various biological events, including in cases where an absolute footprint cannot be determined.

  20. Bivariate Genomic Footprinting Detects Changes in Transcription Factor Activity.

    Science.gov (United States)

    Baek, Songjoon; Goldstein, Ido; Hager, Gordon L

    2017-05-23

    In response to activating signals, transcription factors (TFs) bind DNA and regulate gene expression. TF binding can be measured by protection of the bound sequence from DNase digestion (i.e., footprint). Here, we report that 80% of TF binding motifs do not show a measurable footprint, partly because of a variable cleavage pattern within the motif sequence. To more faithfully portray the effect of TFs on chromatin, we developed an algorithm that captures two TF-dependent effects on chromatin accessibility: footprinting and motif-flanking accessibility. The algorithm, termed bivariate genomic footprinting (BaGFoot), efficiently detects TF activity. BaGFoot is robust to different accessibility assays (DNase-seq, ATAC-seq), all examined peak-calling programs, and a variety of cut bias correction approaches. BaGFoot reliably predicts TF binding and provides valuable information regarding the TFs affecting chromatin accessibility in various biological systems and following various biological events, including in cases where an absolute footprint cannot be determined. Published by Elsevier Inc.

  1. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2018-01-01

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach, and has several attractive features compared to the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, since the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. PMID:26303591

  2. Fitting statistical models in bivariate allometry.

    Science.gov (United States)

    Packard, Gary C; Birchard, Geoffrey F; Boardman, Thomas J

    2011-08-01

    Several attempts have been made in recent years to formulate a general explanation for what appear to be recurring patterns of allometric variation in morphology, physiology, and ecology of both plants and animals (e.g. the Metabolic Theory of Ecology, the Allometric Cascade, the Metabolic-Level Boundaries hypothesis). However, published estimates for parameters in allometric equations often are inaccurate, owing to undetected bias introduced by the traditional method for fitting lines to empirical data. The traditional method entails fitting a straight line to logarithmic transformations of the original data and then back-transforming the resulting equation to the arithmetic scale. Because of fundamental changes in distributions attending transformation of predictor and response variables, the traditional practice may cause influential outliers to go undetected, and it may result in an underparameterized model being fitted to the data. Also, substantial bias may be introduced by the insidious rotational distortion that accompanies regression analyses performed on logarithms. Consequently, the aforementioned patterns of allometric variation may be illusions, and the theoretical explanations may be wide of the mark. Problems attending the traditional procedure can be largely avoided in future research simply by performing preliminary analyses on arithmetic values and by validating fitted equations in the arithmetic domain. The goal of most allometric research is to characterize relationships between biological variables and body size, and this is done most effectively with data expressed in the units of measurement. Back-transforming from a straight line fitted to logarithms is not a generally reliable way to estimate an allometric equation in the original scale. © 2010 The Authors. Biological Reviews © 2010 Cambridge Philosophical Society.

  3. Secondary Data Analyses of Conclusions Drawn by the Program Implementers of a Positive Youth Development Program in Hong Kong

    Directory of Open Access Journals (Sweden)

    Andrew M. H. Siu

    2010-01-01

    Full Text Available The Tier 2 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes is designed for adolescents with significant psychosocial needs, and its various programs are designed and implemented by social workers (program implementers for specific student groups in different schools. Using subjective outcome evaluation data collected from the program participants (Form C at 207 schools, the program implementers were asked to aggregate data and write down five conclusions (n = 1,035 in their evaluation reports. The conclusions stated in the evaluation reports were further analyzed via secondary data analyses in this study. Results showed that the participants regarded the Tier 2 Program as a success, and was effective in enhancing self-understanding, interpersonal skills, and self-management. They liked the experiential learning approach and activities that are novel, interesting, diversified, adventure-based, and outdoor in nature. They also liked instructors who were friendly, supportive, well-prepared, and able to bring challenges and give positive recognition. Most of the difficulties encountered in running the programs were related to time constraints, clashes with other activities, and motivation of participants. Consistent with the previous evaluation findings, the present study suggests that the Tier 2 Program was well received by the participants and that it was beneficial to the development of the program participants.

  4. Secondary data analyses of conclusions drawn by the program implementers of a positive youth development program in Hong Kong.

    Science.gov (United States)

    Siu, Andrew M H; Shek, Daniel T L

    2010-02-12

    The Tier 2 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) is designed for adolescents with significant psychosocial needs, and its various programs are designed and implemented by social workers (program implementers) for specific student groups in different schools. Using subjective outcome evaluation data collected from the program participants (Form C) at 207 schools, the program implementers were asked to aggregate data and write down five conclusions (n = 1,035) in their evaluation reports. The conclusions stated in the evaluation reports were further analyzed via secondary data analyses in this study. Results showed that the participants regarded the Tier 2 Program as a success, and was effective in enhancing self-understanding, interpersonal skills, and self-management. They liked the experiential learning approach and activities that are novel, interesting, diversified, adventure-based, and outdoor in nature. They also liked instructors who were friendly, supportive, well-prepared, and able to bring challenges and give positive recognition. Most of the difficulties encountered in running the programs were related to time constraints, clashes with other activities, and motivation of participants. Consistent with the previous evaluation findings, the present study suggests that the Tier 2 Program was well received by the participants and that it was beneficial to the development of the program participants.

  5. Semiparametric analysis of two-level bivariate binary data.

    Science.gov (United States)

    Naskar, Malay; Das, Kalyan

    2006-12-01

    In medical studies, paired binary responses are often observed for each study subject over timepoints or clusters. A primary interest is to investigate how the bivariate association and marginal univariate risks are affected by repeated measurements on each subject. To achieve this we propose a very general class of semiparametric bivariate binary models. The subject-specific effects involved in the bivariate log odds ratio and the univariate logit components are assumed to follow a nonparametric Dirichlet process (DP). We propose a hybrid method to draw model-based inferences. In the framework of the proposed hybrid method, estimation of parameters is done by implementing the Monte Carlo expectation-maximization algorithm. The proposed methodology is illustrated through a study on the effectiveness of tibolone for reducing menopausal problems experienced by Indian women. A simulation study is also conducted to evaluate the efficiency of the new methodology.

  6. A bivariate cumulative probit regression model for ordered categorical data.

    Science.gov (United States)

    Kim, K

    1995-06-30

    This paper proposes a latent variable regression model for bivariate ordered categorical data and develops the necessary numerical procedure for parameter estimation. The proposed model is an extension of the standard bivariate probit model for dichotomous data to ordered categorical data with more than two categories for each margin. In addition, the proposed model allows for different covariates for the margins, which is characteristic of data from typical ophthalmological studies. It utilizes the stochastic ordering implicit in the data and the correlation coefficient of the bivariate normal distribution in expressing intra-subject dependency. Illustration of the proposed model uses data from the Wisconsin Epidemiologic Study of Diabetic Retinopathy for identifying risk factors for diabetic retinopathy among younger-onset diabetics. The proposed regression model also applies to other clinical or epidemiological studies that involve paired organs.

  7. Jacobi Fiber Surfaces for Bivariate Reeb Space Computation.

    Science.gov (United States)

    Tierny, Julien; Carr, Hamish

    2017-01-01

    This paper presents an efficient algorithm for the computation of the Reeb space of an input bivariate piecewise linear scalar function f defined on a tetrahedral mesh. By extending and generalizing algorithmic concepts from the univariate case to the bivariate one, we report the first practical, output-sensitive algorithm for the exact computation of such a Reeb space. The algorithm starts by identifying the Jacobi set of f, the bivariate analogs of critical points in the univariate case. Next, the Reeb space is computed by segmenting the input mesh along the new notion of Jacobi Fiber Surfaces, the bivariate analog of critical contours in the univariate case. We additionally present a simplification heuristic that enables the progressive coarsening of the Reeb space. Our algorithm is simple to implement and most of its computations can be trivially parallelized. We report performance numbers demonstrating orders of magnitude speedups over previous approaches, enabling for the first time the tractable computation of bivariate Reeb spaces in practice. Moreover, unlike range-based quantization approaches (such as the Joint Contour Net), our algorithm is parameter-free. We demonstrate the utility of our approach by using the Reeb space as a semi-automatic segmentation tool for bivariate data. In particular, we introduce continuous scatterplot peeling, a technique which enables the reduction of the cluttering in the continuous scatterplot, by interactively selecting the features of the Reeb space to project. We provide a VTK-based C++ implementation of our algorithm that can be used for reproduction purposes or for the development of new Reeb space based visualization techniques.

  8. Bivariate sub-Gaussian model for stock index returns

    Science.gov (United States)

    Jabłońska-Sabuka, Matylda; Teuerle, Marek; Wyłomańska, Agnieszka

    2017-11-01

    Financial time series are commonly modeled with methods assuming data normality. However, the real distribution can be nontrivial, also not having an explicitly formulated probability density function. In this work we introduce novel parameter estimation and high-powered distribution testing methods which do not rely on closed form densities, but use the characteristic functions for comparison. The approach applied to a pair of stock index returns demonstrates that such a bivariate vector can be a sample coming from a bivariate sub-Gaussian distribution. The methods presented here can be applied to any nontrivially distributed financial data, among others.

  9. Geometric visualization of parallel bivariate Pareto distribution surfaces

    Directory of Open Access Journals (Sweden)

    N.H. Abdel-All

    2016-04-01

    Full Text Available In the present paper, the differential-geometrical framework for parallel bivariate Pareto distribution surfaces (P,P¯ is given. Curvatures of a curve lying on (P,P¯, are interpreted in terms of the parameters of P. Geometrical and statistical interpretations of some results are introduced and plotted.

  10. Optimizing an objective function under a bivariate probability model

    NARCIS (Netherlands)

    X. Brusset; N.M. Temme (Nico)

    2007-01-01

    htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be

  11. About some properties of bivariate splines with shape parameters

    Science.gov (United States)

    Caliò, F.; Marchetti, E.

    2017-07-01

    The paper presents and proves geometrical properties of a particular bivariate function spline, built and algorithmically implemented in previous papers. The properties typical of this family of splines impact the field of computer graphics in particular that of the reverse engineering.

  12. Assessing the copula selection for bivariate frequency analysis ...

    Indian Academy of Sciences (India)

    58

    J R. Stat Soc Series B Stat Methodol, 54, 171-183. Karmakar, S. & Simonovic, S. 2009. Bivariate flood frequency analysis. Part 2: a copula‐based approach with mixed marginal distributions. J. Flood Risk Manage., 2, 32-44. Katz, R. W. 2013. Statistical methods for nonstationary extremes. Extremes in a Changing. Climate.

  13. Latent class bivariate model for the meta-analysis of diagnostic test accuracy studies.

    Science.gov (United States)

    Eusebi, Paolo; Reitsma, Johannes B; Vermunt, Jeroen K

    2014-07-11

    Several types of statistical methods are currently available for the meta-analysis of studies on diagnostic test accuracy. One of these methods is the Bivariate Model which involves a simultaneous analysis of the sensitivity and specificity from a set of studies. In this paper, we review the characteristics of the Bivariate Model and demonstrate how it can be extended with a discrete latent variable. The resulting clustering of studies yields additional insight into the accuracy of the test of interest. A Latent Class Bivariate Model is proposed. This model captures the between-study variability in sensitivity and specificity by assuming that studies belong to one of a small number of latent classes. This yields both an easier to interpret and a more precise description of the heterogeneity between studies. Latent classes may not only differ with respect to the average sensitivity and specificity, but also with respect to the correlation between sensitivity and specificity. The Latent Class Bivariate Model identifies clusters of studies with their own estimates of sensitivity and specificity. Our simulation study demonstrated excellent parameter recovery and good performance of the model selection statistics typically used in latent class analysis. Application in a real data example on coronary artery disease showed that the inclusion of latent classes yields interesting additional information. Our proposed new meta-analysis method can lead to a better fit of the data set of interest, less biased estimates and more reliable confidence intervals for sensitivities and specificities. But even more important, it may serve as an exploratory tool for subsequent sub-group meta-analyses.

  14. Application of the ASP3D Computer Program to Unsteady Aerodynamic and Aeroelastic Analyses

    Science.gov (United States)

    Batina, John T.

    2006-01-01

    A new computer program has been developed called ASP3D (Advanced Small Perturbation - 3D), which solves the small perturbation potential flow equation in an advanced form including mass-consistent surface and trailing wake boundary conditions, and entropy, vorticity, and viscous effects. The purpose of the program is for unsteady aerodynamic and aeroelastic analyses, especially in the nonlinear transonic flight regime. The program exploits the simplicity of stationary Cartesian meshes with the movement or deformation of the configuration under consideration incorporated into the solution algorithm through a planar surface boundary condition. The paper presents unsteady aerodynamic and aeroelastic applications of ASP3D to assess the time dependent capability and demonstrate various features of the code.

  15. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  16. Introduction of the ASP3D Computer Program for Unsteady Aerodynamic and Aeroelastic Analyses

    Science.gov (United States)

    Batina, John T.

    2005-01-01

    A new computer program has been developed called ASP3D (Advanced Small Perturbation 3D), which solves the small perturbation potential flow equation in an advanced form including mass-consistent surface and trailing wake boundary conditions, and entropy, vorticity, and viscous effects. The purpose of the program is for unsteady aerodynamic and aeroelastic analyses, especially in the nonlinear transonic flight regime. The program exploits the simplicity of stationary Cartesian meshes with the movement or deformation of the configuration under consideration incorporated into the solution algorithm through a planar surface boundary condition. The new ASP3D code is the result of a decade of developmental work on improvements to the small perturbation formulation, performed while the author was employed as a Senior Research Scientist in the Configuration Aerodynamics Branch at the NASA Langley Research Center. The ASP3D code is a significant improvement to the state-of-the-art for transonic aeroelastic analyses over the CAP-TSD code (Computational Aeroelasticity Program Transonic Small Disturbance), which was developed principally by the author in the mid-1980s. The author is in a unique position as the developer of both computer programs to compare, contrast, and ultimately make conclusions regarding the underlying formulations and utility of each code. The paper describes the salient features of the ASP3D code including the rationale for improvements in comparison with CAP-TSD. Numerous results are presented to demonstrate the ASP3D capability. The general conclusion is that the new ASP3D capability is superior to the older CAP-TSD code because of the myriad improvements developed and incorporated.

  17. Bivariate modelling of clustered continuous and ordered categorical outcomes.

    Science.gov (United States)

    Catalano, P J

    1997-04-30

    Simultaneous observation of continuous and ordered categorical outcomes for each subject is common in biomedical research but multivariate analysis of the data is complicated by the multiple data types. Here we construct a model for the joint distribution of bivariate continuous and ordinal outcomes by applying the concept of latent variables to a multivariate normal distribution. The approach is then extended to allow for clustering of the bivariate outcomes. The model can be parameterized in a way that allows writing the joint distribution as a product of a standard random effects model for the continuous variable and a correlated cumulative probit model for the ordinal outcome. This factorization suggests convenient parameter estimation using estimating equations. Foetal weight and malformation data from a developmental toxicity experiment illustrate the results.

  18. Louisiana Barrier Island Comprehensive Monitoring (BICM) Program Summary Report: Data and Analyses 2006 through 2010

    Science.gov (United States)

    Kindinger, Jack G.; Buster, Noreen A.; Flocks, James G.; Bernier, Julie C.; Kulp, Mark A.

    2013-01-01

    The Barrier Island Comprehensive Monitoring (BICM) program was implemented under the Louisiana Coastal Area Science and Technology (LCA S&T) office as a component of the System Wide Assessment and Monitoring (SWAMP) program. The BICM project was developed by the State of Louisiana (Coastal Protection Restoration Authority [CPRA], formerly Department of Natural Resources [DNR]) to complement other Louisiana coastal monitoring programs such as the Coastwide Reference Monitoring System-Wetlands (CRMS-Wetlands) and was a collaborative research effort by CPRA, University of New Orleans (UNO), and the U.S. Geological Survey (USGS). The goal of the BICM program was to provide long-term data on the barrier islands of Louisiana that could be used to plan, design, evaluate, and maintain current and future barrier-island restoration projects. The BICM program used both historical and newly acquired (2006 to 2010) data to assess and monitor changes in the aerial and subaqueous extent of islands, habitat types, sediment texture and geotechnical properties, environmental processes, and vegetation composition. BICM datasets included aerial still and video photography (multiple time series) for shoreline positions, habitat mapping, and land loss; light detection and ranging (lidar) surveys for topographic elevations; single-beam and swath bathymetry; and sediment grab samples. Products produced using BICM data and analyses included (but were not limited to) storm-impact assessments, rate of shoreline and bathymetric change, shoreline-erosion and accretion maps, high-resolution elevation maps, coastal-shoreline and barrier-island habitat-classification maps, and coastal surficial-sediment characterization maps. Discussions in this report summarize the extensive data-collection efforts and present brief interpretive analyses for four coastal Louisiana geographic regions. In addition, several coastal-wide and topical themes were selected that integrate the data and analyses within a

  19. A bivariate ordered probit estimator with mixed effects

    OpenAIRE

    Buscha, Franz; Conte, Anna

    2010-01-01

    In this paper, we discuss the derivation and application of a bivariate ordered probit model with mixed effects. Our approach allows one to estimate the distribution of the effect (gamma) of an endogenous ordered variable on an ordered explanatory variable. By allowing gamma to vary over the population, our estimator offers a more flexible parametric setting to recover the causal effect of an endogenous variable in an ordered choice setting. We use Monte Carlo simulations to examine the perfo...

  20. Boston children's hospital community asthma initiative: Five-year cost analyses of a home visiting program.

    Science.gov (United States)

    Bhaumik, Urmi; Sommer, Susan J; Giller-Leinwohl, Judith; Norris, Kerri; Tsopelas, Lindsay; Nethersole, Shari; Woods, Elizabeth R

    2017-03-01

    To evaluate the costs and benefits of the Boston Children's Hospital Community Asthma Initiative (CAI) through reduction of Emergency Department (ED) visits and hospitalizations for the full pilot-phase program participants. A cost-benefit analyses was conducted using hospital administrative data to determine an adjusted Return on Investment (ROI): on all 268 patients enrolled in the CAI program during the 33-month pilot program phase of CAI intervention between October 1, 2005 and June 30, 2008 using a comparison group of 818 patients from a similar cohort in neighboring ZIP codes without CAI intervention. Cost data through June 30, 2013 were used to examine cost changes and calculate an adjusted ROI over a 5-year post-intervention period. CAI patients had a cost reduction greater than the comparison group of $1,216 in Year 1 (P = 0.001), $1,320 in Year 2 (P management programs can decrease the incidence of costly hospitalizations and ED visits from asthma. An ROI of greater than one, as found in this cost analysis, supports the business case for the provision of community-based asthma services as part of patient-centered medical homes and Accountable Care Organizations.

  1. Bivariate Lagrange interpolation at the Padua points: Computational aspects

    Science.gov (United States)

    Caliari, Marco; de Marchi, Stefano; Vianello, Marco

    2008-11-01

    The so-called "Padua points" give a simple, geometric and explicit construction of bivariate polynomial interpolation in the square. Moreover, the associated Lebesgue constant has minimal order of growth . Here we show four families of Padua points for interpolation at any even or odd degree n, and we present a stable and efficient implementation of the corresponding Lagrange interpolation formula, based on the representation in a suitable orthogonal basis. We also discuss extension of (non-polynomial) Padua-like interpolation to other domains, such as triangles and ellipses; we give complexity and error estimates, and several numerical tests.

  2. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasushi [Department of Fuel Cycle Safety Research, Nuclear Safety Research Center, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Tamaki, Hitoshi [Department of Safety Research Technical Support, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Kanai, Shigeru [Fuji Research Institute Corporation, Tokyo (Japan)

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  3. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  4. Perceived Social Support and Academic Achievement: Cross-Lagged Panel and Bivariate Growth Curve Analyses

    Science.gov (United States)

    Mackinnon, Sean P.

    2012-01-01

    As students transition to post-secondary education, they experience considerable stress and declines in academic performance. Perceived social support is thought to improve academic achievement by reducing stress. Longitudinal designs with three or more waves are needed in this area because they permit stronger causal inferences and help…

  5. Bivariate genetic analyses of stuttering and nonfluency in a large sample of 5-year old twins

    NARCIS (Netherlands)

    van Beijsterveldt, C.E.M.; Felsenfeld, S.; Boomsma, D.I.

    2010-01-01

    Purpose: Behavioral genetic studies of speech fluency have focused on participants who present with clinical stuttering. Knowledge about genetic influences on the development and regulation of normal speech fluency is limited. The primary aims of this study were to identify the heritability of

  6. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from the product......Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  7. Awareness and acceptability of human papillomavirus vaccine: an application of the instrumental variables bivariate probit model.

    Science.gov (United States)

    Do, Young Kyung; Wong, Ker Yi

    2012-01-13

    Although lower uptake rates of the human papillomavirus (HPV) vaccine among socioeconomically disadvantaged populations have been documented, less is known about the relationships between awareness and acceptability, and other factors affecting HPV vaccine uptake.The current study aimed to estimate the potential effectiveness of increased HPV vaccine awareness on the acceptability of HPV vaccination in a nationally representative sample of women, using a methodology that controlled for potential non-random selection. This study used a population-based sample from the 2007 Health Information National Trends Survey, a cross-sectional study of the US population aged 18 years or older, and focused on the subsample of 742 women who have any female children under the age of 18 years in the household. An instrumental variables bivariate probit model was used to jointly estimate HPV vaccine awareness and acceptability. The proportion of HPV vaccine acceptability among the previously aware and non-aware groups was 58% and 47%, respectively. Results from the instrumental variables bivariate probit model showed that the estimated marginal effect of awareness on acceptability was 46 percentage points, an effect that was even greater than observed. Among populations who are not currently aware of the HPV vaccine, the potential impact of raising awareness on acceptability of HPV vaccination is substantial. This finding provides additional support to strengthening public health programs that increase awareness and policy efforts that address barriers to HPV vaccination.

  8. Efficient estimation of semiparametric copula models for bivariate survival data

    KAUST Repository

    Cheng, Guang

    2014-01-01

    A semiparametric copula model for bivariate survival data is characterized by a parametric copula model of dependence and nonparametric models of two marginal survival functions. Efficient estimation for the semiparametric copula model has been recently studied for the complete data case. When the survival data are censored, semiparametric efficient estimation has only been considered for some specific copula models such as the Gaussian copulas. In this paper, we obtain the semiparametric efficiency bound and efficient estimation for general semiparametric copula models for possibly censored data. We construct an approximate maximum likelihood estimator by approximating the log baseline hazard functions with spline functions. We show that our estimates of the copula dependence parameter and the survival functions are asymptotically normal and efficient. Simple consistent covariance estimators are also provided. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2013 Elsevier Inc.

  9. DESIGN OF BEZIER SPLINE SURFACES OVER BIVARIATE NETWORKS OF CURVES

    Directory of Open Access Journals (Sweden)

    A. P. Pobegailo

    2014-01-01

    Full Text Available The paper presents an approach to construct interpolating spline surfaces over a bivariate net-work of curves with rectangular patches. Patches of the interpolating spline surface are constructed by means of blending their boundaries with special polynomials. In order to ensure a necessary para-metric continuity of the designed surface the polynomials of the corresponding degree must be used. The constructed interpolating spline surfaces have a local shape control. If the surface frame is deter-mined by means of Bezier curves, then patches of the interpolating spline surface are Bezier surfaces. The presented approach to surface modeling can be used in such applications as computer graphics and geometric design.

  10. Bivariate analysis of floods in climate impact assessments.

    Science.gov (United States)

    Brunner, Manuela Irene; Sikorska, Anna E; Seibert, Jan

    2018-03-01

    Climate impact studies regarding floods usually focus on peak discharges and a bivariate assessment of peak discharges and hydrograph volumes is not commonly included. A joint consideration of peak discharges and hydrograph volumes, however, is crucial when assessing flood risks for current and future climate conditions. Here, we present a methodology to develop synthetic design hydrographs for future climate conditions that jointly consider peak discharges and hydrograph volumes. First, change factors are derived based on a regional climate model and are applied to observed precipitation and temperature time series. Second, the modified time series are fed into a calibrated hydrological model to simulate runoff time series for future conditions. Third, these time series are used to construct synthetic design hydrographs. The bivariate flood frequency analysis used in the construction of synthetic design hydrographs takes into account the dependence between peak discharges and hydrograph volumes, and represents the shape of the hydrograph. The latter is modeled using a probability density function while the dependence between the design variables peak discharge and hydrograph volume is modeled using a copula. We applied this approach to a set of eight mountainous catchments in Switzerland to construct catchment-specific and season-specific design hydrographs for a control and three scenario climates. Our work demonstrates that projected climate changes have an impact not only on peak discharges but also on hydrograph volumes and on hydrograph shapes both at an annual and at a seasonal scale. These changes are not necessarily proportional which implies that climate impact assessments on future floods should consider more flood characteristics than just flood peaks. Copyright © 2017. Published by Elsevier B.V.

  11. Effectiveness of a selective alcohol prevention program targeting personality risk factors: Results of interaction analyses.

    Science.gov (United States)

    Lammers, Jeroen; Goossens, Ferry; Conrod, Patricia; Engels, Rutger; Wiers, Reinout W; Kleinjan, Marloes

    2017-08-01

    To explore whether specific groups of adolescents (i.e., scoring high on personality risk traits, having a lower education level, or being male) benefit more from the Preventure intervention with regard to curbing their drinking behaviour. A clustered randomized controlled trial, with participants randomly assigned to a 2-session coping skills intervention or a control no-intervention condition. Fifteen secondary schools throughout The Netherlands; 7 schools in the intervention and 8 schools in the control condition. 699 adolescents aged 13-15; 343 allocated to the intervention and 356 to the control condition; with drinking experience and elevated scores in either negative thinking, anxiety sensitivity, impulsivity or sensation seeking. Differential effectiveness of the Preventure program was examined for the personality traits group, education level and gender on past-month binge drinking (main outcome), binge frequency, alcohol use, alcohol frequency and problem drinking, at 12months post-intervention. Preventure is a selective school-based alcohol prevention programme targeting personality risk factors. The comparator was a no-intervention control. Intervention effects were moderated by the personality traits group and by education level. More specifically, significant intervention effects were found on reducing alcohol use within the anxiety sensitivity group (OR=2.14, CI=1.40, 3.29) and reducing binge drinking (OR=1.76, CI=1.38, 2.24) and binge drinking frequency (β=0.24, p=0.04) within the sensation seeking group at 12months post-intervention. Also, lower educated young adolescents reduced binge drinking (OR=1.47, CI=1.14, 1.88), binge drinking frequency (β=0.25, p=0.04), alcohol use (OR=1.32, CI=1.06, 1.65) and alcohol use frequency (β=0.47, p=0.01), but not those in the higher education group. Post hoc latent-growth analyses revealed significant effects on the development of binge drinking (β=-0.19, p=0.02) and binge drinking frequency (β=-0.10, p=0

  12. A power study of bivariate LOD score analysis of a complex trait and fear/discomfort with strangers.

    Science.gov (United States)

    Ji, Fei; Lee, Dayoung; Mendell, Nancy Role

    2005-12-30

    Complex diseases are often reported along with disease-related traits (DRT). Sometimes investigators consider both disease and DRT phenotypes separately and sometimes they consider individuals as affected if they have either the disease or the DRT, or both. We propose instead to consider the joint distribution of the disease and the DRT and do a linkage analysis assuming a pleiotropic model. We evaluated our results through analysis of the simulated datasets provided by Genetic Analysis Workshop 14. We first conducted univariate linkage analysis of the simulated disease, Kofendrerd Personality Disorder and one of its simulated associated traits, phenotype b (fear/discomfort with strangers). Subsequently, we considered the bivariate phenotype, which combined the information on Kofendrerd Personality Disorder and fear/discomfort with strangers. We developed a program to perform bivariate linkage analysis using an extension to the Elston-Stewart peeling method of likelihood calculation. Using this program we considered the microsatellites within 30 cM of the gene pleiotropic for this simulated disease and DRT. Based on 100 simulations of 300 families we observed excellent power to detect linkage within 10 cM of the disease locus using the DRT and the bivariate trait.

  13. The bivariate statistical analysis of environmental (compositional) data.

    Science.gov (United States)

    Filzmoser, Peter; Hron, Karel; Reimann, Clemens

    2010-09-01

    Environmental sciences usually deal with compositional (closed) data. Whenever the concentration of chemical elements is measured, the data will be closed, i.e. the relevant information is contained in the ratios between the variables rather than in the data values reported for the variables. Data closure has severe consequences for statistical data analysis. Most classical statistical methods are based on the usual Euclidean geometry - compositional data, however, do not plot into Euclidean space because they have their own geometry which is not linear but curved in the Euclidean sense. This has severe consequences for bivariate statistical analysis: correlation coefficients computed in the traditional way are likely to be misleading, and the information contained in scatterplots must be used and interpreted differently from sets of non-compositional data. As a solution, the ilr transformation applied to a variable pair can be used to display the relationship and to compute a measure of stability. This paper discusses how this measure is related to the usual correlation coefficient and how it can be used and interpreted. Moreover, recommendations are provided for how the scatterplot can still be used, and which alternatives exist for displaying the relationship between two variables. Copyright 2010 Elsevier B.V. All rights reserved.

  14. Bivariate Rainfall and Runoff Analysis Using Entropy and Copula Theories

    Directory of Open Access Journals (Sweden)

    Lan Zhang

    2012-09-01

    Full Text Available Multivariate hydrologic frequency analysis has been widely studied using: (1 commonly known joint distributions or copula functions with the assumption of univariate variables being independently identically distributed (I.I.D. random variables; or (2 directly applying the entropy theory-based framework. However, for the I.I.D. univariate random variable assumption, the univariate variable may be considered as independently distributed, but it may not be identically distributed; and secondly, the commonly applied Pearson’s coefficient of correlation (g is not able to capture the nonlinear dependence structure that usually exists. Thus, this study attempts to combine the copula theory with the entropy theory for bivariate rainfall and runoff analysis. The entropy theory is applied to derive the univariate rainfall and runoff distributions. It permits the incorporation of given or known information, codified in the form of constraints and results in a universal solution of univariate probability distributions. The copula theory is applied to determine the joint rainfall-runoff distribution. Application of the copula theory results in: (i the detection of the nonlinear dependence between the correlated random variables-rainfall and runoff, and (ii capturing the tail dependence for risk analysis through joint return period and conditional return period of rainfall and runoff. The methodology is validated using annual daily maximum rainfall and the corresponding daily runoff (discharge data collected from watersheds near Riesel, Texas (small agricultural experimental watersheds and Cuyahoga River watershed, Ohio.

  15. Epileptic seizure prediction based on a bivariate spectral power methodology.

    Science.gov (United States)

    Bandarabadi, Mojtaba; Teixeira, Cesar A; Direito, Bruno; Dourado, Antonio

    2012-01-01

    The spectral power of 5 frequently considered frequency bands (Alpha, Beta, Gamma, Theta and Delta) for 6 EEG channels is computed and then all the possible pairwise combinations among the 30 features set, are used to create a 435 dimensional feature space. Two new feature selection methods are introduced to choose the best candidate features among those and to reduce the dimensionality of this feature space. The selected features are then fed to Support Vector Machines (SVMs) that classify the cerebral state in preictal and non-preictal classes. The outputs of the SVM are regularized using a method that accounts for the classification dynamics of the preictal class, also known as "Firing Power" method. The results obtained using our feature selection approaches are compared with the ones obtained using minimum Redundancy Maximum Relevance (mRMR) feature selection method. The results in a group of 12 patients of the EPILEPSIAE database, containing 46 seizures and 787 hours multichannel recording for out-of-sample data, indicate the efficiency of the bivariate approach as well as the two new feature selection methods. The best results presented sensitivity of 76.09% (35 of 46 seizures predicted) and a false prediction rate of 0.15(-1).

  16. An adjustment to improve the bivariate survivor function repaired NPMLE.

    Science.gov (United States)

    Moodie, F Zoe; Prentic, Ross L

    2005-09-01

    We recently proposed a representation of the bivariate survivor function as a mapping of the hazard function for truncated failure time variates. The representation led to a class of estimators that includes van der Laan's repaired nonparametric maximum likelihood estimator (NPMLE) as an important special case. We proposed a Greenwood-like variance estimator for the repaired NPMLE but found somewhat poor agreement between the empirical variance estimates and these analytic estimates for the sample sizes and bandwidths considered in our simulation study. The simulation results also confirmed those of others in showing slightly inferior performance for the repaired NPMLE compared to other competing estimators as well as a sensitivity to bandwidth choice in moderate sized samples. Despite its attractive asymptotic properties, the repaired NPMLE has drawbacks that hinder its practical application. This paper presents a modification of the repaired NPMLE that improves its performance in moderate sized samples and renders it less sensitive to the choice of bandwidth. Along with this modified estimator, more extensive simulation studies of the repaired NPMLE and Greenwood-like variance estimates are presented. The methods are then applied to a real data example.

  17. A Bivariate Markov Regime Switching GARCH Approach to Estimate Time Varying Minimum Variance Hedge Ratios

    OpenAIRE

    Hsiang-Tai Lee; Jonathan Yoder

    2005-01-01

    This paper develops a new bivariate Markov regime switching BEKK-GARCH (RS-BEKK-GARCH) model. The model is a state-dependent bivariate BEKK- GARCH model, and an extension of Gray’s univariate generalized regime- switching (GRS) model to the bivariate case. To solve the path- dependency problem inherent in the bivariate regime switching BEKK-GARCH model, we propose a recombining method for the covariance term in the conditional variance-covariance matrix. The model is applied to estimate time-...

  18. Evaluation of the Tier 1 Program of Project P.A.T.H.S.: Secondary Data Analyses of Conclusions Drawn by the Program Implementers

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2008-01-01

    Full Text Available The Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes is a curricula-based positive youth development program. In the experimental implementation phase, 52 schools participated in the program. Based on subjective outcome evaluation data collected from the program participants (Form A and program implementers (Form B in each school, the program implementers were invited to write down five conclusions based on an integration of the evaluation findings (N = 52. The conclusions stated in the 52 evaluation reports were further analyzed via secondary data analyses in this paper. Results showed that most of the conclusions concerning perceptions of the Tier 1 Program, instructors, and effectiveness of the programs were positive in nature. There were also conclusions reflecting the respondents’ appreciation of the program. Finally, responses on the difficulties encountered and suggestions for improvements were observed. In conjunction with the previous evaluation findings, the present study suggests that the Tier 1 Program was well received by the stakeholders and the program was beneficial to the development of the program participants.

  19. Hanford Environmental Monitoring Program schedule for samples, analyses, and measurements for calendar year 1985

    Energy Technology Data Exchange (ETDEWEB)

    Blumer, P.J.; Price, K.R.; Eddy, P.A.; Carlile, J.M.V.

    1984-12-01

    This report provides the CY 1985 schedule of data collection for the routine Hanford Surface Environmental Monitoring and Ground-Water Monitoring Programs at the Hanford Site. The purpose is to evaluate and report the levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 5484.1. The routine sampling schedule provided herein does not include samples scheduled to be collected during FY 1985 in support of special studies, special contractor support programs, or for quality control purposes. In addition, the routine program outlined in this schedule is subject to modification during the year in response to changes in site operations, program requirements, or unusual sample results.

  20. Initial Analyses of Change Detection Capabilities and Data Redundancies in the Long Term Resource Monitoring Program

    National Research Council Canada - National Science Library

    Lubinski, Kenneth

    2001-01-01

    Evaluations of Long Term Resource Monitoring Program sampling designs for water quality, fish, aquatic vegetation, and macroinvertebrates were initiated in 1999 by analyzing data collected since 1992...

  1. A bivariate whole-genome linkage scan suggests several shared genomic regions for obesity and osteoporosis.

    Science.gov (United States)

    Tang, Zi-Hui; Xiao, Peng; Lei, Shu-Feng; Deng, Fei-Yan; Zhao, Lan-Juan; Deng, Hong-Yi; Tan, Li-Jun; Shen, Hui; Xiong, Dong-Hai; Recker, Robert R; Deng, Hong-Wen

    2007-07-01

    A genome-wide bivariate analysis was conducted for body fat mass (BFM) and bone mineral density (BMD) in a large Caucasian sample. We found some quantitative trait loci shared by BFM and BMD in the total sample and the gender-specific subgroups, and quantitative trait loci with potential pleiotropy were disclosed. BFM and BMD, as the respective measure for obesity and osteoporosis, are phenotypically and genetically correlated. However, specific genomic regions accounting for their genetic correlation are unknown. To identify systemically the shared genomic regions for BFM and BMD, we performed a bivariate whole-genome linkage scan in 4498 Caucasian individuals from 451 families for BFM and BMD at the hip, spine, and wrist, respectively. Linkage analyses were performed in the total sample and the male and female subgroups, respectively. In the entire sample, suggestive linkages were detected at 7p22-p21 (LOD 2.69) for BFM and spine BMD, 6q27 (LOD 2.30) for BFM and hip BMD, and 11q13 (LOD 2.64) for BFM and wrist BMD. Male-specific suggestive linkages were found at 13q12 (LOD 3.23) for BFM and spine BMD and at 7q21 (LOD 2.59) for BFM and hip BMD. Female-specific suggestive LOD scores were 3.32 at 15q13 for BFM and spine BMD and 3.15 at 6p25-24 for BFM and wrist BMD. Several shared genomic regions for BFM and BMD were identified here. Our data may benefit further positional and functional studies, aimed at eventually uncovering the complex mechanism underlying the shared genetic determination of obesity and osteoporosis.

  2. Meta-Analyses and Review of Research on Pull-Out Programs in Gifted Education.

    Science.gov (United States)

    Vaughn, Vicki L.; And Others

    1991-01-01

    A meta-analysis was conducted on nine experimental studies dealing with pull-out programs for gifted students in grades one through nine. Results indicate that pull-out programs in gifted education have significant positive effects on achievement, critical thinking, and creativity, but not on student self-concept. (Author/JDD)

  3. Bivariate C^1 quadratic finite elements and vertex splines

    Science.gov (United States)

    Chui, Charles K.; He, Tian Xiao

    1990-01-01

    Following work of Heindl and of Powell and Sabin, each triangle of an arbitrary (regular) triangulation Δ of a polygonal region Ω in {R^2} is subdivided into twelve triangles, using the three medians, yielding the refinement hat Δ of Δ , so that {C^1} quadratic finite elements can be constructed. In this paper, we derive the Bezier nets of these elements in terms of the parameters that describe function and first partial derivative values at the vertices and values of the normal derivatives at the midpoints of the edges of Δ . Consequently, bivariate {C^1} quadratic (generalized) vertex splines on Δ have an explicit formulation. Here, a generalized vertex spline is one which is a piecewise polynomial on the refined grid partition hat Δ and has support that contains at most one vertex of the original partition Δ in its interior. The collection of all {C^1} quadratic generalized vertex splines on Δ so constructed is shown to form a basis of S_2^1(hat Δ ) , the vector space of all functions on {C^1}(Ω ) whose restrictions to each triangular cell of the partition hat Δ are quadratic polynomials. A subspace with the basis given by appropriately chosen generalized vertex splines with exactly one vertex of Δ in the interior of their supports, that reproduces all quadratic polynomials, is identified, and hence, has approximation order three. Quasi-interpolation formulas using this subspace are obtained. In addition, a constructive procedure that yields a locally supported basis of yet another subspace with dimension given by the number of vertices of Δ , that has approximation order three, is given.

  4. Marine Ecosystems Analysis (MESA) Program, New York Bight Surficial Sediment Analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Marine Ecosystems Analysis (MESA) Program, New York Bight Study was funded by NOAA and the Bureau of Land Management (BLM). The Atlas was a historical...

  5. Non-Constant Learning Rates in Retrospective Experience Curve Analyses and their Correlation to Deployment Programs

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Max [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Smith, Sarah J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sohn, Michael D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-16

    A key challenge for policy-makers and technology market forecasters is to estimate future technology costs and in particular the rate of cost reduction versus production volume. A related, critical question is what role should state and federal governments have in advancing energy efficient and renewable energy technologies? This work provides retrospective experience curves and learning rates for several energy-related technologies, each of which have a known history of federal and state deployment programs. We derive learning rates for eight technologies including energy efficient lighting technologies, stationary fuel cell systems, and residential solar photovoltaics, and provide an overview and timeline of historical deployment programs such as state and federal standards and state and national incentive programs for each technology. Piecewise linear regimes are observed in a range of technology experience curves, and public investments or deployment programs are found to be strongly correlated to an increase in learning rate across multiple technologies. A downward bend in the experience curve is found in 5 out of the 8 energy-related technologies presented here (electronic ballasts, magnetic ballasts, compact fluorescent lighting, general service fluorescent lighting, and the installed cost of solar PV). In each of the five downward-bending experience curves, we believe that an increase in the learning rate can be linked to deployment programs to some degree. This work sheds light on the endogenous versus exogenous contributions to technological innovation and highlights the impact of exogenous government sponsored deployment programs. This work can inform future policy investment direction and can shed light on market transformation and technology learning behavior.

  6. Light Water Reactor Sustainability Program Industry Application External Hazard Analyses Problem Statement

    Energy Technology Data Exchange (ETDEWEB)

    Szilard, Ronaldo Henriques [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kammerer, Annie [Annie Kammerer Consulting, Rye, NH (United States); Youngblood, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pope, Chad [Idaho State Univ., Pocatello, ID (United States)

    2015-07-01

    Risk-Informed Margin Management Industry Application on External Events. More specifically, combined events, seismically induced external flooding analyses for a generic nuclear power plant with a generic site soil, and generic power plant system and structure. The focus of this report is to define the problem above, set up the analysis, describe the methods to be used, tools to be applied to each problem, and data analysis and validation associated with the above.

  7. A simple program to measure and analyse tree rings using Excel, R and SigmaScan

    Science.gov (United States)

    Hietz, Peter

    2011-01-01

    I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood–earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code. PMID:26109835

  8. A simple program to measure and analyse tree rings using Excel, R and SigmaScan.

    Science.gov (United States)

    Hietz, Peter

    I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood-earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code.

  9. IMPROVEMENTS IN HANFORD TRANSURANIC (TRU) PROGRAM UTILIZING SYSTEMS MODELING AND ANALYSES

    Energy Technology Data Exchange (ETDEWEB)

    UYTIOCO EM

    2007-11-12

    Hanford's Transuranic (TRU) Program is responsible for certifying contact-handled (CH) TRU waste and shipping the certified waste to the Waste Isolation Pilot Plant (WIPP). Hanford's CH TRU waste includes material that is in retrievable storage as well as above ground storage, and newly generated waste. Certifying a typical container entails retrieving and then characterizing it (Real-Time Radiography, Non-Destructive Assay, and Head Space Gas Sampling), validating records (data review and reconciliation), and designating the container for a payload. The certified payload is then shipped to WIPP. Systems modeling and analysis techniques were applied to Hanford's TRU Program to help streamline the certification process and increase shipping rates.

  10. Scanning electron microscopic analyses of Ferrocyanide tank wastes for the Ferrocyanide safety program

    Energy Technology Data Exchange (ETDEWEB)

    Callaway, W.S.

    1995-09-01

    This is Fiscal Year 1995 Annual Report on the progress of activities relating to the application of scanning electron microscopy in addressing the Ferrocyanide Safety Issue associated with Hanford Site high-level radioactive waste tanks. The status of the FY 1995 activities directed towards establishing facilities capable of providing SEM based micro-characterization of ferrocyanide tank wastes is described. A summary of key events in the SEM task over FY 1995 and target activities in FY 1996 are presented. A brief overview of the potential applications of computer controlled SEM analytical data in light of analyses of ferrocyanide simulants performed by an independent contractor is also presented

  11. Case Study Analyses of the Impact of Flipped Learning in Teaching Programming Robots

    Directory of Open Access Journals (Sweden)

    Majlinda Fetaji

    2016-11-01

    Full Text Available The focus of the research study was to investigate and find out the benefits of the flipped learning pedagogy on the student learning in teaching programming Robotics classes. Also, the assessment of whether it has any advantages over the traditional teaching methods in computer sciences. Assessment of learners on their attitudes, motivation, and effectiveness when using flipped classroom compared with traditional classroom has been realized. The research questions investigated are: “What kind of problems can we face when we have robotics classes in the traditional methods?”, “If we applied flipped learning method, can we solve these problems?”. In order to analyze all this, a case study experiment was realized and insights as well as recommendations are presented.

  12. SUBCHANFLOW: a thermal hydraulic sub-channel program to analyse fuel rod bundles and reactor cores

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, V.; Imke, U.; Ivanov, A. [Karlsruhe Institute of Technology, Institute of Neutron Physics and Reactor Technology, Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Gomez, R., E-mail: Victor.Sanchez@kit.ed [University of Applied Sciences Offenburg, Badstr. 24, 77652 Offenburg (Germany)

    2010-10-15

    The improvement of numerical analysis tools for the design and safety evaluation of reactor cores in a continuous effort in the nuclear community not only to improve the plant efficiency but also to demonstrate high degree of safety. Investigations at the Institute of Neutron Physics and Reactor Technology of the Karlsruhe Institute of Technology (KIT) are focused on the further development and qualification of subchannel and system codes using experimental data. The majority of sub-channel codes in use likes Thesis, Bacchus, Cobra and Matra, were developed in the seventies and eighties. The programming style is rather obsolete and most of these codes are working internally with British Units instead of Si-Units. In the case of water, outdated steam tables are used. Both the trends to improve the efficiency of light water reactors (LWR) and the involvement of KIT in European projects related to the study of the technical feasibility of different fast reactors systems reinforced the need for the development and improvement of sub-channel codes, since they will play a key role in performing better designs as stand-alone tools or coupled to neutron physical codes (deterministic or stochastic). Hence, KIT started the development of a new sub-channel code SUBCHANFLOW based on the Cobra-family. SUBCHANFLOW is a modular code programmed in Fortran-95 with dynamic memory allocation using Si-units. Different fluids like liquid metals and water are available as coolant. In addition some models were improved or replaced by new ones. In this paper the structure, the physical models and the current validation status will be presented and discussed. (Author)

  13. INLAND DISSOLVED SALT CHEMISTRY: STATISTICAL EVALUATION OF BIVARIATE AND TERNARY DIAGRAM MODELS FOR SURFACE AND SUBSURFACE WATERS

    Science.gov (United States)

    We compared the use of ternary and bivariate diagrams to distinguish the effects of atmospheric precipitation, rock weathering, and evaporation on inland surface and subsurface water chemistry. The three processes could not be statistically differentiated using bivariate models e...

  14. A semiparametric bivariate probit model for joint modeling of outcomes in STEMI patients.

    Science.gov (United States)

    Ieva, Francesca; Marra, Giampiero; Paganoni, Anna Maria; Radice, Rosalba

    2014-01-01

    In this work we analyse the relationship among in-hospital mortality and a treatment effectiveness outcome in patients affected by ST-Elevation myocardial infarction. The main idea is to carry out a joint modeling of the two outcomes applying a Semiparametric Bivariate Probit Model to data arising from a clinical registry called STEMI Archive. A realistic quantification of the relationship between outcomes can be problematic for several reasons. First, latent factors associated with hospitals organization can affect the treatment efficacy and/or interact with patient's condition at admission time. Moreover, they can also directly influence the mortality outcome. Such factors can be hardly measurable. Thus, the use of classical estimation methods will clearly result in inconsistent or biased parameter estimates. Secondly, covariate-outcomes relationships can exhibit nonlinear patterns. Provided that proper statistical methods for model fitting in such framework are available, it is possible to employ a simultaneous estimation approach to account for unobservable confounders. Such a framework can also provide flexible covariate structures and model the whole conditional distribution of the response.

  15. Semiparametric Maximum Likelihood Estimation in Normal Transformation Models for Bivariate Survival Data

    Science.gov (United States)

    Li, Yi; Prentice, Ross L.; Lin, Xihong

    2008-01-01

    SUMMARY We consider a class of semiparametric normal transformation models for right censored bivariate failure times. Nonparametric hazard rate models are transformed to a standard normal model and a joint normal distribution is assumed for the bivariate vector of transformed variates. A semiparametric maximum likelihood estimation procedure is developed for estimating the marginal survival distribution and the pairwise correlation parameters. This produces an efficient estimator of the correlation parameter of the semiparametric normal transformation model, which characterizes the bivariate dependence of bivariate survival outcomes. In addition, a simple positive-mass-redistribution algorithm can be used to implement the estimation procedures. Since the likelihood function involves infinite-dimensional parameters, the empirical process theory is utilized to study the asymptotic properties of the proposed estimators, which are shown to be consistent, asymptotically normal and semiparametric efficient. A simple estimator for the variance of the estimates is also derived. The finite sample performance is evaluated via extensive simulations. PMID:19079778

  16. Semiparametric Maximum Likelihood Estimation in Normal Transformation Models for Bivariate Survival Data.

    Science.gov (United States)

    Li, Yi; Prentice, Ross L; Lin, Xihong

    2008-12-01

    We consider a class of semiparametric normal transformation models for right censored bivariate failure times. Nonparametric hazard rate models are transformed to a standard normal model and a joint normal distribution is assumed for the bivariate vector of transformed variates. A semiparametric maximum likelihood estimation procedure is developed for estimating the marginal survival distribution and the pairwise correlation parameters. This produces an efficient estimator of the correlation parameter of the semiparametric normal transformation model, which characterizes the bivariate dependence of bivariate survival outcomes. In addition, a simple positive-mass-redistribution algorithm can be used to implement the estimation procedures. Since the likelihood function involves infinite-dimensional parameters, the empirical process theory is utilized to study the asymptotic properties of the proposed estimators, which are shown to be consistent, asymptotically normal and semiparametric efficient. A simple estimator for the variance of the estimates is also derived. The finite sample performance is evaluated via extensive simulations.

  17. Dose-Finding Based on Bivariate Efficacy-Toxicity Outcome Using Archimedean Copula: e78805

    National Research Council Canada - National Science Library

    Yuxi Tao; Junlin Liu; Zhihui Li; Jinguan Lin; Tao Lu; Fangrong Yan

    2013-01-01

    ... constructed with Archimedean Copula, and extend the continual reassessment method (CRM) to a bivariate trial design in which the optimal dose for phase III is based on both efficacy and toxicity...

  18. Semiparametric maximum likelihood estimation in normal transformation models for bivariate survival data

    OpenAIRE

    Yi Li; Ross L. Prentice; Xihong Lin

    2008-01-01

    We consider a class of semiparametric normal transformation models for right-censored bivariate failure times. Nonparametric hazard rate models are transformed to a standard normal model and a joint normal distribution is assumed for the bivariate vector of transformed variates. A semiparametric maximum likelihood estimation procedure is developed for estimating the marginal survival distribution and the pairwise correlation parameters. This produces an efficient estimator of the correlation ...

  19. DNA Radiation Environments Program - Spring 1990 2-meter box experiments and analyses

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.T (Oak Ridge National Lab., TN (United States)); Whitaker, S.Y. (Clark Atlanta Univ., GA (United States))

    1992-09-01

    This report summarizes the Spring 1990 2-m Box Experiments performed at the Army Pulse Radiation Facility (APRF) at Aberdeen Proving Ground, Maryland. These studies were sponsored by the Defense Nuclear Agency (DNA) under the Radiation Environments Program to obtain measured data for benchmarking the Adjoint Monte Carlo Code System, MASH, Version 1.0. MASH was developed as the Department of Defense and NATO code system for calculating neutron and gamma-ray radiation fields and shielding protection factors for armored vehicles and military structures against nuclear weapon radiation. In the experiments, neutron and gamma-ray dose and reduction factors were measured in the free-field and as a function of position on an anthropomorphic phantom that was placed outside and inside the steel-walled 2-m box. The data were acquired at a distance of 400-m from the APRF reactor. The measurements were performed by APRF, Bubble Technology Industries, the Defence Research Establishment Ottawa, Establishment Technique Central de l'Armement, and Harry Diamond Laboratory. Calculations were carried out by the Oak Ridge National Laboratory and Science Applications International Corporation. The purpose of these experiments was to measure the neutron and gamma-ray dose as a function of detector location on the phantom for cases when the phantom was standing in the free-field and inside of the box. Neutron measurements were made using a BD-IOOR bubble detector and gamma-ray measurements were made using thermoluminescent detectors (TLD). Calculated and measured data were compared in terms of the C/M ratio. DNA mandated that C/M values of {plus minus}20% define the acceptable limits for the comparison of the dose and reduction factor data and for qualifying the MASH code in replicating integral parameters.

  20. DNA Radiation Environments Program Spring 1991 2-meter box experiments and analyses

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.T. [Oak Ridge National Lab., TN (United States); Whitaker, S.Y. [Clark Atlanta Univ., GA (United States)

    1993-03-01

    This report summarizes the Spring 1991 2-m Box experiments that were performed at the Army Pulse Radiation Facility (APRF) at Aberdeen Proving Ground. These studies were sponsored by the Defense Nuclear Agency (DNA) under the Radiation Environments Program to obtain measured data for benchmarking the Adjoint Monte Carlo Code System, MASH, Version 1.0. The MASH code system was developed for the Department of Defense and NATO for calculating neutron and gamma-ray radiation fields and shielding protection factors for armored vehicles and military structures against nuclear weapon radiation. In the 2-m Box experiments, neutron and gamma-ray dose rates and reduction factors were measured in the free-field and as a function of position on an anthropomorphic phantom that was placed outside and inside a borated polyethylene lined steel-walled 2-m box. The data were acquired at a distance of 400-m from the APRF reactor. The purpose of these experiments was to measure the neutron and gamma-ray dose rates as a function of detector location on the phantom for cases when the phantom was in the free-field and inside of the box. Neutron measurements were made using a BD-100R bubble detector and gamma-ray measurements were made using thermoluminescent detectors (TLD). Calculated and measured data were compared in terms of the C/M ratio. The calculated and measured neutron and gamma-ray dose rates and reduction factors agreed on the average within the {plus_minus}20% limits mandated by DNA and demonstrate the capability of the MASH code system in reproducing measured data in nominally shielded assemblies.

  1. DNA Radiation Environments Program Spring 1991 2-meter box experiments and analyses. [DEfense Nuclear Agency (DNA)

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.T. (Oak Ridge National Lab., TN (United States)); Whitaker, S.Y. (Clark Atlanta Univ., GA (United States))

    1993-03-01

    This report summarizes the Spring 1991 2-m Box experiments that were performed at the Army Pulse Radiation Facility (APRF) at Aberdeen Proving Ground. These studies were sponsored by the Defense Nuclear Agency (DNA) under the Radiation Environments Program to obtain measured data for benchmarking the Adjoint Monte Carlo Code System, MASH, Version 1.0. The MASH code system was developed for the Department of Defense and NATO for calculating neutron and gamma-ray radiation fields and shielding protection factors for armored vehicles and military structures against nuclear weapon radiation. In the 2-m Box experiments, neutron and gamma-ray dose rates and reduction factors were measured in the free-field and as a function of position on an anthropomorphic phantom that was placed outside and inside a borated polyethylene lined steel-walled 2-m box. The data were acquired at a distance of 400-m from the APRF reactor. The purpose of these experiments was to measure the neutron and gamma-ray dose rates as a function of detector location on the phantom for cases when the phantom was in the free-field and inside of the box. Neutron measurements were made using a BD-100R bubble detector and gamma-ray measurements were made using thermoluminescent detectors (TLD). Calculated and measured data were compared in terms of the C/M ratio. The calculated and measured neutron and gamma-ray dose rates and reduction factors agreed on the average within the [plus minus]20% limits mandated by DNA and demonstrate the capability of the MASH code system in reproducing measured data in nominally shielded assemblies.

  2. DNA Radiation Environments Program - Spring 1990 2-meter box experiments and analyses

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.T [Oak Ridge National Lab., TN (United States); Whitaker, S.Y. [Clark Atlanta Univ., GA (United States)

    1992-09-01

    This report summarizes the Spring 1990 2-m Box Experiments performed at the Army Pulse Radiation Facility (APRF) at Aberdeen Proving Ground, Maryland. These studies were sponsored by the Defense Nuclear Agency (DNA) under the Radiation Environments Program to obtain measured data for benchmarking the Adjoint Monte Carlo Code System, MASH, Version 1.0. MASH was developed as the Department of Defense and NATO code system for calculating neutron and gamma-ray radiation fields and shielding protection factors for armored vehicles and military structures against nuclear weapon radiation. In the experiments, neutron and gamma-ray dose and reduction factors were measured in the free-field and as a function of position on an anthropomorphic phantom that was placed outside and inside the steel-walled 2-m box. The data were acquired at a distance of 400-m from the APRF reactor. The measurements were performed by APRF, Bubble Technology Industries, the Defence Research Establishment Ottawa, Establishment Technique Central de l`Armement, and Harry Diamond Laboratory. Calculations were carried out by the Oak Ridge National Laboratory and Science Applications International Corporation. The purpose of these experiments was to measure the neutron and gamma-ray dose as a function of detector location on the phantom for cases when the phantom was standing in the free-field and inside of the box. Neutron measurements were made using a BD-IOOR bubble detector and gamma-ray measurements were made using thermoluminescent detectors (TLD). Calculated and measured data were compared in terms of the C/M ratio. DNA mandated that C/M values of {plus_minus}20% define the acceptable limits for the comparison of the dose and reduction factor data and for qualifying the MASH code in replicating integral parameters.

  3. Economic and policy instrument analyses in support of the scrap tire recycling program in Taiwan.

    Science.gov (United States)

    Chang, Ni-Bin

    2008-02-01

    Understanding the cost-effectiveness and the role of economic and policy instruments, such as the combined product tax-recycling subsidy scheme or a tradable permit, for scrap tire recycling has been of crucial importance in a market-oriented environmental management system. Promoting product (tire) stewardship on one hand and improving incentive-based recycling policy on the other hand requires a comprehensive analysis of the interfaces and interactions in the nexus of economic impacts, environmental management, environmental valuation, and cost-benefit analysis. This paper presents an assessment of the interfaces and interactions between the implementation of policy instruments and its associated economic evaluation for sustaining a scrap tire recycling program in Taiwan during the era of the strong economic growth of the late 1990s. It begins with an introduction of the management of the co-evolution between technology metrics of scrap tire recycling and organizational changes for meeting the managerial goals island-wide during the 1990s. The database collected and used for such analysis covers 17 major tire recycling firms and 10 major tire manufacturers at that time. With estimates of scrap tire generation and possible scale of subsidy with respect to differing tire recycling technologies applied, economic analysis eventually leads to identify the associated levels of product tax with respect to various sizes of new tires. It particularly demonstrates a broad perspective of how an integrated econometric and engineering economic analysis can be conducted to assist in implementing policy instruments for scrap tire management. Research findings indicate that different subsidy settings for collection, processing, and end use of scrap tires should be configured to ameliorate the overall managerial effectiveness. Removing the existing boundaries between designated service districts could strengthen the competitiveness of scrap tires recycling industry, helping to

  4. A Bivariate Mixture Model for Natural Antibody Levels to Human Papillomavirus Types 16 and 18: Baseline Estimates for Monitoring the Herd Effects of Immunization

    OpenAIRE

    Vink, Margaretha A.; Berkhof, Johannes; van de Kassteele, Jan; van Boven, Michiel; Bogaards, Johannes A

    2016-01-01

    Post-vaccine monitoring programs for human papillomavirus (HPV) have been introduced in many countries, but HPV serology is still an underutilized tool, partly owing to the weak antibody response to HPV infection. Changes in antibody levels among non-vaccinated individuals could be employed to monitor herd effects of immunization against HPV vaccine types 16 and 18, but inference requires an appropriate statistical model. The authors developed a four-component bivariate mixture model for join...

  5. Impacts Analyses Supporting the National Environmental Policy Act Environmental Assessment for the Resumption of Transient Testing Program

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, Annette L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Brown, LLoyd C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Carathers, David C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Christensen, Boyd D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Dahl, James J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miller, Mark L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Farnum, Cathy Ottinger [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Peterson, Steven [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sondrup, A. Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Subaiya, Peter V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wachs, Daniel M. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Weiner, Ruth F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-02-01

    This document contains the analysis details and summary of analyses conducted to evaluate the environmental impacts for the Resumption of Transient Fuel and Materials Testing Program. It provides an assessment of the impacts for the two action alternatives being evaluated in the environmental assessment. These alternatives are (1) resumption of transient testing using the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory (INL) and (2) conducting transient testing using the Annular Core Research Reactor (ACRR) at Sandia National Laboratory in New Mexico (SNL/NM). Analyses are provided for radiologic emissions, other air emissions, soil contamination, and groundwater contamination that could occur (1) during normal operations, (2) as a result of accidents in one of the facilities, and (3) during transport. It does not include an assessment of the biotic, cultural resources, waste generation, or other impacts that could result from the resumption of transient testing. Analyses were conducted by technical professionals at INL and SNL/NM as noted throughout this report. The analyses are based on bounding radionuclide inventories, with the same inventories used for test materials by both alternatives and different inventories for the TREAT Reactor and ACRR. An upper value on the number of tests was assumed, with a test frequency determined by the realistic turn-around times required between experiments. The estimates provided for impacts during normal operations are based on historical emission rates and projected usage rates; therefore, they are bounding. Estimated doses for members of the public, collocated workers, and facility workers that could be incurred as a result of an accident are very conservative. They do not credit safety systems or administrative procedures (such as evacuation plans or use of personal protective equipment) that could be used to limit worker doses. Doses estimated for transportation are conservative and are based on

  6. An assessment of non-randomized medical treatment of long-term schizophrenia relapse using bivariate binary-response transition models.

    Science.gov (United States)

    Ten Have, Thomas R; Morabia, Alfredo

    2002-03-01

    The analyses of observational longitudinal studies involving concurrent changes in treatment and medical conditions present difficulties because of the multitude of directions of potential relationships: past medication influences current symptoms; past symptoms influence current medication; and current medication is associated with current symptoms. In the context of a long-term study of non-randomized pharmacological treatment of schizophrenic relapse, we present an analysis of bivariate discrete-time transitional data with binary responses in an attempt to understand the transitional and concurrent relationships between schizophrenia relapse and medication use. A naive analysis does not show any association between previous medication and current relapse. However, we provide evidence suggesting that current treatment may impact current relapse for those who have previously taken medication, but not for those who haven't taken medication in the past. When univariate models are specified to assess these associations, the bivariate nature of the problem requires a choice of which response, relapse or medication, should be the dependent variable. In this case, the choice of relapse or medication as a dependent variable does matter. Hence, our results derive from models where both relapse and medication are treated as dependent variables. Specifically, we specify a bivariate log odds ratio for current relapse and current medication use and a separate univariate logit component for each of these outcomes. Each of these components contains transitional associations with previous relapse and medication. Such models represent extensions of univariate transitional association models (e.g. Diggle et al. (1994)) and correspond to bivariate transitional models (e.g. Zeger and Liang (1991)). We incorporate changes in transitional associations into the full-data parametric model for final inference, and investigate if these temporal changes are due to learning effects or the

  7. Current misuses of multiple regression for investigating bivariate hypotheses: an example from the organizational domain.

    Science.gov (United States)

    O'Neill, Thomas A; McLarnon, Matthew J W; Schneider, Travis J; Gardner, Robert C

    2014-09-01

    By definition, multiple regression (MR) considers more than one predictor variable, and each variable's beta will depend on both its correlation with the criterion and its correlation with the other predictor(s). Despite ad nauseam coverage of this characteristic in organizational psychology and statistical texts, researchers' applications of MR in bivariate hypothesis testing has been the subject of recent and renewed interest. Accordingly, we conducted a targeted survey of the literature by coding articles, covering a five-year span from two top-tier organizational journals, that employed MR for testing bivariate relations. The results suggest that MR coefficients, rather than correlation coefficients, were most common for testing hypotheses of bivariate relations, yet supporting theoretical rationales were rarely offered. Regarding the potential impact on scientific advancement, in almost half of the articles reviewed (44 %), at least one conclusion of each study (i.e., that the hypothesis was or was not supported) would have been different, depending on the author's use of correlation or beta to test the bivariate hypothesis. It follows that inappropriate decisions to interpret the correlation versus the beta will affect the accumulation of consistent and replicable scientific evidence. We conclude with recommendations for improving bivariate hypothesis testing.

  8. A Comparative Study on DDF Curve with Bivariate and Univariate Model

    Science.gov (United States)

    Joo, K.; Choi, S.; Heo, J.

    2012-12-01

    DDF(or IDF) curve is consisted with rainfall depth(or intensity), duration and frequency, and it is useful to see how rainfall changes in various conditions. Furthermore, recently, multivariate frequency analysis is applied to hydrology because of its scalability. In this study, to obtain DDF curve, rainfall quantile is estimated by both of univariate and bivariate(rainfall depth and duration) frequency analysis. For bivariate model, three copula models which are Frank, Gumbel-Hougaard, and Joe, are used in this study. Copula model has been studied widely for various fields, and it is flexible for marginal distribution than other conventional bivariate models. Hourly recorded data(1961~2010) of Seoul weather station from Korea Meteorological Administration (KMA) is applied for frequency analysis, and inter-event time definition is used for identification of rainfall events. For estimate parameters of copula models, maximum pseudo-likelihood estimation method which is semi-parametric method is used. Gumbel distribution is examined and used for rainfall depth, and generalized extreme value (GEV) distribution is examined and used for duration. As a result, 4 DDF curves are obtained (univariate, 3 copula models). In compared to univariate model, rainfall quantile of bivariate model unaffected by duration. In detail, Frank model shows closest trend along the duration, and Joe model doesn`t show the little changes along the duration. Change of rainfall quantile from bivariate model along the duration is less significant than univariate model as varying nonexceedance probability.

  9. Two new bivariate zero-inflated generalized Poisson distributions with a flexible correlation structure

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2015-05-01

    Full Text Available To model correlated bivariate count data with extra zero observations, this paper proposes two new bivariate zero-inflated generalized Poisson (ZIGP distributions by incorporating a multiplicative factor (or dependency parameter λ, named as Type I and Type II bivariate ZIGP distributions, respectively. The proposed distributions possess a flexible correlation structure and can be used to fit either positively or negatively correlated and either over- or under-dispersed count data, comparing to the existing models that can only fit positively correlated count data with over-dispersion. The two marginal distributions of Type I bivariate ZIGP share a common parameter of zero inflation while the two marginal distributions of Type II bivariate ZIGP have their own parameters of zero inflation, resulting in a much wider range of applications. The important distributional properties are explored and some useful statistical inference methods including maximum likelihood estimations of parameters, standard errors estimation, bootstrap confidence intervals and related testing hypotheses are developed for the two distributions. A real data are thoroughly analyzed by using the proposed distributions and statistical methods. Several simulation studies are conducted to evaluate the performance of the proposed methods.

  10. Consumer Loyalty and Loyalty Programs: a topographic examination of the scientific literature using bibliometrics, spatial statistics and network analyses

    Directory of Open Access Journals (Sweden)

    Viviane Moura Rocha

    2015-04-01

    Full Text Available This paper presents a topographic analysis of the fields of consumer loyalty and loyalty programs, vastly studied in the last decades and still relevant in the marketing literature. After the identification of 250 scientific papers that were published in the last ten years in indexed journals, a subset of 76 were chosen and their 3223 references were extracted. The journals in which these papers were published, their key words, abstracts, authors, institutions of origin and citation patterns were identified and analyzed using bibliometrics, spatial statistics techniques and network analyses. The results allow the identification of the central components of the field, as well as its main authors, journals, institutions and countries that intermediate the diffusion of knowledge, which contributes to the understanding of the constitution of the field by researchers and students.

  11. A FORTRAN 77 Program and User's Guide for the Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon C.; Shortencarier, Maichael J.

    1999-08-01

    A description and user's guide are given for a computer program, PATTRN, developed at Sandia National Laboratories for use in sensitivity analyses of complex models. This program is intended for use in the analysis of input-output relationships in Monte Carlo analyses when the input has been selected using random or Latin hypercube sampling. Procedures incorporated into the program are based upon attempts to detect increasingly complex patterns in scatterplots and involve the detection of linear relationships, monotonic relationships, trends in measures of central tendency, trends in measures of variability, and deviations from randomness. The program was designed to be easy to use and portable.

  12. A double SIMEX approach for bivariate random-effects meta-analysis of diagnostic accuracy studies

    Directory of Open Access Journals (Sweden)

    Annamaria Guolo

    2017-01-01

    Full Text Available Abstract Background Bivariate random-effects models represent a widely accepted and recommended approach for meta-analysis of test accuracy studies. Standard likelihood methods routinely used for inference are prone to several drawbacks. Small sample size can give rise to unreliable inferential conclusions and convergence issues make the approach unappealing. This paper suggests a different methodology to address such difficulties. Methods A SIMEX methodology is proposed. The method is a simulation-based technique originally developed as a correction strategy within the measurement error literature. It suits the meta-analysis framework as the diagnostic accuracy measures provided by each study are prone to measurement error. SIMEX can be straightforwardly adapted to cover different measurement error structures and to deal with covariates. The effortless implementation with standard software is an interesting feature of the method. Results Extensive simulation studies highlight the improvement provided by SIMEX over likelihood approach in terms of empirical coverage probabilities of confidence intervals under different scenarios, independently of the sample size and the values of the correlation between sensitivity and specificity. A remarkable amelioration is obtained in case of deviations from the normality assumption for the random-effects distribution. From a computational point of view, the application of SIMEX is shown to be neither involved nor subject to the convergence issues affecting likelihood-based alternatives. Application of the method to a diagnostic review of the performance of transesophageal echocardiography for assessing ascending aorta atherosclerosis enables overcoming limitations of the likelihood procedure. Conclusions The SIMEX methodology represents an interesting alternative to likelihood-based procedures for inference in meta-analysis of diagnostic accuracy studies. The approach can provide more accurate inferential

  13. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  14. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    Science.gov (United States)

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    Perhaps no other pair of variables in ecology has generated as much discussion as species richness and ecosystem productivity, as illustrated by the reactions by Pierce (2013) and others to Adler et al.'s (2011) report that empirical patterns are weak and inconsistent. Adler et al. (2011) argued we need to move beyond a focus on simplistic bivariate relationships and test mechanistic, multivariate causal hypotheses. We feel the continuing debate over productivity–richness relationships (PRRs) provides a focused context for illustrating the fundamental difficulties of using bivariate relationships to gain scientific understanding.

  15. Review of a licensed dengue vaccine: Inappropriate subgroup analyses and selective reporting may cause harm in mass vaccination programs.

    Science.gov (United States)

    Dans, Antonio L; Dans, Leonila F; Lansang, Mary Ann D; Silvestre, Maria Asuncion A; Guyatt, Gordon H

    2017-11-24

    Severe life-threatening dengue fever usually occurs when a child is infected by dengue virus a 2nd time. This is caused by a phenomenon called antibody dependent enhancement or ADE. Since dengue vaccines can mimic a first infection in seronegative children (those with no previous infection), a natural infection later in life could lead to severe disease. The possibility that dengue vaccines can cause severe dengue through ADE has led to serious concern regarding the safety of mass vaccination programs. A published meta-analysis addressed this safety issue for a new vaccine against dengue fever - Dengvaxia ™. The trials in this meta-analysis have been used to campaign for mass vaccination programs in developing countries. We discuss the results of this paper and point out problems in the analyses. Reporting the findings in an Asian trial (CYD14), the authors show a 7-fold rise in one outcome - hospitalization for dengue fever in children harm for another outcome - hospitalization for severe dengue fever (as confirmed by an independent data monitoring committee): 1. in children younger than 9 years, the relative risk was 8.5 [95% CI 0.5, 146.8], and 2. in the overall study group, the relative risk was 5.5 [95% CI: 0.9, 33] The authors conduct a subgroup analysis to support claims that the vaccine is probably safe among children aged 9 years or more. This subgroup analysis has limited credibility because: 1) it was a post-hoc analysis; 2) it was one of a large number of subgroup analyses; 3) the test of interaction was not reported; but was insignificant (p=0.14); and 4) there is no biological basis for a threshold age of 9 years. The more likely explanation for the higher risk in younger children is ADE, that is, more frequent seronegativity, rather than age itself. The selective reporting and inappropriate subgroup claims mask the potential harm of dengue mass vaccination programs. Countries planning public use of the vaccine must conduct diligent post

  16. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    Science.gov (United States)

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2017-10-21

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the

  17. Accuracy of body mass index in predicting pre-eclampsia: bivariate meta-analysis

    NARCIS (Netherlands)

    Cnossen, J. S.; Leeflang, M. M. G.; de Haan, E. E. M.; Mol, B. W. J.; van der Post, J. A. M.; Khan, K. S.; ter Riet, G.

    2007-01-01

    OBJECTIVE: The objective of this study was to determine the accuracy of body mass index (BMI) (pre-pregnancy or at booking) in predicting pre-eclampsia and to explore its potential for clinical application. DESIGN: Systematic review and bivariate meta-analysis. SETTING: Medline, Embase, Cochrane

  18. Parameter estimation and statistical test of geographically weighted bivariate Poisson inverse Gaussian regression models

    Science.gov (United States)

    Amalia, Junita; Purhadi, Otok, Bambang Widjanarko

    2017-11-01

    Poisson distribution is a discrete distribution with count data as the random variables and it has one parameter defines both mean and variance. Poisson regression assumes mean and variance should be same (equidispersion). Nonetheless, some case of the count data unsatisfied this assumption because variance exceeds mean (over-dispersion). The ignorance of over-dispersion causes underestimates in standard error. Furthermore, it causes incorrect decision in the statistical test. Previously, paired count data has a correlation and it has bivariate Poisson distribution. If there is over-dispersion, modeling paired count data is not sufficient with simple bivariate Poisson regression. Bivariate Poisson Inverse Gaussian Regression (BPIGR) model is mix Poisson regression for modeling paired count data within over-dispersion. BPIGR model produces a global model for all locations. In another hand, each location has different geographic conditions, social, cultural and economic so that Geographically Weighted Regression (GWR) is needed. The weighting function of each location in GWR generates a different local model. Geographically Weighted Bivariate Poisson Inverse Gaussian Regression (GWBPIGR) model is used to solve over-dispersion and to generate local models. Parameter estimation of GWBPIGR model obtained by Maximum Likelihood Estimation (MLE) method. Meanwhile, hypothesis testing of GWBPIGR model acquired by Maximum Likelihood Ratio Test (MLRT) method.

  19. Bivariate categorical data analysis using normal linear conditional multinomial probability model.

    Science.gov (United States)

    Sun, Bingrui; Sutradhar, Brajendra

    2015-02-10

    Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.

  20. Sub-super-stabilizability of certain bivariate means via mean-convexity

    Directory of Open Access Journals (Sweden)

    Mustapha Raïssouli

    2016-11-01

    Full Text Available Abstract In this paper, we first show that the first Seiffert mean P is concave whereas the second Seiffert mean T and the Neuman-Sándor mean NS are convex. As applications, we establish the sub-stabilizability/super-stabilizability of certain bivariate means. Open problems are derived as well.

  1. The Effects of Selection Strategies for Bivariate Loglinear Smoothing Models on NEAT Equating Functions

    Science.gov (United States)

    Moses, Tim; Holland, Paul W.

    2010-01-01

    In this study, eight statistical strategies were evaluated for selecting the parameterizations of loglinear models for smoothing the bivariate test score distributions used in nonequivalent groups with anchor test (NEAT) equating. Four of the strategies were based on significance tests of chi-square statistics (Likelihood Ratio, Pearson,…

  2. On the matched pairs sign test using bivariate ranked set sampling ...

    African Journals Online (AJOL)

    Our numerical and theoretical results indicate that using BVRSS for the matched pairs sign test is substantially more efficient than using BVSRS. Illustration using palm trees data from sultanate of Oman is provided. Key words: Bootstrap method, bivariate ranked set sample, power of the test, P-value of the test, Pitman's ...

  3. Latent class bivariate model for the meta-analysis of diagnostic test accuracy studies

    NARCIS (Netherlands)

    Eusebi, P.; Reitsma, J.B.; Vermunt, J.K.

    2014-01-01

    Background Several types of statistical methods are currently available for the meta-analysis of studies on diagnostic test accuracy. One of these methods is the Bivariate Model which involves a simultaneous analysis of the sensitivity and specificity from a set of studies. In this paper, we review

  4. On the construction of bivariate exponential distributions with an arbitrary correlation coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    In this paper we use a concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...

  5. On the Construction of Bivariate Exponential Distributions with an Arbitrary Correlation Coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2010-01-01

    In this article we use the concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...

  6. Technical note: Towards a continuous classification of climate using bivariate colour mapping

    NARCIS (Netherlands)

    Teuling, A.J.

    2011-01-01

    Climate is often defined in terms of discrete classes. Here I use bivariate colour mapping to show that the global distribution of K¨oppen-Geiger climate classes can largely be reproduced by combining the simple means of two key states of the climate system 5 (i.e., air temperature and relative

  7. Bivariate analysis of sensitivity and specificity produces informative summary measures in diagnostic reviews

    NARCIS (Netherlands)

    Reitsma, Johannes B.; Glas, Afina S.; Rutjes, Anne W. S.; Scholten, Rob J. P. M.; Bossuyt, Patrick M.; Zwinderman, Aeilko H.

    2005-01-01

    Background and Objectives: Studies of diagnostic accuracy most often report pairs of sensitivity and specificity. We demonstrate the advantage of using bivariate meta-regression models to analyze such data. Methods: We discuss the methodology of both the summary Receiver Operating Characteristic

  8. A model for gust amplitude and gust length based on the bivariate gamma probability distribution function

    Science.gov (United States)

    Smith, O. E.; Adelfang, S. I.

    1981-01-01

    A model of the largest gust amplitude and gust length is presented which uses the properties of the bivariate gamma distribution. The gust amplitude and length are strongly dependent on the filter function; the amplitude increases with altitude and is larger in winter than in summer.

  9. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  10. Semi-automated detection of aberrant chromosomes in bivariate flow karyotypes

    NARCIS (Netherlands)

    Boschman, G. A.; Manders, E. M.; Rens, W.; Slater, R.; Aten, J. A.

    1992-01-01

    A method is described that is designed to compare, in a standardized procedure, bivariate flow karyotypes of Hoechst 33258 (HO)/Chromomycin A3 (CA) stained human chromosomes from cells with aberrations with a reference flow karyotype of normal chromosomes. In addition to uniform normalization of

  11. A Bivariate Mixed Distribution with a Heavy-tailed Component and its Application to Single-site Daily Rainfall Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.

    2013-02-06

    This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing low to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way

  12. Powerful bivariate genome-wide association analyses suggest the SOX6 gene influencing both obesity and osteoporosis phenotypes in males

    National Research Council Canada - National Science Library

    Liu, Yao-Zhong; Pei, Yu-Fang; Liu, Jian-Feng; Yang, Fang; Guo, Yan; Zhang, Lei; Liu, Xiao-Gang; Yan, Han; Wang, Liang; Zhang, Yin-Ping; Levy, Shawn; Recker, Robert R; Deng, Hong-Wen

    2009-01-01

    .... Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically...

  13. Bivariate functional data clustering: grouping streams based on a varying coefficient model of the stream water and air temperature relationship

    Science.gov (United States)

    H. Li; X. Deng; Andy Dolloff; E. P. Smith

    2015-01-01

    A novel clustering method for bivariate functional data is proposed to group streams based on their water–air temperature relationship. A distance measure is developed for bivariate curves by using a time-varying coefficient model and a weighting scheme. This distance is also adjusted by spatial correlation of streams via the variogram. Therefore, the proposed...

  14. Smoothing of the bivariate LOD score for non-normal quantitative traits.

    Science.gov (United States)

    Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John

    2005-12-30

    Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.

  15. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    of individual rainfall events; and (3) by storage volume and duration. In each case we used partial duration series (PDS) to extract extreme rainfall variables. The DDF curves derived from each method are presented and compared. This study examines extreme rainfall data from catchment Vedbæ k Renseanlæ g......The traditional rainfall intensity-duration-frequency (IDF) curve is a reliable approach for representing the variation of rainfall intensity with duration for a given return period. In reality rainfall variables intensity, depth and duration are dependent and therefore a bivariate analysis using...... copulas can give a more accurate IDF curve. We study IDF curves using a copula in a bivariate frequency analysis of extreme rainfall. To be able to choose the most suitable copula among candidate copulas (i.e., Gumbel, Clayton, and Frank) we demonstrated IDF curves based on variation of depth...

  16. A spatial bivariate probit model for correlated binary data with application to adverse birth outcomes.

    Science.gov (United States)

    Neelon, Brian; Anthopolos, Rebecca; Miranda, Marie Lynn

    2014-04-01

    Motivated by a study examining geographic variation in birth outcomes, we develop a spatial bivariate probit model for the joint analysis of preterm birth and low birth weight. The model uses a hierarchical structure to incorporate individual and areal-level information, as well as spatially dependent random effects for each spatial unit. Because rates of preterm birth and low birth weight are likely to be correlated within geographic regions, we model the spatial random effects via a bivariate conditionally autoregressive prior, which induces regional dependence between the outcomes and provides spatial smoothing and sharing of information across neighboring areas. Under this general framework, one can obtain region-specific joint, conditional, and marginal inferences of interest. We adopt a Bayesian modeling approach and develop a practical Markov chain Monte Carlo computational algorithm that relies primarily on easily sampled Gibbs steps. We illustrate the model using data from the 2007-2008 North Carolina Detailed Birth Record.

  17. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  18. Modeling two-vehicle crash severity by a bivariate generalized ordered probit approach.

    Science.gov (United States)

    Chiou, Yu-Chiun; Hwang, Cherng-Chwan; Chang, Chih-Chin; Fu, Chiang

    2013-03-01

    This study simultaneously models crash severity of both parties in two-vehicle accidents at signalized intersections in Taipei City, Taiwan, using a novel bivariate generalized ordered probit (BGOP) model. Estimation results show that the BGOP model performs better than the conventional bivariate ordered probit (BOP) model in terms of goodness-of-fit indices and prediction accuracy and provides a better approach to identify the factors contributing to different severity levels. According to estimated parameters in latent propensity functions and elasticity effects, several key risk factors are identified-driver type (age>65), vehicle type (motorcycle), violation type (alcohol use), intersection type (three-leg and multiple-leg), collision type (rear ended), and lighting conditions (night and night without illumination). Corresponding countermeasures for these risk factors are proposed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Reprint of "Modeling two-vehicle crash severity by a bivariate generalized ordered probit approach".

    Science.gov (United States)

    Chiou, Yu-Chiun; Hwang, Cherng-Chwan; Chang, Chih-Chin; Fu, Chiang

    2013-12-01

    This study simultaneously models crash severity of both parties in two-vehicle accidents at signalized intersections in Taipei City, Taiwan, using a novel bivariate generalized ordered probit (BGOP) model. Estimation results show that the BGOP model performs better than the conventional bivariate ordered probit (BOP) model in terms of goodness-of-fit indices and prediction accuracy and provides a better approach to identify the factors contributing to different severity levels. According to estimated parameters in latent propensity functions and elasticity effects, several key risk factors are identified-driver type (age>65), vehicle type (motorcycle), violation type (alcohol use), intersection type (three-leg and multiple-leg), collision type (rear ended), and lighting conditions (night and night without illumination). Corresponding countermeasures for these risk factors are proposed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. The bivariate probit model of uncomplicated control of tumor: a heuristic exposition of the methodology.

    Science.gov (United States)

    Herbert, D

    1997-08-01

    To describe the concept, models, and methods for the construction of estimates of joint probability of uncomplicated control of tumors in radiation oncology. Interpolations using this model can lead to the identification of more efficient treatment regimens for an individual patient. The requirement to find the treatment regimen that will maximize the joint probability of uncomplicated control of tumors suggests a new class of evolutionary experimental designs--Response Surface Methods--for clinical trials in radiation oncology. The software developed by Lesaffre and Molenberghs is used to construct bivariate probit models of the joint probability of uncomplicated control of cancer of the oropharynx from a set of 45 patients for each of whom the presence/absence of recurrent tumor (the binary event E1/E1) and the presence/absence of necrosis (the binary event E2/E2) of the normal tissues of the target volume is recorded, together with the treatment variables dose, time, and fractionation. The bivariate probit model can be used to select a treatment regime that will give a specified probability, say P(S) = 0.60, of uncomplicated control of tumor by interpolation within a set of treatment regimens with known outcomes of recurrence and necrosis. The bivariate probit model can be used to guide a sequence of clinical trials to find the maximum probability of uncomplicated control of tumor for patients in a given prognostic stratum using Response Surface Methods by extrapolation from an initial set of treatment regimens. The design of treatments for individual patients and the design of clinical trials might be improved by use of a bivariate probit model and Response Surface Methods.

  1. The approximation of bivariate Chlodowsky-Sz?sz-Kantorovich-Charlier-type operators

    OpenAIRE

    Agrawal, Purshottam Narain; Baxhaku, Behar; Chauhan, Ruchi

    2017-01-01

    Abstract In this paper, we introduce a bivariate Kantorovich variant of combination of Szász and Chlodowsky operators based on Charlier polynomials. Then, we study local approximation properties for these operators. Also, we estimate the approximation order in terms of Peetre’s K-functional and partial moduli of continuity. Furthermore, we introduce the associated GBS-case (Generalized Boolean Sum) of these operators and study the degree of approximation by means of the Lipschitz class of Bög...

  2. Bipolar and bivariate models in multi-criteria decision analysis: descriptive and constructive approaches

    OpenAIRE

    Grabisch, Michel; Greco,Salvatore; Pirlot, Marc

    2008-01-01

    International audience; Multi-criteria decision analysis studies decision problems in which the alternatives are evaluated on several dimensions or viewpoints. In the problems we consider in this paper, the scales used for assessing the alternatives with respect to a viewpoint are bipolar and univariate or unipolar and bivariate. In the former case, the scale is divided in two zones by a neutral point; a positive feeling is associated to the zone above the neutral point and a negative feeling...

  3. Self-consistent nonparametric maximum likelihood estimator of the bivariate survivor function

    OpenAIRE

    R. L. Prentice

    2014-01-01

    As usually formulated the nonparametric likelihood for the bivariate survivor function is overparameterized, resulting in uniqueness problems for the corresponding nonparametric maximum likelihood estimator. Here the estimation problem is redefined to include parameters for marginal hazard rates, and for double failure hazard rates only at informative uncensored failure time grid points where there is pertinent empirical information. Double failure hazard rates at other grid points in the ris...

  4. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Science.gov (United States)

    Zscheischler, Jakob; Orth, Rene; Seneviratne, Sonia I.

    2017-07-01

    Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature-precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 %) than models relying directly on temperature and precipitation as predictors (36 %). Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate-crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  5. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying

    2017-01-18

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  6. Genetic relationship of discrete-time survival with fertility and production in dairy cattle using bivariate models

    Directory of Open Access Journals (Sweden)

    Alenda Rafael

    2007-07-01

    Full Text Available Abstract Bivariate analyses of functional longevity in dairy cattle measured as survival to next lactation (SURV with milk yield and fertility traits were carried out. A sequential threshold-linear censored model was implemented for the analyses of SURV. Records on 96 642 lactations from 41 170 cows were used to estimate genetic parameters, using animal models, for longevity, 305 d-standardized milk production (MY305, days open (DO and number of inseminations to conception (INS in the Spanish Holstein population; 31% and 30% of lactations were censored for DO and INS, respectively. Heritability estimates for SURV and MY305 were 0.11 and 0.27 respectively; while heritability estimates for fertility traits were lower (0.07 for DO and 0.03 for INS. Antagonist genetic correlations were estimated between SURV and fertility (-0.78 and -0.54 for DO and INS, respectively or production (-0.53 for MY305, suggesting reduced functional longevity with impaired fertility and increased milk production. Longer days open seems to affect survival more than increased INS. Also, high productive cows were more problematic, less functional and more liable to being culled. The results suggest that the sequential threshold model is a method that might be considered at evaluating genetic relationship between discrete-time survival and other traits, due to its flexibility.

  7. Bivariate random-effects meta-analysis and the estimation of between-study correlation

    Directory of Open Access Journals (Sweden)

    Lambert Paul C

    2007-01-01

    Full Text Available Abstract Background When multiple endpoints are of interest in evidence synthesis, a multivariate meta-analysis can jointly synthesise those endpoints and utilise their correlation. A multivariate random-effects meta-analysis must incorporate and estimate the between-study correlation (ρB. Methods In this paper we assess maximum likelihood estimation of a general normal model and a generalised model for bivariate random-effects meta-analysis (BRMA. We consider two applied examples, one involving a diagnostic marker and the other a surrogate outcome. These motivate a simulation study where estimation properties from BRMA are compared with those from two separate univariate random-effects meta-analyses (URMAs, the traditional approach. Results The normal BRMA model estimates ρB as -1 in both applied examples. Analytically we show this is due to the maximum likelihood estimator sensibly truncating the between-study covariance matrix on the boundary of its parameter space. Our simulations reveal this commonly occurs when the number of studies is small or the within-study variation is relatively large; it also causes upwardly biased between-study variance estimates, which are inflated to compensate for the restriction on ρ^ MathType@MTEF@5@5@+=feaafiart1ev1aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacH8akY=wiFfYdH8Gipec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqai=hGuQ8kuc9pgc9s8qqaq=dirpe0xb9q8qiLsFr0=vr0=vr0dc8meaabaqaciaacaGaaeqabaqabeGadaaakeaaiiGacuWFbpGCgaqcaaaa@2E83@B. Importantly, this does not induce any systematic bias in the pooled estimates and produces conservative standard errors and mean-square errors. Furthermore, the normal BRMA is preferable to two normal URMAs; the mean-square error and standard error of pooled estimates is generally smaller in the BRMA, especially given data missing at random. For meta-analysis of proportions we then show that a generalised BRMA model is better still. This correctly uses a binomial

  8. Analyses of genetic relationships between linear type traits, fat-to-protein ratio, milk production traits, and somatic cell count in first-parity Czech Holstein cows

    DEFF Research Database (Denmark)

    Zink, V; Zavadilová, L; Lassen, Jan

    2014-01-01

    Genetic and phenotypic correlations between production traits, selected linear type traits, and somatic cell score were estimated. The results could be useful for breeding programs involving Czech Holstein dairy cows or other populations. A series of bivariate analyses was applied whereby (co...... percentage per the standard first lactation. Fifteen classified linear type traits were added, as they were measured at first lactation in the Czech Holstein population. All phenotypic data were collected within the progeny testing program of the Czech-Moravian Breeders Corporation from 2005 to 2009...

  9. A COMPARISON OF SOME ROBUST BIVARIATE CONTROL CHARTS FOR INDIVIDUAL OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    Moustafa Omar Ahmed Abu - Shawiesh

    2014-06-01

    Full Text Available This paper proposed and considered some bivariate control charts to monitor individual observations from a statistical process control. Usual control charts which use mean and variance-covariance estimators are sensitive to outliers. We consider the following robust alternatives to the classical Hoteling's T2: T2MedMAD, T2MCD, T2MVE a simulation study has been conducted to compare the performance of these control charts. Two real life data are analyzed to illustrate the application of these robust alternatives.

  10. The approximation of bivariate Chlodowsky-Szász-Kantorovich-Charlier-type operators

    Directory of Open Access Journals (Sweden)

    Purshottam Narain Agrawal

    2017-08-01

    Full Text Available Abstract In this paper, we introduce a bivariate Kantorovich variant of combination of Szász and Chlodowsky operators based on Charlier polynomials. Then, we study local approximation properties for these operators. Also, we estimate the approximation order in terms of Peetre’s K-functional and partial moduli of continuity. Furthermore, we introduce the associated GBS-case (Generalized Boolean Sum of these operators and study the degree of approximation by means of the Lipschitz class of Bögel continuous functions. Finally, we present some graphical examples to illustrate the rate of convergence of the operators under consideration.

  11. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  12. A non-parametric conditional bivariate reference region with an application to height/weight measurements on normal girls

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm

    2009-01-01

    A conceptually simple two-dimensional conditional reference curve is described. The curve gives a decision basis for determining whether a bivariate response from an individual is "normal" or "abnormal" when taking into account that a third (conditioning) variable may influence the bivariate resp......-dimensional response. An example that will serve to motivate and illustrate the reference is the study of the height/weight distribution of 7-8-year-old Danish school girls born in 1930, 1950, or 1970....

  13. Assessing characteristics related to the use of seatbelts and cell phones by drivers: application of a bivariate probit model.

    Science.gov (United States)

    Russo, Brendan J; Kay, Jonathan J; Savolainen, Peter T; Gates, Timothy J

    2014-06-01

    The effects of cell phone use and safety belt use have been an important focus of research related to driver safety. Cell phone use has been shown to be a significant source of driver distraction contributing to substantial degradations in driver performance, while safety belts have been demonstrated to play a vital role in mitigating injuries to crash-involved occupants. This study examines the prevalence of cell phone use and safety belt non-use among the driving population through direct observation surveys. A bivariate probit model is developed to simultaneously examine the factors that affect cell phone and safety belt use among motor vehicle drivers. The results show that several factors may influence drivers' decision to use cell phones and safety belts, and that these decisions are correlated. Understanding the factors that affect both cell phone use and safety belt non-use is essential to targeting policy and programs that reduce such behavior. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. A Bivariate Mixture Model for Natural Antibody Levels to Human Papillomavirus Types 16 and 18: Baseline Estimates for Monitoring the Herd Effects of Immunization.

    Directory of Open Access Journals (Sweden)

    Margaretha A Vink

    Full Text Available Post-vaccine monitoring programs for human papillomavirus (HPV have been introduced in many countries, but HPV serology is still an underutilized tool, partly owing to the weak antibody response to HPV infection. Changes in antibody levels among non-vaccinated individuals could be employed to monitor herd effects of immunization against HPV vaccine types 16 and 18, but inference requires an appropriate statistical model. The authors developed a four-component bivariate mixture model for jointly estimating vaccine-type seroprevalence from correlated antibody responses against HPV16 and -18 infections. This model takes account of the correlation between HPV16 and -18 antibody concentrations within subjects, caused e.g. by heterogeneity in exposure level and immune response. The model was fitted to HPV16 and -18 antibody concentrations as measured by a multiplex immunoassay in a large serological survey (3,875 females carried out in the Netherlands in 2006/2007, before the introduction of mass immunization. Parameters were estimated by Bayesian analysis. We used the deviance information criterion for model selection; performance of the preferred model was assessed through simulation. Our analysis uncovered elevated antibody concentrations in doubly as compared to singly seropositive individuals, and a strong clustering of HPV16 and -18 seropositivity, particularly around the age of sexual debut. The bivariate model resulted in a more reliable classification of singly and doubly seropositive individuals than achieved by a combination of two univariate models, and suggested a higher pre-vaccine HPV16 seroprevalence than previously estimated. The bivariate mixture model provides valuable baseline estimates of vaccine-type seroprevalence and may prove useful in seroepidemiologic assessment of the herd effects of HPV vaccination.

  15. Semiparametric probit models with univariate and bivariate current-status data.

    Science.gov (United States)

    Liu, Hao; Qin, Jing

    2017-04-24

    Multivariate current-status data are frequently encountered in biomedical and public health studies. Semiparametric regression models have been extensively studied for univariate current-status data, but most existing estimation procedures are computationally intensive, involving either penalization or smoothing techniques. It becomes more challenging for the analysis of multivariate current-status data. In this article, we study the maximum likelihood estimations for univariate and bivariate current-status data under the semiparametric probit regression models. We present a simple computational procedure combining the expectation-maximization algorithm with the pool-adjacent-violators algorithm for solving the monotone constraint on the baseline function. Asymptotic properties of the maximum likelihood estimators are investigated, including the calculation of the explicit information bound for univariate current-status data, as well as the asymptotic consistency and convergence rate for bivariate current-status data. Extensive simulation studies showed that the proposed computational procedures performed well under small or moderate sample sizes. We demonstrate the estimation procedure with two real data examples in the areas of diabetic and HIV research. © 2017, The International Biometric Society.

  16. Issues concerning Landowner Management Plan Adoption Decisions: A Recursive Bivariate Probit Approach

    Directory of Open Access Journals (Sweden)

    Omkar Joshi

    2015-01-01

    Full Text Available Despite the likely benefits of having a written forest management plan, a small number of landowners in the United States have the one. A recursive bivariate probit model was used to identify the possible relationship between landowners’ decision to obtain a management plan and their interest in future timber harvesting. Our study results based on recursive bivariate model suggest that landowners having larger land ownerships, longer forest ownership tenure, and higher education were more likely to have a forest management plan and future timber harvesting interest. While the landowners having interest for wildlife management were also interested to have a written management plan, they did not prefer to harvest in future. Study results indicate that written management plan means more than a timber harvesting strategy to landowners in general. Many elderly landowners with a low level of income and less formal education and those having small or medium sized tracts of forestland are less likely to own a written management plan. Therefore, this group requires special attention in various government sponsored forest management related extension activities. Future research on understanding landowner perception behind written management plan is recommended.

  17. A bivariate extension of the Hosking and Wallis goodness-of-fit measure for regional distributions

    Science.gov (United States)

    Kjeldsen, T. R.; Prosdocimi, I.

    2015-02-01

    This study presents a bivariate extension of the goodness-of-fit measure for regional frequency distributions developed by Hosking and Wallis (1993) for use with the method of L-moments. Utilizing the approximate joint normal distribution of the regional L-skewness and L-kurtosis, a graphical representation of the confidence region on the L-moment diagram can be constructed as an ellipsoid. Candidate distributions can then be accepted where the corresponding theoretical relationship between the L-skewness and L-kurtosis intersects the confidence region, and the chosen distribution would be the one that minimizes the Mahalanobis distance measure. Based on a set of Monte Carlo simulations, it is demonstrated that the new bivariate measure generally selects the true population distribution more frequently than the original method. Results are presented to show that the new measure remains robust when applied to regions where the level of intersite correlation is at a level found in real world regions. Finally the method is applied to two different case studies involving annual maximum peak flow data from Italian and British catchments to identify suitable regional frequency distributions.

  18. A method of moments to estimate bivariate survival functions: the copula approach

    Directory of Open Access Journals (Sweden)

    Silvia Angela Osmetti

    2013-05-01

    Full Text Available In this paper we discuss the problem on parametric and non parametric estimation of the distributions generated by the Marshall-Olkin copula. This copula comes from the Marshall-Olkin bivariate exponential distribution used in reliability analysis. We generalize this model by the copula and different marginal distributions to construct several bivariate survival functions. The cumulative distribution functions are not absolutely continuous and they unknown parameters are often not be obtained in explicit form. In order to estimate the parameters we propose an easy procedure based on the moments. This method consist in two steps: in the first step we estimate only the parameters of marginal distributions and in the second step we estimate only the copula parameter. This procedure can be used to estimate the parameters of complex survival functions in which it is difficult to find an explicit expression of the mixed moments. Moreover it is preferred to the maximum likelihood one for its simplex mathematic form; in particular for distributions whose maximum likelihood parameters estimators can not be obtained in explicit form.

  19. A non-stationary cost-benefit based bivariate extreme flood estimation approach

    Science.gov (United States)

    Qi, Wei; Liu, Junguo

    2018-02-01

    Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.

  20. Bivariate EMD-Based Data Adaptive Approach to the Analysis of Climate Variability

    Directory of Open Access Journals (Sweden)

    Md. Khademul Islam Molla

    2011-01-01

    Full Text Available This paper presents a data adaptive approach for the analysis of climate variability using bivariate empirical mode decomposition (BEMD. The time series of climate factors: daily evaporation, maximum and minimum temperatures are taken into consideration in variability analysis. All climate data are collected from a specific area of Bihar in India. Fractional Gaussian noise (fGn is used here as the reference signal. The climate signal and fGn (of same length are combined to produce bivariate (complex signal which is decomposed using BEMD into a finite number of sub-band signals named intrinsic mode functions (IMFs. Both of climate signal as well as fGn are decomposed together into IMFs. The instantaneous frequencies and Fourier spectrum of IMFs are observed to illustrate the property of BEMD. The lowest frequency oscillation of climate signal represents the annual cycle (AC which is an important factor in analyzing climate change and variability. The energies of the fGn's IMFs are used to define the data adaptive threshold to separate AC. The IMFs of climate signal with energy exceeding such threshold are summed up to separate the AC. The interannual distance of climate signal is also illustrated for better understanding of climate change and variability.

  1. Self-Consistent Nonparametric Maximum Likelihood Estimator of the Bivariate Survivor Function.

    Science.gov (United States)

    Prentice, R L

    2014-09-01

    As usually formulated the nonparametric likelihood for the bivariate survivor function is over-parameterized, resulting in uniqueness problems for the corresponding nonparametric maximum likelihood estimator. Here the estimation problem is redefined to include parameters for marginal hazard rates, and for double failure hazard rates only at informative uncensored failure time grid points where there is pertinent empirical information. Double failure hazard rates at other grid points in the risk region are specified rather than estimated. With this approach the nonparametric maximum likelihood estimator is unique, and can be calculated using a two-step procedure. The first step involves setting aside all doubly censored observations that are interior to the risk region. The nonparametric maximum likelihood estimator from the remaining data turns out to be the Dabrowska (1988) estimator. The omitted doubly censored observations are included in the procedure in the second stage using self-consistency, resulting in a non-iterative nonpara-metric maximum likelihood estimator for the bivariate survivor function. Simulation evaluation and asymptotic distributional results are provided. Moderate sample size efficiency for the survivor function nonparametric maximum likelihood estimator is similar to that for the Dabrowska estimator as applied to the entire dataset, while some useful efficiency improvement arises for corresponding distribution function estimator, presumably due to the avoidance of negative mass assignments.

  2. Artificial neural networks versus bivariate logistic regression in prediction diagnosis of patients with hypertension and diabetes.

    Science.gov (United States)

    Adavi, Mehdi; Salehi, Masoud; Roudbari, Masoud

    2016-01-01

    Diabetes and hypertension are important non-communicable diseases and their prevalence is important for health authorities. The aim of this study was to determine the predictive precision of the bivariate Logistic Regression (LR) and Artificial Neutral Network (ANN) in concurrent diagnosis of diabetes and hypertension. This cross-sectional study was performed with 12000 Iranian people in 2013 using stratified- cluster sampling. The research questionnaire included information on hypertension and diabetes and their risk factors. A perceptron ANN with two hidden layers was applied to data. To build a joint LR model and ANN, SAS 9.2 and Matlab software were used. The AUC was used to find the higher accurate model for predicting diabetes and hypertension. The variables of gender, type of cooking oil, physical activity, family history, age, passive smokers and obesity entered to the LR model and ANN. The odds ratios of affliction to both diabetes and hypertension is high in females, users of solid oil, with no physical activity, with positive family history, age of equal or higher than 55, passive smokers and those with obesity. The AUC for LR model and ANN were 0.78 (p=0.039) and 0.86 (p=0.046), respectively. The best model for concurrent affliction to hypertension and diabetes is ANN which has higher accuracy than the bivariate LR model.

  3. Xp21 contiguous gene syndromes: Deletion quantitation with bivariate flow karyotyping allows mapping of patient breakpoints

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, E.R.B.; Towbin, J.A. (Baylor College of Medicine, Houston, TX (United States)); Engh, G. van den; Trask, B.J. (Lawrence Livermore National Lab., CA (United States))

    1992-12-01

    Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.

  4. Evaluation of PCR on bronchoalveolar lavage fluid for diagnosis of invasive aspergillosis: a bivariate metaanalysis and systematic review.

    Directory of Open Access Journals (Sweden)

    Wenkui Sun

    Full Text Available BACKGROUND: Nucleic acid detection by polymerase chain reaction (PCR is emerging as a sensitive and rapid diagnostic tool. PCR assays on serum have the potential to be a practical diagnostic tool. However, PCR on bronchoalveolar lavage fluid (BALF has not been well established. We performed a systematic review of published studies to evaluate the diagnostic accuracy of PCR assays on BALF for invasive aspergillosis (IA. METHODS: Relevant published studies were shortlisted to evaluate the quality of their methodologies. A bivariate regression approach was used to calculate pooled values of the method sensitivity, specificity, and positive and negative likelihood ratios. Hierarchical summary receiver operating characteristic curves were used to summarize overall performance. We calculated the post-test probability to evaluate clinical usefulness. Potential heterogeneity among studies was explored by subgroup analyses. RESULTS: Seventeen studies comprising 1191 at-risk patients were selected. The summary estimates of the BALF-PCR assay for proven and probable IA were as follows: sensitivity, 0.91 (95% confidence interval (CI, 0.79-0.96; specificity, 0.92 (95% CI, 0.87-0.96; positive likelihood ratio, 11.90 (95% CI, 6.80-20.80; and negative likelihood ratio, 0.10 (95% CI, 0.04-0.24. Subgroup analyses showed that the performance of the PCR assay was influenced by PCR assay methodology, primer design and the methods of cell wall disruption and DNA extraction. CONCLUSIONS: PCR assay on BALF is highly accurate for diagnosing IA in immunocompromised patients and is likely to be a useful diagnostic tool. However, further efforts towards devising a standard protocol are needed to enable formal validation of BALF-PCR.

  5. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Gi Hwa [Seoul National Univ., Seoul (Korea, Republic of)

    1997-11-15

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances.

  6. Analyse the risks of ad hoc programming in web development and develop a metrics of appropriate tools

    OpenAIRE

    Gubhaju, Manish; Al-Sherbaz, Ali

    2013-01-01

    Today the World Wide Web has become one of the most powerful tools for business promotion and social networking. As the use of websites and web applications to promote the businesses has increased drastically over the past few years, the complexity of managing them and protecting them from security threats has become a complicated task for the organizations. On the other hand, most of the web projects are at risk and less secure due to lack of quality programming. Although there are plenty of...

  7. Integrated expression profiling and ChIP-seq analyses of the growth inhibition response program of the androgen receptor.

    Directory of Open Access Journals (Sweden)

    Biaoyang Lin

    2009-08-01

    Full Text Available The androgen receptor (AR plays important roles in the development of male phenotype and in different human diseases including prostate cancers. The AR can act either as a promoter or a tumor suppressor depending on cell types. The AR proliferative response program has been well studied, but its prohibitive response program has not yet been thoroughly studied.Previous studies found that PC3 cells expressing the wild-type AR inhibit growth and suppress invasion. We applied expression profiling to identify the response program of PC3 cells expressing the AR (PC3-AR under different growth conditions (i.e. with or without androgens and at different concentration of androgens and then applied the newly developed ChIP-seq technology to identify the AR binding regions in the PC3 cancer genome. A surprising finding was that the comparison of MOCK-transfected PC3 cells with AR-transfected cells identified 3,452 differentially expressed genes (two fold cutoff even without the addition of androgens (i.e. in ethanol control, suggesting that a ligand independent activation or extremely low-level androgen activation of the AR. ChIP-Seq analysis revealed 6,629 AR binding regions in the cancer genome of PC3 cells with an FDR (false discovery rate cut off of 0.05. About 22.4% (638 of 2,849 can be mapped to within 2 kb of the transcription start site (TSS. Three novel AR binding motifs were identified in the AR binding regions of PC3-AR cells, and two of them share a core consensus sequence CGAGCTCTTC, which together mapped to 27.3% of AR binding regions (1,808/6,629. In contrast, only about 2.9% (190/6,629 of AR binding sites contains the canonical AR matrix M00481, M00447 and M00962 (from the Transfac database, which is derived mostly from AR proliferative responsive genes in androgen dependent cells. In addition, we identified four top ranking co-occupancy transcription factors in the AR binding regions, which include TEF1 (Transcriptional enhancer factor

  8. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    Science.gov (United States)

    Sadri, S.; Madsen, H.; Mikkelsen, P. S.; Burn, D. H.

    2009-05-01

    The traditional rainfall intensity-duration-frequency (IDF) curve is a reliable approach for representing the variation of rainfall intensity with duration for a given return period. In reality rainfall variables intensity, depth and duration are dependent and therefore a bivariate analysis using copulas can give a more accurate IDF curve. We study IDF curves using a copula in a bivariate frequency analysis of extreme rainfall. To be able to choose the most suitable copula among candidate copulas (i.e., Gumbel, Clayton, and Frank) we demonstrated IDF curves based on variation of depth with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration of individual rainfall events; and (3) by storage volume and duration. In each case we used partial duration series (PDS) to extract extreme rainfall variables. The DDF curves derived from each method are presented and compared. This study examines extreme rainfall data from catchment Vedbæ k Renseanlæ g, situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration of individual rainfall events (method 2) are in agreement with empirically derived DDF curves obtained from maximum mean intensity (method 1) for a 10-year return period. For a 100-year

  9. Bayesian neural networks for bivariate binary data: an application to prostate cancer study.

    Science.gov (United States)

    Chakraborty, Sounak; Ghosh, Malay; Maiti, Tapabrata; Tewari, Ashutosh

    2005-12-15

    Prostate cancer is one of the most common cancers in American men. The cancer could either be locally confined, or it could spread outside the organ. When locally confined, there are several options for treating and curing this disease. Otherwise, surgery is the only option, and in extreme cases of outside spread, it could very easily recur within a short time even after surgery and subsequent radiation therapy. Hence, it is important to know, based on pre-surgery biopsy results how likely the cancer is organ-confined or not. The paper considers a hierarchical Bayesian neural network approach for posterior prediction probabilities of certain features indicative of non-organ confined prostate cancer. In particular, we find such probabilities for margin positivity (MP) and seminal vesicle (SV) positivity jointly. The available training set consists of bivariate binary outcomes indicating the presence or absence of the two. In addition, we have certain covariates such as prostate specific antigen (PSA), gleason score and the indicator for the cancer to be unilateral or bilateral (i.e. spread on one or both sides) in one data set and gene expression microarrays in another data set. We take a hierarchical Bayesian neural network approach to find the posterior prediction probabilities for a test and validation set, and compare these with the actual outcomes for the first data set. In case of the microarray data we use leave one out cross-validation to access the accuracy of our method. We also demonstrate the superiority of our method to the other competing methods through a simulation study. The Bayesian procedure is implemented by an application of the Markov chain Monte Carlo numerical integration technique. For the problem at hand, our Bayesian bivariate neural network procedure is shown to be superior to the classical neural network, Radford Neal's Bayesian neural network as well as bivariate logistic models to predict jointly the MP and SV in a patient in both the

  10. Effectiveness of enforcement levels of speed limit and drink driving laws and associated factors – Exploratory empirical analysis using a bivariate ordered probit model

    Directory of Open Access Journals (Sweden)

    Behram Wali

    2017-06-01

    Full Text Available The contemporary traffic safety research comprises little information on quantifying the simultaneous association between drink driving and speeding among fatally injured drivers. Potential correlation between driver's drink driving and speeding behavior poses a substantial methodological concern which needs investigation. This study therefore focused on investigating the simultaneous impact of socioeconomic factors, fatalities, vehicle ownership, health services and highway agency road safety policies on enforcement levels of speed limit and drink driving laws. The effectiveness of enforcement levels of speed limit and drink driving laws has been investigated through development of bivariate ordered probit model using data extricated from WHO's global status report on road safety in 2013. The consistent and intuitive parameter estimates along with statistically significant correlation between response outcomes validates the statistical supremacy of bivariate ordered probit model. The results revealed that fatalities per thousand registered vehicles, hospital beds per hundred thousand population and road safety policies are associated with a likely medium or high effectiveness of enforcement levels of speed limit and drink driving laws, respectively. Also, the model encapsulates the effect of several other agency related variables and socio-economic status on the response outcomes. Marginal effects are reported for analyzing the impact of such factors on intermediate categories of response outcomes. The results of this study are expected to provide necessary insights to elemental enforcement programs. Also, marginal effects of explanatory variables may provide useful directions for formulating effective policy countermeasures for overcoming driver's speeding and drink driving behavior.

  11. A composite likelihood method for bivariate meta-analysis in diagnostic systematic reviews.

    Science.gov (United States)

    Chen, Yong; Liu, Yulun; Ning, Jing; Nie, Lei; Zhu, Hongjian; Chu, Haitao

    2017-04-01

    Diagnostic systematic review is a vital step in the evaluation of diagnostic technologies. In many applications, it involves pooling pairs of sensitivity and specificity of a dichotomized diagnostic test from multiple studies. We propose a composite likelihood (CL) method for bivariate meta-analysis in diagnostic systematic reviews. This method provides an alternative way to make inference on diagnostic measures such as sensitivity, specificity, likelihood ratios, and diagnostic odds ratio. Its main advantages over the standard likelihood method are the avoidance of the nonconvergence problem, which is nontrivial when the number of studies is relatively small, the computational simplicity, and some robustness to model misspecifications. Simulation studies show that the CL method maintains high relative efficiency compared to that of the standard likelihood method. We illustrate our method in a diagnostic review of the performance of contemporary diagnostic imaging technologies for detecting metastases in patients with melanoma.

  12. Arlequin suite ver 3.5: a new series of programs to perform population genetics analyses under Linux and Windows.

    Science.gov (United States)

    Excoffier, Laurent; Lischer, Heidi E L

    2010-05-01

    We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.

  13. Bivariate cumulative probit model for the comparison of neuronal encoding hypotheses.

    Science.gov (United States)

    Hillmann, Julia; Kneib, Thomas; Koepcke, Lena; Juárez Paz, León M; Kretzberg, Jutta

    2014-01-01

    Understanding the way stimulus properties are encoded in the nerve cell responses of sensory organs is one of the fundamental scientific questions in neurosciences. Different neuronal coding hypotheses can be compared by use of an inverse procedure called stimulus reconstruction. Here, based on different attributes of experimentally recorded neuronal responses, the values of certain stimulus properties are estimated by statistical classification methods. Comparison of stimulus reconstruction results then allows to draw conclusions about relative importance of covariate features. Since many stimulus properties have a natural order and can therefore be considered as ordinal, we introduce a bivariate ordinal probit model to obtain classifications for the combination of light intensity and velocity of a visual dot pattern based on different covariates extracted from recorded spike trains. For parameter estimation, we develop a Bayesian Gibbs sampler and incorporate penalized splines to model nonlinear effects. We compare the classification performance of different individual cell covariates and simple features of groups of neurons and find that the combination of at least two covariates increases the classification performance significantly. Furthermore, we obtain a non-linear effect for the first spike latency. The model is compared to a naïve Bayesian stimulus estimation method where it yields comparable misclassification rates for the given dataset. Hence, the bivariate ordinal probit model is shown to be a helpful tool for stimulus reconstruction particularly thanks to its flexibility with respect to the number of covariates as well as their scale and effect type. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A bivariate contaminated binormal model for robust fitting of proper ROC curves to a pair of correlated, possibly degenerate, ROC datasets.

    Science.gov (United States)

    Zhai, Xuetong; Chakraborty, Dev P

    2017-06-01

    The objective was to design and implement a bivariate extension to the contaminated binormal model (CBM) to fit paired receiver operating characteristic (ROC) datasets-possibly degenerate-with proper ROC curves. Paired datasets yield two correlated ratings per case. Degenerate datasets have no interior operating points and proper ROC curves do not inappropriately cross the chance diagonal. The existing method, developed more than three decades ago utilizes a bivariate extension to the binormal model, implemented in CORROC2 software, which yields improper ROC curves and cannot fit degenerate datasets. CBM can fit proper ROC curves to unpaired (i.e., yielding one rating per case) and degenerate datasets, and there is a clear scientific need to extend it to handle paired datasets. In CBM, nondiseased cases are modeled by a probability density function (pdf) consisting of a unit variance peak centered at zero. Diseased cases are modeled with a mixture distribution whose pdf consists of two unit variance peaks, one centered at positive μ with integrated probability α, the mixing fraction parameter, corresponding to the fraction of diseased cases where the disease was visible to the radiologist, and one centered at zero, with integrated probability (1-α), corresponding to disease that was not visible. It is shown that: (a) for nondiseased cases the bivariate extension is a unit variances bivariate normal distribution centered at (0,0) with a specified correlation ρ1 ; (b) for diseased cases the bivariate extension is a mixture distribution with four peaks, corresponding to disease not visible in either condition, disease visible in only one condition, contributing two peaks, and disease visible in both conditions. An expression for the likelihood function is derived. A maximum likelihood estimation (MLE) algorithm, CORCBM, was implemented in the R programming language that yields parameter estimates and the covariance matrix of the parameters, and other statistics. A

  15. Dynamics of intracranial electroencephalographic recordings from epilepsy patients using univariate and bivariate recurrence networks.

    Science.gov (United States)

    Subramaniyam, Narayan Puthanmadam; Hyttinen, Jari

    2015-02-01

    Recently Andrezejak et al. combined the randomness and nonlinear independence test with iterative amplitude adjusted Fourier transform (iAAFT) surrogates to distinguish between the dynamics of seizure-free intracranial electroencephalographic (EEG) signals recorded from epileptogenic (focal) and nonepileptogenic (nonfocal) brain areas of epileptic patients. However, stationarity is a part of the null hypothesis for iAAFT surrogates and thus nonstationarity can violate the null hypothesis. In this work we first propose the application of the randomness and nonlinear independence test based on recurrence network measures to distinguish between the dynamics of focal and nonfocal EEG signals. Furthermore, we combine these tests with both iAAFT and truncated Fourier transform (TFT) surrogate methods, which also preserves the nonstationarity of the original data in the surrogates along with its linear structure. Our results indicate that focal EEG signals exhibit an increased degree of structural complexity and interdependency compared to nonfocal EEG signals. In general, we find higher rejections for randomness and nonlinear independence tests for focal EEG signals compared to nonfocal EEG signals. In particular, the univariate recurrence network measures, the average clustering coefficient C and assortativity R, and the bivariate recurrence network measure, the average cross-clustering coefficient C(cross), can successfully distinguish between the focal and nonfocal EEG signals, even when the analysis is restricted to nonstationary signals, irrespective of the type of surrogates used. On the other hand, we find that the univariate recurrence network measures, the average path length L, and the average betweenness centrality BC fail to distinguish between the focal and nonfocal EEG signals when iAAFT surrogates are used. However, these two measures can distinguish between focal and nonfocal EEG signals when TFT surrogates are used for nonstationary signals. We also

  16. Synthetic control charts with two-stage sampling for monitoring bivariate processes

    Directory of Open Access Journals (Sweden)

    Antonio F. B. Costa

    2007-04-01

    Full Text Available In this article, we consider the synthetic control chart with two-stage sampling (SyTS chart to control bivariate processes. During the first stage, one item of the sample is inspected and two correlated quality characteristics (x;y are measured. If the Hotelling statistic T1² for these individual observations of (x;y is lower than a specified value UCL1 the sampling is interrupted. Otherwise, the sampling goes on to the second stage, where the remaining items are inspected and the Hotelling statistic T2² for the sample means of (x;y is computed. When the statistic T2² is larger than a specified value UCL2, the sample is classified as nonconforming. According to the synthetic control chart procedure, the signal is based on the number of conforming samples between two neighbor nonconforming samples. The proposed chart detects process disturbances faster than the bivariate charts with variable sample size and it is from the practical viewpoint more convenient to administer.Este artigo apresenta um gráfico de controle com regra especial de decisão e amostragens em dois estágios para o monitoramento de processos bivariados. No primeiro estágio, um item da amostra é inspecionado e duas características de qualidade correlacionadas (x;y são medidas. Se a estatística de Hotelling T1² para as observações individuais de (x;y for menor que um valor especificado UCL1 a amostragem é interrompida. Caso contrário, a amostragem segue para o segundo estágio, onde os demais itens da amostra são inspecionados e a estatística de Hotelling T2² para as médias de (x;y é calculada. Quando a estatística T2² é maior que um valor especificado UCL2, a amostra é classificada como não conforme. De acordo com a regra especial de decisão, o alarme é baseado no número de amostras entre duas não conformes. O gráfico proposto é mais ágil e mais simples do ponto de vista operacional que o gráfico de controle bivariado com tamanho de amostras variável.

  17. Dynamics of intracranial electroencephalographic recordings from epilepsy patients using univariate and bivariate recurrence networks

    Science.gov (United States)

    Subramaniyam, Narayan Puthanmadam; Hyttinen, Jari

    2015-02-01

    Recently Andrezejak et al. combined the randomness and nonlinear independence test with iterative amplitude adjusted Fourier transform (iAAFT) surrogates to distinguish between the dynamics of seizure-free intracranial electroencephalographic (EEG) signals recorded from epileptogenic (focal) and nonepileptogenic (nonfocal) brain areas of epileptic patients. However, stationarity is a part of the null hypothesis for iAAFT surrogates and thus nonstationarity can violate the null hypothesis. In this work we first propose the application of the randomness and nonlinear independence test based on recurrence network measures to distinguish between the dynamics of focal and nonfocal EEG signals. Furthermore, we combine these tests with both iAAFT and truncated Fourier transform (TFT) surrogate methods, which also preserves the nonstationarity of the original data in the surrogates along with its linear structure. Our results indicate that focal EEG signals exhibit an increased degree of structural complexity and interdependency compared to nonfocal EEG signals. In general, we find higher rejections for randomness and nonlinear independence tests for focal EEG signals compared to nonfocal EEG signals. In particular, the univariate recurrence network measures, the average clustering coefficient C and assortativity R , and the bivariate recurrence network measure, the average cross-clustering coefficient Ccross, can successfully distinguish between the focal and nonfocal EEG signals, even when the analysis is restricted to nonstationary signals, irrespective of the type of surrogates used. On the other hand, we find that the univariate recurrence network measures, the average path length L , and the average betweenness centrality BC fail to distinguish between the focal and nonfocal EEG signals when iAAFT surrogates are used. However, these two measures can distinguish between focal and nonfocal EEG signals when TFT surrogates are used for nonstationary signals. We also

  18. Improved deadzone modeling for bivariate wavelet shrinkage-based image denoising

    Science.gov (United States)

    DelMarco, Stephen

    2016-05-01

    Modern image processing performed on-board low Size, Weight, and Power (SWaP) platforms, must provide high- performance while simultaneously reducing memory footprint, power consumption, and computational complexity. Image preprocessing, along with downstream image exploitation algorithms such as object detection and recognition, and georegistration, place a heavy burden on power and processing resources. Image preprocessing often includes image denoising to improve data quality for downstream exploitation algorithms. High-performance image denoising is typically performed in the wavelet domain, where noise generally spreads and the wavelet transform compactly captures high information-bearing image characteristics. In this paper, we improve modeling fidelity of a previously-developed, computationally-efficient wavelet-based denoising algorithm. The modeling improvements enhance denoising performance without significantly increasing computational cost, thus making the approach suitable for low-SWAP platforms. Specifically, this paper presents modeling improvements to the Sendur-Selesnick model (SSM) which implements a bivariate wavelet shrinkage denoising algorithm that exploits interscale dependency between wavelet coefficients. We formulate optimization problems for parameters controlling deadzone size which leads to improved denoising performance. Two formulations are provided; one with a simple, closed form solution which we use for numerical result generation, and the second as an integral equation formulation involving elliptic integrals. We generate image denoising performance results over different image sets drawn from public domain imagery, and investigate the effect of wavelet filter tap length on denoising performance. We demonstrate denoising performance improvement when using the enhanced modeling over performance obtained with the baseline SSM model.

  19. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    Science.gov (United States)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  20. Bivariate flow cytometric analysis and sorting of different types of maize starch grains.

    Science.gov (United States)

    Zhang, Xudong; Feng, Jiaojiao; Wang, Heng; Zhu, Jianchu; Zhong, Yuyue; Liu, Linsan; Xu, Shutu; Zhang, Renhe; Zhang, Xinghua; Xue, Jiquan; Guo, Dongwei

    2017-10-04

    Particle-size distribution, granular structure, and composition significantly affect the physicochemical properties, rheological properties, and nutritional function of starch. Flow cytometry and flow sorting are widely considered convenient and efficient ways of classifying and separating natural biological particles or other substances into subpopulations, respectively, based on the differential response of each component to stimulation by a light beam; the results allow for the correlation analysis of parameters. In this study, different types of starches isolated from waxy maize, sweet maize, high-amylose maize, pop maize, and normal maize were initially classified into various subgroups by flow cytometer and then collected through flow sorting to observe their morphology and particle-size distribution. The results showed that a 0.25% Gelzan solution served as an optimal reagent for keeping individual starch particles homogeneously dispersed in suspension for a relatively long time. The bivariate flow cytometric population distributions indicated that the starches of normal maize, sweet maize, and pop maize were divided into two subgroups, whereas high-amylose maize starch had only one subgroup. Waxy maize starch, conversely, showed three subpopulations. The subgroups sorted by flow cytometer were determined and verified in terms of morphology and granule size by scanning electron microscopy and laser particle distribution analyzer. Results showed that flow cytometry can be regarded as a novel method for classifying and sorting starch granules. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  1. Effects of three heavy metals on the bacteria growth kinetics. A bivariate model for toxicological assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rial, Diego; Vazquez, Jose Antonio; Murado, Miguel Anxo [Instituto de Investigacions Marinas (CSIC), Vigo (ES). Grupo de Reciclado y Valorizacion de Materiales Residuales (REVAL)

    2011-05-15

    The effects of three heavy metals (Co, Ni and Cd) on the growth kinetics of five bacterial strains with different characteristics (Pseudomonas sp., Phaeobacter sp. strain 27-4, Listonella anguillarum, Carnobacterium piscicola and Leuconostoc mesenteroides subsp. lysis) were studied in a batch system. A bivariate model, function of time and dose, is proposed to describe simultaneously all the kinetic profiles obtained by incubating a microorganism at increasing concentrations of individual metals. This model combines the logistic equation for describing growth, with a modification of the cumulative Weibull's function for describing the dose-dependent variations of growth parameters. The comprehensive model thus obtained - which minimizes the effects of the experimental error - was statistically significant in all the studied cases, and it raises doubts about toxicological evaluations that are based on a single growth parameter, especially if it is not obtained from a kinetic equation. In lactic acid bacteria cultures (C. piscicola and L. mesenteroides), Cd induced remarkable differences in yield and time course of characteristic metabolites. A global parameter is defined (ED{sub 50,{tau}}: dose of toxic chemical that reduces the biomass of a culture by 50% compared to that produced by the control at the time corresponding to its semi maximum biomass) that allows comparing toxic effects on growth kinetics using a single value. (orig.)

  2. Accommodating negative intracluster correlation with a mixed effects logistic model for bivariate binary data.

    Science.gov (United States)

    Ten Have, T R; Kunselman, A; Zharichenko, E

    1998-03-01

    We extend the random intercept logistic model to accommodate negative intracluster correlations for bivariate binary response data. This approach assumes a single random effect per cluster, but entails separate affine transformations of this random effect for the two responses of the pair. We show this approach works for two data sets and a simulation, whereas other mixed effects approaches fail. The two data sets are from a crossover trial and a developmental toxicity study of the effects of chemical exposure on malformation risk among rat pups. Comparisons are made with the conditional likelihood approach and with generalized estimating equations estimation of the population-averaged logit model. Simulations show the conditional likelihood approach does not perform well for moderate to strong negative correlations, as a positive intracluster correlation is assumed. The proposed mixed effects approach appears to be slightly more conservative than the population-averaged approach with respect to coverage of confidence intervals. Nonetheless, the statistical literature suggests that mixed effects models provide information in addition to that provided by population-averaged models under scientific contexts such as crossover trials. Extensions to trivariate and higher-dimensional responses also are addressed. However, such extensions require certain constraints on the correlation structure.

  3. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.

    2015-05-22

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields\\' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  4. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

    Science.gov (United States)

    Winahju, W. S.; Mukarromah, A.; Putri, S.

    2015-03-01

    Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

  5. The Bivariate Empirical Mode Decomposition and Its Contribution to Grinding Chatter Detection

    Directory of Open Access Journals (Sweden)

    Huanguo Chen

    2017-02-01

    Full Text Available Grinding chatter reduces the long-term reliability of grinding machines. Detecting the negative effects of chatter requires improved chatter detection techniques. The vibration signals collected from grinders are mainly nonstationary, nonlinear and multidimensional. Hence, bivariate empirical mode decomposition (BEMD has been investigated as a multiple signal processing method. In this paper, a feature vector extraction method based on BEMD and Hilbert transform was applied to the problem of grinding chatter. The effectiveness of this method was tested and validated with a simulated chatter signal produced by a vibration signal generator. The extraction criterion of true intrinsic mode functions (IMFs was also investigated, as well as a method for selecting the most ideal number of projection directions using the BEMD algorithm. Moreover, real-time variance and instantaneous energy were employed as chatter feature vectors for improving the prediction of chatter. Furthermore, the combination of BEMD and Hilbert transform was validated by experimental data collected from a computer numerical control (CNC guideway grinder. The results reveal the good behavior of BEMD in terms of processing nonstationary and nonlinear signals, and indicating the synchronous characteristics of multiple signals. Extracted chatter feature vectors were demonstrated to be reliable predictors of early grinding chatter.

  6. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  7. Modelling bivariate astronomical data with multiple components and non-linear relationships

    Science.gov (United States)

    Koen, C.; Bere, A.

    2017-11-01

    A common approach towards modelling bivariate scatterplots is decomposition into Gaussian components, I.e. Gaussian mixture modelling. This implicitly assumes linear relationships between the variables within each of the components in the mixture. An alternative, namely dependence modelling by mixtures of copulas, is advocated in this paper. This approach allows separate modelling of the univariate marginal distributions and the dependence which can possibly be non-linear and/or asymmetric. It also accommodates the use of a variety of parametric families for modelling each component and for each variable. The variety of dependence structures can be extended by introducing rotated versions of the copulas. Gaussian mixture modelling on the one hand, and separate modelling of univariate marginal distributions and dependence on the other hand, are illustrated by application to pulsar period - period-derivative observations. Parameter estimation for mixtures of copulas is performed using the method of maximum likelihood and selected copula models are subjected to non-parametric goodness-of-fit testing.

  8. Using the Bivariate Dale Model to jointly estimate predictors of frequency and quantity of alcohol use.

    Science.gov (United States)

    McMillan, Garnett P; Hanson, Tim; Bedrick, Edward J; Lapham, Sandra C

    2005-09-01

    This study demonstrates the usefulness of the Bivariate Dale Model (BDM) as a method for estimating the relationship between risk factors and the quantity and frequency of alcohol use, as well as the degree of association between these highly correlated drinking measures. The BDM is used to evaluate childhood sexual abuse, along with age and gender, as risk factors for the quantity and frequency of beer consumption in a sample of driving-while-intoxicated (DWI) offenders (N = 1,964; 1,612 men). The BDM allows one to estimate the relative odds of drinking up to each level of ordinal-scaled quantity and frequency of alcohol use, as well as model the degree of association between quantity and frequency of alcohol consumption as a function of covariates. Individuals who experienced childhood sexual abuse have increased risks of higher quantity and frequency of beer consumption. History of childhood sexual abuse has a greater effect on women, causing them to drink higher quantities of beer per drinking occasion. The BDM is a useful method for evaluating predictors of the quantity-frequency of alcohol consumption. SAS macrocode for fitting the BDM model is provided.

  9. Detection of recurrent chromosome abnormalities in Ewing's sarcoma and peripheral neuroectodermal tumor cells using bivariate flow karyotyping

    NARCIS (Netherlands)

    Boschman, G. A.; Rens, W.; Manders, E. M.; Slater, R. M.; Versteeg, R.; Aten, J. A.

    1992-01-01

    Bivariate flow karyotyping can be used for the detection of recurrent chromosome abnormalities in tumor cells. For this purpose 2 cell lines originally derived from Ewing's sarcomas and 4 cell lines from peripheral neuroectodermal tumors were used. The characteristic t(11;22) was known to be present

  10. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez, E-mail: valter.costa@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  11. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  12. Using bivariate signal analysis to characterize the epileptic focus: The benefit of surrogates

    Science.gov (United States)

    Andrzejak, R. G.; Chicharro, D.; Lehnertz, K.; Mormann, F.

    2011-04-01

    The disease epilepsy is related to hypersynchronous activity of networks of neurons. While acute epileptic seizures are the most extreme manifestation of this hypersynchronous activity, an elevated level of interdependence of neuronal dynamics is thought to persist also during the seizure-free interval. In multichannel recordings from brain areas involved in the epileptic process, this interdependence can be reflected in an increased linear cross correlation but also in signal properties of higher order. Bivariate time series analysis comprises a variety of approaches, each with different degrees of sensitivity and specificity for interdependencies reflected in lower- or higher-order properties of pairs of simultaneously recorded signals. Here we investigate which approach is best suited to detect putatively elevated interdependence levels in signals recorded from brain areas involved in the epileptic process. For this purpose, we use the linear cross correlation that is sensitive to lower-order signatures of interdependence, a nonlinear interdependence measure that integrates both lower- and higher-order properties, and a surrogate-corrected nonlinear interdependence measure that aims to specifically characterize higher-order properties. We analyze intracranial electroencephalographic recordings of the seizure-free interval from 29 patients with an epileptic focus located in the medial temporal lobe. Our results show that all three approaches detect higher levels of interdependence for signals recorded from the brain hemisphere containing the epileptic focus as compared to signals recorded from the opposite hemisphere. For the linear cross correlation, however, these differences are not significant. For the nonlinear interdependence measure, results are significant but only of moderate accuracy with regard to the discriminative power for the focal and nonfocal hemispheres. The highest significance and accuracy is obtained for the surrogate-corrected nonlinear

  13. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images

    Science.gov (United States)

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K.; Schad, Lothar R.; Zöllner, Frank Gerrit

    2015-01-01

    Background Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. Methods and Results In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin—3,3’-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. Validation To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Context Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics. PMID:26717571

  14. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Directory of Open Access Journals (Sweden)

    Jakob Nikolas Kather

    Full Text Available Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions.In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images.To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images.Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  15. Determining the Architecture of a Protein-DNA Complex by Combining FeBABE Cleavage Analyses, 3-D Printed Structures, and the ICM Molsoft Program.

    Science.gov (United States)

    James, Tamara; Hsieh, Meng-Lun; Knipling, Leslie; Hinton, Deborah

    2015-01-01

    Determining the structure of a protein-DNA complex can be difficult, particularly if the protein does not bind tightly to the DNA, if there are no homologous proteins from which the DNA binding can be inferred, and/or if only portions of the protein can be crystallized. If the protein comprises just a part of a large multi-subunit complex, other complications can arise such as the complex being too large for NMR studies, or it is not possible to obtain the amounts of protein and nucleic acids needed for crystallographic analyses. Here, we describe a technique we used to map the position of an activator protein relative to the DNA within a large transcription complex. We determined the position of the activator on the DNA from data generated using activator proteins that had been conjugated at specific residues with the chemical cleaving reagent, iron bromoacetamidobenzyl-EDTA (FeBABE). These analyses were combined with 3-D models of the available structures of portions of the activator protein and B-form DNA to obtain a 3-D picture of the protein relative to the DNA. Finally, the Molsoft program was used to refine the position, revealing the architecture of the protein-DNA within the transcription complex.

  16. Systematic review of model-based analyses reporting the cost-effectiveness and cost-utility of cardiovascular disease management programs.

    Science.gov (United States)

    Maru, Shoko; Byrnes, Joshua; Whitty, Jennifer A; Carrington, Melinda J; Stewart, Simon; Scuffham, Paul A

    2015-02-01

    The reported cost effectiveness of cardiovascular disease management programs (CVD-MPs) is highly variable, potentially leading to different funding decisions. This systematic review evaluates published modeled analyses to compare study methods and quality. Articles were included if an incremental cost-effectiveness ratio (ICER) or cost-utility ratio (ICUR) was reported, it is a multi-component intervention designed to manage or prevent a cardiovascular disease condition, and it addressed all domains specified in the American Heart Association Taxonomy for Disease Management. Nine articles (reporting 10 clinical outcomes) were included. Eight cost-utility and two cost-effectiveness analyses targeted hypertension (n=4), coronary heart disease (n=2), coronary heart disease plus stoke (n=1), heart failure (n=2) and hyperlipidemia (n=1). Study perspectives included the healthcare system (n=5), societal and fund holders (n=1), a third party payer (n=3), or was not explicitly stated (n=1). All analyses were modeled based on interventions of one to two years' duration. Time horizon ranged from two years (n=1), 10 years (n=1) and lifetime (n=8). Model structures included Markov model (n=8), 'decision analytic models' (n=1), or was not explicitly stated (n=1). Considerable variation was observed in clinical and economic assumptions and reporting practices. Of all ICERs/ICURs reported, including those of subgroups (n=16), four were above a US$50,000 acceptability threshold, six were below and six were dominant. The majority of CVD-MPs was reported to have favorable economic outcomes, but 25% were at unacceptably high cost for the outcomes. Use of standardized reporting tools should increase transparency and inform what drives the cost-effectiveness of CVD-MPs. © The European Society of Cardiology 2014.

  17. Impact of public programs on fertility and gender specific investment in human capital of children in rural India: cross sectional and time series analyses.

    Science.gov (United States)

    Duraisamy, P; Malathy, R

    1991-01-01

    Cross sectional and time series analyses are conducted with 1971 and 1981 rural district level data for India in order to estimate variations in program impacts on household decisionmaking concerning fertility, child mortality, and schooling; to analyze how the variation in public program subsidies and services influences sex specific investments in schooling; and to examine the bias in cross sectional estimates by employing fixed effects methodology. The theory of household production uses the framework development by Rosenzweig and Wolpin. The utility function is expressed as a function of families' desired number of children, sex specific investment in human capital of children measured by schooling of males and females, and a composite consumption good. Budget constraints are characterized in terms of the biological supply of births or natural fertility, the number of births averted by fertility control, exogenous money income, the prices of number of children, contraceptives, child schooling, and consumption of goods. Demand functions are constructed from maximizing the utility function subject to the budget constraint. Data constitute 40% of the total districts and 50% of the rural population. The empirical specification of the linear model and variable description are provided. Other explanatory variables included are adult educational attainment; % of scheduled castes and tribes and % Muslim; and % rural population. Estimation methods are described and justification is provided for the use of ordinary least squares and fixed effects methods. The results of the cross sectional analysis reveal that own-program effects of family planning and primary health centers reduced family size in 1971 and 81. The increase in secondary school enrollment is evidenced in only 1971. There is a significant effect of family planning (FP) clinics on the demand for surviving children only in 1971. The presence of a seconary school in a village reduces the demand for children in

  18. Programs in Fortran language for reporting the results of the analyses by ICP emission spectroscopy; Programas en lenguaje Fortran para la informacion de los resultados de los analisis efectuados mediante Espectroscopia Optica de emision con fuente de plasma

    Energy Technology Data Exchange (ETDEWEB)

    Roca, M.

    1985-07-01

    Three programs, written in FORTRAN IV language, for reporting the results of the analyses by ICP emission spectroscopy from data stored in files on floppy disks have been developed. They are intended, respectively, for the analyses of: 1) waters, 2) granites and slates, and 3) different kinds of geological materials. (Author) 8 refs.

  19. An application of UV-derivative spectrophotometry and bivariate calibration algorithm for study of photostability of levomepromazine hydrochloride.

    Science.gov (United States)

    Karpińska, Joanna; Sokół, Aneta; Skoczylas, Marta

    2008-12-15

    Derivative spectrophotometry and bivariate calibration algorithm were used for study of run of photooxidation of levomepromazine hydrochloride (LV). The actual concentrations of LV and its main degradation product levomepromazine sulphoxide (LV-SO) were calculated using data provided by applied methods. The direct reading of absorbance values at 302 nm and 334 nm were employed for quantification of LV and LV-SO, respectively, in the case of bivariate method. The derivative spectrophotometric method is based on transformation of zero-order spectra into first derivative. The values of first derivative at 334 nm were used for quantification of LV while at 278 nm for assay of LV-SO. The obtained quantitative data were applied for investigation of kinetics of photodegradation of LV.

  20. Inland dissolved salt chemistry: statistical evaluation of bivariate and ternary diagram models for surface and subsurface waters

    Directory of Open Access Journals (Sweden)

    Stephen T. THRELKELD

    2000-08-01

    Full Text Available We compared the use of ternary and bivariate diagrams to distinguish the effects of atmospheric precipitation, rock weathering, and evaporation on inland surface and subsurface water chemistry. The three processes could not be statistically differentiated using bivariate models even if large water bodies were evaluated separate from small water bodies. Atmospheric precipitation effects were identified using ternary diagrams in water with total dissolved salts (TDS 1000 mg l-1. A principal components analysis showed that the variability in the relative proportions of the major ions was related to atmospheric precipitation, weathering, and evaporation. About half of the variation in the distribution of inorganic ions was related to rock weathering. By considering most of the important inorganic ions, ternary diagrams are able to distinguish the contributions of atmospheric precipitation, rock weathering, and evaporation to inland water chemistry.

  1. Bivariate least squares linear regression: Towards a unified analytic formalism. I. Functional models

    Science.gov (United States)

    Caimmi, R.

    2011-08-01

    Concerning bivariate least squares linear regression, the classical approach pursued for functional models in earlier attempts ( York, 1966, 1969) is reviewed using a new formalism in terms of deviation (matrix) traces which, for unweighted data, reduce to usual quantities leaving aside an unessential (but dimensional) multiplicative factor. Within the framework of classical error models, the dependent variable relates to the independent variable according to the usual additive model. The classes of linear models considered are regression lines in the general case of correlated errors in X and in Y for weighted data, and in the opposite limiting situations of (i) uncorrelated errors in X and in Y, and (ii) completely correlated errors in X and in Y. The special case of (C) generalized orthogonal regression is considered in detail together with well known subcases, namely: (Y) errors in X negligible (ideally null) with respect to errors in Y; (X) errors in Y negligible (ideally null) with respect to errors in X; (O) genuine orthogonal regression; (R) reduced major-axis regression. In the limit of unweighted data, the results determined for functional models are compared with their counterparts related to extreme structural models i.e. the instrumental scatter is negligible (ideally null) with respect to the intrinsic scatter ( Isobe et al., 1990; Feigelson and Babu, 1992). While regression line slope and intercept estimators for functional and structural models necessarily coincide, the contrary holds for related variance estimators even if the residuals obey a Gaussian distribution, with the exception of Y models. An example of astronomical application is considered, concerning the [O/H]-[Fe/H] empirical relations deduced from five samples related to different stars and/or different methods of oxygen abundance determination. For selected samples and assigned methods, different regression models yield consistent results within the errors (∓ σ) for both

  2. On the sources of the height–intelligence correlation: New insights from a bivariate ACE model with assortative mating

    OpenAIRE

    Beauchamp, Jonathan P; Cesarini, David; Johannesson, Magnus; Lindqvist, Erik; Apicella, Coren Lee

    2010-01-01

    A robust positive correlation between height and intelligence, as measured by IQ tests, has been established in the literature. This paper makes several contributions toward establishing the causes of this association. First, we extend the standard bivariate ACE model to account for assortative mating. The more general theoretical framework provides several key insights, including formulas to decompose a cross-trait genetic correlation into components attributable to assortative mating and pl...

  3. Bivariate Extension of the Quadrature Method of Moments for Modeling Simultaneous Coagulation and Sintering of Particle Populations.

    Science.gov (United States)

    Wright, Douglas L.; McGraw, Robert; Rosner, Daniel E.

    2001-04-15

    We extendthe application of moment methods to multivariate suspended particle population problems-those for which size alone is insufficient to specify the state of a particle in the population. Specifically, a bivariate extension of the quadrature method of moments (QMOM) (R. McGraw, Aerosol Sci. Technol. 27, 255 (1997)) is presented for efficiently modeling the dynamics of a population of inorganic nanoparticles undergoing simultaneous coagulation and particle sintering. Continuum regime calculations are presented for the Koch-Friedlander-Tandon-Rosner model, which includes coagulation by Brownian diffusion (evaluated for particle fractal dimensions, D(f), in the range 1.8-3) and simultaneous sintering of the resulting aggregates (P. Tandon and D. E. Rosner, J. Colloid Interface Sci. 213, 273 (1999)). For evaluation purposes, and to demonstrate the computational efficiency of the bivariate QMOM, benchmark calculations are carried out using a high-resolution discrete method to evolve the particle distribution function n(nu, a) for short to intermediate times (where nu and a are particle volume and surface area, respectively). Time evolution of a selected set of 36 low-order mixed moments is obtained by integration of the full bivariate distribution and compared with the corresponding moments obtained directly using two different extensions of the QMOM. With the more extensive treatment, errors of less than 1% are obtained over substantial aerosol evolution, while requiring only a few minutes (rather than days) of CPU time. Longer time QMOM simulations lend support to the earlier finding of a self-preserving limit for the dimensionless joint (nu, a) particle distribution function under simultaneous coagulation and sintering (Tandon and Rosner, 1999; D. E. Rosner and S. Yu, AIChE J., 47 (2001)). We demonstrate that, even in the bivariate case, it is possible to use the QMOM to rapidly model the approach to asymptotic behavior, allowing an immediate assessment of

  4. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Directory of Open Access Journals (Sweden)

    Jose Javier Gorgoso-Varela

    2016-04-01

    Full Text Available Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights.Area of study: North-West of Spain.Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill. stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution.Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic.Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass.

  5. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Energy Technology Data Exchange (ETDEWEB)

    Cardil Forradellas, A.; Molina Terrén, D.M.; Oliveres, J.; Castellnou, M.

    2016-07-01

    Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights. Area of study: North-West of Spain. Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill.) stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution. Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic. Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass. (Author)

  6. Giant Galápagos tortoises; molecular genetic analyses identify a trans-island hybrid in a repatriation program of an endangered taxon

    Directory of Open Access Journals (Sweden)

    Caccone Adalgisa

    2007-02-01

    Full Text Available Abstract Background Giant Galápagos tortoises on the island of Española have been the focus of an intensive captive breeding-repatriation programme for over 35 years that saved the taxon from extinction. However, analysis of 118 samples from released individuals indicated that the bias sex ratio and large variance in reproductive success among the 15 breeders has severely reduced the effective population size (Ne. Results We report here that an analysis of an additional 473 captive-bred tortoises released back to the island reveals an individual (E1465 that exhibits nuclear microsatellite alleles not found in any of the 15 breeders. Statistical analyses incorporating genotypes of 304 field-sampled individuals from all populations on the major islands indicate that E1465 is most probably a hybrid between an Española female tortoise and a male from the island of Pinzón, likely present on Española due to human transport. Conclusion Removal of E1465 as well as its father and possible (half-siblings is warranted to prevent further contamination within this taxon of particular conservation significance. Despite this detected single contamination, it is highly noteworthy to emphasize the success of this repatriation program conducted over nearly 40 years and involving release of over 2000 captive-bred tortoises that now reproduce in situ. The incorporation of molecular genetic analysis of the program is providing guidance that will aid in monitoring the genetic integrity of this ambitious effort to restore a unique linage of a spectacular animal.

  7. The accuracy of cell-free fetal DNA-based non-invasive prenatal testing in singleton pregnancies: a systematic review and bivariate meta-analysis.

    Science.gov (United States)

    Mackie, F L; Hemming, K; Allen, S; Morris, R K; Kilby, M D

    2017-01-01

    Cell-free fetal DNA (cffDNA) non-invasive prenatal testing (NIPT) is rapidly expanding, and is being introduced at varying rates depending on country and condition. Determine accuracy of cffDNA-based NIPT for all conditions. Evaluate influence of other factors on test performance. Medline, Embase, CINAHL, Cochrane Library, from 1997 to April 2015. Cohort studies reporting cffDNA-based NIPT performance in singleton pregnancies. Bivariate or univariate meta-analysis and subgroup analysis performed to explore influence of test type and population risk. A total of 117 studies were included that analysed 18 conditions. Bivariate meta-analysis demonstrated sensitivities and specificities, respectively, for: fetal sex, 0.989 (95% CI 0.980-0.994) and 0.996 (95% CI 0.989-0.998), 11 179 tests; rhesus D, 0.993 (95% CI 0.982-0.997) and 0.984 (95% CI 0.964-0.993), 10 290 tests; trisomy 21, 0.994 (95% CI 0.983-0.998) and 0.999 (95% CI 0.999-1.000), 148 344 tests; trisomy 18, 0.977 (95% CI 0.952-0.989) and 0.999 (95% CI 0.998-1.000), 146 940 tests; monosomy X, 0.929 (95% CI 0.741-0.984) and 0.999 (95% CI 0.995-0.999), 6712 tests. Trisomy 13 was analysed by univariate meta-analysis, with a summary sensitivity of 0.906 (95% CI 0.823-0.958) and specificity of 1.00 (95% CI 0.999-0.100), from 134 691 tests. False and inconclusive results were poorly reported across all conditions. Although the test type affected both sensitivity and specificity, there was no evidence that population risk had any effect. Performance of cffDNA-based NIPT is affected by condition under investigation. For fetal sex and rhesus D status, NIPT can be considered diagnostic. For trisomy 21, 18, and 13, the lower sensitivity, specificity, and disease prevalence, combined with the biological influence of confined placental mosaicism, designates it a screening test. These factors must be considered when counselling patients and assessing the cost of introduction into routine care

  8. Computation of the Integral of the Bivariate Normal Distribution Over Arbitrary Polygons

    Science.gov (United States)

    1980-06-01

    9 IV. Normal Probability over Arbitrary Polygons .................................. 13 V. Discussion of Computer Program B ( Flow Charts...evaluation of P for each angular region needed. The computer program and its flow charts are discussed in Section V. The Fortran IV CDC-6700s program...Shows Anular Regiom of Simple Polygons D and E I- S (27) P(D) = 1 -L P(OO), P(E) = I -L-j P(Ne), P(D) + P(E)= P(S), : ill ill where (28) :2 = a,, •i 2

  9. VH Replacement Footprint Analyser-I (VHRFA-I, a Java based Computer Program for Analyses of Immunoglobulin Heavy Chain Genes and Potential VH Replacement Products in Human and Mouse

    Directory of Open Access Journals (Sweden)

    Lin eHuang

    2014-02-01

    Full Text Available VH replacement occurs through RAG-mediated secondary recombination between a rearranged VH gene and an upstream unrearranged VH gene. Due to the location of the cryptic Recombination Signal Sequence (cRSS, TACTGTG at the 3’ end of VH gene coding region, a short stretch of nucleotides from the previous rearranged VH gene can be retained in the newly formed VH-DH junction as a footprint of VH replacement. Such footprints can be used as markers to identify IgH genes potentially generated through VH replacement. To explore the contribution of VH replacement products to the antibody repertoire, we developed a Java based computer program, VH replacement footprint analyzer-I (VHRFA-I, to analyze published or newly obtained IgH genes from human or mouse. The VHRFA-1 program has multiple functional modules: it first uses service provided by the IMGT/V-QUEST program to assign potential VH, DH, and JH germline genes; then, it searches for VH replacement footprint motifs within the VH-DH junction (N1 regions of IgH gene sequences to identify potential VH replacement products; it can also analyze the frequencies of VH replacement products in correlation with publications, keywords, or VH, DH, and JH gene usages, and mutation status; it can further analyze the amino acid usages encoded by the identified VH replacement footprints. In summary, this program provides a useful computation tool for exploring the biological significance of VH replacement products in human and mouse.

  10. A Distribution-Based Multiple Imputation Method for Handling Bivariate Pesticide Data with Values below the Limit of Detection

    Science.gov (United States)

    Chen, Haiying; Quandt, Sara A.; Grzywacz, Joseph G.; Arcury, Thomas A.

    2011-01-01

    Background Environmental and biomedical researchers frequently encounter laboratory data constrained by a lower limit of detection (LOD). Commonly used methods to address these left-censored data, such as simple substitution of a constant for all values LOD, may bias parameter estimation. In contrast, multiple imputation (MI) methods yield valid and robust parameter estimates and explicit imputed values for variables that can be analyzed as outcomes or predictors. Objective In this article we expand distribution-based MI methods for left-censored data to a bivariate setting, specifically, a longitudinal study with biological measures at two points in time. Methods We have presented the likelihood function for a bivariate normal distribution taking into account values LOD as well as missing data assumed missing at random, and we use the estimated distributional parameters to impute values LOD and to generate multiple plausible data sets for analysis by standard statistical methods. We conducted a simulation study to evaluate the sampling properties of the estimators, and we illustrate a practical application using data from the Community Participatory Approach to Measuring Farmworker Pesticide Exposure (PACE3) study to estimate associations between urinary acephate (APE) concentrations (indicating pesticide exposure) at two points in time and self-reported symptoms. Results Simulation study results demonstrated that imputed and observed values together were consistent with the assumed and estimated underlying distribution. Our analysis of PACE3 data using MI to impute APE values LOD showed that urinary APE concentration was significantly associated with potential pesticide poisoning symptoms. Results based on simple substitution methods were substantially different from those based on the MI method. Conclusions The distribution-based MI method is a valid and feasible approach to analyze bivariate data with values LOD, especially when explicit values for the

  11. Survival advantage associated with treatment of injury at designated trauma centers: a bivariate probit model with instrumental variables.

    Science.gov (United States)

    Pracht, Etienne E; Tepas, Joseph J; Celso, Brian G; Langland-Orban, Barbara; Flint, Lewis

    2007-02-01

    This article analyzes the effectiveness of designated trauma centers in Florida concerning reduction in the mortality risk of severely injured trauma victims. A bivariate probit model is used to compute the differential impact of two alternative acute care treatment sites. The alternative sites are defined as (1) a nontrauma center (NC) or (2) a designated trauma center (DTC). An instrumental-variables method was used to adjust for prehospital selection bias in addition to the influence of age, gender, race, risk of mortality, and type of injury. Treatment at a DTC was associated with a reduction of 0.13 in the probability of mortality.

  12. Effectiveness of enforcement levels of speed limit and drink driving laws and associated factors – Exploratory empirical analysis using a bivariate ordered probit model

    National Research Council Canada - National Science Library

    Behram Wali; Anwaar Ahmed; Shahid Iqbal; Arshad Hussain

    2017-01-01

    ... levels of speed limit and drink driving laws. The effectiveness of enforcement levels of speed limit and drink driving laws has been investigated through development of bivariate ordered probit model using data extricated from WHO's global...

  13. Effectiveness of enforcement levels of speed limit and drink driving laws and associated factors -Exploratory empirical analysis using a bivariate ordered probit model

    National Research Council Canada - National Science Library

    Behram Wali Anwaar Ahmed Shahid Iqbal Arshad Hussain

    2017-01-01

    ... levels of speed limit and drink driving laws. The effectiveness of enforcement levels of speed limit and drink driving laws has been investigated through development of bivariate ordered probit model using data extricated from...

  14. Simultaneous use of serum IgG and IgM for risk scoring of suspected early Lyme borreliosis: graphical and bivariate analyses

    DEFF Research Database (Denmark)

    Dessau, Ram B; Ejlertsen, Tove; Hilden, Jørgen

    2010-01-01

    The laboratory diagnosis of early disseminated Lyme borreliosis (LB) rests on IgM and IgG antibodies in serum. The purpose of this study was to refine the statistical interpretation of IgM and IgG by combining the diagnostic evidence provided by the two immunoglobulins and exploiting the whole ra...

  15. Operator identities involving the bivariate Rogers-Szegö polynomials and their applications to the multiple q-series identities

    Science.gov (United States)

    Zhang, Zhizheng; Wang, Tianze

    2008-07-01

    In this paper, we first give several operator identities involving the bivariate Rogers-Szegö polynomials. By applying the technique of parameter augmentation to the multiple q-binomial theorems given by Milne [S.C. Milne, Balanced summation theorems for U(n) basic hypergeometric series, AdvE Math. 131 (1997) 93-187], we obtain several new multiple q-series identities involving the bivariate Rogers-Szegö polynomials. These include multiple extensions of Mehler's formula and Rogers's formula. Our U(n+1) generalizations are quite natural as they are also a direct and immediate consequence of their (often classical) known one-variable cases and Milne's fundamental theorem for An or U(n+1) basic hypergeometric series in Theorem 1E49 of [S.C. Milne, An elementary proof of the Macdonald identities for , Adv. Math. 57 (1985) 34-70], as rewritten in Lemma 7.3 on p. 163 of [S.C. Milne, Balanced summation theorems for U(n) basic hypergeometric series, Adv. Math. 131 (1997) 93-187] or Corollary 4.4 on pp. 768-769 of [S.C. Milne, M. Schlosser, A new An extension of Ramanujan's summation with applications to multilateral An series, Rocky Mountain J. Math. 32 (2002) 759-792].

  16. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  17. Requirements for implementation of Kuessner and Wagner indicial lift growth functions into the FLEXSTAB computer program system for use in dynamic loads analyses

    Science.gov (United States)

    Miller, R. D.; Rogers, J. T.

    1975-01-01

    General requirements for dynamic loads analyses are described. The indicial lift growth function unsteady subsonic aerodynamic representation is reviewed, and the FLEXSTAB CPS is evaluated with respect to these general requirements. The effects of residual flexibility techniques on dynamic loads analyses are also evaluated using a simple dynamic model.

  18. Root Cause Analyses of Nunn-McCurdy Breaches. Volume 2: Excalibur Artillery Projectile and the Navy Enterprise Resource Planning Program, with an Approach to Analyzing Program Complexity and Risk

    Science.gov (United States)

    2012-01-01

    extraordinarily helpful discussions about the program. We thank John McNair and Dennis Taitano for their insights regarding the creation and expectations of...Program Deviation Report PEO-AMMO Program Executive Officer for Ammunition PMP parts management program POM Program Objectives Memorandum RCA root cause

  19. Nonparametric association analysis of bivariate left-truncated competing risks data.

    Science.gov (United States)

    Cheng, Yu; Shen, Pao-Sheng; Zhang, Zhumin; Lai, HuiChuan J

    2016-05-01

    We develop time-varying association analyses for onset ages of two lung infections to address the statistical challenges in utilizing registry data where onset ages are left-truncated by ages of entry and competing-risk censored by deaths. Two types of association estimators are proposed based on conditional cause-specific hazard function and cumulative incidence function that are adapted from unconditional quantities to handle left truncation. Asymptotic properties of the estimators are established by using the empirical process techniques. Our simulation study shows that the estimators perform well with moderate sample sizes. We apply our methods to the Cystic Fibrosis Foundation Registry data to study the relationship between onset ages of Pseudomonas aeruginosa and Staphylococcus aureus infections. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Calibration exercise for the Community Aquatic Monitoring Program (CAMP) nutrient analyses: establishing variability between filtered and unfiltered water samples and two analytical laboratories

    National Research Council Canada - National Science Library

    Thériault, M.-H; Courtenay, S.C

    2012-01-01

    As part of the Community Aquatic Monitoring Program (CAMP) unfiltered water samples were collected between 2006 and 2008 and analyzed for dissolved inorganic nutrients (i.e., nitrate + nitrite (NO3 + NO2...

  1. Fitting a Bivariate Measurement Error Model for Episodically Consumed Dietary Components

    KAUST Repository

    Zhang, Saijuan

    2011-01-06

    There has been great public health interest in estimating usual, i.e., long-term average, intake of episodically consumed dietary components that are not consumed daily by everyone, e.g., fish, red meat and whole grains. Short-term measurements of episodically consumed dietary components have zero-inflated skewed distributions. So-called two-part models have been developed for such data in order to correct for measurement error due to within-person variation and to estimate the distribution of usual intake of the dietary component in the univariate case. However, there is arguably much greater public health interest in the usual intake of an episodically consumed dietary component adjusted for energy (caloric) intake, e.g., ounces of whole grains per 1000 kilo-calories, which reflects usual dietary composition and adjusts for different total amounts of caloric intake. Because of this public health interest, it is important to have models to fit such data, and it is important that the model-fitting methods can be applied to all episodically consumed dietary components.We have recently developed a nonlinear mixed effects model (Kipnis, et al., 2010), and have fit it by maximum likelihood using nonlinear mixed effects programs and methodology (the SAS NLMIXED procedure). Maximum likelihood fitting of such a nonlinear mixed model is generally slow because of 3-dimensional adaptive Gaussian quadrature, and there are times when the programs either fail to converge or converge to models with a singular covariance matrix. For these reasons, we develop a Monte-Carlo (MCMC) computation of fitting this model, which allows for both frequentist and Bayesian inference. There are technical challenges to developing this solution because one of the covariance matrices in the model is patterned. Our main application is to the National Institutes of Health (NIH)-AARP Diet and Health Study, where we illustrate our methods for modeling the energy-adjusted usual intake of fish and whole

  2. Choice of dialysis treatment and type of medical unit (private vs public): application of a recursive bivariate probit.

    Science.gov (United States)

    Gitto, Lara; Santoro, Domenico; Sobbrio, Giuseppe

    2006-11-01

    ESRD patients have to deal with two choices: the first is related to the dialysis modality; the second concerns the type of dialysis unit (public vs private) where to undertake the treatment. Such a choice is related to unobservable factors, among which there might be patients' clinical factors as well as factors related to the characteristics of each unit. We employ a recursive bivariate probit estimation on a sample of ESRD Sicilian patients in order to evaluate the impact of these factors. Results can have important implications for Sicily in order to organize dialysis services: here, in fact, the number of private centres is higher than in other Italian Regions. Copyright (c) 2006 John Wiley & Sons, Ltd.

  3. Local linear estimation of concordance probability with application to covariate effects models on association for bivariate failure-time data.

    Science.gov (United States)

    Ding, Aidong Adam; Hsieh, Jin-Jian; Wang, Weijing

    2015-01-01

    Bivariate survival analysis has wide applications. In the presence of covariates, most literature focuses on studying their effects on the marginal distributions. However covariates can also affect the association between the two variables. In this article we consider the latter issue by proposing a nonstandard local linear estimator for the concordance probability as a function of covariates. Under the Clayton copula, the conditional concordance probability has a simple one-to-one correspondence with the copula parameter for different data structures including those subject to independent or dependent censoring and dependent truncation. The proposed method can be used to study how covariates affect the Clayton association parameter without specifying marginal regression models. Asymptotic properties of the proposed estimators are derived and their finite-sample performances are examined via simulations. Finally, for illustration, we apply the proposed method to analyze a bone marrow transplant data set.

  4. VH Replacement Footprint Analyzer-I, a Java-Based Computer Program for Analyses of Immunoglobulin Heavy Chain Genes and Potential VH Replacement Products in Human and Mouse.

    Science.gov (United States)

    Huang, Lin; Lange, Miles D; Zhang, Zhixin

    2014-01-01

    VH replacement occurs through RAG-mediated secondary recombination between a rearranged VH gene and an upstream unrearranged VH gene. Due to the location of the cryptic recombination signal sequence (cRSS, TACTGTG) at the 3' end of VH gene coding region, a short stretch of nucleotides from the previous rearranged VH gene can be retained in the newly formed VH-DH junction as a "footprint" of VH replacement. Such footprints can be used as markers to identify Ig heavy chain (IgH) genes potentially generated through VH replacement. To explore the contribution of VH replacement products to the antibody repertoire, we developed a Java-based computer program, VH replacement footprint analyzer-I (VHRFA-I), to analyze published or newly obtained IgH genes from human or mouse. The VHRFA-1 program has multiple functional modules: it first uses service provided by the IMGT/V-QUEST program to assign potential VH, DH, and JH germline genes; then, it searches for VH replacement footprint motifs within the VH-DH junction (N1) regions of IgH gene sequences to identify potential VH replacement products; it can also analyze the frequencies of VH replacement products in correlation with publications, keywords, or VH, DH, and JH gene usages, and mutation status; it can further analyze the amino acid usages encoded by the identified VH replacement footprints. In summary, this program provides a useful computation tool for exploring the biological significance of VH replacement products in human and mouse.

  5. Optimizing the analysis strategy for the CANVAS Program : A prespecified plan for the integrated analyses of the CANVAS and CANVAS-R trials

    NARCIS (Netherlands)

    Neal, Bruce; Perkovic, Vlado; Mahaffey, Kenneth W.; Fulcher, Greg; Erondu, Ngozi; Desai, Mehul; Shaw, Wayne; Law, Gordon; Walton, Marc K.; Rosenthal, Norm; de Zeeuw, Dick; Matthews, David R.

    Two large cardiovascular outcome trials of canagliflozin, comprising the CANVAS Program, will complete in early 2017: the CANagliflozin cardioVascular Assessment Study (CANVAS) and the CANagliflozin cardioVascular Assessment Study-Renal (CANVAS-R). Accruing data for the sodium glucose co-transporter

  6. (In)security factor atomic bomb. An analysis of the crisis with the Iranian nuclear program; (Un-)Sicherheitsfaktor Atombombe. Eine Analyse der Krise um das iranische Nuklearprogramm

    Energy Technology Data Exchange (ETDEWEB)

    Bock, Andreas [Augsburg Univ. (Germany). Lehrstuhl fuer Friedens- und Konfliktforschung

    2012-04-15

    Iran is a rational actor in the international politics that decides on the basis of the perception of threat. Iran's security situation is comparable with that of Israel with the rational consequence to rely on the atomic program with respect to deterrence and self-defense. The solution of the Iran crisis is basically dependent on a change of the perception of threat. A military act against the Iranian nuclear facilities would be counterproductive, would only slowing down the program but not prevent further activities. In fact a military act would enhance the perception of threat. For the analysis of the Iran crises the author used the Cuba crisis as blueprint, were mislead perceptions were responsible for the escalation.

  7. Employment program for patients with severe mental illness in Malaysia: a 3-month outcome.

    Science.gov (United States)

    Wan Kasim, Syarifah Hafizah; Midin, Marhani; Abu Bakar, Abdul Kadir; Sidi, Hatta; Nik Jaafar, Nik Ruzyanei; Das, Srijit

    2014-01-01

    This study aimed to examine the rate and predictive factors of successful employment at 3 months upon enrolment into an employment program among patients with severe mental illness (SMI). A cross-sectional study using universal sampling technique was conducted on patients with SMI who completed a 3-month period of being employed at Hospital Permai, Malaysia. A total of 147 patients were approached and 126 were finally included in the statistical analyses. Successful employment was defined as the ability to work 40 or more hours per month. Factors significantly associated with successful employment from bivariate analyses were entered into a multiple logistic regression analysis to identify predictors of successful employment. The rate of successful employment at 3 months was 68.3% (n=81). Significant factors associated with successful employment from bivariate analyses were having past history of working, good family support, less number of psychiatric admissions, good compliance to medicine, good interest in work, living in hostel, being motivated to work, satisfied with the job or salary, getting a preferred job, being in competitive or supported employment and having higher than median scores of PANNS on the positive, negative and general psychopathology. Significant predictors of employment, from a logistic regression model were having good past history of working (pemployment rate among patients with SMI. Good past history of working and getting a preferred job were significant predictors of successful employment. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Analyse d'une formation plurilingue à distance : actions et interactions Analysis of a plurilingual e-learning program: action and interaction

    Directory of Open Access Journals (Sweden)

    Monica Masperi

    2006-04-01

    Full Text Available Cette contribution s'attache à analyser trois sessions de formation à l'intercompréhension en langues romanes qui se sont déroulées selon un scénario semblable depuis une plateforme Internet spécifiquement développée pour cet usage : la plateforme Galanet. Les formations Galanet réunissent un nombre important d'étudiants (entre cent et deux cents en général, répartis au sein de plusieurs établissements universitaires et investis dans un projet à réaliser en commun, soit un "dossier de presse" quadrilingue publié sur la Toile. Nous mènerons notre étude selon deux axes complémentaires : une analyse quantitative des messages déposés, pour chacune des sessions, dans l'espace forum de la plateforme et une analyse qualitative d'un échantillon d'interventions générées dans une de ces sessions. La première approche, quantitative et comparative, devrait nous permettre de tirer un certain nombre d'enseignements sur le déroulement de la formation dans ses grandes lignes, ainsi que sur l'action de ses participants (étudiants et tuteurs. La seconde nous permettra en revanche de mieux cerner les pratiques discursives suscitées par le scénario pédagogique et la dimension plurilingue des échanges.This paper focuses on analysing three sessions of cross-comprehension training in romance languages which took place using similar scenarii on the e-learning platform "Galanet". "Galanet" learning schemes assemble large groups of students (from one to two hundred in general who are geographically located in different universities and committed to completing common tasks the final objective of which is to create a quadrilingual press file publication online. The present study will be developed along two complementary axes: 1. a quantitative analysis of the messages posted on the forum page of the platform for each session. 2. a qualitative analysis of sample interventions generated in one of these sessions. On the one hand, the

  9. Chromosome analyses in dogs.

    Science.gov (United States)

    Reimann-Berg, N; Bullerdiek, J; Murua Escobar, H; Nolte, I

    2012-01-01

    Cytogenetics is the study of normal and abnormal chromosomes. Every species is characterized by a given number of chromosomes that can be recognized by their specific shape. The chromosomes are arranged according to standard classification schemes for the respective species. While pre- and postnatal chromosome analyses investigate the constitutional karyotype, tumor cytogenetics is focused on the detection of clonal acquired, tumor-associated chromosome aberrations. Cytogenetic investigations in dogs are of great value especially for breeders dealing with fertility problems within their pedigrees, for veterinarians and last but not least for the dog owners. Dogs and humans share a variety of genetic diseases, including cancer. Thus, the dog has become an increasingly important model for genetic diseases. However, cytogenetic analyses of canine cells are complicated by the complex karyotype of the dog. Only just 15 years ago, a standard classification scheme for the complete canine karyotype was established. For chromosome analyses of canine cells the same steps of chromosome preparation are used as in human cytogenetics. There are few reports about cytogenetic changes in non-neoplastic cells, involving predominantly the sex chromosomes. Cytogenetic analyses of different entities of canine tumors revealed that, comparable to human tumors, tumors of the dog are often characterized by clonal chromosome aberrations, which might be used as diagnostic and prognostic markers. The integration of modern techniques (molecular genetic approaches, adaptive computer programs) will facilitate and complete conventional cytogenetic studies. However, conventional cytogenetics is still non-replaceable.

  10. MicroRNAs level as an initial screening method for early-stage lung cancer: a bivariate diagnostic random-effects meta-analysis

    Science.gov (United States)

    He, Wen-Jie; Li, Wen-Hui; Jiang, Bo; Wang, Yu-Feng; Xia, Yao-Xiong; Wang, Li

    2015-01-01

    Accumulating studies suggested that microRNAs (miRNAs) can have high diagnostic value as a non-invasive and cost-effective procedure with high sensitivity and specificity in the detection of early-stage lung cancer. However, there is inconsistency observed in the results of relevant studies. Therefore, we performed this meta-analysis to evaluate diagnostic value of miRNAs based on all related studies. A total of 38 studies from 13 included articles were used for the analysis, consisting of 510 patients and 465 healthy controls. All analyses were performed on the R 3.2.0 software. The bivariate random-effects meta-analysis model was applied to obtain the following pooled parameters: sensitivity, 0.797 (95% CI: 0.756-0.832); false positive rate, 0.296 (95% CI: 0.250-0.346); and AUC, 0.818. In addition, subgroup analyses were conducted, showing not only that a combination of multiple miRNAs as biomarkers have greater diagnostic value for early-stage lung cancer (sensitivity, false positive rate and AUC of 83%, 25.2% and 0.858, respectively) had a higher diagnostic accuracy than single miRNA (sensitivity, false positive rate and AUC of 78.3%, 31.6% and 0.799, respectively), but also that specimen from circulating system (sensitivity, false positive rate and AUC of 82.5%, 30.5% and 0.836, respectively) provide better biomarkers than specimen from non-circulating system (sensitivity, false positive rate and AUC of 73.8%, 26.5% and 0.796, respectively). In summary, the current meta-analysis suggests that miRNAs as biomarkers, particularly a combination of multiple tumor-specific miRNAs from circulating system, have moderately high clinical diagnostic value in the detection of early-stage lung cancer. However, the clinical diagnostic utilization and additional improvements of miRNAs as biomarkers for early-stage lung cancer detection still remain to be further validated by more future studies. PMID:26550141

  11. On the sources of the height-intelligence correlation: new insights from a bivariate ACE model with assortative mating.

    Science.gov (United States)

    Beauchamp, Jonathan P; Cesarini, David; Johannesson, Magnus; Lindqvist, Erik; Apicella, Coren

    2011-03-01

    A robust positive correlation between height and intelligence, as measured by IQ tests, has been established in the literature. This paper makes several contributions toward establishing the causes of this association. First, we extend the standard bivariate ACE model to account for assortative mating. The more general theoretical framework provides several key insights, including formulas to decompose a cross-trait genetic correlation into components attributable to assortative mating and pleiotropy and to decompose a cross-trait within-family correlation. Second, we use a large dataset of male twins drawn from Swedish conscription records and examine how well genetic and environmental factors explain the association between (i) height and intelligence and (ii) height and military aptitude, a professional psychologist's assessment of a conscript's ability to deal with wartime stress. For both traits, we find suggestive evidence of a shared genetic architecture with height, but we demonstrate that point estimates are very sensitive to assumed degrees of assortative mating. Third, we report a significant within-family correlation between height and intelligence (p^ = 0.10), suggesting that pleiotropy might be at play.

  12. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    Science.gov (United States)

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  13. Bivariate ordered-response probit model of driver's and passenger's injury severities in collisions with fixed objects.

    Science.gov (United States)

    Yamamoto, Toshiyuki; Shankar, Venkataraman N

    2004-09-01

    A bivariate ordered-response probit model of driver's and most severely injured passenger's severity (IS) in collisions with fixed objects is developed in this study. Exact passenger's IS is not necessarily observed, especially when only most severe injury of the accident and driver's injury are recorded in the police reports. To accommodate passenger IS as well, we explicitly develop a partial observability model of passenger IS in multi-occupant vehicle (HOV). The model has consistent coefficients for the driver IS between single-occupant vehicle (SOV) and multiple-occupant vehicle accidents, and provides more efficient coefficient estimates by taking into account the common unobserved factors between driver and passenger IS. The results of the empirical analysis using 4-year statewide accident data in Washington State reveal the effects of driver's characteristics, vehicle attributes, types of objects, and environmental conditions on both driver and passenger IS, and that their IS have different elasticities to some of the risk factors. Copyright 2003 Elsevier Ltd.

  14. A comparison of the effect of 5-bromodeoxyuridine substitution on 33258 Hoechst- and DAPI-fluorescence of isolated chromosomes by bivariate flow karyotyping

    NARCIS (Netherlands)

    Buys, C. H.; Mesa, J.; van der Veen, A. Y.; Aten, J. A.

    1986-01-01

    Application of the fluorescent DNA-intercalator propidium iodide for stabilization of the mitotic chromosome structure during isolation of chromosomes from V79 Chinese hamster cells and subsequent staining with the fluorochromes 33258 Hoechst or DAPI allowed bivariate flow karyotyping of isolated

  15. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone...

  16. Aquaculture in artificially developed wetlands in urban areas: an application of the bivariate relationship between soil and surface water in landscape ecology.

    Science.gov (United States)

    Paul, Abhijit

    2011-01-01

    Wetlands show a strong bivariate relationship between soil and surface water. Artificially developed wetlands help to build landscape ecology and make built environments sustainable. The bheries, wetlands of eastern Calcutta (India), utilize the city sewage to develop urban aquaculture that supports the local fish industries and opens a new frontier in sustainable environmental planning research.

  17. Introduction of Transplant Registry Unified Management Program 2 (TRUMP2): scripts for TRUMP data analyses, part I (variables other than HLA-related data).

    Science.gov (United States)

    Atsuta, Yoshiko

    2016-01-01

    Collection and analysis of information on diseases and post-transplant courses of allogeneic hematopoietic stem cell transplant recipients have played important roles in improving therapeutic outcomes in hematopoietic stem cell transplantation. Efficient, high-quality data collection systems are essential. The introduction of the Second-Generation Transplant Registry Unified Management Program (TRUMP2) is intended to improve data quality and more efficient data management. The TRUMP2 system will also expand possible uses of data, as it is capable of building a more complex relational database. The construction of an accessible data utilization system for adequate data utilization by researchers would promote greater research activity. Study approval and management processes and authorship guidelines also need to be organized within this context. Quality control of processes for data manipulation and analysis will also affect study outcomes. Shared scripts have been introduced to define variables according to standard definitions for quality control and improving efficiency of registry studies using TRUMP data.

  18. Modeling mood variation and covariation among adolescent smokers: application of a bivariate location-scale mixed-effects model.

    Science.gov (United States)

    Pugach, Oksana; Hedeker, Donald; Richmond, Melanie J; Sokolovsky, Alexander; Mermelstein, Robin

    2014-05-01

    Ecological momentary assessments (EMAs) are useful for understanding both between- and within-subject dynamic changes in smoking and mood. Modeling 2 moods (positive affect [PA] and negative affect [NA], PA and NA) simultaneously will better enable researchers to explore the association between mood variables and what influences them at both the momentary and subject level. The EMA component of a natural history study of adolescent smoking was analyzed with a bivariate location-scale mixed-effects model. The proposed model separately estimates the between- and within-subject variances and jointly models the 2 mood constructs. A total of 461 adolescents completed the baseline EMA wave, which resulted in 14,105 random prompts. Smoking level, represented by the number of smoking events on EMA, entered the model as 2 predictors: one that compared nonsmokers during the EMA week to 1-cigarette smokers, and the second one that estimated the effect of smoking level on mood among smokers. Results suggest that nonsmokers had more consistent positive and negative moods compared to 1-cigarette smokers. Among those who smoked, both moods were more consistent at higher smoking levels. The effects of smoking level were greater for NA than for PA. The within-subject association between mood constructs was negative and strongest among 1-cigarette smokers; the within-subject association between positive and negative moods was negatively associated with smoking. Mood variation and association between mood constructs varied across smoking levels. The most infrequent smokers were characterized with more inconsistent moods, whereas mood was more consistent for subjects with higher smoking levels.

  19. Rotational distortion in conventional allometric analyses.

    Science.gov (United States)

    Packard, Gary C

    2011-08-01

    Three data sets from the recent literature were submitted to new analyses to illustrate the rotational distortion that commonly accompanies traditional allometric analyses and that often causes allometric equations to be inaccurate and misleading. The first investigation focused on the scaling of evaporative water loss to body mass in passerine birds; the second was concerned with the influence of body size on field metabolic rates of rodents; and the third addressed interspecific variation in kidney mass among primates. Straight lines were fitted to logarithmic transformations by Ordinary Least Squares and Generalized Linear Models, and the resulting equations then were re-expressed as two-parameter power functions in the original arithmetic scales. The re-expressed models were displayed on bivariate graphs together with tracings for equations fitted directly to untransformed data by nonlinear regression. In all instances, models estimated by back-transformation failed to describe major features of the arithmetic distribution whereas equations fitted by nonlinear regression performed quite well. The poor performance of equations based on models fitted to logarithms can be traced to the increased weight and leverage exerted in those analyses by observations for small species and to the decreased weight and leverage exerted by large ones. The problem of rotational distortion can be avoided by performing exploratory analysis on untransformed values and by validating fitted models in the scale of measurement. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. A Hybrid ANN-GA Model to Prediction of Bivariate Binary Responses: Application to Joint Prediction of Occurrence of Heart Block and Death in Patients with Myocardial Infarction.

    Science.gov (United States)

    Mirian, Negin-Sadat; Sedehi, Morteza; Kheiri, Soleiman; Ahmadi, Ali

    2016-01-01

    In medical studies, when the joint prediction about occurrence of two events should be anticipated, a statistical bivariate model is used. Due to the limitations of usual statistical models, other methods such as Artificial Neural Network (ANN) and hybrid models could be used. In this paper, we propose a hybrid Artificial Neural Network-Genetic Algorithm (ANN-GA) model to prediction the occurrence of heart block and death in myocardial infarction (MI) patients simultaneously. For fitting and comparing the models, 263 new patients with definite diagnosis of MI hospitalized in Cardiology Ward of Hajar Hospital, Shahrekord, Iran, from March, 2014 to March, 2016 were enrolled. Occurrence of heart block and death were employed as bivariate binary outcomes. Bivariate Logistic Regression (BLR), ANN and hybrid ANN-GA models were fitted to data. Prediction accuracy was used to compare the models. The codes were written in Matlab 2013a and Zelig package in R3.2.2. The prediction accuracy of BLR, ANN and hybrid ANN-GA models was obtained 77.7%, 83.69% and 93.85% for the training and 78.48%, 84.81% and 96.2% for the test data, respectively. In both training and test data set, hybrid ANN-GA model had better accuracy. ANN model could be a suitable alternative for modeling and predicting bivariate binary responses when the presuppositions of statistical models are not met in actual data. In addition, using optimization methods, such as hybrid ANN-GA model, could improve precision of ANN model.

  1. Análise comparativa da aplicação do programa Seis Sigma em processos de manufatura e serviços Comparative analyses of Six-Sigma program application in manufacturing and services process

    Directory of Open Access Journals (Sweden)

    Luis Ricardo Galvani

    2013-01-01

    Full Text Available O programa Seis Sigma nasceu e evoluiu em ambiente de manufatura, mas também pode ser utilizado em processos de serviços. Porém sua utilização nesse ambiente tem sido feita de forma mais modesta, com menor participação de empresas e, consequentemente, menor número de casos e relatos divulgados. Este trabalho buscou comparar e analisar a aplicação do programa Seis Sigma em manufatura e serviços, por meio de revisão da literatura, análise de projetos vivenciados pelo autor e pesquisa de campo em empresas praticantes do programa. Apesar da limitação da amostra, os resultados mostram fortes indícios de diferenças significativas cuja melhor compreensão pode ajudar a obter melhores resultados em aplicações em serviços.The Six-sigma program has its roots in the manufacturing field, but it can be applied to a service process. However, this application has been done in a modest manner, with the participation of few companies and, as a consequence, few cases, projects, and papers have been exposed. This paper presents a comparative analysis of Six-sigma program application in manufacturing and services process by literature review, comparative analyses of Six-sigma projects experienced by the author, and case studies with companies that apply the program in the manufacturing and services field. Even with sample restriction, the results show key differences that can lead to benefits to services application.

  2. The use of bivariate spatial modeling of questionnaire and parasitology data to predict the distribution of Schistosoma haematobium in Coastal Kenya.

    Directory of Open Access Journals (Sweden)

    Hugh J W Sturrock

    Full Text Available Questionnaires of reported blood in urine (BIU distributed through the existing school system provide a rapid and reliable method to classify schools according to the prevalence of Schistosoma haematobium, thereby helping in the targeting of schistosomiasis control. However, not all schools return questionnaires and it is unclear whether treatment is warranted in such schools. This study investigates the use of bivariate spatial modelling of available and multiple data sources to predict the prevalence of S. haematobium at every school along the Kenyan coast.Data from a questionnaire survey conducted by the Kenya Ministry of Education in Coast Province in 2009 were combined with available parasitological and environmental data in a Bayesian bivariate spatial model. This modeled the relationship between BIU data and environmental covariates, as well as the relationship between BIU and S. haematobium infection prevalence, to predict S. haematobium infection prevalence at all schools in the study region. Validation procedures were implemented to assess the predictive accuracy of endemicity classification.The prevalence of BIU was negatively correlated with distance to nearest river and there was considerable residual spatial correlation at small (~15 km spatial scales. There was a predictable relationship between the prevalence of reported BIU and S. haematobium infection. The final model exhibited excellent sensitivity (0.94 but moderate specificity (0.69 in identifying low (<10% prevalence schools, and had poor performance in differentiating between moderate and high prevalence schools (sensitivity 0.5, specificity 1.Schistosomiasis is highly focal and there is a need to target treatment on a school-by-school basis. The use of bivariate spatial modelling can supplement questionnaire data to identify schools requiring mass treatment, but is unable to distinguish between moderate and high prevalence schools.

  3. Task 1 Report - Assessment of Data Availability to Inform Energy Planning Analyses: Energy Alternatives Study for the Lao People's Democratic Republic: Smart Infrastructure for the Mekong Program

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Nathan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Katz, Jessica R. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cardoso de Oliveira, Ricardo P. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hayter, Sheila J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-01-24

    In an effort to address concerns such as energy security, reliability, affordability, and other objectives, the Government of the Lao People's Democratic Republic (Lao PDR) is seeking to advance its expertise and experience in energy system analysis and planning to explore energy alternatives. Assessing the potential and alternatives for deploying energy technology options is often an early step - and, in most cases, an ongoing process - in planning for the development of the energy sector as a whole. Reliable and robust data are crucial to conducting these types of planning-related analyses in a transparent manner that builds confidence among power sector stakeholders and encourages investment in future energy project development and infrastructure opportunities. This report represents the first output of the Energy Alternatives Study for the Lao PDR (Energy Alternatives Study), a collaboration between Ministry of Energy and Mines and the United States Agency for International Development (USAID) under the auspices of the Smart Infrastructure for the Mekong (SIM) program. The Energy Alternatives Study includes five tasks that build upon each other to meet the goal of the project. The report summarizes the availability, quality, and accessibility of data that serve as key inputs to energy planning activities for the power sector. The purpose of this data assessment is two-fold: 1. To facilitate the informed use of existing data by highlighting applications for these data as they relate to priority energy planning analyses; and 2. To inform future investments in energy data collection and management by identifying significant data gaps and providing guidance on how to fill these gaps.

  4. Bayesian hierarchical models for cost-effectiveness analyses that use data from cluster randomized trials.

    Science.gov (United States)

    Grieve, Richard; Nixon, Richard; Thompson, Simon G

    2010-01-01

    Cost-effectiveness analyses (CEA) may be undertaken alongside cluster randomized trials (CRTs) where randomization is at the level of the cluster (for example, the hospital or primary care provider) rather than the individual. Costs (and outcomes) within clusters may be correlated so that the assumption made by standard bivariate regression models, that observations are independent, is incorrect. This study develops a flexible modeling framework to acknowledge the clustering in CEA that use CRTs. The authors extend previous Bayesian bivariate models for CEA of multicenter trials to recognize the specific form of clustering in CRTs. They develop new Bayesian hierarchical models (BHMs) that allow mean costs and outcomes, and also variances, to differ across clusters. They illustrate how each model can be applied using data from a large (1732 cases, 70 primary care providers) CRT evaluating alternative interventions for reducing postnatal depression. The analyses compare cost-effectiveness estimates from BHMs with standard bivariate regression models that ignore the data hierarchy. The BHMs show high levels of cost heterogeneity across clusters (intracluster correlation coefficient, 0.17). Compared with standard regression models, the BHMs yield substantially increased uncertainty surrounding the cost-effectiveness estimates, and altered point estimates. The authors conclude that ignoring clustering can lead to incorrect inferences. The BHMs that they present offer a flexible modeling framework that can be applied more generally to CEA that use CRTs.

  5. A Hybrid Forecasting Model Based on Bivariate Division and a Backpropagation Artificial Neural Network Optimized by Chaos Particle Swarm Optimization for Day-Ahead Electricity Price

    Directory of Open Access Journals (Sweden)

    Zhilong Wang

    2014-01-01

    Full Text Available In the electricity market, the electricity price plays an inevitable role. Nevertheless, accurate price forecasting, a vital factor affecting both government regulatory agencies and public power companies, remains a huge challenge and a critical problem. Determining how to address the accurate forecasting problem becomes an even more significant task in an era in which electricity is increasingly important. Based on the chaos particle swarm optimization (CPSO, the backpropagation artificial neural network (BPANN, and the idea of bivariate division, this paper proposes a bivariate division BPANN (BD-BPANN method and the CPSO-BD-BPANN method for forecasting electricity price. The former method creatively transforms the electricity demand and price to be a new variable, named DV, which is calculated using the division principle, to forecast the day-ahead electricity by multiplying the forecasted values of the DVs and forecasted values of the demand. Next, to improve the accuracy of BD-BPANN, chaos particle swarm optimization and BD-BPANN are synthesized to form a novel model, CPSO-BD-BPANN. In this study, CPSO is utilized to optimize the initial parameters of BD-BPANN to make its output more stable than the original model. Finally, two forecasting strategies are proposed regarding different situations.

  6. A feeding education program to prevent mother-to-child transmission of HIV in Haiti.

    Science.gov (United States)

    Deschamps, Marie-Marcelle; Dévieux, Jessy G; Théodore, Harry; Saint-Jean, Gilbert; Antillus, Lynda; Cadot, Ionie; Pape, Jean William; Malow, Robert M

    2009-03-01

    In Haiti, as in most of the developing world, vertical transmission of HIV from infected mother to infant through postpartum breastfeeding remains a significant mode of transmission. As part of their prevention of mother-to-child transmission program, the Groupe Haitien d'Etude du Sarcome de Kaposi et des Infections Opportunistes (GHESKIO) Centers developed a feeding education program in which over 83% of the HIV-positive pregnant women who were eligible to participate, enrolled. Bivariate and adjusted multivariate logistic regression analyses were used to compare feeding choices of the 290 women who participated in the feeding education program to 58 who did not. Of those who participated, 91.7% chose to use replacement formulas for their newborns, while 75.9% of those who did not participate chose replacement feeding. After adjustment for socio-demographic variables, analyses revealed that the no education group was less likely to adopt replacement feeding and more likely to use mixed feeding (OR=0.31, p=0.004; and OR=2.74, p=0.05, respectively). This suggests that a targeted and culturally appropriate education program can be effective in encouraging replacement feeding, even in those countries where breastfeeding is the norm.

  7. Contributory fault and level of personal injury to drivers involved in head-on collisions: Application of copula-based bivariate ordinal models.

    Science.gov (United States)

    Wali, Behram; Khattak, Asad J; Xu, Jingjing

    2018-01-01

    The main objective of this study is to simultaneously investigate the degree of injury severity sustained by drivers involved in head-on collisions with respect to fault status designation. This is complicated to answer due to many issues, one of which is the potential presence of correlation between injury outcomes of drivers involved in the same head-on collision. To address this concern, we present seemingly unrelated bivariate ordered response models by analyzing the joint injury severity probability distribution of at-fault and not-at-fault drivers. Moreover, the assumption of bivariate normality of residuals and the linear form of stochastic dependence implied by such models may be unduly restrictive. To test this, Archimedean copula structures and normal mixture marginals are integrated into the joint estimation framework, which can characterize complex forms of stochastic dependencies and non-normality in residual terms. The models are estimated using 2013 Virginia police reported two-vehicle head-on collision data, where exactly one driver is at-fault. The results suggest that both at-fault and not-at-fault drivers sustained serious/fatal injuries in 8% of crashes, whereas, in 4% of the cases, the not-at-fault driver sustained a serious/fatal injury with no injury to the at-fault driver at all. Furthermore, if the at-fault driver is fatigued, apparently asleep, or has been drinking the not-at-fault driver is more likely to sustain a severe/fatal injury, controlling for other factors and potential correlations between the injury outcomes. While not-at-fault vehicle speed affects injury severity of at-fault driver, the effect is smaller than the effect of at-fault vehicle speed on at-fault injury outcome. Contrarily, and importantly, the effect of at-fault vehicle speed on injury severity of not-at-fault driver is almost equal to the effect of not-at-fault vehicle speed on injury outcome of not-at-fault driver. Compared to traditional ordered probability

  8. Sproglig Metode og Analyse

    DEFF Research Database (Denmark)

    le Fevre Jakobsen, Bjarne

    Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011......Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011...

  9. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis

    Science.gov (United States)

    Bain, Robert E.S.; Cronk, Ryan; Wright, Jim A.; Bartram, Jamie

    2015-01-01

    Background Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. Objectives We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. Methods We performed a bivariate random-effects meta-analysis of 45 studies, identified through a systematic review, that reported either the proportion of samples free of fecal indicator bacteria and/or individual sample bacteria counts for source and HSW, disaggregated by supply type. Results Water quality deteriorated substantially between source and stored water. The mean percentage of contaminated samples (noncompliance) at the source was 46% (95% CI: 33, 60%), whereas mean noncompliance in HSW was 75% (95% CI: 64, 84%). Water supply type was significantly associated with noncompliance at the source (p Source water (OR = 0.2; 95% CI: 0.1, 0.5) and HSW (OR = 0.3; 95% CI: 0.2, 0.8) from piped supplies had significantly lower odds of contamination compared with non-piped water, potentially due to residual chlorine. Conclusions Piped water is less likely to be contaminated compared with other water supply types at both the source and in HSW. A focus on upgrading water services to piped supplies may help improve safety, including for those drinking stored water. Citation Shields KF, Bain RE, Cronk R, Wright JA, Bartram J. 2015. Association of supply type with fecal contamination of source water and household stored drinking water in developing countries: a bivariate meta-analysis. Environ Health Perspect 123:1222–1231; http://dx.doi.org/10.1289/ehp.1409002 PMID:25956006

  10. Meal programs improve nutritional risk: a longitudinal analysis of community-living seniors.

    Science.gov (United States)

    Keller, Heather H

    2006-07-01

    To determine the independent association of meal programs (eg, Meals On Wheels and other meal programs with a social component) and shopping help on seniors' nutritional risk. Cohort design. Baseline data were collected with an in-person interview and subjects were followed up for 18 months via telephone interview. Cognitively well, vulnerable (ie, required informal or formal supports for activities of daily living) seniors were recruited through community service agencies in southwestern Ontario, Canada. Three hundred sixty-seven seniors participated in baseline interviews and 263 completed data collection at 18-month follow-up; 70% participated in meal programs at baseline. The 15-item Seniors in the Community: Risk Evaluation for Eating and Nutrition (SCREEN) questionnaire identified nutritional risk at 18 months. Descriptive and bivariate analyses were performed and significant associations (Pnutritional risk for vulnerable seniors. Increased use of these programs over time may indicate a senior's declining status. Seniors who are in need of informal or formal supports for food shopping or preparation should be encouraged to participate in meal programs as a means of maintaining or improving their nutrition.

  11. Supporting analyses and assessments

    Energy Technology Data Exchange (ETDEWEB)

    Ohi, J. [National Renewable Energy Lab., Golden, CO (United States)

    1995-09-01

    Supporting analysis and assessments can provide a sound analytic foundation and focus for program planning, evaluation, and coordination, particularly if issues of hydrogen production, distribution, storage, safety, and infrastructure can be analyzed in a comprehensive and systematic manner. The overall purpose of this activity is to coordinate all key analytic tasks-such as technology and market status, opportunities, and trends; environmental costs and benefits; and regulatory constraints and opportunities-within a long-term and systematic analytic foundation for program planning and evaluation. Within this context, the purpose of the project is to help develop and evaluate programmatic pathway options that incorporate near and mid-term strategies to achieve the long-term goals of the Hydrogen Program. In FY 95, NREL will develop a comprehensive effort with industry, state and local agencies, and other federal agencies to identify and evaluate programmatic pathway options to achieve the long-term goals of the Program. Activity to date is reported.

  12. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  13. Meta-analyses

    NARCIS (Netherlands)

    Hendriks, Maria A.; Luyten, Johannes W.; Scheerens, Jaap; Sleegers, P.J.C.; Scheerens, J

    2014-01-01

    In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used

  14. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach...

  15. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    . Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  16. Report sensory analyses veal

    NARCIS (Netherlands)

    Veldman, M.; Schelvis-Smit, A.A.M.

    2005-01-01

    On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull,

  17. Filmstil - teori og analyse

    DEFF Research Database (Denmark)

    Hansen, Lennard Højbjerg

    Filmstil påvirker på afgørende vis vores oplevelse af film. Men filmstil, måden, de levende billeder organiserer fortællingen på fylder noget mindre end filmens handling, når vi taler om film. Filmstil - teori og analyse er en rigt eksemplificeret præsentation, kritik og videreudvikling af...

  18. Testing bivariate independence and normality

    NARCIS (Netherlands)

    Kallenberg, W.C.M.; Ledwina, Teresa; Rafajlowicz, Ewaryst

    1997-01-01

    In many statistical studies the relationship between two random variables X and Y is investigated and in particular the question whether X and Y are independent and normally distributed is of interest. Smooth tests may be used for testing this. They consist of several components, the first measuring

  19. Dropped object protection analyses

    OpenAIRE

    Nilsen, Ingve

    2014-01-01

    Master's thesis in Offshore structural engineering Impact from dropped object is a typical accident action (NOKSOK N-004, 2013). Hence, the DOP structure is to be analyzed in an accidental limit state (ALS) design practice, which means that a non-linear finite element analysis can be applied. The DOP structure will be based on a typical DOP structure. Several FEM analyses are performed for the DOP structure. Different shapes size and weights and various impact positions are used for si...

  20. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  1. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  2. Measuring dispersed spot of positioning CMOS camera from star image quantitative interpretation based on a bivariate-error least squares curve fitting algorithm

    Science.gov (United States)

    Bu, Fan; Qiu, Yuehong; Yao, Dalei; Yan, Xingtao

    2017-02-01

    For a positioning CMOS camera, we put forward a system which can measure quantitatively dispersed spot parameters and the degree of energy concentration of certain optical system. Based on this method, the detection capability of the positioning CMOS camera can be verified. The measuring method contains some key instruments, such as 550mm collimator, 0.2mm star point, turntable and a positioning CMOS camera. Firstly, the definition of dispersed spot parameters is introduced. Then, the steps of measuring dispersed spot parameters are listed. The energy center of dispersed spot is calculated using centroid algorithm, and then a bivariate-error least squares curve Gaussian fitting method is presented to fit dispersion spot energy distribution curve. Finally, the connected region shaped by the energy contour of the defocused spots is analyzed. The diameter equal to the area which is 80% of the total energy of defocused spots and the energy percentage to the 3×3 central area of the image size are both calculated. The experimental results show that 80% of the total energy of defocused spots is concentrated in the diameter of the inner circle of 15μm, and the percentage to the 3×3 pixels central area can achieve 80% and even higher. Therefore, the method meets the needs of the optical systems in positioning CMOS camera for the imaging quality control.

  3. Modelling the vicious circle between obesity and physical activity in children and adolescents using a bivariate probit model with endogenous regressors.

    Science.gov (United States)

    Yeh, C-Y; Chen, L-J; Ku, P-W; Chen, C-M

    2015-01-01

    The increasing prevalence of obesity in children and adolescents has become one of the most important public health issues around the world. Lack of physical activity is a risk factor for obesity, while being obese could reduce the likelihood of participating in physical activity. Failing to account for the endogeneity between obesity and physical activity would result in biased estimation. This study investigates the relationship between overweight and physical activity by taking endogeneity into consideration. It develops an endogenous bivariate probit model estimated by the maximum likelihood method. The data included 4008 boys and 4197 girls in the 5th-9th grades in Taiwan in 2007-2008. The relationship between overweight and physical activity is significantly negative in the endogenous model, but insignificant in the comparative exogenous model. This endogenous relationship presents a vicious circle in which lower levels of physical activity lead to overweight, while those who are already overweight engage in less physical activity. The results not only reveal the importance of endogenous treatment, but also demonstrate the robust negative relationship between these two factors. An emphasis should be put on overweight and obese children and adolescents in order to break the vicious circle. Promotion of physical activity by appropriate counselling programmes and peer support could be effective in reducing the prevalence of obesity in children and adolescents.

  4. Regional Analysis of Precipitation by Means of Bivariate Distribution Adjusted by Maximum Entropy; Analisis regional de precipitacion con base en una distribucion bivariada ajustada por maxima entropia

    Energy Technology Data Exchange (ETDEWEB)

    Escalante Sandoval, Carlos A.; Dominguez Esquivel, Jose Y. [Universidad Nacional Autonoma de Mexico (Mexico)

    2001-09-01

    The principle of maximum entropy (POME) is used to derive an alternative method of parameter estimation for the bivariate Gumbel distribution. A simple algorithm for this parameter estimation technique is presented. This method is applied to analyze the precipitation in a region of Mexico. Design events are compered with those obtained by the maximum likelihood procedure. According to the results, the proposed technique is a suitable option to be considered when performing frequency analysis of precipitation with small samples. [Spanish] El principio de maxima entropia, conocido como POME, es utilizado para derivar un procedimiento alternativo de estimacion de parametros de la distribucion bivariada de valores extremos con marginales Gumbel. El modelo se aplica al analisis de la precipitacion maxima en 24 horas en una region de Mexico y los eventos de diseno obtenidos son comparados con los proporcionados por la tecnica de maxima verosimilitud. De acuerdo con los resultados obtenidos, se concluye que la tecnica propuesta representa una buena opcion, sobre todo para el caso de muestras pequenas.

  5. Evaluation of Factors Affecting E-Bike Involved Crash and E-Bike License Plate Use in China Using a Bivariate Probit Model

    Directory of Open Access Journals (Sweden)

    Yanyong Guo

    2017-01-01

    Full Text Available The primary objective of this study is to evaluate factors affecting e-bike involved crash and license plate use in China. E-bike crashes data were collected from police database and completed through a telephone interview. Noncrash samples were collected by a questionnaire survey. A bivariate probit (BP model was developed to simultaneously examine the significant factors associated with e-bike involved crash and e-bike license plate and to account for the correlations between them. Marginal effects for contributory factors were calculated to quantify their impacts on the outcomes. The results show that several contributory factors, including gender, age, education level, driver license, car in household, experiences in using e-bike, law compliance, and aggressive driving behaviors, are found to have significant impacts on both e-bike involved crash and license plate use. Moreover, type of e-bike, frequency of using e-bike, impulse behavior, degree of riding experience, and risk perception scale are found to be associated with e-bike involved crash. It is also found that e-bike involved crash and e-bike license plate use are strongly correlated and are negative in direction. The result enhanced our comprehension of the factors related to e-bike involved crash and e-bike license plate use.

  6. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis.

    Science.gov (United States)

    Shields, Katherine F; Bain, Robert E S; Cronk, Ryan; Wright, Jim A; Bartram, Jamie

    2015-12-01

    Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. We performed a bivariate random-effects meta-analysis of 45 studies, identified through a systematic review, that reported either the proportion of samples free of fecal indicator bacteria and/or individual sample bacteria counts for source and HSW, disaggregated by supply type. Water quality deteriorated substantially between source and stored water. The mean percentage of contaminated samples (noncompliance) at the source was 46% (95% CI: 33, 60%), whereas mean noncompliance in HSW was 75% (95% CI: 64, 84%). Water supply type was significantly associated with noncompliance at the source (p water (OR = 0.2; 95% CI: 0.1, 0.5) and HSW (OR = 0.3; 95% CI: 0.2, 0.8) from piped supplies had significantly lower odds of contamination compared with non-piped water, potentially due to residual chlorine. Piped water is less likely to be contaminated compared with other water supply types at both the source and in HSW. A focus on upgrading water services to piped supplies may help improve safety, including for those drinking stored water.

  7. Information system and geographic information system tools in the data analyses of the control program for visceral leishmaniases from 2006 to 2010 in the sanitary district of venda nova, belo horizonte, minas gerais, Brazil.

    Science.gov (United States)

    Saraiva, Lara; Leite, Camila Gonçalves; de Carvalho, Luiz Otávio Alves; Andrade Filho, José Dilermando; de Menezes, Fernanda Carvalho; Fiúza, Vanessa de Oliveira Pires

    2012-01-01

    The aim of this paper is to report a brief history of control actions for Visceral Leishmaniasis (VL) from 2006 to 2010 in the Sanitary District (DS) of Venda Nova, Belo Horizonte, Minas Gerais, Brazil, focusing on the use of information systems and Geographic Information System (GIS) tools. The analyses showed that the use of an automated database allied with geoprocessing tools may favor control measures of VL, especially with regard to the evaluation of control actions carried out. Descriptive analyses of control measures allowed to evaluating that the information system and GIS tools promoted greater efficiency in making decisions and planning activities. These analyses also pointed to the necessity of new approaches to the control of VL in large urban centers.

  8. Force-plate analyses of balance following a balance exercise program during acute post-operative phase in individuals with total hip and knee arthroplasty: A randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Pankaj Jogi

    2016-11-01

    Full Text Available Objectives: Typical rehabilitation programs following total hip arthroplasty and total knee arthroplasty include joint range of motion and muscle-strengthening exercises. Balance and balance exercises following total hip arthroplasty and total knee arthroplasty have not received much attention. The purpose of this study was to determine whether an intervention of balance exercises added to a typical rehabilitation program positively affects patients’ balance. Methods: A total of 63 patients were provided with outpatient physical therapy at their home. Patients were randomly assigned to either typical (n = 33 or balance (n = 30 exercise group. The typical group completed seven typical surgery-specific joint range of motion and muscle-strengthening exercises, while the balance group completed the typical exercises plus three balance exercises. After 5 weeks of administering the rehabilitation program, patients’ balance was assessed on a force plate using 95% ellipse area of the center of pressure amplitude. Results: Patients in the balance group demonstrated significant reduction in the 95% ellipse area for the anterior and posterior lean standing conditions (p < 0.01. Conclusion: Balance exercises added to the typical outpatient physical therapy program resulted in significantly greater improvements in balance for participants with total hip arthroplasty or total knee arthroplasty, compared to the typical exercise program alone. Physical therapists might consider the use of balance exercises to improve balance in individuals in the acute post-operative phase following total hip arthroplasty or total knee arthroplasty.

  9. Persistent Monitoring of Urban Infrasound Phenomenology. Report 1: Modeling an Urban Environment for Acoustical Analyses using the 3-D Finite-Difference Time-Domain Program PSTOP3D

    Science.gov (United States)

    2015-08-01

    ER D C TR -1 5- 5 Remote Assessment of Critical Infrastructure Persistent Monitoring of Urban Infrasound Phenomenology Report 1...ERDC TR-15-5 August 2015 Persistent Monitoring of Urban Infrasound Phenomenology Report 1: Modeling an Urban Environment for Acoustical Analyses...From - To) 4. TITLE AND SUBTITLE Persistent Monitoring of Urban Infrasound Phenomenology ; Report 1: Modeling an Urban Environment for

  10. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  11. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  12. Analyse af elbilers forbrug

    DEFF Research Database (Denmark)

    Andersen, Ove; Krogh, Benjamin Bjerre; Torp, Kristian

    2014-01-01

    Denne rapport undersøger GPS og CAN bus datagrundlaget opsamlet ved kørsel med elbiler og analysere på elbilers forbrug. Analyserne er baseret på godt 133 millioner GPS og CAN bus målinger opsamlet fra 164 elbiler (Citroen C-Zero, Mitsubishi iMiev og Peugeot Ion) i kalenderåret 2012....... For datagrundlaget kan det konstateres, at der er behov for væsentlige, men simple opstramninger for fremadrettet at gøre det nemmere at anvende GPS/CAN bus data fra elbiler i andre analyser. Brugen af elbiler er sammenlignet med brændstofbiler og konklusionen er, at elbiler generelt kører 10-15 km/t langsommere på...

  13. Landslide susceptibility assessment in Lianhua County (China): A comparison between a random forest data mining technique and bivariate and multivariate statistical models

    Science.gov (United States)

    Hong, Haoyuan; Pourghasemi, Hamid Reza; Pourtaghi, Zohre Sadat

    2016-04-01

    Landslides are an important natural hazard that causes a great amount of damage around the world every year, especially during the rainy season. The Lianhua area is located in the middle of China's southern mountainous area, west of Jiangxi Province, and is known to be an area prone to landslides. The aim of this study was to evaluate and compare landslide susceptibility maps produced using the random forest (RF) data mining technique with those produced by bivariate (evidential belief function and frequency ratio) and multivariate (logistic regression) statistical models for Lianhua County, China. First, a landslide inventory map was prepared using aerial photograph interpretation, satellite images, and extensive field surveys. In total, 163 landslide events were recognized in the study area, with 114 landslides (70%) used for training and 49 landslides (30%) used for validation. Next, the landslide conditioning factors-including the slope angle, altitude, slope aspect, topographic wetness index (TWI), slope-length (LS), plan curvature, profile curvature, distance to rivers, distance to faults, distance to roads, annual precipitation, land use, normalized difference vegetation index (NDVI), and lithology-were derived from the spatial database. Finally, the landslide susceptibility maps of Lianhua County were generated in ArcGIS 10.1 based on the random forest (RF), evidential belief function (EBF), frequency ratio (FR), and logistic regression (LR) approaches and were validated using a receiver operating characteristic (ROC) curve. The ROC plot assessment results showed that for landslide susceptibility maps produced using the EBF, FR, LR, and RF models, the area under the curve (AUC) values were 0.8122, 0.8134, 0.7751, and 0.7172, respectively. Therefore, we can conclude that all four models have an AUC of more than 0.70 and can be used in landslide susceptibility mapping in the study area; meanwhile, the EBF and FR models had the best performance for Lianhua

  14. Flash flood susceptibility analysis and its mapping using different bivariate models in Iran: a comparison between Shannon's entropy, statistical index, and weighting factor models.

    Science.gov (United States)

    Khosravi, Khabat; Pourghasemi, Hamid Reza; Chapi, Kamran; Bahri, Masoumeh

    2016-12-01

    Flooding is a very common worldwide natural hazard causing large-scale casualties every year; Iran is not immune to this thread as well. Comprehensive flood susceptibility mapping is very important to reduce losses of lives and properties. Thus, the aim of this study is to map susceptibility to flooding by different bivariate statistical methods including Shannon's entropy (SE), statistical index (SI), and weighting factor (Wf). In this regard, model performance evaluation is also carried out in Haraz Watershed, Mazandaran Province, Iran. In the first step, 211 flood locations were identified by the documentary sources and field inventories, of which 70% (151 positions) were used for flood susceptibility modeling and 30% (60 positions) for evaluation and verification of the model. In the second step, ten influential factors in flooding were chosen, namely slope angle, plan curvature, altitude, topographic wetness index (TWI), stream power index (SPI), distance from river, rainfall, geology, land use, and normalized difference vegetation index (NDVI). In the next step, flood susceptibility maps were prepared by these four methods in ArcGIS. As the last step, receiver operating characteristic (ROC) curve was drawn and the area under the curve (AUC) was calculated for quantitative assessment of each model. The results showed that the best model to estimate the susceptibility to flooding in Haraz Watershed was SI model with the prediction and success rates of 99.71 and 98.72%, respectively, followed by Wf and SE models with the AUC values of 98.1 and 96.57% for the success rate, and 97.6 and 92.42% for the prediction rate, respectively. In the SI and Wf models, the highest and lowest important parameters were the distance from river and geology. Flood susceptibility maps are informative for managers and decision makers in Haraz Watershed in order to contemplate measures to reduce human and financial losses.

  15. Inductive Temporal Logic Programming

    OpenAIRE

    Kolter, Robert

    2009-01-01

    We study the extension of techniques from Inductive Logic Programming (ILP) to temporal logic programming languages. Therefore we present two temporal logic programming languages and analyse the learnability of programs from these languages from finite sets of examples. In first order temporal logic the following topics are analysed: - How can we characterize the denotational semantics of programs? - Which proof techniques are best suited? - How complex is the learning task? In propositional ...

  16. Assessment of coupling between trans-abdominally acquired fetal ECG and uterine activity by bivariate phase-rectified signal averaging analysis.

    Directory of Open Access Journals (Sweden)

    Daniela Casati

    Full Text Available Couplings between uterine contractions (UC and fetal heart rate (fHR provide important information on fetal condition during labor. At present, couplings between UC and fHR are assessed by visual analysis and interpretation of cardiotocography. The application of computerized approaches is restricted due to the non-stationarity of the signal, missing data and noise, typical for fHR. Herein, we propose a novel approach to assess couplings between UC and fHR, based on a signal-processing algorithm termed bivariate phase-rectified signal averaging (BPRSA.Electrohysterogram (EHG and fetal electrocardiogram (fECG were recorded non-invasively by a trans-abdominal device in 73 women at term with uneventful singleton pregnancy during the first stage of labor. Coupling between UC and fHR was analyzed by BPRSA and by conventional cross power spectral density analysis (CPSD. For both methods, degree of coupling was assessed by the maximum coefficient of coherence (CPRSA and CRAW, respectively in the UC frequency domain. Coherence values greater than 0.50 were consider significant. CPRSA and CRAW were compared by Wilcoxon test.At visual inspection BPRSA analysis identified coupled periodicities in 86.3% (63/73 of the cases. 11/73 (15% cases were excluded from further analysis because no 30 minutes of fECG recording without signal loss was available for spectral analysis. Significant coupling was found in 90.3% (56/62 of the cases analyzed by BPRSA, and in 24.2% (15/62 of the cases analyzed by CPSD, respectively. The difference between median value of CPRSA and CRAW was highly significant (0.79 [IQR 0.69-0.90] and 0.29 [IQR 0.17-0.47], respectively; p<0.0001.BPRSA is a novel computer-based approach that can be reliably applied to trans-abdominally acquired EHG-fECG. It allows the assessment of correlations between UC and fHR patterns in the majority of labors, overcoming the limitations of non-stationarity and artifacts. Compared to standard techniques of

  17. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    Science.gov (United States)

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  18. Impact of a pharmacy technician-centered medication reconciliation program on medication discrepancies and implementation of recommendations

    Directory of Open Access Journals (Sweden)

    Kraus SK

    2017-06-01

    Full Text Available Objectives: To evaluate the impact of a pharmacy-technician centered medication reconciliation (PTMR program by identifying and quantifying medication discrepancies and outcomes of pharmacist medication reconciliation recommendations. Methods: A retrospective chart review was performed on two-hundred patients admitted to the internal medicine teaching services at Cooper University Hospital in Camden, NJ. Patients were selected using a stratified systematic sample approach and were included if they received a pharmacy technician medication history and a pharmacist medication reconciliation at any point during their hospital admission. Pharmacist identified medication discrepancies were analyzed using descriptive statistics, bivariate analyses. Potential risk factors were identified using multivariate analyses, such as logistic regression and CART. The priority level of significance was set at 0.05. Results: Three-hundred and sixty-five medication discrepancies were identified out of the 200 included patients. The four most common discrepancies were omission (64.7%, non-formulary omission (16.2%, dose discrepancy (10.1%, and frequency discrepancy (4.1%. Twenty-two percent of pharmacist recommendations were implemented by the prescriber within 72 hours. Conclusion: A PTMR program with dedicated pharmacy technicians and pharmacists identifies many medication discrepancies at admission and provides opportunities for pharmacist reconciliation recommendations.

  19. Analysing qualitative research data using computer software.

    Science.gov (United States)

    McLafferty, Ella; Farley, Alistair H

    An increasing number of clinical nurses are choosing to undertake qualitative research. A number of computer software packages are available designed for the management and analysis of qualitative data. However, while it is claimed that the use of these programs is also increasing, this claim is not supported by a search of recent publications. This paper discusses the advantages and disadvantages of using computer software packages to manage and analyse qualitative data.

  20. Hepatitis B vaccination: an unmet challenge in the era of harm reduction programs.

    Science.gov (United States)

    Vallejo, Fernando; Toro, Carlos; de la Fuente, Luis; Brugal, M Teresa; Barrio, Gregorio; Soriano, Vicente; Ballesta, Rosario; Bravo, María J

    2008-06-01

    The prevalence of vaccination against hepatitis B virus (HBV), factors associated with vaccination, and missed opportunities for vaccination were assessed among 949 street-recruited young injecting heroin users (IHUs) and noninjecting HUs (NIHUs). A cross-sectional study was carried out in Madrid, Barcelona, and Seville. Face-to-face interviews were held using a structured questionnaire with computer-assisted personal interviewing. Dried blood spot samples were tested for anti-HBV core antigen and HBV surface antigen. Bivariate and logistic regression analyses were performed. The prevalence of HBV vaccination was 21.7%, with significant differences among the cities (13.3% in Madrid, 18.4% in Seville, and 33.2% in Barcelona) and between IHUs (23.8%) and NIHUs (17.9%). In the logistic regression analysis, living in Barcelona and being aged 25 years or younger were associated with HBV vaccination in IHUs and NIHUs; in IHUs, vaccination was also associated with living in the street or in institutions for most of the last 12 months. Practically all those susceptible to HBV infection had missed at least one opportunity for vaccination, and most of them had missed such an opportunity in the last year. The proportion of vaccinated HUs remains very low despite efforts to set up harm reduction programs. New and more active strategies must be incorporated in these programs.

  1. Selection of interest and inflation rates for infrastructure investment analyses.

    Science.gov (United States)

    2014-12-01

    The South Dakota Department of Transportation (SDDOT) uses engineering economic analyses (EEA) to : support planning, design, and construction decision-making such as project programming and planning, : pavement type selection, and the occasional val...

  2. The MAFLA (Mississippi, Alabama, Florida) Study, Grain Size Analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The MAFLA (Mississippi, Alabama, Florida) Study was funded by NOAA as part of the Outer Continental Shelf Program. Dr. L.J. Doyle produced grain size analyses in the...

  3. Descriptive Analyses of Pediatric Food Refusal and Acceptance

    Science.gov (United States)

    Borrero, Carrie S. W.; Woods, Julia N.; Borrero, John C.; Masler, Elizabeth A.; Lesser, Aaron D.

    2010-01-01

    Functional analyses of inappropriate mealtime behavior typically include conditions to determine if the contingent delivery of attention, tangible items, or escape reinforce food refusal. In the current investigation, descriptive analyses were conducted for 25 children who had been admitted to a program for the assessment and treatment of food…

  4. Inorganic Analyses in Water Quality Control Programs. Training Manual.

    Science.gov (United States)

    Kroner, Audrey; And Others

    This lecture/laboratory manual for a five-day course deals with the analysis of selected inorganic pollutants. The manual is an instructional aid for classroom presentations to those with little or no experience in the field, but having one year (or equivalent) of college level inorganic chemistry, one semester of college level quantitative…

  5. Organic Analyses in Water Quality Control Programs. Training Manual.

    Science.gov (United States)

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    This document is a lecture/laboratory manual dealing with the analysis of selected organic pollutants. It is intended for use by those having little or no experience in the field, but having one year (or equivalent) of college organic chemistry, and having basic laboratory skills (volumetric glassware, titration, analytical and trip balances).…

  6. National Defense Stockpile Program. Phase 1. Development and Analyses

    Science.gov (United States)

    1990-03-01

    percent respectively for each year. Examples include appliances , furniture, jewelry, power tools and sporting equipment (NIPA categories 3, 5-16 and 18...8. Mattresses & Bedding 45. Telephone & Telegraph 9. Appliances 46. Domestic Services 10. China, Glassware & Utensils 47. Misc. Household Services 11...Jewelry & Watches 53. Misc. Transportation 17. Opthalmic & Orthopedic Goods 54. Rail Transportation 18. Books & Maps 55. Airline Transportation 19. Boats

  7. Conjoint-Analyse und Marktsegmentierung

    OpenAIRE

    Steiner, Winfried J.; Baumgartner, Bernhard

    2003-01-01

    Die Marktsegmentierung zählt neben der Neuproduktplanung und Preisgestaltung zu den wesentlichen Einsatzgebieten der Conjoint-Analyse. Neben traditionell eingesetzten zweistufigen Vorgehensweisen, bei denen Conjoint-Analyse und Segmentierung in zwei getrennten Schritten erfolgen, stehen heute mit Methoden wie der Clusterwise Regression oder Mixture-Modellen neuere Entwicklungen, die eine simultane Segmentierung und Präferenzschätzung ermöglichen, zur Verfügung. Der Beitrag gibt einen Überblic...

  8. Analyse des Organisations en Afrique

    African Journals Online (AJOL)

    de celle-ci pourtenter d'identifier et d'élucider les dynamiques sociales qui s'y déroulent, puis de les examiner en rapport avec les stratégies d'acteurs et le contexte général (l'environnement) dans lequel évolue cette organisation, ... Et son Analyse. L'analyse des organisations est une démarche ancienne dont les ...

  9. A Program Transformation for Backwards Analysis of Logic Programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick

    2003-01-01

    programs presented here is based on a transformation of the input program, which makes explicit the dependencies of the given program points on the initial goals. The transformation is derived from the resultants semantics of logic programs. The transformed program is then analysed using a standard...... framework and no special properties of the abstract domain....

  10. CHEMICAL ANALYSES OF SODIUM SYSTEMS FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Greenhalgh, W. O.; Yunker, W. H.; Scott, F. A.

    1970-06-01

    BNWL-1407 summarizes information gained from the Chemical Analyses of Sodium Systems Program pursued by Battelle- Northwest over the period from July 1967 through June 1969. Tasks included feasibility studies for performing coulometric titration and polarographic determinations of oxygen in sodium, and the development of new separation techniques for sodium impurities and their subsequent analyses. The program was terminated ahead of schedule so firm conclusions were not obtained in all areas of the work. At least 40 coulometric titrations were carried out and special test cells were developed for coulometric application. Data indicated that polarographic measurements are theoretically feasible, but practical application of the method was not verified. An emission spectrographic procedure for trace metal impurities was developed and published. Trace metal analysis by a neutron activation technique was shown to be feasible; key to the success of the activation technique was the application of a new ion exchange resin which provided a sodium separation factor of 10{sup 11}. Preliminary studies on direct scavenging of trace metals produced no conclusive results.

  11. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences......The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... (mitogenomes). Such studies were initially limited to analyses of extant organisms, but developments in both DNA sequencing technologies and general methodological aspects related to working with degraded DNA have resulted in complete mitogenomes becoming increasingly popular for ancient DNA studies as well...

  12. Beskrivende analyse af mekaniske systemer

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    Descriptive analysis is the activity, where a given product is analysed for obtaining insight into different aspects, leading to an explicit description of each of these aspects. This textbook is worked out for course 72101 Produktanalyse (Analysis of products) given at DTU.......Descriptive analysis is the activity, where a given product is analysed for obtaining insight into different aspects, leading to an explicit description of each of these aspects. This textbook is worked out for course 72101 Produktanalyse (Analysis of products) given at DTU....

  13. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    , this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security......Analysing real-world systems for vulnerabilities with respect to security and safety threats is a difficult undertaking, not least due to a lack of availability of formalisations for those systems. While both formalisations and analyses can be found for artificial systems such as software...

  14. Bivariate analysis of the genetic variability among some accessions of African Yam Bean (Sphenostylis stenocarpa (Hochst ex A. RichHarms

    Directory of Open Access Journals (Sweden)

    Solomon Tayo AKINYOSOYE

    2017-12-01

    Full Text Available Variability is an important factor to consider in crop improvement programmes. This study was conducted in two years to assess genetic variability and determine relationship between seed yield, its components and tuber production characters among twelve accessions of African yam bean. Data collected were subjected to combined analysis of variance (ANOVA, Principal Component Analysis (PCA, hierarchical and K-means clustering analyses. Results obtained revealed that genotype by year (G × Y interaction had significant effects on some of variables measured (days to first flowering, days to 50 % flowering, number of pod per plant, pod length, seed yield and tuber yield per plant in this study.The first five principal components (PC with Eigen values greater than 1.0 accounted for about 66.70 % of the total variation, where PC1 and PC 2 accounted for 39.48 % of variation and were associated with seed and tuber yield variables. Three heterotic groups were clearly delineated among genotypes with accessions AY03 and AY10 identified for high seed yield and tuber yield respectively. Non-significant relationship that existed between tuber and seed yield per plant of these accessions was recommended for further test in various agro-ecologies for their suitability, adaptability and possible exploitation of heterosis to further improve the accessions.

  15. Mitogenomic analyses from ancient DNA.

    Science.gov (United States)

    Paijmans, Johanna L A; Gilbert, M Thomas P; Hofreiter, Michael

    2013-11-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences (mitogenomes). Such studies were initially limited to analyses of extant organisms, but developments in both DNA sequencing technologies and general methodological aspects related to working with degraded DNA have resulted in complete mitogenomes becoming increasingly popular for ancient DNA studies as well. To date, at least 124 partially or fully assembled mitogenomes from more than 20 species have been obtained, and, given the rapid progress in sequencing technology, this number is likely to dramatically increase in the future. The increased information content offered by analysing full mitogenomes has yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  17. Mitogenomic analyses of caniform relationships.

    Science.gov (United States)

    Arnason, Ulfur; Gullberg, Anette; Janke, Axel; Kullberg, Morgan

    2007-12-01

    Extant members of the order Carnivora split into two basal groups, Caniformia (dog-like carnivorans) and Feliformia (cat-like carnivorans). In this study we address phylogenetic relationships within Caniformia applying various methodological approaches to analyses of complete mitochondrial genomes. Pinnipeds are currently well represented with respect to mitogenomic data and here we add seven mt genomes to the non-pinniped caniform collection. The analyses identified a basal caniform divergence between Cynoidea and Arctoidea. Arctoidea split into three primary groups, Ursidae (including the giant panda), Pinnipedia, and a branch, Musteloidea, which encompassed Ailuridae (red panda), Mephitidae (skunks), Procyonidae (raccoons) and Mustelidae (mustelids). The analyses favored a basal arctoid split between Ursidae and a branch containing Pinnipedia and Musteloidea. Within the Musteloidea there was a preference for a basal divergence between Ailuridae and remaining families. Among the latter, the analyses identified a sister group relationship between Mephitidae and a branch that contained Procyonidae and Mustelidae. The mitogenomic distance between the wolf and the dog was shown to be at the same level as that of basal human divergences. The wolf and the dog are commonly considered as separate species in the popular literature. The mitogenomic result is inconsistent with that understanding at the same time as it provides insight into the time of the domestication of the dog relative to basal human mitogenomic divergences.

  18. Evaluation "Risk analyses of agroparks"

    NARCIS (Netherlands)

    Ge, L.

    2011-01-01

    Dit TransForum project richt zich op analyse van de onzekerheden en mogelijkheden van agroparken. Dit heeft geleid tot een risicomodel dat de kwalitatieve en/of kwantitatieve onzekerheden van een agropark project in kaart brengt. Daarmee kunnen maatregelen en managementstrategiën worden

  19. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies...... three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto I plant in Finland...... with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power...

  20. The CAMAC logic state analyser

    CERN Document Server

    Centro, Sandro

    1981-01-01

    Summary form only given, as follows. Large electronic experiments using distributed processors for parallel readout and data reduction need to analyse the data acquisition components status and monitor dead time constants of each active readout module and processor. For the UA1 experiment, a microprocessor-based CAMAC logic status analyser (CLSA) has been developed in order to implement these functions autonomously. CLSA is a single unit CAMAC module, able to record, up to 256 times, the logic status of 32 TTL inputs gated by a common clock, internal or external, with a maximum frequency of 2 MHz. The data stored in the internal CLSA memory can be read directly via CAMAC function or preprocessed by CLSA 6800 microprocessor. The 6800 resident firmware (4Kbyte) expands the module features to include an interactive monitor, data recording control, data reduction and histogram accumulation with statistics parameter evaluation. The microprocessor memory and the resident firmware can be externally extended using st...

  1. National Assessment of Quality Programs in Emergency Medical Services.

    Science.gov (United States)

    Redlener, Michael; Olivieri, Patrick; Loo, George T; Munjal, Kevin; Hilton, Michael T; Potkin, Katya Trudeau; Levy, Michael; Rabrich, Jeffrey; Gunderson, Michael R; Braithwaite, Sabina A

    2018-01-03

    This study aims to understand the adoption of clinical quality measurement throughout the United States on an EMS agency level, the features of agencies that do participate in quality measurement, and the level of physician involvement. It also aims to barriers to implementing quality improvement initiatives in EMS. A 46-question survey was developed to gather agency level data on current quality improvement practices and measurement. The survey was distributed nationally via State EMS Offices to EMS agencies nation-wide using Surveymonkey©. A convenience sample of respondents was enrolled between August and November, 2015. Univariate, bivariate and multiple logistic regression analyses were conducted to describe demographics and relationships between outcomes of interest and their covariates using SAS 9.3©. A total of 1,733 surveys were initiated and 1,060 surveys had complete or near-complete responses. This includes agencies from 45 states representing over 6.23 million 9-1-1 responses annually. Totals of 70.5% (747) agencies reported dedicated QI personnel, 62.5% (663) follow clinical metrics and 33.3% (353) participate in outside quality or research program. Medical director hours varied, notably, 61.5% (649) of EMS agencies had quality measures compared to fire-based agencies. Agencies in rural only environments were less likely to follow clinical quality metrics. (OR 0.47 CI 0.31 -0.72 p quality improvement resources, medical direction and specific clinical quality measures. More research is needed to understand the impact of this variation on patient care outcomes.

  2. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    Science.gov (United States)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  3. Mitogenomic analyses of eutherian relationships.

    Science.gov (United States)

    Arnason, U; Janke, A

    2002-01-01

    Reasonably correct phylogenies are fundamental to the testing of evolutionary hypotheses. Here, we present phylogenetic findings based on analyses of 67 complete mammalian mitochondrial (mt) genomes. The analyses, irrespective of whether they were performed at the amino acid (aa) level or on nucleotides (nt) of first and second codon positions, placed Erinaceomorpha (hedgehogs and their kin) as the sister group of remaining eutherians. Thus, the analyses separated Erinaceomorpha from other traditional lipotyphlans (e.g., tenrecs, moles, and shrews), making traditional Lipotyphla polyphyletic. Both the aa and nt data sets identified the two order-rich eutherian clades, the Cetferungulata (comprising Pholidota, Carnivora, Perissodactyla, Artiodactyla, and Cetacea) and the African clade (Tenrecomorpha, Macroscelidea, Tubulidentata, Hyracoidea, Proboscidea, and Sirenia). The study corroborated recent findings that have identified a sister-group relationship between Anthropoidea and Dermoptera (flying lemurs), thereby making our own order, Primates, a paraphyletic assembly. Molecular estimates using paleontologically well-established calibration points, placed the origin of most eutherian orders in Cretaceous times, 70-100 million years before present (MYBP). The same estimates place all primate divergences much earlier than traditionally believed. For example, the divergence between Homo and Pan is estimated to have taken place approximately 10 MYBP, a dating consistent with recent findings in primate paleontology. Copyright 2002 S. Karger AG, Basel

  4. Partial correlation analyses of global diffusion tensor imaging-derived metrics in glioblastoma multiforme: Pilot study

    Science.gov (United States)

    Cortez-Conradis, David; Rios, Camilo; Moreno-Jimenez, Sergio; Roldan-Valadez, Ernesto

    2015-01-01

    AIM: To determine existing correlates among diffusion tensor imaging (DTI)-derived metrics in healthy brains and brains with glioblastoma multiforme (GBM). METHODS: Case-control study using DTI data from brain magnetic resonance imaging of 34 controls (mean, 41.47; SD, ± 21.94 years; range, 21-80 years) and 27 patients with GBM (mean, SD; 48.41 ± 15.18 years; range, 18-78 years). Image postprocessing using FSL software calculated eleven tensor metrics: fractional (FA) and relative anisotropy; pure isotropic (p) and anisotropic diffusions (q), total magnitude of diffusion (L); linear (Cl), planar (Cp) and spherical tensors (Cs); mean (MD), axial (AD) and radial diffusivities (RD). Partial correlation analyses (controlling the effect of age and gender) and multivariate Mancova were performed. RESULTS: There was a normal distribution for all metrics. Comparing healthy brains vs brains with GBM, there were significant very strong bivariate correlations only depicted in GBM: [FA↔Cl (+)], [FA↔q (+)], [p↔AD (+)], [AD↔MD (+)], and [MD↔RD (+)]. Among 56 pairs of bivariate correlations, only seven were significantly different. The diagnosis variable depicted a main effect [F-value (11, 23) = 11.842, P ≤ 0.001], with partial eta squared = 0.850, meaning a large effect size; age showed a similar result. The age also had a significant influence as a covariate [F (11, 23) = 10.523, P < 0.001], with a large effect size (partial eta squared = 0.834). CONCLUSION: DTI-derived metrics depict significant differences between healthy brains and brains with GBM, with specific magnitudes and correlations. This study provides reference data and makes a contribution to decrease the underlying empiricism in the use of DTI parameters in brain imaging. PMID:26644826

  5. Systematic Derivation of Static Analyses for Software Product Lines

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Brabrand, Claus; Wasowski, Andrzej

    2014-01-01

    A recent line of work lifts particular verification and analysis methods to Software Product Lines (SPL). In an effort to generalize such case-by-case approaches, we develop a systematic methodology for lifting program analyses to SPLs using abstract interpretation. Abstract interpretation...... is a classical framework for deriving static analyses in a compositional, step-by-step manner. We show how to take an analysis expressed as an abstract interpretation and lift each of the abstract interpretation steps to a family of programs. This includes schemes for how to lift domain types, and combinators...

  6. Providing traceability for neuroimaging analyses.

    Science.gov (United States)

    McClatchey, Richard; Branson, Andrew; Anjum, Ashiq; Bloodsworth, Peter; Habib, Irfan; Munir, Kamran; Shamdasani, Jetendr; Soomro, Kamran

    2013-09-01

    With the increasingly digital nature of biomedical data and as the complexity of analyses in medical research increases, the need for accurate information capture, traceability and accessibility has become crucial to medical researchers in the pursuance of their research goals. Grid- or Cloud-based technologies, often based on so-called Service Oriented Architectures (SOA), are increasingly being seen as viable solutions for managing distributed data and algorithms in the bio-medical domain. For neuroscientific analyses, especially those centred on complex image analysis, traceability of processes and datasets is essential but up to now this has not been captured in a manner that facilitates collaborative study. Few examples exist, of deployed medical systems based on Grids that provide the traceability of research data needed to facilitate complex analyses and none have been evaluated in practice. Over the past decade, we have been working with mammographers, paediatricians and neuroscientists in three generations of projects to provide the data management and provenance services now required for 21st century medical research. This paper outlines the finding of a requirements study and a resulting system architecture for the production of services to support neuroscientific studies of biomarkers for Alzheimer's disease. The paper proposes a software infrastructure and services that provide the foundation for such support. It introduces the use of the CRISTAL software to provide provenance management as one of a number of services delivered on a SOA, deployed to manage neuroimaging projects that have been studying biomarkers for Alzheimer's disease. In the neuGRID and N4U projects a Provenance Service has been delivered that captures and reconstructs the workflow information needed to facilitate researchers in conducting neuroimaging analyses. The software enables neuroscientists to track the evolution of workflows and datasets. It also tracks the outcomes of

  7. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  8. Is bilingualism associated with a lower risk of dementia in community-living older adults? Cross-sectional and prospective analyses.

    Science.gov (United States)

    Yeung, Caleb M; St John, Philip D; Menec, Verena; Tyas, Suzanne L

    2014-01-01

    The aim of this study was to determine whether bilingualism is associated with dementia in cross-sectional or prospective analyses of older adults. In 1991, 1616 community-living older adults were assessed and were followed 5 years later. Measures included age, sex, education, subjective memory loss (SML), and the modified Mini-mental State Examination (3MS). Dementia was determined by clinical examination in those who scored below the cut point on the 3MS. Language status was categorized based upon self-report into 3 groups: English as a first language (monolingual English, bilingual English) and English as a Second Language (ESL). The ESL category had lower education, lower 3MS scores, more SML, and were more likely to be diagnosed with cognitive impairment, no dementia at both time 1 and time 2 compared with those speaking English as a first language. There was no association between being bilingual (ESL and bilingual English vs. monolingual) and having dementia at time 1 in bivariate or multivariate analyses. In those who were cognitively intact at time 1, there was no association between being bilingual and having dementia at time 2 in bivariate or multivariate analyses. We did not find any association between speaking >1 language and dementia.

  9. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  10. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  11. Distinguishing signs of opioid overdose and indication for naloxone: an evaluation of six overdose training and naloxone distribution programs in the United States

    Science.gov (United States)

    Green, Traci C.; Heimer, Robert; Grau, Lauretta E.

    2011-01-01

    Aims This study assessed overdose and naloxone administration knowledge among current or former opioid abusers trained and untrained in overdose–response in the United States. Design and participants Ten individuals, divided equally between those trained or not trained in overdose recognition and response, were recruited from each of six sites (n = 62). Setting US-based overdose training and naloxone distribution programs in Baltimore, San Francisco, Chicago, New York and New Mexico. Measurements Participants completed a brief questionnaire on overdose knowledge that included the task of rating 16 putative overdose scenarios for: (i) whether an overdose was occurring and (ii) if naloxone was indicated. Bivariate and multivariable analyses compared results for those trained to untrained. Responses were also compared to those of 11 medical experts using weighted and unweighted kappa statistics. Findings Respondents were primarily male (72.6%); 45.8% had experienced an overdose and 72% had ever witnessed an overdose. Trained participants recognized more opioid overdose scenarios accurately (t60 = 3.76, P naloxone was indicated (t59 = 2.2, P overdose were associated independently with higher overdose recognition scores. Trained respondents were as skilled as medical experts in recognizing opioid overdose situations (weighted kappa = 0.85) and when naloxone was indicated (kappa = 1.0). Conclusions Results suggest that naloxone training programs in the United States improve participants’ ability to recognize and respond to opioid overdoses in the community. Drug users with overdose training and confidence in their abilities to respond may effectively prevent overdose mortality. PMID:18422830

  12. Waste Stream Analyses for Nuclear Fuel Cycles

    Energy Technology Data Exchange (ETDEWEB)

    N. R. Soelberg

    2010-08-01

    A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.

  13. Statistical Analyses of Digital Collections

    DEFF Research Database (Denmark)

    Frandsen, Tove Faber; Nicolaisen, Jeppe

    2016-01-01

    Using statistical methods to analyse digital material for patterns makes it possible to detect patterns in big data that we would otherwise not be able to detect. This paper seeks to exemplify this fact by statistically analysing a large corpus of references in systematic reviews. The aim...... of the analysis is to study the phenomenon of non-citation: Situations where just one (or some) document(s) are cited from a pool of otherwise equally citable documents. The study is based on more than 120,000 cited studies, and a total number of non-cited studies of more than 1.6 million. The number of cited...... 10 years. After 10 years the cited and non-cited studies tend to be more similar in terms of age. Separating the data set into different sub-disciplines reveals that the sub-disciplines vary in terms of age of cited vs. non-cited references. Some fields may be expanding and the number of published...

  14. Extensão bivariada do índice de confiabilidade univariado para avaliação da estabilidade fenotípica Bivariate extension of univariate reliability index for evaluating phenotypic stability

    Directory of Open Access Journals (Sweden)

    Suzankelly Cunha Arruda de Abreu

    2004-10-01

    Full Text Available Com o presente trabalho, objetiva-se realizar a derivação teórica da extensão bivariada dos métodos de Annicchiarico (1992 e Annicchiarico et al. (1995 para estudar a estabilidade fenotípica. A partir dos ensaios com genótipos em ambientes e mensurações de duas variáveis, cada genótipo teve seu valor padronizado com relação a cada variável k = 1, 2. Essa padronização foi realizada em função da média do ambiente, da seguinte forma: Wijk = Yijk/×100 ; em que Wijk representa o valor padronizado do genótipo i, no ambiente j para a variável k; representa a média observada do genótipo , no ambiente para a variável k e , a média de todos genótipos para o ambiente e variável k. Com os valores padronizados foram estimados o vetor média e a matriz de variância e covariância de cada genótipo. Foi obtida a derivação teórica da extensão bivariada do índice de risco (Ii de Annicchiarico com sucesso e foi proposto um segundo índice de risco baseado nas probabilidades bivariada (Prb i; os dois índices apresentaram grande concordância nos resultados obtidos em um exemplo ilustrativo com genótipos de melões.The objective of this work was to obtain the theoretical derivation of the bivariate extension to the methods proposed by Annicchiarico (1992 and Annicchiarico et al. (1995 for studing phenotypic stability. Considering assays with genotypes in environments and two variates, every genotype had the response of each variate (k = 1, 2 standardized. This standardization has been made using the environment means as follows: Wijk = Yijk/×100 ; where Wijk represents the ith genotype standard value in the jth environment for the kth variate; represents the observed mean of the ith genotype, in jth environment for the kth variate e the overall genotypes means for jth environment to kth variate. Considering the standardized values, the genotypes mean vector and covariance matrix were estimated. The theoretical derivation of the

  15. Analysing ESP Texts, but How?

    Directory of Open Access Journals (Sweden)

    Borza Natalia

    2015-03-01

    Full Text Available English as a second language (ESL teachers instructing general English and English for specific purposes (ESP in bilingual secondary schools face various challenges when it comes to choosing the main linguistic foci of language preparatory courses enabling non-native students to study academic subjects in English. ESL teachers intending to analyse English language subject textbooks written for secondary school students with the aim of gaining information about what bilingual secondary school students need to know in terms of language to process academic textbooks cannot avoiding deal with a dilemma. It needs to be decided which way it is most appropriate to analyse the texts in question. Handbooks of English applied linguistics are not immensely helpful with regard to this problem as they tend not to give recommendation as to which major text analytical approaches are advisable to follow in a pre-college setting. The present theoretical research aims to address this lacuna. Respectively, the purpose of this pedagogically motivated theoretical paper is to investigate two major approaches of ESP text analysis, the register and the genre analysis, in order to find the more suitable one for exploring the language use of secondary school subject texts from the point of view of an English as a second language teacher. Comparing and contrasting the merits and limitations of the two contrastive approaches allows for a better understanding of the nature of the two different perspectives of text analysis. The study examines the goals, the scope of analysis, and the achievements of the register perspective and those of the genre approach alike. The paper also investigates and reviews in detail the starkly different methods of ESP text analysis applied by the two perspectives. Discovering text analysis from a theoretical and methodological angle supports a practical aspect of English teaching, namely making an informed choice when setting out to analyse

  16. HGCal Simulation Analyses for CMS

    CERN Document Server

    Bruno, Sarah Marie

    2015-01-01

    This summer, I approached the topic of fast-timing detection of photons from Higgs decays via simulation analyses, working under the supervision of Dr. Adolf Bornheim of the California Institute of Technology. My specific project focused on simulating the high granularity calorimeter for the Compact Muon Solenoid (CMS) experiment. CMS detects particles using calorimeters. The Electromagnetic Calorimeter (ECal) is arranged cylindrically to form a barrel section and two “endcaps.” Previously, both the barrel and endcap have employed lead tungstate crystal detectors, known as the “shashlik” design. The crystal detectors, however, rapidly degrade from exposure to radiation. This effect is most pronounced in the endcaps. To avoid the high expense of frequently replacing degraded detectors, it was recently decided to eliminate the endcap crystals in favor of an arrangement of silicon detectors known as the “High Granularity Calorimeter” (HGCal), while leaving the barrel detector technology unchanged. T...

  17. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  18. Retorisk analyse af historiske tekster

    DEFF Research Database (Denmark)

    Kock, Christian Erik J

    2014-01-01

    scholars who identify themselves as rhetoricians tend to define and conduct such an analysis. It is argued that while rhetoricians would sympathize with Skinner’s adoption of speech act theory in his reading of historical documents, they would generally extend their rhetorical readings of such documents......In recent years, rhetoric and the rhetorical tradition has attracted increasing interest from historians, such as, e.g., Quentin Skinner. The paper aims to explain and illustrate what may be understood by a rhetorical analysis (or “rhetorical criticism”) of historical documents, i.e., how those...... to many more features than just the key concepts invoked in them. The paper discusses examples of rhetorical analyses done by prominent contemporary rhetoricians, including Edwin Black, Kenneth Burke, Maurice Charland, and Michael Leff. It relates its view of rhetorical documents to trends in current...

  19. Uncertainty and Sensitivity Analyses Plan

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  20. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2014-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...... reasons why those competing views fail provide important insights into the ethics of killing....

  1. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  2. THOR Turbulence Electron Analyser: TEA

    Science.gov (United States)

    Fazakerley, Andrew; Samara, Marilia; Hancock, Barry; Wicks, Robert; Moore, Tom; Rust, Duncan; Jones, Jonathan; Saito, Yoshifumi; Pollock, Craig; Owen, Chris; Rae, Jonny

    2017-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The Turbulence Electron Analyser (TEA) will measure the plasma electron populations in the mission's Regions of Interest. It will collect a 3D electron velocity distribution with cadences as short as 5 ms. The instrument will be capable of measuring energies up to 30 keV. TEA consists of multiple electrostatic analyser heads arranged so as to measure electrons arriving from look directions covering the full sky, i.e. 4 pi solid angle. The baseline concept is similar to the successful FPI-DES instrument currently operating on the MMS mission. TEA is intended to have a similar angular resolution, but a larger geometric factor. In comparison to earlier missions, TEA improves on the measurement cadence. For example, MMS FPI-DES routinely operates at 30 ms cadence. The objective of measuring distributions at rates as fast as 5 ms is driven by the mission's scientific requirements to resolve electron gyroscale size structures, where plasma heating and fluctuation dissipation is predicted to occur. TEA will therefore be capable of making measurements of the evolution of distribution functions across thin (a few km) current sheets travelling past the spacecraft at up to 600 km/s, of the Power Spectral Density of fluctuations of electron moments and of distributions fast enough to match frequencies with waves expected to be dissipating turbulence (e.g. with 100 Hz whistler waves). A novel capability to time tag individual electron events during short intervals for the purposes of ground analysis of wave-particle interactions is also planned.

  3. Genetic analyses of captive Alala (Corvus hawaiiensis) using AFLP analyses

    Science.gov (United States)

    Jarvi, Susan I.; Bianchi, Kiara R.

    2006-01-01

    affected by the mutation rate at microsatellite loci, thus introducing a bias. Also, the number of loci that can be studied is frequently limited to fewer than 10. This theoretically represents a maximum of one marker for each of 10 chromosomes. Dominant markers like AFLP allow a larger fraction of the genome to be screened. Large numbers of loci can be screened by AFLP to resolve very small individual differences that can be used for identification of individuals, estimates of pairwise relatedness and, in some cases, for parentage analyses. Since AFLP is a dominant marker (can not distinguish between +/+ homozygote versus +/- heterozygote), it has limitations for parentage analyses. Only when both parents are homozygous for the absence of alleles (-/-) and offspring show a presence (+/+ or +/-) can the parents be excluded. In this case, microsatellites become preferable as they have the potential to exclude individual parents when the other parent is unknown. Another limitation of AFLP is that the loci are generally less polymorphic (only two alleles/locus) than microsatellite loci (often >10 alleles/locus). While generally fewer than 10 highly polymorphic microsatellite loci are enough to exclude and assign parentage, it might require up to 100 or more AFLP loci. While there are pros and cons to different methodologies, the total number of loci evaluated by AFLP generally offsets the limitations imposed due to the dominant nature of this approach and end results between methods are generally comparable. Overall objectives of this study were to evaluate the level of genetic diversity in the captive population of Alala, to compare genetic data with currently available pedigree information, and to determine the extent of relatedness of mating pairs and among founding individuals.

  4. Assessing the hydrodynamic boundary conditions for risk analyses in coastal areas: a multivariate statistical approach based on Copula functions

    Directory of Open Access Journals (Sweden)

    T. Wahl

    2012-02-01

    Full Text Available This paper presents an advanced approach to statistically analyse storm surge events. In former studies the highest water level during a storm surge event usually was the only parameter that was used for the statistical assessment. This is not always sufficient, especially when statistically analysing storm surge scenarios for event-based risk analyses. Here, Archimedean Copula functions are applied and allow for the consideration of further important parameters in addition to the highest storm surge water levels. First, a bivariate model is presented and used to estimate exceedance probabilities of storm surges (for two tide gauges in the German Bight by jointly analysing the important storm surge parameters "highest turning point" and "intensity". Second, another dimension is added and a trivariate fully nested Archimedean Copula model is applied to additionally incorporate the significant wave height as an important wave parameter. With the presented methodology, reliable and realistic exceedance probabilities are derived and can be considered (among others for integrated flood risk analyses contributing to improve the overall results. It is highlighted that the concept of Copulas represents a promising alternative for facing multivariate problems in coastal engineering.

  5. Environmental monitoring program for the Ormen Lange Onshore Processing Plant and the Reserve Power Plant at Nyhamna, Gossa. Monitoring of vegetation and soil: re-analyses and establishment of new monitoring plots in 2010.; Miljoeovervaakingsprogram for Ormen Lange landanlegg og Reservegasskraftverk paa Nyhamna, Gossa. Overvaaking av vegetasjon og jord: gjenanalyser og nyetablering av overvaakingsfelter i 2010

    Energy Technology Data Exchange (ETDEWEB)

    Aarrestad, P.A.; Bakkestuen, V.; Stabbetorp, O.E.; Myklebost, Heidi

    2011-07-01

    The Ormen Lange Onshore Processing Plant in Aukra municipality (Moere og Romsdal county) receives unprocessed gas and condensate from the Ormen Lange field in the Norwegian Sea. During processing of sales gas and condensate, the plant emits CO, Co2, Nox, CH4, NMVOC (including BTEX), SO2 and small amounts of heavy metals, as specified in the discharge permit issued by the Climate and Pollution Directorate. The plant started production in 2007, with A/S Norske Shell as operator. In general, emissions of nitrogen and sulphur-containing gasses may affect terrestrial ecosystems through acidification and fertilization of soil and vegetation. The emissions from the onshore plant are calculated to be below the current critical loads for the terrestrial nature types. However, the nitrogen background level in the area of influence is close to the critical loads for oligotrophic habitats. To be able to document any effects of emissions to air on terrestrial ecosystems, a monitoring program for vegetation and soil was established in 2008 in the area of influence from the Ormen Lange Onshore Plant. The monitoring is planned at regular intervals according to the same methods employed in 2008, with the first reanalysis in 2010. The benefits of the monitoring parameters will be continuously evaluated. Statnett has established a Reserve Power Plant with discharge permits of similar substances in the same area as the Ormen Lange Onshore Processing plant, and participates in an extended monitor program from 2010. In 2008 two monitoring sites were established, one with rather high deposition of nitrogen north of the plant within Gule-Stavmyran nature reserve in Fraena municipality (site Gulmyran) and one south of the plant on the island Gossa (site Aukra). Deposition values have been estimated by the Norwegian Institute for Air Research (NILU). Within each site integrated monitoring of the species composition of the vegetation, plant growth, and chemical content of plants and soil is

  6. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  7. Ergonomic analyses of downhill skiing.

    Science.gov (United States)

    Clarys, J P; Publie, J; Zinzen, E

    1994-06-01

    The purpose of this study was to provide electromyographic feedback for (1) pedagogical advice in motor learning, (2) the ergonomics of materials choice and (3) competition. For these purposes: (1) EMG data were collected for the Stem Christie, the Stem Turn and the Parallel Christie (three basic ski initiation drills) and verified for the complexity of patterns; (2) integrated EMG (iEMG) and linear envelopes (LEs) were analysed from standardized positions, motions and slopes using compact, soft and competition skis; (3) in a simulated 'parallel special slalom', the muscular activity pattern and intensity of excavated and flat snow conditions were compared. The EMG data from the three studies were collected on location in the French Alps (Tignes). The analog raw EMG was recorded on the slopes with a portable seven-channel FM recorder (TEAC MR30) and with pre-amplified bipolar surface electrodes supplied with a precision instrumentation amplifier (AD 524, Analog Devices, Norwood, USA). The raw signal was full-wave rectified and enveloped using a moving average principle. This linear envelope was normalized according to the highest peak amplitude procedure per subject and was integrated in order to obtain a reference of muscular intensity. In the three studies and for all subjects (elite skiers: n = 25 in studies 1 and 2, n = 6 in study 3), we found a high level of co-contractions in the lower limb extensors and flexors, especially during the extension phase of the ski movement. The Stem Christie and the Parallel Christie showed higher levels of rhythmic movement (92 and 84%, respectively).(ABSTRACT TRUNCATED AT 250 WORDS)

  8. Blast sampling for structural and functional analyses.

    Science.gov (United States)

    Friedrich, Anne; Ripp, Raymond; Garnier, Nicolas; Bettler, Emmanuel; Deléage, Gilbert; Poch, Olivier; Moulinier, Luc

    2007-02-23

    The post-genomic era is characterised by a torrent of biological information flooding the public databases. As a direct consequence, similarity searches starting with a single query sequence frequently lead to the identification of hundreds, or even thousands of potential homologues. The huge volume of data renders the subsequent structural, functional and evolutionary analyses very difficult. It is therefore essential to develop new strategies for efficient sampling of this large sequence space, in order to reduce the number of sequences to be processed. At the same time, it is important to retain the most pertinent sequences for structural and functional studies. An exhaustive analysis on a large scale test set (284 protein families) was performed to compare the efficiency of four different sampling methods aimed at selecting the most pertinent sequences. These four methods sample the proteins detected by BlastP searches and can be divided into two categories: two customisable methods where the user defines either the maximal number or the percentage of sequences to be selected; two automatic methods in which the number of sequences selected is determined by the program. We focused our analysis on the potential information content of the sampled sets of sequences using multiple alignment of complete sequences as the main validation tool. The study considered two criteria: the total number of sequences in BlastP and their associated E-values. The subsequent analyses investigated the influence of the sampling methods on the E-value distributions, the sequence coverage, the final multiple alignment quality and the active site characterisation at various residue conservation thresholds as a function of these criteria. The comparative analysis of the four sampling methods allows us to propose a suitable sampling strategy that significantly reduces the number of homologous sequences required for alignment, while at the same time maintaining the relevant information

  9. Blast sampling for structural and functional analyses

    Directory of Open Access Journals (Sweden)

    Friedrich Anne

    2007-02-01

    Full Text Available Abstract Background The post-genomic era is characterised by a torrent of biological information flooding the public databases. As a direct consequence, similarity searches starting with a single query sequence frequently lead to the identification of hundreds, or even thousands of potential homologues. The huge volume of data renders the subsequent structural, functional and evolutionary analyses very difficult. It is therefore essential to develop new strategies for efficient sampling of this large sequence space, in order to reduce the number of sequences to be processed. At the same time, it is important to retain the most pertinent sequences for structural and functional studies. Results An exhaustive analysis on a large scale test set (284 protein families was performed to compare the efficiency of four different sampling methods aimed at selecting the most pertinent sequences. These four methods sample the proteins detected by BlastP searches and can be divided into two categories: two customisable methods where the user defines either the maximal number or the percentage of sequences to be selected; two automatic methods in which the number of sequences selected is determined by the program. We focused our analysis on the potential information content of the sampled sets of sequences using multiple alignment of complete sequences as the main validation tool. The study considered two criteria: the total number of sequences in BlastP and their associated E-values. The subsequent analyses investigated the influence of the sampling methods on the E-value distributions, the sequence coverage, the final multiple alignment quality and the active site characterisation at various residue conservation thresholds as a function of these criteria. Conclusion The comparative analysis of the four sampling methods allows us to propose a suitable sampling strategy that significantly reduces the number of homologous sequences required for alignment, while

  10. GPU based framework for geospatial analyses

    Science.gov (United States)

    Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus

    2017-04-01

    Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric

  11. Predictors of Place of Death of Individuals in a Home-Based Primary and Palliative Care Program.

    Science.gov (United States)

    Prioleau, Phoebe G; Soones, Tacara N; Ornstein, Katherine; Zhang, Meng; Smith, Cardinale B; Wajnberg, Ania

    2016-11-01

    To investigate factors associated with place of death of individuals in the Mount Sinai Visiting Doctors Program (MSVD). A retrospective chart review was performed of all MSVD participants who died in 2012 to assess predictors of place of death in the last month of life. MSVD, a home-based primary and palliative care program in New York. MSVD participants who were discharged from the program because of death between January 2012 and December 2012 and died at home, in inpatient hospice, or in the hospital (N = 183). Electronic medical records were reviewed to collect information on demographic characteristics, physician visits, and end-of-life conversations. Of 183 participants, 103 (56%) died at home, approximately twice the national average; 28 (15%) died in inpatient hospice; and 52 (28%) died in the hospital. Bivariate analyses showed that participants who were white, aged 90 and older, non-Medicaid, or had a recorded preference for place of death were more likely to die outside the hospital. Diagnoses and living situation were not significantly associated with place of death. Multivariate logistic regression analysis showed no statistical association between place of death and home visits in the last month of life (odds ratio = 1.21, 95% confidence interval = 0.52-2.77). Home-based primary and palliative care results in a high likelihood of nonhospital death, although certain demographic characteristics are strong predictors of death in the hospital. For MSVD participants, home visits in the last month of life were not associated with death outside the hospital. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.

  12. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  13. Reducing injuries among Native Americans: five cost-outcome analyses.

    Science.gov (United States)

    Zaloshnja, Eduard; Miller, Ted R; Galbraith, Maury S; Lawrence, Bruce A; DeBruyn, Lemyra M; Bill, Nancy; Hicks, Kenny R; Keiffer, Michael; Perkins, Ronald

    2003-09-01

    This paper presents cost-outcome analyses of five injury prevention efforts in Native American jurisdictions: a safety-belt program, a streetlight project, a livestock control project, a drowning prevention program, and a suicide prevention and intervention program. Pre- and post-intervention data were analyzed to estimate projects' impact on injury reduction. Projects' costs were amortized over the time period covered by the evaluation or over the useful life of physical capital invested. Projects' savings were calculated based on estimated reduction in medical and public program expenses, on estimated decrease in lost productivity, and on estimated quality adjusted life years saved.All projects yielded positive benefit-cost ratios. The net cost per quality adjusted life years was less than zero (i.e. the monetary savings exceeded project costs) for all but one of the projects.

  14. Conveyor: a workflow engine for bioinformatic analyses.

    Science.gov (United States)

    Linke, Burkhard; Giegerich, Robert; Goesmann, Alexander

    2011-04-01

    The rapidly increasing amounts of data available from new high-throughput methods have made data processing without automated pipelines infeasible. As was pointed out in several publications, integration of data and analytic resources into workflow systems provides a solution to this problem, simplifying the task of data analysis. Various applications for defining and running workflows in the field of bioinformatics have been proposed and published, e.g. Galaxy, Mobyle, Taverna, Pegasus or Kepler. One of the main aims of such workflow systems is to enable scientists to focus on analysing their datasets instead of taking care for data management, job management or monitoring the execution of computational tasks. The currently available workflow systems achieve this goal, but fundamentally differ in their way of executing workflows. We have developed the Conveyor software library, a multitiered generic workflow engine for composition, execution and monitoring of complex workflows. It features an open, extensible system architecture and concurrent program execution to exploit resources available on modern multicore CPU hardware. It offers the ability to build complex workflows with branches, loops and other control structures. Two example use cases illustrate the application of the versatile Conveyor engine to common bioinformatics problems. The Conveyor application including client and server are available at http://conveyor.cebitec.uni-bielefeld.de.

  15. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  16. Designing and recasting LHC analyses with MadAnalysis 5

    Energy Technology Data Exchange (ETDEWEB)

    Conte, Eric [Universite de Haute-Alsace, IUT Colmar, Groupe de Recherche de Physique des Hautes Energies (GRPHE), 34 rue du Grillenbreit, BP 50568, Colmar Cedex (France); Dumont, Beranger [Universite Grenoble-Alpes, CNRS/IN2P3, LPSC, Grenoble (France); Fuks, Benjamin [CERN, Theory Division, Physics Department, Geneva 23 (Switzerland); Universite de Strasbourg/CNRS-IN2P3, Institut Pluridisciplinaire Hubert Curien/Departement Recherches Subatomiques, Strasbourg (France); Wymant, Chris [Laboratoire d' Annecy-le-Vieux de Physique Theorique, Annecy-le-Vieux (France); Imperial College London, Department of Infectious Disease Epidemiology, London (United Kingdom)

    2014-10-15

    We present an extension of the expert mode of the MadAnalysis 5 program dedicated to the design or reinterpretation of high-energy physics collider analyses. We detail the predefined classes, functions and methods available to the user and emphasize the most recent developments. The latter include the possible definition of multiple sub-analyses and a novel user-friendly treatment for the selection criteria. We illustrate this approach by two concrete examples: a CMS search for supersymmetric partners of the top quark and a phenomenological analysis targeting hadronically decaying monotop systems. (orig.)

  17. Determination of volatile compounds in wine by gas chromatography-flame ionization detection: comparison between the U.S. Environmental Protection Agency 3sigma approach and Hubaux-Vos calculation of detection limits using ordinary and bivariate least squares.

    Science.gov (United States)

    Caruso, Rosario; Scordino, Monica; Traulo, Pasqualino; Gagliano, Giacomo

    2012-01-01

    A capillary GC-flame ionization detection (FID) method to determine volatile compounds (ethyl acetate, 1,1-diethoxyethane, methyl alcohol, 1-propanol, 2-methyl-1-propanol, 2-methyl-1-butanol, 3-methyl-1-butanol, 1-butanol, and 2-butanol) in wine was investigated in terms of calculation of detection limits and calibration method. The main objectives were: (1) calculation of regression coefficient parameters by ordinary least-squares (OLS) and bivariate least-squares (BLS) regression models, taking into account errors in both axes; (2) estimation of linear dynamic range (LDR) according to International Conference on Harmonization recommendations; (3) performance evaluation of a method by using three different internal standards (ISs) such as acetonitrile, acetone, and 1-pentanol; (4) evaluation of LODs according to the U.S. Environmental Protection Agency (EPA) 3sigma approach and the Hubaux-Vos (H-V) method; (5) application of H-V theory to a gas chromatographic analytical method and to a food matrix; and (6) accuracy assessment of the method relative to methyl alcohol content through a Unione Italiana Vini (UIV) interlaboratory proficiency test. Calibration curves calculated via BLS and OLS show similar slopes, while intercepts are closer to zero in the first case, independent of the chosen IS. The studied ISs show a substantially equivalent behavior, even though the IS closer to the analyte retention time seems to be more appropriate in terms of LDR and LOD. Results indicate an underestimation of LODs using the EPA 3sigma approach instead of the more realistic H-V method, both with OLS and BLS regression models. Methanol contents compared with UIV average values indicate recovery between 90 and 110%.

  18. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...... to be applied to more complex pre-interpretations and larger programs. There are two main techniques presented: the first is a novel algorithm for determinising finite tree automata, yielding a compact ``product" form of the transitions of the result automaton, that is often orders of magnitude smaller than...

  19. Pediatric Program Leadership's Contribution Toward Resident Wellness.

    Science.gov (United States)

    Carson, Savanna L; Perkins, Kate; Reilly, Maura R; Sim, Myung-Shin; Li, Su-Ting T

    2018-02-27

    Residency program leaders are required to support resident well-being, but often do not receive training in how to do so. Determine frequency in which program leadership provides support for resident well-being, comfort in supporting resident well-being, and factors associated with need for additional training in supporting resident well-being. National cross-sectional web-based survey of pediatric program directors, associate program directors, and coordinators in June 2015, on their experience supporting resident well-being. Univariate and bivariate descriptive statistics compared responses between groups. Generalized linear modeling, adjusting for program region, size, program leadership role, and number of years in role determined factors associated with need for additional training. 39.3% (322/820) of participants responded. Most respondents strongly agreed that supporting resident well-being is an important part of their role, but few reported supporting resident well-being as part of their job description. Most reported supporting residents' clinical, personal, and health issues at least annually, and in some cases weekly, with 72% spending >10% of their time on resident well-being. Most program leaders desired more training. After adjusting for level of comfort in dealing with resident well-being issues, program leaders more frequently exposed to resident well-being issues were more likely to desire additional training (pProgram leaders spend a significant amount of time supporting resident well-being. While they feel that supporting resident well-being is an important part of their job, opportunities exist for developing program leaders through including resident wellness on job descriptions and training program leaders how to support resident well-being. Copyright © 2018 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  20. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  1. Effects of an integrated Yoga Program on Self-reported Depression Scores in Breast Cancer Patients Undergoing Conventional Treatment: A Randomized Controlled Trial.

    Science.gov (United States)

    Rao, Raghavendra Mohan; Raghuram, Nagarathna; Nagendra, H R; Usharani, M R; Gopinath, K S; Diwakar, Ravi B; Patil, Shekar; Bilimagga, Ramesh S; Rao, Nalini

    2015-01-01

    To compare the effects of yoga program with supportive therapy on self-reported symptoms of depression in breast cancer patients undergoing conventional treatment. Ninety-eight breast cancer patients with stage II and III disease from a cancer center were randomly assigned to receive yoga (n = 45) and supportive therapy (n = 53) over a 24-week period during which they underwent surgery followed by adjuvant radiotherapy (RT) or chemotherapy (CT) or both. The study stoppage criteria was progressive disease rendering the patient bedridden or any physical musculoskeletal injury resulting from intervention or less than 60% attendance to yoga intervention. Subjects underwent yoga intervention for 60 min daily with control group undergoing supportive therapy during their hospital visits. Beck's Depression Inventory (BDI) and symptom checklist were assessed at baseline, after surgery, before, during, and after RT and six cycles of CT. We used analysis of covariance (intent-to-treat) to study the effects of intervention on depression scores and Pearson correlation analyses to evaluate the bivariate relationships. A total of 69 participants contributed data to the current analysis (yoga, n = 33, and controls, n = 36). There was 29% attrition in this study. The results suggest an overall decrease in self-reported depression with time in both the groups. There was a significant decrease in depression scores in the yoga group as compared to controls following surgery, RT, and CT (P surgery, RT, and CT. The results suggest possible antidepressant effects with yoga intervention in breast cancer patients undergoing conventional treatment.

  2. Comparison of connectivity analyses for resting state EEG data

    Science.gov (United States)

    Olejarczyk, Elzbieta; Marzetti, Laura; Pizzella, Vittorio; Zappasodi, Filippo

    2017-06-01

    Objective. In the present work, a nonlinear measure (transfer entropy, TE) was used in a multivariate approach for the analysis of effective connectivity in high density resting state EEG data in eyes open and eyes closed. Advantages of the multivariate approach in comparison to the bivariate one were tested. Moreover, the multivariate TE was compared to an effective linear measure, i.e. directed transfer function (DTF). Finally, the existence of a relationship between the information transfer and the level of brain synchronization as measured by phase synchronization value (PLV) was investigated. Approach. The comparison between the connectivity measures, i.e. bivariate versus multivariate TE, TE versus DTF, TE versus PLV, was performed by means of statistical analysis of indexes based on graph theory. Main results. The multivariate approach is less sensitive to false indirect connections with respect to the bivariate estimates. The multivariate TE differentiated better between eyes closed and eyes open conditions compared to DTF. Moreover, the multivariate TE evidenced non-linear phenomena in information transfer, which are not evidenced by the use of DTF. We also showed that the target of information flow, in particular the frontal region, is an area of greater brain synchronization. Significance. Comparison of different connectivity analysis methods pointed to the advantages of nonlinear methods, and indicated a relationship existing between the flow of information and the level of synchronization of the brain.

  3. Overview of Meta-Analyses of the Prevention of Mental Health, Substance Use and Conduct Problems

    Science.gov (United States)

    Sandler, Irwin; Wolchik, Sharlene A.; Cruden, Gracelyn; Mahrer, Nicole E.; Ahn, Soyeon; Brincks, Ahnalee; Brown, C. Hendricks

    2014-01-01

    This paper presents findings from an overview of meta-analyses of the effects of prevention and promotion programs to prevent mental health, substance use and conduct problems. The review of 48 meta-analyses found small but significant effects to reduce depression, anxiety, anti-social behavior and substance use. Further, the effects are sustained over time. Meta-analyses often found that the effects were heterogeneous. A conceptual model is proposed to guide the study of moderators of program effects in future meta-analyses and methodological issues in synthesizing findings across preventive interventions are discussed. PMID:24471372

  4. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Technical analyses. 61.13 Section 61.13 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses § 61.13 Technical analyses. The specific technical information must also include the following analyses...

  5. Finite element analyses of a linear-accelerator electron gun

    Energy Technology Data Exchange (ETDEWEB)

    Iqbal, M., E-mail: muniqbal.chep@pu.edu.pk, E-mail: muniqbal@ihep.ac.cn [Centre for High Energy Physics, University of the Punjab, Lahore 45590 (Pakistan); Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Wasy, A. [Department of Mechanical Engineering, Changwon National University, Changwon 641773 (Korea, Republic of); Islam, G. U. [Centre for High Energy Physics, University of the Punjab, Lahore 45590 (Pakistan); Zhou, Z. [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China)

    2014-02-15

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator.

  6. Finite element analyses of a linear-accelerator electron gun

    Science.gov (United States)

    Iqbal, M.; Wasy, A.; Islam, G. U.; Zhou, Z.

    2014-02-01

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator.

  7. Thermo-Elastic Finite Element Analyses of Annular Nuclear Fuels

    Science.gov (United States)

    Kwon, Y. D.; Kwon, S. B.; Rho, K. T.; Kim, M. S.; Song, H. J.

    In this study, we tried to examine the pros and cons of the annular type of fuel concerning mainly with the temperatures and stresses of pellet and cladding. The inner and outer gaps between pellet and cladding may play an important role on the temperature distribution and stress distribution of fuel system. Thus, we tested several inner and outer gap cases, and we evaluated the effect of gaps on fuel systems. We conducted thermo-elastic-plastic-creep analyses using an in-house thermo-elastic-plastic-creep finite element program that adopted the 'effective-stress-function' algorithm. Most analyses were conducted until the gaps disappeared; however, certain analyses lasted for 1582 days, after which the fuels were replaced. Further study on the optimal gaps sizes for annular nuclear fuel systems is still required.

  8. Attitudes Sociales et Intellectuelles et Valeurs a Developper dans L'Enseignement Quebecois des Sciences Humaines au Primaire: Une Analyse des Propositions du Programme D'Etudes (Social and Intellectual Attitudes and Values to Be Developed in Social Science Instruction in the Elementary Grades in Quebec: An Analysis of the Proposals of the Program of Study).

    Science.gov (United States)

    Laforest, Mario; Lenoir, Yves

    1994-01-01

    Discusses Quebec's elementary school social studies program, which aims to promote the development of politically correct social attitudes and scientifically correct intellectual attitudes. Suggests that, under the cover of a developmental approach to "savoir-etre" ("knowing how to be"), the program supports a technobehaviorist…

  9. Abstract Interpretation of PIC programs through Logic Programming

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    A logic based general approach to abstract interpretation of low-level machine programs is reported. It is based on modelling the behavior of the machine as a logic program. General purpose program analysis and transformation of logic programs, such as partial evaluation and convex hull analysis....... The specialised emulator can now be further analysed to gain insight into the given program for the PIC microcontroller. The method describes a general framework for applying abstractions, illustrated here by linear constraints and convex hull analysis, to logic programs. Using these techniques on the specialised...

  10. Predicting likelihood of seeking help through the employee assistance program among salaried and union hourly employees.

    Science.gov (United States)

    Delaney, W; Grube, J W; Ames, G M

    1998-03-01

    This research investigated belief, social support and background predictors of employee likelihood to use an Employee Assistance Program (EAP) for a drinking problem. An anonymous cross-sectional survey was administered in the home. Bivariate analyses and simultaneous equations path analysis were used to explore a model of EAP use. Survey and ethnographic research were conducted in a unionized heavy machinery manufacturing plant in the central states of the United States. A random sample of 852 hourly and salaried employees was selected. In addition to background variables, measures included: likelihood of going to an EAP for a drinking problem, belief the EAP can help, social support for the EAP from co-workers/others, belief that EAP use will harm employment, and supervisor encourages the EAP for potential drinking problems. Belief in EAP efficacy directly increased the likelihood of going to an EAP. Greater perceived social support and supervisor encouragement increased the likelihood of going to an EAP both directly and indirectly through perceived EAP efficacy. Black and union hourly employees were more likely to say they would use an EAP. Males and those who reported drinking during working hours were less likely to say they would use an EAP for a drinking problem. EAP beliefs and social support have significant effects on likelihood to go to an EAP for a drinking problem. EAPs may wish to focus their efforts on creating an environment where there is social support from coworkers and encouragement from supervisors for using EAP services. Union networks and team members have an important role to play in addition to conventional supervisor intervention.

  11. Guide: a desktop application for analysing gene expression data.

    Science.gov (United States)

    Choi, Jarny

    2013-10-07

    Multiplecompeting bioinformatics tools exist for next-generation sequencing data analysis. Many of these tools are available as R/Bioconductor modules, and it can be challenging for the bench biologist without any programming background to quickly analyse genomics data. Here, we present an application that is designed to be simple to use, while leveraging the power of R as the analysis engine behind the scenes. Genome Informatics Data Explorer (Guide) is a desktop application designed for the bench biologist to analyse RNA-seq and microarray gene expression data. It requires a text file of summarised read counts or expression values as input data, and performs differential expression analyses at both the gene and pathway level. It uses well-established R/Bioconductor packages such as limma for its analyses, without requiring the user to have specific knowledge of the underlying R functions. Results are presented in figures or interactive tables which integrate useful data from multiple sources such as gene annotation and orthologue data. Advanced options include the ability to edit R commands to customise the analysis pipeline. Guide is a desktop application designed to query gene expression data in a user-friendly way while automatically communicating with R. Its customisation options make it possible to use different bioinformatics tools available through R/Bioconductor for its analyses, while keeping the core usage simple. Guide is written in the cross-platform framework of Qt, and is freely available for use from http://guide.wehi.edu.au.

  12. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  13. Global Health Education in US Pediatric Residency Programs.

    Science.gov (United States)

    Butteris, Sabrina M; Schubert, Charles J; Batra, Maneesh; Coller, Ryan J; Garfunkel, Lynn C; Monticalvo, David; Moore, Molly; Arora, Gitanjli; Moore, Melissa A; Condurache, Tania; Sweet, Leigh R; Hoyos, Catalina; Suchdev, Parminder S

    2015-09-01

    Despite the growing importance of global health (GH) training for pediatric residents, few mechanisms have cataloged GH educational opportunities offered by US pediatric residency programs. We sought to characterize GH education opportunities across pediatric residency programs and identify program characteristics associated with key GH education elements. Data on program and GH training characteristics were sought from program directors or their delegates of all US pediatric residency programs during 2013 to 2014. These data were used to compare programs with and without a GH track as well as across small, medium, and large programs. Program characteristics associated with the presence of key educational elements were identified by using bivariate logistic regression. Data were collected from 198 of 199 active US pediatric residency programs (99.5%). Seven percent of pediatric trainees went abroad during 2013 to 2014. Forty-nine programs (24.7%) reported having a GH track, 66.1% had a faculty lead, 58.1% offered international field experiences, and 48.5% offered domestic field experiences. Forty-two percent of programs reported international partnerships across 153 countries. Larger programs, those with lead faculty, GH tracks, or partnerships had significantly increased odds of having each GH educational element, including pretravel preparation. The number of pediatric residency programs offering GH training opportunities continues to rise. However, smaller programs and those without tracks, lead faculty, or formal partnerships lag behind with organized GH curricula. As GH becomes an integral component of pediatric training, a heightened commitment is needed to ensure consistency of training experiences that encompass best practices in all programs. Copyright © 2015 by the American Academy of Pediatrics.

  14. 7 CFR 93.5 - Fees for citrus product analyses set by cooperative agreement.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Fees for citrus product analyses set by cooperative... (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.5 Fees for citrus product analyses set by cooperative agreement. The fees for the...

  15. Analysing the English-Xhosa parallel corpus of technical texts with ...

    African Journals Online (AJOL)

    Paraconc has been used in analysing a parallel corpus of English-Xhosa texts. Paraconc is a parallel concordancer which makes it easy to analyse translated texts. This software program was developed by Michael Barlow (1995, 2003). He designed this software for linguists and translators who wish to work with translated ...

  16. Child hunger and the protective effects of Supplemental Nutrition Assistance Program (SNAP) and alternative food sources among Mexican-origin families in Texas border colonias.

    Science.gov (United States)

    Sharkey, Joseph R; Dean, Wesley R; Nalty, Courtney C

    2013-09-13

    Nutritional health is essential for children's growth and development. Many Mexican-origin children who reside in limited-resource colonias along the Texas-Mexico border are at increased risk for poor nutrition as a result of household food insecurity. However, little is known about the prevalence of child hunger or its associated factors among children of Mexican immigrants. This study determines the prevalence of child hunger and identifies protective and risk factors associated with it in two Texas border areas. This study uses 2009 Colonia Household and Community Food Resource Assessment (C-HCFRA) data from 470 mothers who were randomly recruited by promotora-researchers. Participants from colonias near two small towns in two South Texas counties participated in an in-home community and household assessment. Interviewer-administered surveys collected data in Spanish on sociodemographics, federal food assistance program participation, and food security status. Frequencies and bivariate correlations were examined while a random-effects logistic regression model with backward elimination was used to determine correlates of childhood hunger. Hunger among children was reported in 51% (n = 239) of households in this C-HCFRA sample. Bivariate analyses revealed that hunger status was associated with select maternal characteristics, such as lower educational attainment and Mexican nativity, and household characteristics, including household composition, reliance on friend or neighbor for transportation, food purchase at dollar stores and from neighbors, and participation in school-based nutrition programs. A smaller percentage of households with child hunger participated in school-based nutrition programs (51%) or used alternative food sources, while 131 households were unable to give their child or children a balanced meal during the school year and 145 households during summer months. In the random effects model (RE = small town), increased household

  17. Pegasys: software for executing and integrating analyses of biological sequences.

    Science.gov (United States)

    Shah, Sohrab P; He, David Y M; Sawkins, Jessica N; Druce, Jeffrey C; Quon, Gerald; Lett, Drew; Zheng, Grace X Y; Xu, Tao; Ouellette, B F Francis

    2004-04-19

    We present Pegasys--a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  18. TOTO and YOYO: New very bright fluorochromes for DNA content analyses by flow cytometry

    Energy Technology Data Exchange (ETDEWEB)

    Hirons, G.T.; Fawcett, J.J.; Crissman, H.A. (Los Alamos National Lab., NM (United States))

    1994-02-01

    Flow cytometric (FCM) studies were performed on nuclei, ethanol-fixed CHO cells, and isolated human GM130 chromosomes stained with two new cyanine dyes, TOTO and YOYO. These fluorochromes, which are dimers of thiazole orange and oxazole yellow, respectively, have high quantum efficiencies and exhibit specificities for both DNA and RNA. Bound to dsDNA in solution, TOTO and YOYO emit at 530 and 510 nm, respectively, when excited at 488 nm and 457 nm, wavelengths available from most lasers employed in FCM. RNase-treated CHO nuclei, stained with either TOTO or YOYO, provided DNA histograms, with low coefficients of variation, that were as good as or better than those obtained with nuclei stained with propidium iodide (PI) or mithramycin (MI). In addition, by comparison on an equimolar basis, nuclei stained with YOYO fluoresced over 1,000 times more intensely than nuclei stained with MI. Fluorescence ratio analyses of nuclei stained with both YOYO and Hoechst 33258 showed that the ratio of YOYO to Hoechst fluorescence remained relatively constant for G[sub 1] and S phase cells, but decreased significantly for cells in G[sub 2]/M. These results indicate that the cyanine dyes may be useful in examining specific changes in chromatin structure during G[sub 2]/M phases of the cell cycle. Ethanol-fixed CHO cells stained with TOTO or YOYO did not yield reproducible DNA histograms of good quality, presumably because of the poor accessibility of DNA to these large fluorochromes. However, bivariate analyses of human GM130 chromosomes stained with TOTO or YOYO alone and excited sequentially with uv and visible wavelengths showed resolution of many individual chromosome peaks similar to results obtained for chromosomes stained with HO and chromomycin A[sub 3]. Collectively, these studies show potential advantages for the use of these new cyanine dyes in FCM studies that require the sensitive detection of DNA. 20 refs., 5 figs., 2 tabs.

  19. TOTO and YOYO: new very bright fluorochromes for DNA content analyses by flow cytometry.

    Science.gov (United States)

    Hirons, G T; Fawcett, J J; Crissman, H A

    1994-02-01

    Flow cytometric (FCM) studies were performed on nuclei, ethanol-fixed CHO cells, and isolated human GM130 chromosomes stained with two new cyanine dyes, TOTO and YOYO. These fluorochromes, which are dimers of thiazole orange and oxazole yellow, respectively, have high quantum efficiencies and exhibit specificities for both DNA and RNA. Bound to dsDNA in solution, TOTO and YOYO emit at 530 and 510 nm, respectively, when excited at 488 nm and 457 nm, wavelengths available from most lasers employed in FCM. RNase-treated CHO nuclei, stained with either TOTO or YOYO, provided DNA histograms, with low coefficients of variation, that were as good as or better than those obtained with nuclei stained with propidium iodide (PI) or mithramycin (MI). In addition, by comparison on an equimolar basis, nuclei stained with YOYO fluoresced over 1,000 times more intensely than nuclei stained with MI. Fluorescence ratio analyses of nuclei stained with both YOYO and Hoechst 33258 showed that the ratio of YOYO to Hoechst fluorescence remained relatively constant for G1 and S phase cells, but decreased significantly for cells in G2/M. These results indicate that the cyanine dyes may be useful in examining specific changes in chromatin structure during G2/M phases of the cell cycle. Ethanol-fixed CHO cells stained with TOTO or YOYO did not yield reproducible DNA histograms of good quality, presumably because of the poor accessibility of DNA to these large fluorochromes. However, bivariate analyses of human GM130 chromosomes stained with TOTO or YOYO alone and excited sequentially with uv and visible wave-lengths showed resolution of many individual chromosome peaks similar to results obtained for chromosomes stained with HO and chromomycin A3. Collectively, these studies show potential advantages for the use of these new cyanine dyes in FCM studies that require the sensitive detection of DNA.

  20. Analysing Biochemical Oscillations through Probabilistic Model Checking

    DEFF Research Database (Denmark)

    Ballarini, Paolo; Mardare, Radu Iulian; Mura, Ivan

    2009-01-01

    Analysing Biochemical Oscillations through Probabilistic Model Checking. In Proc. of the Second International Workshop "From Biology To Concurrency" (FBTC 2008), Electronic Notes in Theoretical Computer Science......Analysing Biochemical Oscillations through Probabilistic Model Checking. In Proc. of the Second International Workshop "From Biology To Concurrency" (FBTC 2008), Electronic Notes in Theoretical Computer Science...

  1. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized for ta...

  2. Spectral analyses of asteroids' linear features

    Science.gov (United States)

    Longobardo, A.; Palomba, E.; Scully, J. E. C.; De Sanctis, M. C.; Capaccioni, F.; Tosi, F.; Zinzi, A.; Galiano, A.; Ammannito, E.; Filacchione, G.; Ciarniello, M.; Raponi, A.; Zambon, F.; Capria, M. T.; Erard, S.; Bockelee-Morvan, D.; Leyrat, C.; Dirri, F.; Nardi, L.; Raymond, C. A.

    2017-09-01

    Linear features are commonly found on small bodies and can have a geomorphic or tectonic origin. Generally, these features are studied by means of morphological analyses. Here we propose a spectroscopic analyses of linear features of different asteroids visited by space missions, in order to search for correspondence between spectral properties and origin of linear features.

  3. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We...

  4. Analyse de risques d'installations industrielles

    OpenAIRE

    Pineau, Jean-Philippe

    1995-01-01

    L'analyse de risques industriels a été entreprise de longue date pour des industries où les accidents pouvaient avoir des effets catastrophiques : fabrication des explosifs, exploitation des mines de charbon, chimie, pétrole. Cette analyse a permis de mieux comprendre les phénomènes mis en jeu et leurs conséquences.

  5. Novel Algorithms for Astronomical Plate Analyses

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... Powerful computers and dedicated software allow effective data mining and scientific analyses in astronomical plate archives. We give and discuss examples of newly developed algorithms for astronomical plate analyses, e.g., searches for optical transients, as well as for major spectral and brightness ...

  6. Comparison and validation of shallow landslides susceptibility maps generated by bi-variate and multi-variate linear probabilistic GIS-based techniques. A case study from Ribeira Quente Valley (S. Miguel Island, Azores)

    Science.gov (United States)

    Marques, R.; Amaral, P.; Zêzere, J. L.; Queiroz, G.; Goulart, C.

    2009-04-01

    Slope instability research and susceptibility mapping is a fundamental component of hazard assessment and is of extreme importance for risk mitigation, land-use management and emergency planning. Landslide susceptibility zonation has been actively pursued during the last two decades and several methodologies are still being improved. Among all the methods presented in the literature, indirect quantitative probabilistic methods have been extensively used. In this work different linear probabilistic methods, both bi-variate and multi-variate (Informative Value, Fuzzy Logic, Weights of Evidence and Logistic Regression), were used for the computation of the spatial probability of landslide occurrence, using the pixel as mapping unit. The methods used are based on linear relationships between landslides and 9 considered conditioning factors (altimetry, slope angle, exposition, curvature, distance to streams, wetness index, contribution area, lithology and land-use). It was assumed that future landslides will be conditioned by the same factors as past landslides in the study area. The presented work was developed for Ribeira Quente Valley (S. Miguel Island, Azores), a study area of 9,5 km2, mainly composed of volcanic deposits (ash and pumice lapilli) produced by explosive eruptions in Furnas Volcano. This materials associated to the steepness of the slopes (38,9% of the area has slope angles higher than 35°, reaching a maximum of 87,5°), make the area very prone to landslide activity. A total of 1.495 shallow landslides were mapped (at 1:5.000 scale) and included in a GIS database. The total affected area is 401.744 m2 (4,5% of the study area). Most slope movements are translational slides frequently evolving into debris-flows. The landslides are elongated, with maximum length generally equivalent to the slope extent, and their width normally does not exceed 25 m. The failure depth rarely exceeds 1,5 m and the volume is usually smaller than 700 m3. For modelling

  7. Program specialization

    CERN Document Server

    Marlet, Renaud

    2013-01-01

    This book presents the principles and techniques of program specialization - a general method to make programs faster (and possibly smaller) when some inputs can be known in advance. As an illustration, it describes the architecture of Tempo, an offline program specializer for C that can also specialize code at runtime, and provides figures for concrete applications in various domains. Technical details address issues related to program analysis precision, value reification, incomplete program specialization, strategies to exploit specialized program, incremental specialization, and data speci

  8. Nuclear Analyses of Indian LLCB Test Blanket System in ITER

    Science.gov (United States)

    Swami, H. L.; Shaw, A. K.; Danani, C.; Chaudhuri, Paritosh

    2017-04-01

    Heading towards the Nuclear Fusion Reactor Program, India is developing Lead Lithium Ceramic Breeder (LLCB) tritium breeding blanket for its future fusion Reactor. A mock-up of the LLCB blanket is proposed to be tested in ITER equatorial port no.2, to ensure the overall performance of blanket in reactor relevant nuclear fusion environment. Nuclear analyses play an important role in LLCB Test Blanket System design & development. It is required for tritium breeding estimation, thermal-hydraulic design, coolants process design, radioactive waste management, equipment maintenance & replacement strategies and nuclear safety. The nuclear behaviour of LLCB test blanket module in ITER is predicated in terms of nuclear responses such as tritium production, nuclear heating, neutron fluxes and radiation damages. Radiation shielding capability of LLCB TBS inside and outside bio-shield was also assessed to fulfill ITER shielding requirements. In order to supports the rad-waste and safety assessment, nuclear activation analyses were carried out and radioactivity data were generated for LLCB TBS components. Nuclear analyses of LLCB TBS are performed using ITER recommended nuclear analyses codes (i.e. MCNP, EASY), nuclear cross section data libraries (i.e. FENDL 2.1, EAF) and neutronic model (ITER C-lite v.l). The paper describes a comprehensive nuclear performance of LLCB TBS in ITER.

  9. MARLIN, software to create, run, and analyse spatially realistic simulations.

    Science.gov (United States)

    Meirmans, Patrick G

    2011-01-01

    MARLIN is a software to create, run, analyse, and visualize spatially explicit population genetic simulations. It provides an intuitive user interface with which the geographical layout of a metapopulation can be drawn by hand or loaded from a map. Furthermore, the interface allows easy selection of the many different simulation settings. MARLIN then uses the program QuantiNemo to run the simulation in the background. When simulations are finished, MARLIN directly analyses and plots the results, thereby greatly simplifying the simulation workflow. This combination of simulation and analysis makes MARLIN ideal for teaching and for scientists who are interested in doing simulations without having to learn command-line operations. MARLIN is available for computers running Mac OS X and can be downloaded from: http://www.patrickmeirmans.com/software. © 2010 Blackwell Publishing Ltd.

  10. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  11. Advanced Mass Spectrometers for Hydrogen Isotope Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chastagner, P.

    2001-08-01

    This report is a summary of the results of a joint Savannah River Laboratory (SRL) - Savannah River Plant (SRP) ''Hydrogen Isotope Mass Spectrometer Evaluation Program''. The program was undertaken to evaluate two prototype hydrogen isotope mass spectrometers and obtain sufficient data to permit SRP personnel to specify the mass spectrometers to replace obsolete instruments.

  12. Comparison with Russian analyses of meteor impact

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  13. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural...... environment. In addition, main input data are based on transport modelling analyses based on a misleading `local ontology' among the model makers. The ontological misconceptions translate into erroneous epistemological assumptions about the possibility of precise predictions and the validity of willingness......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  14. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  15. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  16. Interactive graphics for functional data analyses.

    Science.gov (United States)

    Wrobel, Julia; Park, So Young; Staicu, Ana Maria; Goldsmith, Jeff

    Although there are established graphics that accompany the most common functional data analyses, generating these graphics for each dataset and analysis can be cumbersome and time consuming. Often, the barriers to visualization inhibit useful exploratory data analyses and prevent the development of intuition for a method and its application to a particular dataset. The refund.shiny package was developed to address these issues for several of the most common functional data analyses. After conducting an analysis, the plot shiny() function is used to generate an interactive visualization environment that contains several distinct graphics, many of which are updated in response to user input. These visualizations reduce the burden of exploratory analyses and can serve as a useful tool for the communication of results to non-statisticians.

  17. Thermal Analyse sof Cross-Linked Polyethylene

    Directory of Open Access Journals (Sweden)

    Radek Polansky

    2007-01-01

    Full Text Available The paper summarizes results obtained during the structural analyses measurements (Differential Scanning Calorimetry DSC, Thermogravimetry TG, Thermomechanical analysis TMA and Fourier transform infrared spectroscopy FT-IR. The samples of cross-linked polyethylene cable insulation were tested via these analyses. The DSC and TG were carried out using simultaneous thermal analyzer TA Instruments SDT Q600 with connection of Fourier transform infrared spectrometer Nicolet 380. Thermomechanical analysis was carried out by TMA Q400EM TA Instruments apparatus.

  18. ANALYSE DES PERCEPTIONS LOCALES ET DES FACTEURS ...

    African Journals Online (AJOL)

    AISA

    (Euphorbiaceae). ANALYSE DES PERCEPTIONS LOCALES ET DES FACTEURS ... Une Analyse en Composantes Principales a été appliquée à la matrice .... valeur de la variable aléatoire normale pour un risque α égal à 0,05. La marge d'erreur d prévue pour tout paramètre à estimer à partir de l'enquête est de 3 %.

  19. Multi-directional program efficiency

    DEFF Research Database (Denmark)

    Asmild, Mette; Balezentis, Tomas; Hougaard, Jens Leth

    2016-01-01

    The present paper analyses both managerial and program efficiencies of Lithuanian family farms, in the tradition of Charnes et al. (Manag Sci 27(6):668–697, 1981) but with the important difference that multi-directional efficiency analysis rather than the traditional data envelopment analysis...... farms have the highest program efficiency, but the lowest managerial efficiency and that the mixed farms have the lowest program efficiency (yet not the highest managerial efficiency)....

  20. TETRA-COM: a comprehensive SPSS program for estimating the tetrachoric correlation.

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2012-12-01

    We provide an SPSS program that implements descriptive and inferential procedures for estimating tetrachoric correlations. These procedures have two main purposes: (1) bivariate estimation in contingency tables and (2) constructing a correlation matrix to be used as input for factor analysis (in particular, the SPSS FACTOR procedure). In both cases, the program computes accurate point estimates, as well as standard errors and confidence intervals that are correct for any population value. For purpose (1), the program computes the contingency table together with five other measures of association. For purpose (2), the program checks the positive definiteness of the matrix, and if it is found not to be Gramian, performs a nonlinear smoothing procedure at the user's request. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.

  1. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  2. Effectiveness in delaying the initiation of sexual intercourse of girls aged 12-14. Two components of the Girls Incorporated Preventing Adolescent Pregnancy Program.

    Science.gov (United States)

    Postrado, L T; Nicholson, H J

    1992-03-01

    The Girls Incorporated, formerly Girls Clubs of America, program in Preventing Adolescent Pregnancy, which was initiated in 1985, was evaluated to obtain information about sexual behavior and attitudes related to pregnancy, educational and career expectations, and sociodemographic characteristics, and to ascertain the effectiveness of 2 components for girls aged 12-14 in delaying early sexual involvement. The theoretical base of intervention is discussed and delineation of 3 themes recurrent in the literature. Program interventions are also described. The 2 program components evaluated were Will Power/Won't Power which taught skill in general and specific assertiveness, and Growing Together, which facilitated communication between parents and daughters. The sample consisted of 412 virgin girls 12-14 years, of which 25% did not participate in either program, from communities in which the adolescent pregnancy rate was higher than the national average. Participants were required to have completed 1 complete year which included before and after surveys. 257 participated in Will Power/Won't Power; 84 in Growing Together; and 46 in both programs. Participants and nonparticipants were similar in background characteristics. The profile was one of primarily African American 12 year olds of a Protestant religion. Growing Together participants were slightly different and less likely to engage in early sexual intercourse. The findings of the bivariate and logistic regression analyses were that nonparticipants were 2.5 times more likely to initiate sexual intercourse during the study year than participants in Growing Together, but this was only marginally statistically significant. It is suggested however that when controlling for age, religion, race and having contact with a pregnant teen that Growing Together participation contributed to a delay in the initiation of sexual intercourse. Participation in Will Power did not account for any differences in likelihood of initiating

  3. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...... through the analysis of one of the earliest recorded examples of preschool education (initiated by J. F. Oberlin in northeastern France in 1767). The general idea of societal need is elaborated as a way of analysing practices, and a general analytic schema is presented for characterising preschool...

  4. A Compendium of Wind Statistics and Models for the NASA Space Shuttle and Other Aerospace Vehicle Programs

    Science.gov (United States)

    Smith, O. E.; Adelfang, S. I.

    1998-01-01

    The wind profile with all of its variations with respect to altitude has been, is now, and will continue to be important for aerospace vehicle design and operations. Wind profile databases and models are used for the vehicle ascent flight design for structural wind loading, flight control systems, performance analysis, and launch operations. This report presents the evolution of wind statistics and wind models from the empirical scalar wind profile model established for the Saturn Program through the development of the vector wind profile model used for the Space Shuttle design to the variations of this wind modeling concept for the X-33 program. Because wind is a vector quantity, the vector wind models use the rigorous mathematical probability properties of the multivariate normal probability distribution. When the vehicle ascent steering commands (ascent guidance) are wind biased to the wind profile measured on the day-of-launch, ascent structural wind loads are reduced and launch probability is increased. This wind load alleviation technique is recommended in the initial phase of vehicle development. The vehicle must fly through the largest load allowable versus altitude to achieve its mission. The Gumbel extreme value probability distribution is used to obtain the probability of exceeding (or not exceeding) the load allowable. The time conditional probability function is derived from the Gumbel bivariate extreme value distribution. This time conditional function is used for calculation of wind loads persistence increments using 3.5-hour Jimsphere wind pairs. These increments are used to protect the commit-to-launch decision. Other topics presented include the Shuttle Shuttle load-response to smoothed wind profiles, a new gust model, and advancements in wind profile measuring systems. From the lessons learned and knowledge gained from past vehicle programs, the development of future launch vehicles can be accelerated. However, new vehicle programs by their very

  5. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...... in order to analyse the effect of different layouts on the flow characteristics. In particular, flow configurations going all the way through the structure were revealed. A couple of suggestions to minimize the risk for flow through have been tested....

  6. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  7. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  8. M-CGH: Analysing microarray-based CGH experiments

    Directory of Open Access Journals (Sweden)

    Meza-Zepeda Leonardo A

    2004-06-01

    Full Text Available Abstract Background Microarray-based comparative genomic hybridisation (array CGH is a technique by which variation in relative copy numbers between two genomes can be analysed by competitive hybridisation to DNA microarrays. This technology has most commonly been used to detect chromosomal amplifications and deletions in cancer. Dedicated tools are needed to analyse the results of such experiments, which include appropriate visualisation, and to take into consideration the physical relation in the genome between the probes on the array. Results M-CGH is a MATLAB toolbox with a graphical user interface designed specifically for the analysis of array CGH experiments, with multiple approaches to ratio normalization. Specifically, the distributions of three classes of DNA copy numbers (gains, normal and losses can be estimated using a maximum likelihood method. Amplicon boundaries are computed by either the fuzzy K-nearest neighbour method or a wavelet approach. The program also allows linking each genomic clone with the corresponding genomic information in the Ensembl database http://www.ensembl.org. Conclusions M-CGH, which encompasses the basic tools needed for analysing array CGH experiments, is freely available for academics http://www.uio.no/~junbaiw/mcgh, and does not require any other MATLAB toolbox.

  9. Considerations for planning and evaluating economic analyses of telemental health.

    Science.gov (United States)

    Luxton, David D

    2013-08-01

    The economic evaluation of telemental health (TMH) is necessary to inform ways to decrease the cost of delivering care, to improve access to care, and to make decisions about the allocation of resources. Previous reviews of telehealth economic analysis studies have concluded that there are significant methodological deficiencies and inconsistencies that limit the ability to make generalized conclusions about the costs and benefits of telehealth programs. Published economic evaluations specific to TMH are also limited. There are unique factors that influence costs in TMH that are necessary for those who are planning and evaluating economic analyses to consider. The purpose of this review is to summarize the main problems and limitations of published economic analyses, to discuss considerations specific to TMH, and to inform and encourage the economic evaluation of TMH in both the public and private sectors. The topics presented here include perspective of costs, direct and indirect costs, and technology, as well as research methodology considerations. The integration of economic analyses into effectiveness trials, the standardization of outcome measurement, and the development of TMH economic evaluation guidelines are recommended. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  10. Subgroup analyses in cost-effectiveness analyses to support health technology assessments.

    Science.gov (United States)

    Fletcher, Christine; Chuang-Stein, Christy; Paget, Marie-Ange; Reid, Carol; Hawkins, Neil

    2014-01-01

    'Success' in drug development is bringing to patients a new medicine that has an acceptable benefit-risk profile and that is also cost-effective. Cost-effectiveness means that the incremental clinical benefit is deemed worth paying for by a healthcare system, and it has an important role in enabling manufacturers to obtain new medicines to patients as soon as possible following regulatory approval. Subgroup analyses are increasingly being utilised by decision-makers in the determination of the cost-effectiveness of new medicines when making recommendations. This paper highlights the statistical considerations when using subgroup analyses to support cost-effectiveness for a health technology assessment. The key principles recommended for subgroup analyses supporting clinical effectiveness published by Paget et al. are evaluated with respect to subgroup analyses supporting cost-effectiveness. A health technology assessment case study is included to highlight the importance of subgroup analyses when incorporated into cost-effectiveness analyses. In summary, we recommend planning subgroup analyses for cost-effectiveness analyses early in the drug development process and adhering to good statistical principles when using subgroup analyses in this context. In particular, we consider it important to provide transparency in how subgroups are defined, be able to demonstrate the robustness of the subgroup results and be able to quantify the uncertainty in the subgroup analyses of cost-effectiveness. Copyright © 2014 John Wiley & Sons, Ltd.

  11. FLUOR HANFORD SAFETY MANAGEMENT PROGRAMS

    Energy Technology Data Exchange (ETDEWEB)

    GARVIN, L. J.; JENSEN, M. A.

    2004-04-13

    This document summarizes safety management programs used within the scope of the ''Project Hanford Management Contract''. The document has been developed to meet the format and content requirements of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses''. This document provides summary descriptions of Fluor Hanford safety management programs, which Fluor Hanford nuclear facilities may reference and incorporate into their safety basis when producing facility- or activity-specific documented safety analyses (DSA). Facility- or activity-specific DSAs will identify any variances to the safety management programs described in this document and any specific attributes of these safety management programs that are important for controlling potentially hazardous conditions. In addition, facility- or activity-specific DSAs may identify unique additions to the safety management programs that are needed to control potentially hazardous conditions.

  12. Roux 105 PHONETIC DATA AND PHONOLOGICAL ANALYSES ...

    African Journals Online (AJOL)

    it will be demonstrated that a phonological analysis (of the same phenome- non), based on verified phonetic data, ... rectness of phonetic data on which phonological analyses are eventually based. Or, seen from another angle, very seldom ...... An Introduction to Tswana Grammar. Longmans. Daniloff, R. & R. HalIIDlarberg: ...

  13. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  14. Thermodynamic modeling to analyse composition of carbonaceous ...

    Indian Academy of Sciences (India)

    Thermodynamic modeling to analyse composition of carbonaceous coatings of MnO and other oxides of manganese grown by MOCVD. SUKANYA DHAR†, A VARADE and S A SHIVASHANKAR. ∗. Materials Research Centre, Indian Institute of Science, Bangalore 560 012, India. †Mechanical Engineering Department ...

  15. Comparative sequence analyses of genome and transcriptome ...

    Indian Academy of Sciences (India)

    2015-12-04

    Dec 4, 2015 ... This work therefore provides a valuable resource to explore the immense research potential of comparative analyses of transcriptome .... species and identified domain architectures that are overrep- resented in elephants. 2. Methods. 2.1 Sample collection, extraction of nucleic acids and next-generation ...

  16. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Heritability estimates derived from threshold analyses for reproduction and stayability traits in a beef cattle herd. ... South African Journal of Animal Science ... The object of this study was to estimate heritabilities and sire breeding values for stayability and reproductive traits in a composite multibreed beef cattle herd using a ...

  17. Regression og geometrisk data analyse (2. del)

    DEFF Research Database (Denmark)

    Brinkkjær, Ulf

    2010-01-01

    Artiklen søger at vise, hvordan regressionsanalyse og geometrisk data analyse kan integreres. Det er interessant, fordi disse metoder ofte opstilles som modsætninger f.eks. som en modsætning mellem beskrivende og forklarende metoder. Artiklens første del bragtes i Praktiske Grunde 3-4 / 2007....

  18. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  19. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  20. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Unknown

    Linear model methodology, such as Henderson's method III, has frequently been used for the analysis of discontinuous as well as continuous data (Olivier et al. 1998). This method of analysing discontinuous data with linear procedures is based on continuous phenotypic distributions and does not take the discontinuity of ...

  1. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  2. ASCORBIC ACID AND MICROBIOLOGICAL ANALYSES OF EXTRA ...

    African Journals Online (AJOL)

    BSN

    Ascorbic acid and microbiological analyses of extra - cotyledonous deposits of Pride of Barbados. (Caesalpina pulcherrima) stored at various temperatures were investigated. 2,6 - Dichlorophenolindophenol (dye) solution titration method was used in ascorbic acid determination while. Nutrient and Sabouraud agar were ...

  3. The Economic Cost of Homosexuality: Multilevel Analyses

    Science.gov (United States)

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  4. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  5. Chemical Analyses of Silicon Aerogel Samples

    CERN Document Server

    van der Werf, I; De Leo, R; Marrone, S

    2008-01-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  6. Three Bibliometric Analyses of Anthropology Literature.

    Science.gov (United States)

    Hider, Philip M.

    1996-01-01

    Describes three bibliometric analyses of articles in the United Kingdom anthropology journal, "Man," focusing on forms of materials cited; relative age of cited publications; and presentation history of articles. Discusses the applicability of bibliometric studies to anthropology librarianship, outlining problems and benefits. (AEF)

  7. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  8. Microbiological And Physicochemical Analyses Of Oil Contaminated ...

    African Journals Online (AJOL)

    Michael Horsfall

    Microbiological And Physicochemical Analyses Of Oil Contaminated Soil From Major. Motor Mechanic Workshops In Benin City Metropolis,Edo State, Nigeria. EKHAISE, F O; NKWELLE, J. Department of Microbiology, Faculty of Life Sciences, University of Benin, Edo State, Nigeria. ABSTRACT: The ability of microorganisms ...

  9. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Unknown

    Abstract. The object of this study was to estimate heritabilities and sire breeding values for stayability and reproductive traits in a composite multibreed beef cattle herd using a threshold model. A GFCAT set of programmes was used to analyse reproductive data. Heritabilities and product-moment correlations between.

  10. Comparison of veterinary import risk analyses studies

    NARCIS (Netherlands)

    Vos-de Jong, de C.J.; Conraths, F.J.; Adkin, A.; Jones, E.M.; Hallgren, G.S.; Paisley, L.G.

    2011-01-01

    Twenty-two veterinary import risk analyses (IRAs) were audited: a) for inclusion of the main elements of risk analysis; b) between different types of IRAs; c) between reviewers' scores. No significant differences were detected between different types of IRAs, although quantitative IRAs and IRAs

  11. Chemical Analyses of Silicon Aerogel Samples

    Energy Technology Data Exchange (ETDEWEB)

    van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano

    2008-04-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  12. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  13. Analysing Stagecoach Network Problem Using Dynamic ...

    African Journals Online (AJOL)

    In this paper we present a recursive dynamic programming algorithm for solving the stagecoach problem. The algorithm is computationally more efficient than the first method as it obtains its minimum total cost using the suboptimal policies of the different stages without computing the cost of all the routes. By the dynamic ...

  14. Point processes statistics of stable isotopes: analysing water uptake patterns in a mixed stand of Aleppo pine and Holm oak

    Directory of Open Access Journals (Sweden)

    Carles Comas

    2015-04-01

    Full Text Available Aim of study: Understanding inter- and intra-specific competition for water is crucial in drought-prone environments. However, little is known about the spatial interdependencies for water uptake among individuals in mixed stands. The aim of this work was to compare water uptake patterns during a drought episode in two common Mediterranean tree species, Quercus ilex L. and Pinus halepensis Mill., using the isotope composition of xylem water (δ18O, δ2H as hydrological marker. Area of study: The study was performed in a mixed stand, sampling a total of 33 oaks and 78 pines (plot area= 888 m2. We tested the hypothesis that both species uptake water differentially along the soil profile, thus showing different levels of tree-to-tree interdependency, depending on whether neighbouring trees belong to one species or the other. Material and Methods: We used pair-correlation functions to study intra-specific point-tree configurations and the bivariate pair correlation function to analyse the inter-specific spatial configuration. Moreover, the isotopic composition of xylem water was analysed as a mark point pattern. Main results: Values for Q. ilex (δ18O = –5.3 ± 0.2‰, δ2H = –54.3 ± 0.7‰ were significantly lower than for P. halepensis (δ18O = –1.2 ± 0.2‰, δ2H = –25.1 ± 0.8‰, pointing to a greater contribution of deeper soil layers for water uptake by Q. ilex. Research highlights: Point-process analyses revealed spatial intra-specific dependencies among neighbouring pines, showing neither oak-oak nor oak-pine interactions. This supports niche segregation for water uptake between the two species.

  15. Parenting Programs

    Directory of Open Access Journals (Sweden)

    Juan Carlos Martín-Quintana

    2010-01-01

    Full Text Available This paper was aimed at emphasizing the importance of using parenting programs to promote parental competences. There is a need for this support taking into account the complexity of the parenting task in our modern societies. Following the European recommendation on positive parenting, those parenting programs are considered important measures to support parents in their educational role. Forward, several generations of parenting programs at the international context were briefly described and some examples of programs within the national context, as well. This paper provides some reflection on three models of parental education, and shows the results of an experiential parenting programs addressed to parents in psychosocial risk situation in two Spanish communities. A new program “Crecer felices en familia”, still in the implementation phase, was also described. As a conclusion, the paper emphasized the importance of evaluating programs in order to know more about their efficacy and to improve the way of implementation in real settings.

  16. Radiation shielding analyses for JT-60SU

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Y. [Mitsubishi Heavy Industries Ltd., Tokyo (Japan); Neyatani, Y. [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment; Ishida, S.; Ashida, S.; Sawamura, H.; Tominaga, M.; Nishimura, K. [Computer Software Development Co., Ltd., Shinjuku, Tokyo (Japan)

    2000-03-01

    Radiation shielding analyses were done for JT-60 Super Upgrade (JT-60SU) of JAERI in Japan, and are reported here. From a viewpoint of the operation and the maintenance, it is important how accurately to evaluate the nuclear heating rate in the superconducting magnet and the activation of components around the vacuum vessel by irradiation of fusion neutrons. A boot-strapped calculation step was applied to the analyses to reduce the redundancy of the conservative in the results. The nuclear heating rate in the superconducting magnets and the maximum {gamma}-ray dose rate one month after shutdown in the cryostat were within the design limits of 2 mW/cc and 10 {mu}Sv/hr, respectively, in the nuclear shielding for a D-D neutron production rate of 1x10{sup 18} s{sup -1}. (author)

  17. Analyse om udviklingen i familieretlige konflikter

    DEFF Research Database (Denmark)

    Ottosen, Mai Heide

    2016-01-01

    Formålet med denne analyse er at kortlægge, hvad man ud fra eksisterende datakilder og undersøgelser kan vurdere om omfanget og karakteren af danske skilsmisseforældres familieretlige konflikter i dag. Analysen kan betragtes som et forarbejde til og et grundlag for videre forskning om familieretl......Formålet med denne analyse er at kortlægge, hvad man ud fra eksisterende datakilder og undersøgelser kan vurdere om omfanget og karakteren af danske skilsmisseforældres familieretlige konflikter i dag. Analysen kan betragtes som et forarbejde til og et grundlag for videre forskning om...... familieretlige konflikter. Analysens tre hovedspørgsmål er: 1) Kan man ud fra befolkningsundersøgelser om skilsmisseudviklingen og om skilsmisseforældres konflikter og samarbejde vurdere, om børn i stigende omfang bliver omfattet af familieretlige konflikter? 2) Er der ud fra de familieretlige myndigheders...

  18. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show that ...... that there is a correspondance between some features of site design and user participation patterns in the projects.......This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show...

  19. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  20. Analytical method used for PIXE analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chassot, E. [Laboratoire de Physique Corpusculaire, CNRS-IN2P3, Universite Blaise Pascal, 24, Avenue des Landais, F-63177 Aubiere Cedex (France)]. E-mail: chassot@clermont.in2p3.fr; Irigaray, J.L. [Laboratoire de Physique Corpusculaire, CNRS-IN2P3, Universite Blaise Pascal, 24, Avenue des Landais, F-63177 Aubiere Cedex (France); Terver, S. [Centre Hospitalier Universitaire de Clermont-Ferrand, Service d' Orthopedie et de Traumatologie, Chirurgie Plastique et Reconstructive II, Universite d' Auvergne, F-63000 Clermont-Ferrand Cedex (France); Vanneuville, G. [Faculte de Medecine, Laboratoire d' Anatomie, Universite d' Auvergne, F-63000 Clermont-Ferrand Cedex (France)

    2005-04-01

    Some metallic prostheses undergo physico-chemical modifications a few years after their implantation. The organism reaction leads to a corrosion process. Consequently, liberation of metallic debris or wear might be observed. The purpose of our analysis was to determine the importance, in surrounding tissues, of the contamination by metallic elements released from joint prosthesis and to localise it. We analysed two types of samples: the first ones were taken during surgical procedure and the other ones were extracted in post-mortem tissues. PIXE technique (Particles Induced X-ray Emission) with a 50 {mu}m proton beam and 3 MeV of energy is an efficient technique to analyse these tissues. We used it to determine qualitatively and quantitatively trace element contamination, and we have developed an analytical method based on the {chi} {sup 2} distribution to separate the different contributions of the different elements, to detect the possible correlation between these metallic elements.

  1. Introduction à l'analyse fonctionnelle

    CERN Document Server

    Reischer, Corina; Hengartner, Walter

    1981-01-01

    Fruit de la collaboration des professeur Walter Hengarther de l'Université Laval, Marcel Lambert et Corina Reischer de l'Université du Québec à Trois-Rivières, Introduction à l'analyse fonctionnelle se distingue tant par l'étendue de son contenu que par l'accessibilité de sa présentation. Sans céder quoi que ce soit sur la rigueur, il est parfaitement adapté à un premier cours d'analyse fonctionnelle. Tout en étant d'abord destiné aux étudiants en mathématiques, il pourra certes être utile aux étudiants de second cycle en sciences et en génie.

  2. Using ENSO to analyse Cloud Radiative Feedback

    Science.gov (United States)

    Kolly, Allison; Huang, Yi

    2017-04-01

    When attempting to diagnose the climate sensitivity, clouds are the cause of much uncertainty as they are highly variable. There exists a discrepancy between climate models and observations on the sign and magnitude of cloud radiative feedback. For example, Dessler (2013) shows that models predict a very strong, positive feedback response to ENSO sea surface temperature anomalies in the central Pacific which is not present in observations. To better understand these discrepancies we are using radiation data from the CERES satellite and ERAi reanalysis data to look at the most recent El Nino events. By looking at temperature and humidity anomalies in the central Pacific which are associated with these events, and using radiative kernels, we can calculate their radiative effects. We extend previous work by not only performing an analysis of TOA but also analysing the surface and atmospheric radiation budgets. Additionally we analyse the latest GCMs (e.g. CMIP5 models) and compare them to observations.

  3. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  4. Hitchhikers’ guide to analysing bird ringing data

    Directory of Open Access Journals (Sweden)

    Harnos Andrea

    2016-06-01

    Full Text Available This paper is the second part of our bird ringing data analyses series (Harnos et al. 2015a in which we continue to focus on exploring data using the R software. We give a short description of data distributions and the measures of data spread and explain how to obtain basic descriptive statistics. We show how to detect and select one and two dimensional outliers and explain how to treat these in case of avian ringing data.

  5. Inelastic and Dynamic Fracture and Stress Analyses

    Science.gov (United States)

    Atluri, S. N.

    1984-01-01

    Large deformation inelastic stress analysis and inelastic and dynamic crack propagation research work is summarized. The salient topics of interest in engine structure analysis that are discussed herein include: (1) a path-independent integral (T) in inelastic fracture mechanics, (2) analysis of dynamic crack propagation, (3) generalization of constitutive relations of inelasticity for finite deformations , (4) complementary energy approaches in inelastic analyses, and (5) objectivity of time integration schemes in inelastic stress analysis.

  6. TARDEC FMEA TRAINING: Understanding and Evaluating Failure Mode and Effects Analyses (FMEA)

    Science.gov (United States)

    2012-06-07

    Unclassified TARDEC FMEA TRAINING: Understanding and Evaluating Failure Mode and Effects Analyses ( FMEA ) TARDEC Systems Engineering Risk...JUN 2012 2. REPORT TYPE Briefing Charts 3. DATES COVERED 01-05-2012 to 31-05-2012 4. TITLE AND SUBTITLE TARDEC FMEA TRAINING: Understanding...and Evaluating Failure Mode and Effects Analyses ( FMEA ) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Kadry Rizk

  7. Renforcement de l'analyse sociale et de l'analyse sexospécifique en ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    EGRN dans la région MOAN ... Le projet devrait permettre de constituer une solide base de chercheurs rompus à l'analyse sociale et à l'analyse sexospécifique, d'améliorer les moyens de ... Les chaînes de valeur comme leviers stratégiques.

  8. TFE Verification Program

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  9. The Family Startup Program

    DEFF Research Database (Denmark)

    Trillingsgaard, Tea; Maimburg, Rikke Damkjær; Simonsen, Marianne

    2015-01-01

    . However, little is known about effect of universal approaches to parenting support during the transition to parenthood. This protocol describes and experimental evaluation of group based parenting support, the Family Startup Program (FSP), currently implemented large scale in Denmark. Methods....../design: Participants will be approximately 2500 pregnant women and partners. Inclusion criteria are parental age above 18 and the mother expecting first child. Families are recruited when attending routine pregnancy scans provided as a part of the publicly available prenatal care program at Aarhus University Hospital...... and community resources. The program consists of twelve group sessions, with nine families in each group, continuing from pregnancy until the child is 15 months old. TAU is the publicly available pre- and postnatal care available to families in both conditions. Analyses will employ survey data, administrative...

  10. SERI Wind Energy Program

    Energy Technology Data Exchange (ETDEWEB)

    Noun, R. J.

    1983-06-01

    The SERI Wind Energy Program manages the areas or innovative research, wind systems analysis, and environmental compatibility for the U.S. Department of Energy. Since 1978, SERI wind program staff have conducted in-house aerodynamic and engineering analyses of novel concepts for wind energy conversion and have managed over 20 subcontracts to determine technical feasibility; the most promising of these concepts is the passive blade cyclic pitch control project. In the area of systems analysis, the SERI program has analyzed the impact of intermittent generation on the reliability of electric utility systems using standard utility planning models. SERI has also conducted methodology assessments. Environmental issues related to television interference and acoustic noise from large wind turbines have been addressed. SERI has identified the causes, effects, and potential control of acoustic noise emissions from large wind turbines.

  11. The role of unconditioned and conditioned cumulative distribution functions in reservoir reliability programing

    Science.gov (United States)

    Simonović, Slobodan P.; Mariño, Miguel A.

    1981-07-01

    This paper presents a comparison of results obtained from the reliability program of a reservoir management problem based on the use of unconditioned and conditioned cumulative distribution functions (CDF's). The reliability program considers the reservoir releases and the reliabilities of the elements of the system as decision variables. Data from the Vardar River system in Yugoslavia are fitted with Pearson Type-III distributions for the case of unconditioned CDF's and with bivariate gamma distributions for the case of conditioned CDF's. The use of conditioned CDF's in the reliability program yields objective-function values and reliability tolerances that are greater than those obtained from the use of unconditioned CDF's. Thus, the use of conditioned CDF's represents one step forward in overcoming the conservative nature of stochastic models used for the design and/or operation of multipurpose multiple reservoir systems.

  12. Advanced composite turboprops - Modeling, structural, and dynamic analyses

    Science.gov (United States)

    Aiello, R. A.; Chi, S.

    1988-01-01

    This paper presents a structural and dynamic analysis of a scaled-down wind tunnel model propfan blade made from fiber composites. This blade is one of a series of propfan blades that have been tested at the NASA Lewis Research Center wind tunnel facilities. The blade is highly swept and twisted and of the spar/shell construction. Due to the complexity of the blade geometry and its high performance, it is subjected to much higher loads and tends to be much less stable than conventional blades. The structural and dynamic analyses of the blade were performed using the NASA-Lewis COBSTRAN computer code. COBSTRAN is designed to generate the mesh and calculate the anisotropic material properties for composite blade analysis. Comparison of analytical and experimental mode shapes and frequencies are shown, verifying the model development and analysis techniques used. The methodologies and programs developed for this analysis are directly applicable to other propfan blades.

  13. Trend analyses of sediment data for the DEC project

    Science.gov (United States)

    Rebich, Richard Allen

    1995-01-01

    Daily stream discharge, suspended-sediment concentration, and suspended-sediment discharge data were collected at eight sites in six watersheds of the Demonstration Erosion Control project in the Yazoo River Basin in north-central Mississippi during the period July 1985 through September 1991. The project is part of an ongoing interagency program of planning, design, construction, monitoring, and evaluation to alleviate flooding, erosion, sedimentation, and water-quality problems for watersheds located in the bluff hills upstream of the Mississippi River alluvial plain. This paper presents preliminary results of trend analyses for stream discharge and sediment data for the eight project sites. More than 550 stream discharge measurements and 20,000 suspended-sediment samples have been collected at the eight sites since 1985.

  14. Material Programming

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Boer, Laurens; Tsaknaki, Vasiliki

    2017-01-01

    , and color, but additionally being capable of sensing, actuating, and computing. Indeed, computers will not be things in and by themselves, but embedded into the materials that make up our surroundings. This also means that the way we interact with computers and the way we program them, will change....... Consequently we ask what the practice of programming and giving form to such materials would be like? How would we be able to familiarize ourselves with the dynamics of these materials and their different combinations of cause and effect? Which tools would we need and what would they look like? Will we program...... these computational composites through external computers and then transfer the code them, or will the programming happen closer to the materials? In this feature we outline a new research program that floats between imagined futures and the development of a material programming practice....

  15. Selection Effects and Prevention Program Outcomes

    Science.gov (United States)

    Hill, Laura G.; Rosenman, Robert; Tennekoon, Vidhura; Mandal, Bidisha

    2013-01-01

    A primary goal of the paper is to provide an example of an evaluation design and analytic method that can be used to strengthen causal inference in nonexperimental prevention research. We used this method in a nonexperimental multisite study to evaluate short-term outcomes of a preventive intervention, and we accounted for effects of two types of selection bias: self-selection into the program and differential dropout. To provide context for our analytic approach, we present an overview of the counterfactual model (also known as Rubin’s causal model or the potential outcomes model) and several methods derived from that model, including propensity score matching, the Heckman two-step approach, and full information maximum likelihood based on a bivariate probit model and its trivariate generalization. We provide an example using evaluation data from a community-based family intervention and a nonexperimental control group constructed from the Washington state biennial Healthy Youth risk behavior survey data (HYS) (HYS n = 68,846; intervention n = 1502). We identified significant effects of participant, program, and community attributes in self-selection into the program and program completion. Identification of specific selection effects is useful for developing recruitment and retention strategies, and failure to identify selection may lead to inaccurate estimation of outcomes and their public health impact. Counterfactual models allow us to evaluate interventions in uncontrolled settings and still maintain some confidence in the internal validity of our inferences; their application holds great promise for the field of prevention science as we scale up to community dissemination of preventive interventions. PMID:23417667

  16. Radiochemical analyses of several spent fuel Approved Testing Materials

    Energy Technology Data Exchange (ETDEWEB)

    Guenther, R.J.; Blahnik, D.E.; Wildung, N.J.

    1994-09-01

    Radiochemical characterization data are described for UO{sub 2} and UO{sub 2} plus 3 wt% Gd{sub 2}O{sub 3} commercial spent nuclear fuel taken from a series of Approved Testing Materials (ATMs). These full-length nuclear fuel rods include MLA091 of ATM-103, MKP070 of ATM-104, NBD095 and NBD131 of ATM-106, and ADN0206 of ATM-108. ATMs 103, 104, and 106 were all irradiated in the Calvert Cliffs Nuclear Power Plant (Reactor No.1), a pressurized-water reactor that used fuel fabricated by Combustion Engineering. ATM-108 was part of the same fuel bundle designed as ATM-105 and came from boiling-water reactor fuel fabricated by General Electric and irradiated in the Cooper Nuclear Power Plant. Rod average burnups and expected fission gas releases ranged from 2,400 to 3,700 GJ/kgM. (25 to 40 Mwd/kgM) and from less than 1% to greater than 10%, respectively, depending on the specific ATM. The radiochemical analyses included uranium and plutonium isotopes in the fuel, selected fission products in the fuel, fuel burnup, cesium and iodine on the inner surfaces of the cladding, {sup 14}C in the fuel and cladding, and analyses of the gases released to the rod plenum. Supporting examinations such as fuel rod design and material descriptions, power histories, and gamma scans used for sectioning diagrams are also included. These ATMs were examined as part of the Materials Characterization Center Program conducted at Pacific Northwest Laboratory provide a source of well-characterized spent fuel for testing in support of the US Department of Energy Office of Civilian Radioactive Waste Management Program.

  17. Effective Programming

    DEFF Research Database (Denmark)

    Frost, Jacob

    To investigate the use of VTLoE as a basis for formal derivation of functional programs with effects. As a part of the process, a number of issues central to effective formal programming are considered. In particular it is considered how to develop a proof system suitable for pratical reasoning......, how to implement this system in the generic proof assistant Isabelle and finally how to apply the logic and the implementation to programming....

  18. Program Fullerene

    DEFF Research Database (Denmark)

    Wirz, Lukas; Peter, Schwerdtfeger,; Avery, James Emil

    2013-01-01

    Fullerene (Version 4.4), is a general purpose open-source program that can generate any fullerene isomer, perform topological and graph theoretical analysis, as well as calculate a number of physical and chemical properties. The program creates symmetric planar drawings of the fullerene graph......-Fowler, and Brinkmann-Fowler vertex insertions. The program is written in standard Fortran and C++, and can easily be installed on a Linux or UNIX environment....

  19. Stable isotopic analyses in paleoclimatic reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Wigand, P.E. [Univ. and Community College System of Nevada, Reno, NV (United States)

    1995-09-01

    Most traditional paleoclimatic proxy data have inherent time lags between climatic input and system response that constrain their use in accurate reconstruction of paleoclimate chronology, scaling of its variability, and the elucidation of the processes that determine its impact on the biotic and abiotic environment. With the exception of dendroclimatology, and studies of short-lived organisms and pollen recovered from annually varved lacustrine sediments, significant periods of time ranging from years, to centuries, to millennia may intervene between climate change and its first manifestation in paleoclimatic proxy data records. Reconstruction of past climate through changes in plant community composition derived from pollen sequences and plant remains from ancient woodrat middens, wet environments and dry caves all suffer from these lags. However, stable isotopic analyses can provide more immediate indication of biotic response to climate change. Evidence of past physiological response of organisms to changes in effective precipitation as climate varies can be provided by analyses of the stable isotopic content of plant macrofossils from various contexts. These analyses consider variation in the stable isotopic (hydrogen, oxygen and carbon) content of plant tissues as it reflects (1) past global or local temperature through changes in meteoric (rainfall) water chemistry in the case of the first two isotopes, and (2) plant stress through changes in plant respiration/transpiration processes under differing water availability, and varying atmospheric CO, composition (which itself may actually be a net result of biotic response to climate change). Studies currently being conducted in the Intermountain West indicate both long- and short-term responses that when calibrated with modem analogue studies have the potential of revealing not only the timing of climate events, but their direction, magnitude and rapidity.

  20. Modelling and Analysing Socio-Technical Systems

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming

    2015-01-01

    and assessing attacks. In our work we model all relevant levels of socio-technical systems, and propose evaluation techniques for analysing the security properties of the model. Our approach simplifies the identification of possible attacks and provides qualified assessment and ranking of attacks based...... on the expected impact. We demonstrate our approach on a home-payment system. The system is specifically designed to help elderly or disabled people, who may have difficulties leaving their home, to pay for some services, e.g., care-taking or rent. The payment is performed using the remote control of a television...

  1. Deux perspectives pour analyser les relations professionnelles

    OpenAIRE

    Dunlop, John T.; Whyte, William F.; Mias, Arnaud

    2016-01-01

    Cet article est la traduction d’un article paru dans la revue Industrial and Labor Relations Review, qui fait suite à un débat organisé à l’université de Princeton au début de l’année 1949, entre William Foote Whyte (1914-2000) et John Thomas Dunlop (1914-2003) à propos du cadre d’analyse des relations professionnelles (Industrial Relations), qui font alors l’objet de recherches de plus en plus nombreuses aux Etats-Unis. Cette controverse entre l’un des chefs de file du mouvement des “relatio...

  2. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    , if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...... attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex...

  3. Analysing human genomes at different scales

    DEFF Research Database (Denmark)

    Liu, Siyang

    genotypes in the Danish Genetics of Overweight Young Adults (GOYA) obesity cohort and prove the clinical usage of the Danish reference panel in genomewide association studies. In the second project, we have collected ultra-low depth sequencing data of more than 140, 000 Chinese pregnant women. We developed...... and applied novel methods to analysing the data that are accumulating rapidly and now reach millions of sample scale. We show that we are able to discover mutations with allele frequencies down to around 0.2% and to explore fine-scale population structure and ancestry across the 31 administrative divisions...

  4. Visuelle Analyse von Eye-Tracking-Daten

    OpenAIRE

    Chen, Xuemei

    2011-01-01

    Eye-Tracking ist eine der am häufigsten eingesetzten Techniken zur Analyse der Mensch-Computer-Interaktion sowie zur Untersuchung der Perzeption. Die erfassten Eye-Tracking-Daten werden meist mit Heat-Maps oder Scan-Paths analysiert, um die Usability der getesteten Anwendung zu ermitteln oder auf höhere kognitive Prozesse zu schließen. Das Ziel dieser Diplomarbeit ist die Entwicklung neuer Visualisierungstechniken für Eye-Tracking-Daten beziehungsweise die Entwicklung eines Studienkonzepts...

  5. Method of performing computational aeroelastic analyses

    Science.gov (United States)

    Silva, Walter A. (Inventor)

    2011-01-01

    Computational aeroelastic analyses typically use a mathematical model for the structural modes of a flexible structure and a nonlinear aerodynamic model that can generate a plurality of unsteady aerodynamic responses based on the structural modes for conditions defining an aerodynamic condition of the flexible structure. In the present invention, a linear state-space model is generated using a single execution of the nonlinear aerodynamic model for all of the structural modes where a family of orthogonal functions is used as the inputs. Then, static and dynamic aeroelastic solutions are generated using computational interaction between the mathematical model and the linear state-space model for a plurality of periodic points in time.

  6. En analyse av Yoga-kundalini-upanisad

    OpenAIRE

    Martinsen, Fred

    2006-01-01

    Avhandlingen En analyse av Yoga-kundalini-upanisad bygger på den indiske asketen Narayanaswamy Aiyers engelske oversettelse av Yoga-kundalini-upanisad, utgitt i Thirty Minor Upanisad-s, Including the Yoga Upanisad-s (Oklahoma, Santarasa Publications, 1980). Denne hinduistiske teksten er omtalt som en av de 21 yoga-upanishadene, den åttisjette av de 108 klassiske upanishadene, og utgjør en del av tekstkorpuset Krsna-Yajurveda. Teksten fungerer som en manual i øvelser fra disiplinene hathayoga,...

  7. Programming Interactivity

    CERN Document Server

    Noble, Joshua

    2009-01-01

    Make cool stuff. If you're a designer or artist without a lot of programming experience, this book will teach you to work with 2D and 3D graphics, sound, physical interaction, and electronic circuitry to create all sorts of interesting and compelling experiences -- online and off. Programming Interactivity explains programming and electrical engineering basics, and introduces three freely available tools created specifically for artists and designers: Processing, a Java-based programming language and environment for building projects on the desktop, Web, or mobile phonesArduino, a system t

  8. Programming F#

    CERN Document Server

    Smith, Chris

    2009-01-01

    Why learn F#? This multi-paradigm language not only offers you an enormous productivity boost through functional programming, it also lets you develop applications using your existing object-oriented and imperative programming skills. With Programming F#, you'll quickly discover the many advantages of Microsoft's new language, which includes access to all the great tools and libraries of the .NET platform. Learn how to reap the benefits of functional programming for your next project -- whether it's quantitative computing, large-scale data exploration, or even a pursuit of your own. With th

  9. Impactos do Programa Bolsa Família federal sobre o trabalho infantil e a frequência escolar Impacts of the Bolsa Família Program on child labor and school attendance

    Directory of Open Access Journals (Sweden)

    Maria Cristina Cacciamali

    2010-08-01

    Full Text Available Este trabalho analisa o impacto do Programa Bolsa Família sobre a incidência de trabalho infantil e a frequência escolar das crianças de famílias pobres no Brasil em 2004, segundo a situação censitária e regional. Para o cálculo dos testes estatísticos, utilizamos um modelo probit bivariado, que estima conjuntamente as opções trabalhar e estudar dos jovens. Os resultados corroboram a eficiência do Programa Bolsa Família em elevar a frequência escolar das crianças; contudo, o Programa apresenta efeitos perversos sobre a incidência de trabalho infantil, elevando a probabilidade de sua ocorrência. Ademais, crianças de famílias pobres situa das em áreas rurais apresentam piores condições em relação àquelas de áreas urbanas, demandando ações específicas a seu favor.This paper analyses the impacts of the Bolsa Família Program on the occurrence of child labor and school attendance of children from poor families in Brazil in 2004, according to census and regional areas. A bivariate probit model was used to estimate the statistical tests. The results corroborate the efficiency of the Bolsa Família to increase the school attendance of children, however, the Program increases the likelihood of occurrence of child labor. Moreover, children of poor households in rural areas have worse conditions than those of urban areas, demanding specific actions to them.

  10. Quantitative analyses of circadian gene expression in mammalian cell cultures.

    Directory of Open Access Journals (Sweden)

    Mariko Izumo

    2006-10-01

    Full Text Available The central circadian pacemaker is located in the hypothalamus of mammals, but essentially the same oscillating system operates in peripheral tissues and even in immortalized cell lines. Using luciferase reporters that allow automated monitoring of circadian gene expression in mammalian fibroblasts, we report the collection and analysis of precise rhythmic data from these cells. We use these methods to analyze signaling pathways of peripheral tissues by studying the responses of Rat-1 fibroblasts to ten different compounds. To quantify these rhythms, which show significant variation and large non-stationarities (damping and baseline drifting, we developed a new fast Fourier transform-nonlinear least squares analysis procedure that specifically optimizes the quantification of amplitude for circadian rhythm data. This enhanced analysis method successfully distinguishes among the ten signaling compounds for their rhythm-inducing properties. We pursued detailed analyses of the responses to two of these compounds that induced the highest amplitude rhythms in fibroblasts, forskolin (an activator of adenylyl cyclase, and dexamethasone (an agonist of glucocorticoid receptors. Our quantitative analyses clearly indicate that the synchronization mechanisms by the cAMP and glucocorticoid pathways are different, implying that actions of different genes stimulated by these pathways lead to distinctive programs of circadian synchronization.

  11. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  12. Hitchhikers’ guide to analysing bird ringing data

    Directory of Open Access Journals (Sweden)

    Harnos Andrea

    2015-12-01

    Full Text Available Bird ringing datasets constitute possibly the largest source of temporal and spatial information on vertebrate taxa available on the globe. Initially, the method was invented to understand avian migration patterns. However, data deriving from bird ringing has been used in an array of other disciplines including population monitoring, changes in demography, conservation management and to study the effects of climate change to name a few. Despite the widespread usage and importance, there are no guidelines available specifically describing the practice of data management, preparation and analyses of ringing datasets. Here, we present the first of a series of comprehensive tutorials that may help fill this gap. We describe in detail and through a real-life example the intricacies of data cleaning and how to create a data table ready for analyses from raw ringing data in the R software environment. Moreover, we created and present here the R package; ringR, designed to carry out various specific tasks and plots related to bird ringing data. Most methods described here can also be applied to a wide range of capture-recapture type data based on individual marking, regardless to taxa or research question.

  13. Autisme et douleur – analyse bibliographique

    Science.gov (United States)

    Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970

  14. Analyses of containment structures with corrosion damage

    Energy Technology Data Exchange (ETDEWEB)

    Cherry, J.L. [Sandia National Labs., Albuquerque, NM (United States)

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.

  15. Post hoc analyses: after the facts.

    Science.gov (United States)

    Srinivas, Titte R; Ho, Bing; Kang, Joseph; Kaplan, Bruce

    2015-01-01

    Prospective clinical trials are constructed with high levels of internal validity. Sample size and power considerations usually address primary endpoints. Primary endpoints have traditionally included events that are becoming increasingly less common and thus have led to growing use of composite endpoints and noninferiority trial designs in transplantation. This approach may mask real clinical benefit in one or the other domain with regard to either clinically relevant secondary endpoints or other unexpected findings. In addition, endpoints solely chosen based on power considerations are prone to misjudgment of actual treatment effect size as well as consistency of that effect. In the instances where treatment effects may have been underestimated, valuable information may be lost if buried within a composite endpoint. In all these cases, analyses and post hoc analyses of data become relevant in informing practitioners about clinical benefits or safety signals that may not be captured by the primary endpoint. On the other hand, there are many pitfalls in using post hoc determined endpoints. This short review is meant to allow readers to appreciate post hoc analysis not as an entity with a single approach, but rather as an analysis with unique limitations and strengths that often raise new questions to be addressed in further inquiries.

  16. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  17. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  18. Bioinformatics tools for analysing viral genomic data.

    Science.gov (United States)

    Orton, R J; Gu, Q; Hughes, J; Maabar, M; Modha, S; Vattipally, S B; Wilkie, G S; Davison, A J

    2016-04-01

    The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing.

  19. Leukoreduction program for red blood cell transfusions in coronary surgery: association with reduced acute kidney injury and in-hospital mortality.

    Science.gov (United States)

    Romano, Gianpaolo; Mastroianni, Ciro; Bancone, Ciro; Della Corte, Alessandro; Galdieri, Nicola; Nappi, Gianantonio; De Santo, Luca Salvatore

    2010-07-01

    Leukocytes in allogeneic blood transfusions cause several immunomodulatory events. This before-and-after cohort study evaluated clinical outcomes after adoption of prestorage leukoreduction program for blood transfusions, with particular focus on acute kidney injury. One thousand thirty-four consecutive patients who underwent on-pump coronary artery bypass grafting between January 2004 and December 2007 were included. Propensity score analysis for transfusion was performed in the whole population; patients who were actually transfused were then divided according to leukoreduction. From these 2 groups, 147 pairs matched for propensity score were considered to evaluate with bivariate and multivariable analyses the effects of leukoreduction, with all-cause in-hospital mortality and morbidity as main outcomes. Unadjusted in-hospital mortalities were 6.6% for the entire cohort and 44.2% for those with acute kidney injury. In the matched population, after introduction of leukoreduction, mortality rates decreased to 5.4% (vs 11.4%) and acute kidney injury (RIFLE [Risk, Injury, Failure, Loss of function, End-stage renal disease] class R or greater) dropped from 51.7% to 41.5% (relative risk -20%, P < .045). No difference emerged regarding other major complications. At multivariable analysis, intra-aortic balloon pump, RIFLE score, and propensity score for transfusion proved independent predictors of in-hospital mortality. Intra-aortic balloon pump and nonleukodepleted transfusion emerged as independent predictors of acute kidney injury. Multivariable analysis on the overall cohort of transfused patients confirmed that nonleukodepleted transfusion was an independent predictor of acute kidney injury. Leukoreduction of allogeneic blood products is associated with decreased acute kidney injury and mortality in highly transfused patients. 2010 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  20. Awareness of pre-exposure prophylaxis (PrEP) among women who inject drugs in NYC: the importance of networks and syringe exchange programs for HIV prevention.

    Science.gov (United States)

    Walters, Suzan M; Reilly, Kathleen H; Neaigus, Alan; Braunstein, Sarah

    2017-06-29

    Women who inject drugs (WWID) are at heightened risk for HIV due to biological, behavioral, and structural factors. Pre-exposure prophylaxis (PrEP) could aid in HIV prevention for WWID. However, little is known about WWID awareness of PrEP, which is a necessary step that must occur before PrEP uptake. We report factors associated with greater awareness among WWID to identify efficient means of awareness dissemination. Data from the 2015 National HIV Behavioral Surveillance (NHBS) system cycle on injection drug use collected in New York City (NYC) were used. Bivariable analyses, using chi-squared statistics, were conducted to examine correlates of awareness of PrEP with socio-demographic, behavioral, and health care variables. Multivariable logistic regression was used to estimate adjusted associations and determine differences in awareness of PrEP. The analysis consisted of 118 WWID. Awareness of PrEP was relatively low (31%), and risk factors were high. In the last 12 months, almost two thirds (65%) reported condomless sex, approximately one third (31%) reported transactional sex, and one third (32%) reported sharing injection equipment. In multivariable logistic regression, increased PrEP awareness was associated with reported transactional sex (AOR 3.32, 95% CI 1.22-9.00) and having a conversation about HIV prevention at a syringe exchange program (SEP) (AOR 7.61, 95% CI 2.65-21.84). We did not find race, education, household income, age, binge drinking, or sexual identity to be significantly associated with PrEP awareness. Large proportions of WWID were unaware of PrEP. These findings suggest that social networks (specifically sex work and SEP networks) are an efficient means for disseminating messaging about prevention materials such as PrEP. We recommend that SEP access increase, SEP processes be adopted in other health care settings, and WWID networks be utilized to increase PrEP awareness.

  1. Under-ascertainment from healthcare settings of child abuse events among children of soldiers by the U.S. Army Family Advocacy Program.

    Science.gov (United States)

    Wood, Joanne N; Griffis, Heather M; Taylor, Christine M; Strane, Douglas; Harb, Gerlinde C; Mi, Lanyu; Song, Lihai; Lynch, Kevin G; Rubin, David M

    2017-01-01

    In cases of maltreatment involving children of U.S. Army service members, the U.S. Army Family Advocacy Program (FAP) is responsible for providing services to families and ensuring child safety. The percentage of cases of maltreatment that are known to FAP, however, is uncertain. Thus, the objective of this retrospective study was to estimate the percentage of U.S. Army dependent children with child maltreatment as diagnosed by a military or civilian medical provider who had a substantiated report with FAP from 2004 to 2007. Medical claims data were used to identify 0-17year old child dependents of soldiers who received a medical diagnosis of child maltreatment. Linkage rates of maltreatment medical diagnoses with corresponding substantiated FAP reports were calculated. Bivariate and multivariable analyses examined the association of child, maltreatment episode, and soldier characteristics with linkage to substantiated FAP reports. Across 5945 medically diagnosed maltreatment episodes, 20.3% had a substantiated FAP report. Adjusting for covariates, the predicted probability of linkage to a substantiated FAP report was higher for physical abuse than for sexual abuse, 25.8%, 95% CI (23.4, 28.3) versus 14.5%, 95% CI (11.2, 17.9). Episodes in which early care was provided at civilian treatment facilities were less likely to have a FAP report than those treated at military facilities, 9.8%, 95% CI (7.3, 12.2) versus 23.6%, 95% CI (20.8, 26.4). The observed low rates of linkage of medically diagnosed child maltreatment to substantiated FAP reports may signal the need for further regulation of FAP reporting requirements, particularly for children treated at civilian facilities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Impact of teachers training on HIV/AIDS education program among secondary school students in Bangladesh: A cross-sectional survey.

    Science.gov (United States)

    Sarma, Haribondhu; Islam, Mohammad Ashraful; Khan, Jahidur Rahman; Chowdhury, Kamal Ibne Amin; Gazi, Rukhsana

    2017-01-01

    In 2007, the Government of Bangladesh incorporated a chapter on HIV/AIDS into the national curriculum for an HIV-prevention program for school students. For the efficient dissemination of knowledge, an intervention was designed to train the teachers and equip them to educate on the topic of HIV/AIDS. The present study intended to understand the impact of this intervention by assessing the knowledge, attitudes and behaviours related to HIV/AIDS, among the targeted students. A cross-sectional survey was conducted with the students at randomly selected schools from two adjacent districts. Considering exposure to intervention, one district was assigned for intervention and the other as a control. In total, 1,381 students, aged 13-18 years (or above) were interviewed, 675 from the control areas and 706 from the intervention areas. Univariate and bivariate analyses were performed on the collected data. A significantly higher proportion (pstudents in the intervention areas attended HIV/AIDS classes, demonstrated better knowledge and fewer misconceptions regarding the transmission and prevention of HIV. The same was derived regarding their attitude towards people living with HIV, as a higher proportion (pstudents in intervention area were more likely to have good knowledge on HIV transmission (OR 2.71, 95% CI 1.74-4.22) and prevention (OR 2.15, 95% CI 1.41-3.26) compared to the students in the control areas. The training programme needs to be scaled up, since it is likely to have an impact among students; we have witnessed that the interventions particularly helped increase HIV/AIDS knowledge among students and positively change the students' attitudes towards HIV/AIDS.

  3. Computer Programs.

    Science.gov (United States)

    Anderson, Tiffoni

    This module provides information on development and use of a Material Safety Data Sheet (MSDS) software program that seeks to link literacy skills education, safety training, and human-centered design. Section 1 discusses the development of the software program that helps workers understand the MSDSs that accompany the chemicals with which they…

  4. HIV Testing Among Young People Aged 16-24 in South Africa: Impact of Mass Media Communication Programs.

    Science.gov (United States)

    Do, Mai; Figueroa, Maria Elena; Lawrence Kincaid, D

    2016-09-01

    Knowing one's serostatus is critical in the HIV prevention, care and treatment continuum. This study examines the impact of communication programs on HIV testing in South Africa. Data came from 2204 young men and women aged 16-24 who reported to be sexually active in a population based survey. Structural equation modeling was used to test the directions and causal pathways between communication program exposure, HIV testing discussion, and having a test in the last 12 months. Bivariate and multivariate probit regressions provided evidence of exogeneity of communication exposure and the two HIV-related outcomes. One in three sampled individuals had been tested in the last 12 months. Communication program exposure only had an indirect effect on getting tested by encouraging young people to talk about testing. The study suggests that communication programs may create an environment that supports open HIV-related discussions and may have a long-term impact on behavior change.

  5. Choreographic Programming

    DEFF Research Database (Denmark)

    Montesi, Fabrizio

    , as they offer a concise view of the message flows enacted by a system. For this reason, in the last decade choreographies have been used in the development of programming languages, giving rise to a programming paradigm that in this dissertation we refer to as Choreographic Programming. Recent studies show...... endpoint described in a choreography can then be automatically generated, ensuring that such implementations are safe by construction. However, current formal models for choreographies do not deal with critical aspects of distributed programming, such as asynchrony, mobility, modularity, and multiparty...... sessions; it remains thus unclear whether choreographies can still guarantee safety when dealing with such nontrivial features. This PhD dissertation argues for the suitability of choreographic programming as a paradigm for the development of safe distributed systems. We proceed by investigating its...

  6. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge.Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls.The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002.Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle. While not the primary purpose of our study

  7. Selecting Television Programs for Language Learning: Investigating Television Programs from the Same Genre

    Science.gov (United States)

    Webb, Stuart

    2011-01-01

    The scripts of 288 television episodes were analysed to determine the extent to which vocabulary reoccurs in television programs from the same subgenres and unrelated television programs from different genres. Episodes from two programs from each of the following three subgenres of the American drama genre: medical, spy/action, and criminal…

  8. Department of Energy's team's analyses of Soviet designed VVERs

    Energy Technology Data Exchange (ETDEWEB)

    1989-09-01

    This document provides Appendices A thru K of this report. The topics discussed respectively are: radiation induced embrittlement and annealing of reactor pressure vessel steels; loss of coolant accident blowdown analyses; LOCA blowdown response analyses; non-seismic structural response analyses; seismic analyses; S'' seal integrity; reactor transient analyses; fire protection; aircraft impacts; and boric acid induced corrosion. (FI).

  9. 40 CFR 270.63 - Permits for land treatment demonstrations using field test or laboratory analyses.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Permits for land treatment demonstrations using field test or laboratory analyses. 270.63 Section 270.63 Protection of Environment... HAZARDOUS WASTE PERMIT PROGRAM Special Forms of Permits § 270.63 Permits for land treatment demonstrations...

  10. Closing the poor-rich gap in contraceptive use in urban Kenya: are family planning programs increasingly reaching the urban poor?

    Science.gov (United States)

    Fotso, Jean Christophe; Speizer, Ilene S; Mukiira, Carol; Kizito, Paul; Lumumba, Vane

    2013-08-27

    Kenya is characterized by high unmet need for family planning (FP) and high unplanned pregnancy, in a context of urban population explosion and increased urban poverty. It witnessed an improvement of its FP and reproductive health (RH) indicators in the recent past, after a period of stalled progress. The objectives of the paper are to: a) describe inequities in modern contraceptive use, types of methods used, and the main sources of contraceptives in urban Kenya; b) examine the extent to which differences in contraceptive use between the poor and the rich widened or shrank over time; and c) attempt to relate these findings to the FP programming context, with a focus on whether the services are increasingly reaching the urban poor. We use data from the 1993, 1998, 2003 and 2008/09 Kenya demographic and health survey. Bivariate analyses describe the patterns of modern contraceptive use and the types and sources of methods used, while multivariate logistic regression models assess how the gap between the poor and the rich varied over time. The quantitative analysis is complemented by a review on the major FP/RH programs carried out in Kenya. There was a dramatic change in contraceptive use between 2003 and 2008/09 that resulted in virtually no gap between the poor and the rich in 2008/09, by contrast to the period 1993-1998 during which the improvement in contraceptive use did not significantly benefit the urban poor. Indeed, the late 1990s marked the realization by the Government of Kenya and its development partners, of the need to deliberately target the poor with family planning services. Most urban women use short-term and less effective methods, with the proportion of long-acting method users dropping by half during the review period. The proportion of private sector users also declined between 2003 and 2008/09. The narrowing gap in the recent past between the urban poor and the urban rich in the use of modern contraception is undoubtedly good news, which

  11. Mediation Analyses in the Real World

    DEFF Research Database (Denmark)

    Lange, Theis; Starkopf, Liis

    2016-01-01

    -code in their publications, thereby shortening the road from reading their paper to employing the considered methods on one’s own data. In this commentary, we will try to follow up on these developments by providing a snapshot of how applied mediation analysis was actually conducted in 2015. While we do not expect to find...... it simultaneously ensures that the comparison is based on properties, which matter in actual applications, and makes the comparison accessible for a broader audience. In a wider context, the choice to stay close to real-life problems mirrors a general trend within the literature on mediation analysis namely to put...... applications using the inverse odds ration approach, as it simply has not had enough time to move from theoretical concept to published applied paper, we do expect to be able to judge the willingness of authors and journals to employ the causal inference-based approach to mediation analyses. Our hope...

  12. Feasibility Analyses of Integrated Broiler Production

    Directory of Open Access Journals (Sweden)

    L. Komalasari

    2010-12-01

    Full Text Available The major obstacles in the development of broiler raising is the expensive price of feed and the fluctuative price of DOCs. The cheap price of imported leg quarters reduces the competitiveness of the local broilers. Therefore, an effort to increase production efficiency is needed through integration between broiler raising and corn farmers and feed producers (integrated farming. The purpose of this study is to analyze the feasibility of integrating broiler raising with corn cultivation and feed production. Besides that, a simulation was conducted to analyze the effects of DOC price changes, broiler price and production capacity. The analyses showed that integrated farming and a mere combination between broiler raising and feed factory of a 10,000 bird capacity is not financially feasible. Increasing the production to 25,000 broiler chickens will make the integrated farming financially feasible. Unintegrated broiler raising is relatively sensitive to broiler price decreases and DOC price increases compared to integrated farming.

  13. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  14. Analysing Terrorism from a Systems Thinking Perspective

    Directory of Open Access Journals (Sweden)

    Lukas Schoenenberger

    2014-02-01

    Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.

  15. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  16. Cointegration Approach to Analysing Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Lena Malešević-Perović

    2009-06-01

    Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.

  17. Attitude stability analyses for small artificial satellites

    Science.gov (United States)

    Silva, W. R.; Zanardi, M. C.; Formiga, J. K. S.; Cabette, R. E. S.; Stuchi, T. J.

    2013-10-01

    The objective of this paper is to analyze the stability of the rotational motion of a symmetrical spacecraft, in a circular orbit. The equilibrium points and regions of stability are established when components of the gravity gradient torque acting on the spacecraft are included in the equations of rotational motion, which are described by the Andoyer's variables. The nonlinear stability of the equilibrium points of the rotational motion is analysed here by the Kovalev-Savchenko theorem. With the application of the Kovalev-Savchenko theorem, it is possible to verify if they remain stable under the influence of the terms of higher order of the normal Hamiltonian. In this paper, numerical simulations are made for a small hypothetical artificial satellite. Several stable equilibrium points were determined and regions around these points have been established by variations in the orbital inclination and in the spacecraft principal moment of inertia. The present analysis can directly contribute in the maintenance of the spacecraft's attitude.

  18. Planning analyses for geothermal district heating

    Energy Technology Data Exchange (ETDEWEB)

    Tessmer, R.G. Jr.; Karkheck, J.

    1979-12-01

    Methodology and data bases are described which can provide a comprehensive planning assessment of the potential for geothermal district heating in any US market. This economic systems model encompasses life-cycle costing over a period of rising competitive fuel prices, it addresses the expansion and financing of a district system over time, and it includes an overall optimization of system design. The elemental area for all analyses is the census tract, for which published data allow estimation of residential and commercial heating demands, building retrofit requirements, and competitive fuel consumption and cost. A system type design, an appropriate hot water district piping system, and costing of heat supply is performed for groups of contiguous tracts in any urban market. Groups are aggregated, in decreasing benefit to cost order, to achieve optimal systems. A specific application for Salt Lake City, Utah, is also described.

  19. Accurate renormalization group analyses in neutrino sector

    Energy Technology Data Exchange (ETDEWEB)

    Haba, Naoyuki [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Kaneta, Kunio [Kavli IPMU (WPI), The University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Takahashi, Ryo [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Yamaguchi, Yuya [Department of Physics, Faculty of Science, Hokkaido University, Sapporo 060-0810 (Japan)

    2014-08-15

    We investigate accurate renormalization group analyses in neutrino sector between ν-oscillation and seesaw energy scales. We consider decoupling effects of top quark and Higgs boson on the renormalization group equations of light neutrino mass matrix. Since the decoupling effects are given in the standard model scale and independent of high energy physics, our method can basically apply to any models beyond the standard model. We find that the decoupling effects of Higgs boson are negligible, while those of top quark are not. Particularly, the decoupling effects of top quark affect neutrino mass eigenvalues, which are important for analyzing predictions such as mass squared differences and neutrinoless double beta decay in an underlying theory existing at high energy scale.

  20. Analysing magnetism using scanning SQUID microscopy.

    Science.gov (United States)

    Reith, P; Renshaw Wang, X; Hilgenkamp, H

    2017-12-01

    Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.

  1. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  2. JENDL-3.2 performance in analyses of MISTRAL critical experiments for high-moderation MOX cores

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Naoyuki [Nuclear Fuel Industries, Ltd., Fuel Engineering and Development Department, Tokai, Ibaraki (Japan); Hibi, Koki [Mitsubishi Heavy Industries Ltd., Tokyo (Japan); Ishii, Kazuya [Hitachi Ltd., Tokyo (Japan); Ando, Yoshihira [Toshiba Corp., Tokyo (Japan); Yamamoto, Toru; Ueji, Masao; Iwata, Yutaka [Nuclear Power Engineering Corp., Tokyo (Japan)

    2001-03-01

    NUPEC and CEA have launched an extensive experimental program called MISTRAL to study highly moderated MOX cores for the advanced LWRs. The analyses using SRAC system and MVP code with JENDL-3.2 library are in progress on the experiments of the MISTRAL and the former EPICURE programs. Various comparisons have been made between calculation results and measurement values. (author)

  3. Programming Python

    CERN Document Server

    Lutz, Mark

    2011-01-01

    If you've mastered Python's fundamentals, you're ready to start using it to get real work done. Programming Python will show you how, with in-depth tutorials on the language's primary application domains: system administration, GUIs, and the Web. You'll also explore how Python is used in databases, networking, front-end scripting layers, text processing, and more. This book focuses on commonly used tools and libraries to give you a comprehensive understanding of Python's many roles in practical, real-world programming. You'll learn language syntax and programming techniques in a clear and co

  4. Database-Driven Analyses of Astronomical Spectra

    Science.gov (United States)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  5. Application of Sigma Metrics and Performance Comparison Between Two Biochemistry Analyser and a Blood Gas Analyser for the Determination of Electrolytes.

    Science.gov (United States)

    Ustundag-Budak, Yasemin; Huysal, Kagan

    2017-02-01

    Electrolytes have a narrow range of biological variation and small changes are clinically significant. It is important to select the best method for clinical decision making and patient monitoring in the emergency room. The sigma metrics model provides an objective method to evaluate the performance of a method. To calculate sigma metrics for electrolytes measured with one arterial blood gas analyser including two auto-analysers that use different technologies. To identify the best approach for electrolyte monitoring in an emergency setting and the context of routine emergency room workflow. The Coefficient of Variation (CV) was determined from Internal Quality Control (IQC). Data was measured from July 2015 to January 2016 for all three analysers. The records of KBUD external quality data (Association of Clinical Biochemists, Istanbul, Turkey) for both Mindray BS-2000M analyser (Mindray, Shenzhen, China) and Architect C16000 (Abbott Diagnostics, Abbott Park, IL) and MLE clinical laboratory evaluation program (Washington, DC, USA) for Radiometer ABL 700 (Radiometer Trading, Copenhagen, Denmark) during the study period were used to determine the bias. The calculated average sigma values for sodium (-1.1), potassium (3.3), and chloride (0.06) were with the Radiometer ABL700. All calculated sigma values were better than the auto-analysers. The sigma values obtained from all analysers suggest that running more controls and increasing the calibration frequency for electrolytes is necessary for quality assurance.

  6. Efficient ALL vs. ALL collision risk analyses

    Science.gov (United States)

    Escobar, D.; Paskowitz, M.; Agueda, A.; Garcia, G.; Molina, M.

    2011-09-01

    In recent years, the space debris has gained a lot of attention due to the increasing amount of uncontrolled man-made objects orbiting the Earth. This population poses a significant and constantly growing thread to operational satellites. In order to face this thread in an independent manner, ESA has launched an initiative for the development of a European SSA System where GMV is participating via several activities. Apart from those activities financed by ESA, GMV has developed closeap, a tool for efficient conjunction assessment and collision probability prediction. ESÁs NAPEOS has been selected as computational engine and numerical propagator to be used in the tool, which can be considered as an add-on to the standard NAPEOS package. closeap makes use of the same orbit computation, conjunction assessment and collision risk algorithms implemented in CRASS, but at the same time both systems are completely independent. Moreover, the implementation in closeap has been validated against CRASS with excellent results. This paper describes the performance improvements implemented in closeap at algorithm level to ensure that the most time demanding scenarios (e.g., all catalogued objects are analysed against each other - all vs. all scenarios -) can be analysed in a reasonable amount of time with commercial-off-the-shelf hardware. However, the amount of space debris increases steadily due to the human activities. Thus, the number of objects involved in a full collision assessment is expected to increase notably and, consequently, the computational cost, which scales as the square of the number of objects, will increase as well. Additionally, orbit propagation algorithms that are computationally expensive might be needed to predict more accurately the trajectories of the space debris. In order to cope with such computational needs, the next natural step in the development of collision assessment tools is the use of parallelization techniques. In this paper we investigate

  7. Analysing Scenarios of Cell Population System Development

    Directory of Open Access Journals (Sweden)

    M. S. Vinogradova

    2014-01-01

    Full Text Available The article considers an isolated population system consisting of two types of human stem cells, namely normal cells and cells with chromosomal abnormalities (abnormal ones. The system develops in the laboratory (in vitro. The article analyses possible scenarios of the population system development, which are implemented for different values of its parameters. An investigated model of the cell population system takes into account the limited resources. It is represented as a system of two nonlinear differential equations with continuous right-hand part. The model is considered with non-negative values of the variables; the domain is divided into four sets. The model feature is that in each set the right part of the system of differential equations has a different form.The article analyses a quality of the rest points of the system in each of four sets. The analytical conditions for determination of the number of rest points and the quality of rest points, with, at least, one zero coordinate, are obtained.It is shown that the population system under study cannot have more than two points of rest, both coordinates of which are positive (non-zero. It is difficult to determine quality of such rest points depending on the model parameters due to the complexity of the expressions, which define the systems of the first approximation, recorded in a neighborhood of these points of rest. Numerical research results of the stability of these points of rest are obtained, and phase portraits with the specified specific values of the system parameters are demonstrated. The main scenarios for the cell population development are adduced. Analysis of mathematical model shows that a cell population system may remain the system consisting of populations of normal and abnormal cells; it can degenerate into a population of abnormal cells or perish. The scenario, in which there is only the population of normal cells, is not implemented. The numerical simulation

  8. Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.

    Science.gov (United States)

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-05-01

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.

  9. Programming Python

    National Research Council Canada - National Science Library

    Lutz, Mark

    2006-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3 3 8 9 15 17 20 22 2. A Sneak Preview "Programming Python: The Short Story" The Task Step 1: Representing Records Step 2: Storing Records Persistently Step 3...

  10. Linear programming

    CERN Document Server

    Solow, Daniel

    2014-01-01

    This text covers the basic theory and computation for a first course in linear programming, including substantial material on mathematical proof techniques and sophisticated computation methods. Includes Appendix on using Excel. 1984 edition.

  11. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  12. Social Media Analyses for Social Measurement

    Science.gov (United States)

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  13. Applications of Parallel Processing in Configuration Analyses

    Science.gov (United States)

    Sundaram, Ppchuraman; Hager, James O.; Biedron, Robert T.

    1999-01-01

    The paper presents the recent progress made towards developing an efficient and user-friendly parallel environment for routine analysis of large CFD problems. The coarse-grain parallel version of the CFL3D Euler/Navier-Stokes analysis code, CFL3Dhp, has been ported onto most available parallel platforms. The CFL3Dhp solution accuracy on these parallel platforms has been verified with the CFL3D sequential analyses. User-friendly pre- and post-processing tools that enable a seamless transfer from sequential to parallel processing have been written. Static load balancing tool for CFL3Dhp analysis has also been implemented for achieving good parallel efficiency. For large problems, load balancing efficiency as high as 95% can be achieved even when large number of processors are used. Linear scalability of the CFL3Dhp code with increasing number of processors has also been shown using a large installed transonic nozzle boattail analysis. To highlight the fast turn-around time of parallel processing, the TCA full configuration in sideslip Navier-Stokes drag polar at supersonic cruise has been obtained in a day. CFL3Dhp is currently being used as a production analysis tool.

  14. Analyse de plomb dans les peintures

    Science.gov (United States)

    Broll, N.; Frezouls, J.-M.

    2002-07-01

    The analysis of lead in paints was previously used for the characterisation of pigments. In this way, the analysis is able to specify the century of the painting of a work of art. Recently this technique was also used to determine the toxicity of lead paints in building. This paper compared the result of several X-ray fluorescence spectrometer, either wave length/energy dispersion laboratory apparatus or X-ray microtube/radioactive source portable equipment's. L'analyse du plomb dans les peintures a jusqu'à présent été appliquée essentiellement pour caractériser les pigments lors de leur fabrication et pour identifier des rouvres d'art. Récemment cette technique est également utilisée pour déterminer la toxicité des peintures au plomb dans les bâtiments. Nous avons comparé les performances de plusieurs spectromètres de fluorescence X, soit de laboratoire à dispersion en longueur d'onde ou à dispersion en énergie (avec tube à rayonsX), soit portable avec source radioactive ou tube à rayons X.

  15. Network-Based and Binless Frequency Analyses.

    Directory of Open Access Journals (Sweden)

    Sybil Derrible

    Full Text Available We introduce and develop a new network-based and binless methodology to perform frequency analyses and produce histograms. In contrast with traditional frequency analysis techniques that use fixed intervals to bin values, we place a range ±ζ around each individual value in a data set and count the number of values within that range, which allows us to compare every single value of a data set with one another. In essence, the methodology is identical to the construction of a network, where two values are connected if they lie within a given a range (±ζ. The value with the highest degree (i.e., most connections is therefore assimilated to the mode of the distribution. To select an optimal range, we look at the stability of the proportion of nodes in the largest cluster. The methodology is validated by sampling 12 typical distributions, and it is applied to a number of real-world data sets with both spatial and temporal components. The methodology can be applied to any data set and provides a robust means to uncover meaningful patterns and trends. A free python script and a tutorial are also made available to facilitate the application of the method.

  16. Analysing the thermal characteristics of LAMP joining

    Directory of Open Access Journals (Sweden)

    Tamás Markovits

    2014-07-01

    Full Text Available The increasing utilisation of different material groups (plastic, metal, ceramics in our advanced construction provides many benefits, but the fastening process is a challenge. LAMP (Laser assisted metal plastic joining is a new technique in fastening technology. It means an alternative process from the existing techniques like using adhesives, screwing, riveting etc. The authors have been dealing with this technology for years. In this research work some important thermal phenomena were analysed in order to understand the process of joining more thoroughly. The temperature of the steel partner was measured in case of different laser settings and experimental situations with two measuring techniques: thermocouple and infrared camera. The results show the effect of different influencing factors during heating and the applicability of different measuring methods. The received temperature values can be compared to the characteristic temperature of PMMA polymer (decomposition temperature in order to determine the root cause of bubble forming in the polymer material. From the result the differences between the different applied laser pulse mode for the heating was also determined and it was possible to measure the heating rate during the laser process.

  17. Design and Analyses of Electromagnetic Microgenerator

    Directory of Open Access Journals (Sweden)

    Nibras Awaja

    2009-04-01

    Full Text Available This paper presents a new design of an electromagnetic microgenerator. The desgin aspects of the microgenerator comprises spring, coil and rear earth magnet have been addressed. The theoretical analyses of the electromagnetic microgenerator are established. Firstly, steady state analysis has been undertaken to determine the practical performance of the device. It is found that the generator will produce more power in applications with high frequency of vibration. Secondly, electromagnetic analysis is established to calculate the generated power on the load. It is found that the output power can be maximized when the impedance of the coil is less than the load impedance and when using a magnet with high magnetic field. Mechanical parameters like (damping factor, resonant frequency, proof mass and maximum displacement and magnetic parameters like (load resistance, coil resistance, and the magnetic field have been adjusted to optimize the output power through a comprehensive theoretical study. A range of microgenerator output power values are obtained in accordance with the consideration of the design parameters.

  18. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  19. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  20. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.