WorldWideScience

Sample records for bivariate analyses showed

  1. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    Science.gov (United States)

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  2. A dynamic bivariate Poisson model for analysing and forecasting match results in the English Premier League

    NARCIS (Netherlands)

    Koopman, S.J.; Lit, R.

    2015-01-01

    Summary: We develop a statistical model for the analysis and forecasting of football match results which assumes a bivariate Poisson distribution with intensity coefficients that change stochastically over time. The dynamic model is a novelty in the statistical time series analysis of match results

  3. On bivariate geometric distribution

    Directory of Open Access Journals (Sweden)

    K. Jayakumar

    2013-05-01

    Full Text Available Characterizations of bivariate geometric distribution using univariate and bivariate geometric compounding are obtained. Autoregressive models with marginals as bivariate geometric distribution are developed. Various bivariate geometric distributions analogous to important bivariate exponential distributions like, Marshall-Olkin’s bivariate exponential, Downton’s bivariate exponential and Hawkes’ bivariate exponential are presented.

  4. Powerful bivariate genome-wide association analyses suggest the SOX6 gene influencing both obesity and osteoporosis phenotypes in males.

    Directory of Open Access Journals (Sweden)

    Yao-Zhong Liu

    2009-08-01

    Full Text Available Current genome-wide association studies (GWAS are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically.To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI, with the osteoporosis risk phenotype, hip bone mineral density (BMD, scanning approximately 380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6 gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82x10(-7 and 1.47x10(-6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the approximately 380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS cohort containing 3,355 Caucasians (1,370 males and 1,985 females from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat.Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis.

  5. Powerful Bivariate Genome-Wide Association Analyses Suggest the SOX6 Gene Influencing Both Obesity and Osteoporosis Phenotypes in Males

    Science.gov (United States)

    Liu, Yao-Zhong; Pei, Yu-Fang; Liu, Jian-Feng; Yang, Fang; Guo, Yan; Zhang, Lei; Liu, Xiao-Gang; Yan, Han; Wang, Liang; Zhang, Yin-Ping; Levy, Shawn; Recker, Robert R.; Deng, Hong-Wen

    2009-01-01

    Background Current genome-wide association studies (GWAS) are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically. Principal Findings To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI), with the osteoporosis risk phenotype, hip bone mineral density (BMD), scanning ∼380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6) gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82×10−7 and 1.47×10−6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the ∼380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS) cohort containing 3,355 Caucasians (1,370 males and 1,985 females) from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat. Conclusions Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis. PMID:19714249

  6. Analysing risk factors of co-occurrence of schistosomiasis haematobium and hookworm using bivariate regression models: Case study of Chikwawa, Malawi

    Directory of Open Access Journals (Sweden)

    Bruce B.W. Phiri

    2016-06-01

    Full Text Available Schistosomiasis and soil-transmitted helminth (STH infections constitute a major public health problem in many parts of sub-Saharan Africa. In areas where prevalence of geo-helminths and schistosomes is high, co-infection with multiple parasite species is common, resulting in disproportionately elevated burden compared with single infections. Determining risk factors of co-infection intensity is important for better design of targeted interventions. In this paper, we examined risk factors of hookworm and S. haematobium co-infection intensity, in Chikwawa district, southern Malawi in 2005, using bivariate count models. Results show that hookworm and S. haematobium infections were much localised with small proportion of individuals harbouring more parasites especially among school-aged children. The risk of co-intensity with both hookworm and S. haematobium was high for all ages, although this diminished with increasing age, increased with fishing (hookworm: coefficient. = 12.29; 95% CI = 11.50–13.09; S. haematobium: 0.040; 95% CI = 0.0037, 3.832. Both infections were abundant in those with primary education (hookworm: coef. = 0.072; 95% CI = 0.056, 0.401 and S. haematobium: coef. = 0.286; 95% CI = 0.034, 0.538. However, much lower risk was observed for those who were farmers (hookworm: coef. = −0.349, 95% CI = −0.547,−0.150; S. haematobium: coef. −0.239, 95% CI = −0.406, −0.072. In conclusion, our findings suggest that efforts to control helminths infection should be co-integrated and health promotion campaigns should be aimed at school-going children and adults who are in constant contact with water.

  7. Ordinal bivariate inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlationincreasing switches and/or median......-preserving spreads. For the canonical 2x2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  8. Ordinal Bivariate Inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    2016-01-01

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlation-increasing switches and....../or median-preserving spreads. For the canonical 2 × 2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  9. An Affine Invariant Bivariate Version of the Sign Test.

    Science.gov (United States)

    1987-06-01

    words: affine invariance, bivariate quantile, bivariate symmetry, model,. generalized median, influence function , permutation test, normal efficiency...calculate a bivariate version of the influence function , and the resulting form is bounded, as is the case for the univartate sign test, and shows the...terms of a blvariate analogue of IHmpel’s (1974) influence function . The latter, though usually defined as a von-Mises derivative of certain

  10. Advanced Behavioral Analyses Show that the Presence of Food Causes Subtle Changes in C. elegans Movement.

    Science.gov (United States)

    Angstman, Nicholas B; Frank, Hans-Georg; Schmitz, Christoph

    2016-01-01

    As a widely used and studied model organism, Caenorhabditis elegans worms offer the ability to investigate implications of behavioral change. Although, investigation of C. elegans behavioral traits has been shown, analysis is often narrowed down to measurements based off a single point, and thus cannot pick up on subtle behavioral and morphological changes. In the present study videos were captured of four different C. elegans strains grown in liquid cultures and transferred to NGM-agar plates with an E. coli lawn or with no lawn. Using an advanced software, WormLab, the full skeleton and outline of worms were tracked to determine whether the presence of food affects behavioral traits. In all seven investigated parameters, statistically significant differences were found in worm behavior between those moving on NGM-agar plates with an E. coli lawn and NGM-agar plates with no lawn. Furthermore, multiple test groups showed differences in interaction between variables as the parameters that significantly correlated statistically with speed of locomotion varied. In the present study, we demonstrate the validity of a model to analyze C. elegans behavior beyond simple speed of locomotion. The need to account for a nested design while performing statistical analyses in similar studies is also demonstrated. With extended analyses, C. elegans behavioral change can be investigated with greater sensitivity, which could have wide utility in fields such as, but not limited to, toxicology, drug discovery, and RNAi screening.

  11. Advanced behavioral analyses show that the presence of food causes subtle changes in C. elegans movement

    Directory of Open Access Journals (Sweden)

    Nicholas eAngstman

    2016-03-01

    Full Text Available As a widely used and studied model organism, C. elegans worms offer the ability to investigate implications of behavioral change. Although investigation of C. elegans behavioral traits has been shown, analysis is often narrowed down to measurements based off a single point, and thus cannot pick up on subtle behavioral and morphological changes. In the present study videos were captured of four different C. elegans strains grown in liquid cultures and transferred to NGM-agar plates with an E. coli lawn or with no lawn. Using an advanced software, WormLab, the full skeleton and outline of worms were tracked to determine whether the presence of food affects behavioral traits. In all seven investigated parameters, statistically significant differences were found in worm behavior between those moving on NGM-agar plates with an E. coli lawn and NGM-agar plates with no lawn. Furthermore, multiple test groups showed differences in interaction between variables as the parameters that significantly correlated statistically with speed of locomotion varied. In the present study, we demonstrate the validity of a model to analyze C. elegans behavior beyond simple speed of locomotion. The need to account for a nested design while performing statistical analyses in similar studies is also demonstrated. With extended analyses, C. elegans behavioral change can be investigated with greater sensitivity, which could have wide utility in fields such as, but not limited to, toxicology, drug discovery, and RNAi screening.

  12. Chin Shan analyses show advantages of whole pool multi-rack approach

    International Nuclear Information System (INIS)

    Singh, K.P.; Soler, A.I.

    1991-01-01

    Nuclear fuel storage racks are essentially thin-walled, cellular structures of prismatic cross-section. Although the details of design vary from one supplier to another, certain key physical attributes are common to all designs. For example, all racks feature square cells of sufficient opening size and height to enable insertion and withdrawal of the fuel assembly. The array of cells is positioned in a vertical orientation and is supported off the pool slab surface by four or more support legs. The spent fuel pool is filled with the individual fuel racks. The plenum created by the support legs is essential for proper cooling of the fuel assemblies stored in the rack, which relies on natural convective cooling to extract the heat emitted by the spent fuel. However, it has the insalutary effect of making it kinematically less stable. Regulatory authorities require careful and comprehensive analysis of the response of the racks under the seismic motions postulated for the pool slab. Results from whole pool multi-rack (WPMR) analyses at the Chin Shan and Oyster Creek nuclear plants point up the potential inadequacies of single rack 3D analyses, and show just how important it is to carry out WPMR simulations, despite their abstruseness and high cost. (author)

  13. Bivariate value-at-risk

    Directory of Open Access Journals (Sweden)

    Giuseppe Arbia

    2007-10-01

    Full Text Available In this paper we extend the concept of Value-at-risk (VaR to bivariate return distributions in order to obtain measures of the market risk of an asset taking into account additional features linked to downside risk exposure. We first present a general definition of risk as the probability of an adverse event over a random distribution and we then introduce a measure of market risk (b-VaR that admits the traditional b of an asset in portfolio management as a special case when asset returns are normally distributed. Empirical evidences are provided by using Italian stock market data.

  14. Preliminary Analyses Showed Short-Term Mental Health Improvements after a Single-Day Manager Training.

    Science.gov (United States)

    Boysen, Elena; Schiller, Birgitta; Mörtl, Kathrin; Gündel, Harald; Hölzer, Michael

    2018-01-10

    Psychosocial working conditions attract more and more attention when it comes to mental health in the workplace. Trying to support managers to deal with their own as well as their employees' psychological risk factors, we conducted a specific manager training. Within this investigation, we wanted to learn about the training's effects and acceptance. A single-day manager training was provided in a large industrial company in Germany. The participants were asked to fill out questionnaires regarding their own physical and mental health condition as well as their working situation. Questionnaires were distributed at baseline, 3-month, and 12-month follow-up. At this point of time the investigation is still ongoing. The current article focuses on short-term preliminary effects. Analyses only included participants that already completed baseline and three months follow-up. Preliminary results from three-month follow-up survey ( n = 33, nmale = 30, Mage = 47.5) indicated positive changes in the manager's mental health condition measured by the Patient Health Questionnaire for depression (PHQ-9: Mt1 = 3.82, Mt2 = 3.15). Training managers about common mental disorders and risk factors at the workplace within a single-day workshop seems to promote positive effects on their own mental health. Especially working with the managers on their own early stress symptoms might have been an important element.

  15. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  16. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.

  17. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  18. Bivariate extreme value with application to PM10 concentration analysis

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-05-01

    This study is focus on a bivariate extreme of renormalized componentwise maxima with generalized extreme value distribution as a marginal function. The limiting joint distribution of several parametric models are presented. Maximum likelihood estimation is employed for parameter estimations and the best model is selected based on the Akaike Information Criterion. The weekly and monthly componentwise maxima series are extracted from the original observations of daily maxima PM10 data for two air quality monitoring stations located in Pasir Gudang and Johor Bahru. The 10 years data are considered for both stations from year 2001 to 2010. The asymmetric negative logistic model is found as the best fit bivariate extreme model for both weekly and monthly maxima componentwise series. However the dependence parameters show that the variables for weekly maxima series is more dependence to each other compared to the monthly maxima.

  19. Bivariate copula in fitting rainfall data

    Science.gov (United States)

    Yee, Kong Ching; Suhaila, Jamaludin; Yusof, Fadhilah; Mean, Foo Hui

    2014-07-01

    The usage of copula to determine the joint distribution between two variables is widely used in various areas. The joint distribution of rainfall characteristic obtained using the copula model is more ideal than the standard bivariate modelling where copula is belief to have overcome some limitation. Six copula models will be applied to obtain the most suitable bivariate distribution between two rain gauge stations. The copula models are Ali-Mikhail-Haq (AMH), Clayton, Frank, Galambos, Gumbel-Hoogaurd (GH) and Plackett. The rainfall data used in the study is selected from rain gauge stations which are located in the southern part of Peninsular Malaysia, during the period from 1980 to 2011. The goodness-of-fit test in this study is based on the Akaike information criterion (AIC).

  20. Reliability for some bivariate beta distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models there has been a large amount of work as regards estimation of the reliability R=Pr( Xbivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate beta. The calculations involve the use of special functions.

  1. Reliability for some bivariate gamma distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models, there has been a large amount of work as regards estimation of the reliability R=Pr( Xbivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate gamma. The calculations involve the use of special functions.

  2. Covariate analysis of bivariate survival data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  3. Phenotypic and functional analyses show stem cell-derived hepatocyte-like cells better mimic fetal rather than adult hepatocytes.

    Science.gov (United States)

    Baxter, Melissa; Withey, Sarah; Harrison, Sean; Segeritz, Charis-Patricia; Zhang, Fang; Atkinson-Dell, Rebecca; Rowe, Cliff; Gerrard, Dave T; Sison-Young, Rowena; Jenkins, Roz; Henry, Joanne; Berry, Andrew A; Mohamet, Lisa; Best, Marie; Fenwick, Stephen W; Malik, Hassan; Kitteringham, Neil R; Goldring, Chris E; Piper Hanley, Karen; Vallier, Ludovic; Hanley, Neil A

    2015-03-01

    Hepatocyte-like cells (HLCs), differentiated from pluripotent stem cells by the use of soluble factors, can model human liver function and toxicity. However, at present HLC maturity and whether any deficit represents a true fetal state or aberrant differentiation is unclear and compounded by comparison to potentially deteriorated adult hepatocytes. Therefore, we generated HLCs from multiple lineages, using two different protocols, for direct comparison with fresh fetal and adult hepatocytes. Protocols were developed for robust differentiation. Multiple transcript, protein and functional analyses compared HLCs to fresh human fetal and adult hepatocytes. HLCs were comparable to those of other laboratories by multiple parameters. Transcriptional changes during differentiation mimicked human embryogenesis and showed more similarity to pericentral than periportal hepatocytes. Unbiased proteomics demonstrated greater proximity to liver than 30 other human organs or tissues. However, by comparison to fresh material, HLC maturity was proven by transcript, protein and function to be fetal-like and short of the adult phenotype. The expression of 81% phase 1 enzymes in HLCs was significantly upregulated and half were statistically not different from fetal hepatocytes. HLCs secreted albumin and metabolized testosterone (CYP3A) and dextrorphan (CYP2D6) like fetal hepatocytes. In seven bespoke tests, devised by principal components analysis to distinguish fetal from adult hepatocytes, HLCs from two different source laboratories consistently demonstrated fetal characteristics. HLCs from different sources are broadly comparable with unbiased proteomic evidence for faithful differentiation down the liver lineage. This current phenotype mimics human fetal rather than adult hepatocytes. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  4. On the matched pairs sign test using bivariate ranked set sampling ...

    African Journals Online (AJOL)

    BVRSS) is introduced and investigated. We show that this test is asymptotically more efficient than its counterpart sign test based on a bivariate simple random sample (BVSRS). The asymptotic null distribution and the efficiency of the test are derived.

  5. Spectral density regression for bivariate extremes

    KAUST Repository

    Castro Camilo, Daniela

    2016-05-11

    We introduce a density regression model for the spectral density of a bivariate extreme value distribution, that allows us to assess how extremal dependence can change over a covariate. Inference is performed through a double kernel estimator, which can be seen as an extension of the Nadaraya–Watson estimator where the usual scalar responses are replaced by mean constrained densities on the unit interval. Numerical experiments with the methods illustrate their resilience in a variety of contexts of practical interest. An extreme temperature dataset is used to illustrate our methods. © 2016 Springer-Verlag Berlin Heidelberg

  6. Bivariate Kumaraswamy Models via Modified FGM Copulas: Properties and Applications

    Directory of Open Access Journals (Sweden)

    Indranil Ghosh

    2017-11-01

    Full Text Available A copula is a useful tool for constructing bivariate and/or multivariate distributions. In this article, we consider a new modified class of FGM (Farlie–Gumbel–Morgenstern bivariate copula for constructing several different bivariate Kumaraswamy type copulas and discuss their structural properties, including dependence structures. It is established that construction of bivariate distributions by this method allows for greater flexibility in the values of Spearman’s correlation coefficient, ρ and Kendall’s τ .

  7. Computational approach to Thornley's problem by bivariate operational calculus

    Science.gov (United States)

    Bazhlekova, E.; Dimovski, I.

    2012-10-01

    Thornley's problem is an initial-boundary value problem with a nonlocal boundary condition for linear onedimensional reaction-diffusion equation, used as a mathematical model of spiral phyllotaxis in botany. Applying a bivariate operational calculus we find explicit representation of the solution, containing two convolution products of special solutions and the arbitrary initial and boundary functions. We use a non-classical convolution with respect to the space variable, extending in this way the classical Duhamel principle. The special solutions involved are represented in the form of fast convergent series. Numerical examples are considered to show the application of the present technique and to analyze the character of the solution.

  8. A generalized right truncated bivariate Poisson regression model with applications to health data.

    Science.gov (United States)

    Islam, M Ataharul; Chowdhury, Rafiqul I

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.

  9. Bivariate Rayleigh Distribution and its Properties

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2007-01-01

    Full Text Available Rayleigh (1880 observed that the sea waves follow no law because of the complexities of the sea, but it has been seen that the probability distributions of wave heights, wave length, wave induce pitch, wave and heave motions of the ships follow the Rayleigh distribution. At present, several different quantities are in use for describing the state of the sea; for example, the mean height of the waves, the root mean square height, the height of the “significant waves” (the mean height of the highest one-third of all the waves the maximum height over a given interval of the time, and so on. At present, the ship building industry knows less than any other construction industry about the service conditions under which it must operate. Only small efforts have been made to establish the stresses and motions and to incorporate the result of such studies in to design. This is due to the complexity of the problem caused by the extensive variability of the sea and the corresponding response of the ships. Although the problem appears feasible, yet it is possible to predict service conditions for ships in an orderly and relatively simple manner Rayleigh (1980 derived it from the amplitude of sound resulting from many independent sources. This distribution is also connected with one or two dimensions and is sometimes referred to as “random walk” frequency distribution. The Rayleigh distribution can be derived from the bivariate normal distribution when the variate are independent and random with equal variances. We try to construct bivariate Rayleigh distribution with marginal Rayleigh distribution function and discuss its fundamental properties.

  10. A comparison of bivariate, multivariate random-effects, and Poisson correlated gamma-frailty models to meta-analyze individual patient data of ordinal scale diagnostic tests.

    Science.gov (United States)

    Simoneau, Gabrielle; Levis, Brooke; Cuijpers, Pim; Ioannidis, John P A; Patten, Scott B; Shrier, Ian; Bombardier, Charles H; de Lima Osório, Flavia; Fann, Jesse R; Gjerdingen, Dwenda; Lamers, Femke; Lotrakul, Manote; Löwe, Bernd; Shaaban, Juwita; Stafford, Lesley; van Weert, Henk C P M; Whooley, Mary A; Wittkampf, Karin A; Yeung, Albert S; Thombs, Brett D; Benedetti, Andrea

    2017-11-01

    Individual patient data (IPD) meta-analyses are increasingly common in the literature. In the context of estimating the diagnostic accuracy of ordinal or semi-continuous scale tests, sensitivity and specificity are often reported for a given threshold or a small set of thresholds, and a meta-analysis is conducted via a bivariate approach to account for their correlation. When IPD are available, sensitivity and specificity can be pooled for every possible threshold. Our objective was to compare the bivariate approach, which can be applied separately at every threshold, to two multivariate methods: the ordinal multivariate random-effects model and the Poisson correlated gamma-frailty model. Our comparison was empirical, using IPD from 13 studies that evaluated the diagnostic accuracy of the 9-item Patient Health Questionnaire depression screening tool, and included simulations. The empirical comparison showed that the implementation of the two multivariate methods is more laborious in terms of computational time and sensitivity to user-supplied values compared to the bivariate approach. Simulations showed that ignoring the within-study correlation of sensitivity and specificity across thresholds did not worsen inferences with the bivariate approach compared to the Poisson model. The ordinal approach was not suitable for simulations because the model was highly sensitive to user-supplied starting values. We tentatively recommend the bivariate approach rather than more complex multivariate methods for IPD diagnostic accuracy meta-analyses of ordinal scale tests, although the limited type of diagnostic data considered in the simulation study restricts the generalization of our findings. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Multi-Scale Carbon Isotopic Analyses Show Allende Nanodiamonds are Mostly Solar with Some PreSolar

    Science.gov (United States)

    Lewis, J. B.; Isheim, D.; Floss, C.; Gyngard, F.; Seidman, D. N.

    2017-07-01

    NanoSIMS and atom-probe experiments on different-sized aggregates of meteoritic nanodiamonds show mostly normal C isotopes, with a fraction of 13C-enriched material. The best interpretation is a combination of solar system and supernova formation.

  12. A Bivariate return period for levee failure monitoring

    Science.gov (United States)

    Isola, M.; Caporali, E.

    2017-12-01

    Levee breaches are strongly linked with the interaction processes among water, soil and structure, thus many are the factors that affect the breach development. One of the main is the hydraulic load, characterized by intensity and duration, i.e. by the flood event hydrograph. On the magnitude of the hydraulic load is based the levee design, generally without considering the fatigue failure due to the load duration. Moreover, many are the cases in which the levee breach are characterized by flood of magnitude lower than the design one. In order to implement the strategies of flood risk management, we built here a procedure based on a multivariate statistical analysis of flood peak and volume together with the analysis of the past levee failure events. Particularly, in order to define the probability of occurrence of the hydraulic load on a levee, a bivariate copula model is used to obtain the bivariate joint distribution of flood peak and volume. Flood peak is the expression of the load magnitude, while the volume is the expression of the stress over time. We consider the annual flood peak and the relative volume. The volume is given by the hydrograph area between the beginning and the end of event. The beginning of the event is identified as an abrupt rise of the discharge by more than 20%. The end is identified as the point from which the receding limb is characterized by the baseflow, using a nonlinear reservoir algorithm as baseflow separation technique. By this, with the aim to define warning thresholds we consider the past levee failure events and the relative bivariate return period (BTr) compared with the estimation of a traditional univariate model. The discharge data of 30 hydrometric stations of Arno River in Tuscany, Italy, in the period 1995-2016 are analysed. The database of levee failure events, considering for each event the location as well as the failure mode, is also created. The events were registered in the period 2000-2014 by EEA

  13. Elapid Snake Venom Analyses Show the Specificity of the Peptide Composition at the Level of Genera Naja and Notechis

    Directory of Open Access Journals (Sweden)

    Aisha Munawar

    2014-02-01

    Full Text Available Elapid snake venom is a highly valuable, but till now mainly unexplored, source of pharmacologically important peptides. We analyzed the peptide fractions with molecular masses up to 10 kDa of two elapid snake venoms—that of the African cobra, N. m. mossambica (genus Naja, and the Peninsula tiger snake, N. scutatus, from Kangaroo Island (genus Notechis. A combination of chromatographic methods was used to isolate the peptides, which were characterized by combining complimentary mass spectrometric techniques. Comparative analysis of the peptide compositions of two venoms showed specificity at the genus level. Three-finger (3-F cytotoxins, bradykinin-potentiating peptides (BPPs and a bradykinin inhibitor were isolated from the Naja venom. 3-F neurotoxins, Kunitz/basic pancreatic trypsin inhibitor (BPTI-type inhibitors and a natriuretic peptide were identified in the N. venom. The inhibiting activity of the peptides was confirmed in vitro with a selected array of proteases. Cytotoxin 1 (P01467 from the Naja venom might be involved in the disturbance of cellular processes by inhibiting the cell 20S-proteasome. A high degree of similarity between BPPs from elapid and viperid snake venoms was observed, suggesting that these molecules play a key role in snake venoms and also indicating that these peptides were recruited into the snake venom prior to the evolutionary divergence of the snakes.

  14. Bivariate generalized Pareto distribution for extreme atmospheric particulate matter

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-02-01

    The high particulate matter (PM10) level is the prominent issue causing various impacts to human health and seriously affecting the economics. The asymptotic theory of extreme value is apply for analyzing the relation of extreme PM10 data from two nearby air quality monitoring stations. The series of daily maxima PM10 for Johor Bahru and Pasir Gudang stations are consider for year 2001 to 2010 databases. The 85% and 95% marginal quantile apply to determine the threshold values and hence construct the series of exceedances over the chosen threshold. The logistic, asymmetric logistic, negative logistic and asymmetric negative logistic models areconsidered as the dependence function to the joint distribution of a bivariate observation. Maximum likelihood estimation is employed for parameter estimations. The best fitted model is chosen based on the Akaike Information Criterion and the quantile plots. It is found that the asymmetric logistic model gives the best fitted model for bivariate extreme PM10 data and shows the weak dependence between two stations.

  15. An assessment on the use of bivariate, multivariate and soft computing techniques for collapse susceptibility in GIS environ

    Science.gov (United States)

    Yilmaz, Işik; Marschalko, Marian; Bednarik, Martin

    2013-04-01

    The paper presented herein compares and discusses the use of bivariate, multivariate and soft computing techniques for collapse susceptibility modelling. Conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) models representing the bivariate, multivariate and soft computing techniques were used in GIS based collapse susceptibility mapping in an area from Sivas basin (Turkey). Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index (TWI), stream power index (SPI), Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from the models, and they were then compared by means of their validations. However, Area Under Curve (AUC) values obtained from all three models showed that the map obtained from soft computing (ANN) model looks like more accurate than the other models, accuracies of all three models can be evaluated relatively similar. The results also showed that the conditional probability is an essential method in preparation of collapse susceptibility map and highly compatible with GIS operating features.

  16. The relative performance of bivariate causality tests in small samples

    NARCIS (Netherlands)

    Bult, J..R.; Leeflang, P.S.H.; Wittink, D.R.

    1997-01-01

    Causality tests have been applied to establish directional effects and to reduce the set of potential predictors, For the latter type of application only bivariate tests can be used, In this study we compare bivariate causality tests. Although the problem addressed is general and could benefit

  17. Efficient estimation of semiparametric copula models for bivariate survival data

    KAUST Repository

    Cheng, Guang

    2014-01-01

    A semiparametric copula model for bivariate survival data is characterized by a parametric copula model of dependence and nonparametric models of two marginal survival functions. Efficient estimation for the semiparametric copula model has been recently studied for the complete data case. When the survival data are censored, semiparametric efficient estimation has only been considered for some specific copula models such as the Gaussian copulas. In this paper, we obtain the semiparametric efficiency bound and efficient estimation for general semiparametric copula models for possibly censored data. We construct an approximate maximum likelihood estimator by approximating the log baseline hazard functions with spline functions. We show that our estimates of the copula dependence parameter and the survival functions are asymptotically normal and efficient. Simple consistent covariance estimators are also provided. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2013 Elsevier Inc.

  18. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1bivariate distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  19. Bivariable analysis of ventricular late potentials in high resolution ECG records

    International Nuclear Information System (INIS)

    Orosco, L; Laciar, E

    2007-01-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, EN QRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually

  20. STUDI PERBANDINGAN ANTARA ALGORITMA BIVARIATE MARGINAL DISTRIBUTION DENGAN ALGORITMA GENETIKA

    Directory of Open Access Journals (Sweden)

    Chastine Fatichah

    2006-01-01

    Full Text Available Bivariate Marginal Distribution Algorithm is extended from Estimation of Distribution Algorithm. This heuristic algorithm proposes the new approach for recombination of generate new individual that without crossover and mutation process such as genetic algorithm. Bivariate Marginal Distribution Algorithm uses connectivity variable the pair gene for recombination of generate new individual. Connectivity between variable is doing along optimization process. In this research, genetic algorithm performance with one point crossover is compared with Bivariate Marginal Distribution Algorithm performance in case Onemax, De Jong F2 function, and Traveling Salesman Problem. In this research, experimental results have shown performance the both algorithm is dependence of parameter respectively and also population size that used. For Onemax case with size small problem, Genetic Algorithm perform better with small number of iteration and more fast for get optimum result. However, Bivariate Marginal Distribution Algorithm perform better of result optimization for case Onemax with huge size problem. For De Jong F2 function, Genetic Algorithm perform better from Bivariate Marginal Distribution Algorithm of a number of iteration and time. For case Traveling Salesman Problem, Bivariate Marginal Distribution Algorithm have shown perform better from Genetic Algorithm of optimization result. Abstract in Bahasa Indonesia : Bivariate Marginal Distribution Algorithm merupakan perkembangan lebih lanjut dari Estimation of Distribution Algorithm. Algoritma heuristik ini mengenalkan pendekatan baru dalam melakukan rekombinasi untuk membentuk individu baru, yaitu tidak menggunakan proses crossover dan mutasi seperti pada Genetic Algorithm. Bivariate Marginal Distribution Algorithm menggunakan keterkaitan pasangan variabel dalam melakukan rekombinasi untuk membentuk individu baru. Keterkaitan antar variabel tersebut ditemukan selama proses optimasi berlangsung. Aplikasi yang

  1. Bivariate discrete beta Kernel graduation of mortality data.

    Science.gov (United States)

    Mazza, Angelo; Punzo, Antonio

    2015-07-01

    Various parametric/nonparametric techniques have been proposed in literature to graduate mortality data as a function of age. Nonparametric approaches, as for example kernel smoothing regression, are often preferred because they do not assume any particular mortality law. Among the existing kernel smoothing approaches, the recently proposed (univariate) discrete beta kernel smoother has been shown to provide some benefits. Bivariate graduation, over age and calendar years or durations, is common practice in demography and actuarial sciences. In this paper, we generalize the discrete beta kernel smoother to the bivariate case, and we introduce an adaptive bandwidth variant that may provide additional benefits when data on exposures to the risk of death are available; furthermore, we outline a cross-validation procedure for bandwidths selection. Using simulations studies, we compare the bivariate approach proposed here with its corresponding univariate formulation and with two popular nonparametric bivariate graduation techniques, based on Epanechnikov kernels and on P-splines. To make simulations realistic, a bivariate dataset, based on probabilities of dying recorded for the US males, is used. Simulations have confirmed the gain in performance of the new bivariate approach with respect to both the univariate and the bivariate competitors.

  2. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting

    2011-03-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  3. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting; Yang, Jingping; Huang, Jianhua Z.

    2011-01-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  4. Bivariational calculations for radiation transfer in an inhomogeneous participating media

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Machali, H.M.; Haggag, M.H.; Attia, M.T.

    1986-07-01

    Equations for radiation transfer are obtained for dispersive media with space dependent albedo. Bivariational bound principle is used to calculate the reflection and transmission coefficients for such media. Numerical results are given and compared. (author)

  5. Comparison between two bivariate Poisson distributions through the ...

    African Journals Online (AJOL)

    These two models express themselves by their probability mass function. ... To remedy this problem, Berkhout and Plug proposed a bivariate Poisson distribution accepting the correlation as well negative, equal to zero, that positive.

  6. A comparison of bivariate and univariate QTL mapping in livestock populations

    Directory of Open Access Journals (Sweden)

    Sorensen Daniel

    2003-11-01

    Full Text Available Abstract This study presents a multivariate, variance component-based QTL mapping model implemented via restricted maximum likelihood (REML. The method was applied to investigate bivariate and univariate QTL mapping analyses, using simulated data. Specifically, we report results on the statistical power to detect a QTL and on the precision of parameter estimates using univariate and bivariate approaches. The model and methodology were also applied to study the effectiveness of partitioning the overall genetic correlation between two traits into a component due to many genes of small effect, and one due to the QTL. It is shown that when the QTL has a pleiotropic effect on two traits, a bivariate analysis leads to a higher statistical power of detecting the QTL and to a more precise estimate of the QTL's map position, in particular in the case when the QTL has a small effect on the trait. The increase in power is most marked in cases where the contributions of the QTL and of the polygenic components to the genetic correlation have opposite signs. The bivariate REML analysis can successfully partition the two components contributing to the genetic correlation between traits.

  7. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    Science.gov (United States)

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.

  8. Technical note: Towards a continuous classification of climate using bivariate colour mapping

    NARCIS (Netherlands)

    Teuling, A.J.

    2011-01-01

    Climate is often defined in terms of discrete classes. Here I use bivariate colour mapping to show that the global distribution of K¨oppen-Geiger climate classes can largely be reproduced by combining the simple means of two key states of the climate system 5 (i.e., air temperature and relative

  9. A Basic Bivariate Structure of Personality Attributes Evident Across Nine Languages.

    Science.gov (United States)

    Saucier, Gerard; Thalmayer, Amber Gayle; Payne, Doris L; Carlson, Robert; Sanogo, Lamine; Ole-Kotikash, Leonard; Church, A Timothy; Katigbak, Marcia S; Somer, Oya; Szarota, Piotr; Szirmák, Zsofia; Zhou, Xinyue

    2014-02-01

    Here, two studies seek to characterize a parsimonious common-denominator personality structure with optimal cross-cultural replicability. Personality differences are observed in all human populations and cultures, but lexicons for personality attributes contain so many distinctions that parsimony is lacking. Models stipulating the most important attributes have been formulated by experts or by empirical studies drawing on experience in a very limited range of cultures. Factor analyses of personality lexicons of nine languages of diverse provenance (Chinese, Korean, Filipino, Turkish, Greek, Polish, Hungarian, Maasai, and Senoufo) were examined, and their common structure was compared to that of several prominent models in psychology. A parsimonious bivariate model showed evidence of substantial convergence and ubiquity across cultures. Analyses involving key markers of these dimensions in English indicate that they are broad dimensions involving the overlapping content of the interpersonal circumplex, models of communion and agency, and morality/warmth and competence. These "Big Two" dimensions-Social Self-Regulation and Dynamism-provide a common-denominator model involving the two most crucial axes of personality variation, ubiquitous across cultures. The Big Two might serve as an umbrella model serving to link diverse theoretical models and associated research literatures. © 2013 Wiley Periodicals, Inc.

  10. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    Science.gov (United States)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further

  11. A bivariate model for analyzing recurrent multi-type automobile failures

    Science.gov (United States)

    Sunethra, A. A.; Sooriyarachchi, M. R.

    2017-09-01

    The failure mechanism in an automobile can be defined as a system of multi-type recurrent failures where failures can occur due to various multi-type failure modes and these failures are repetitive such that more than one failure can occur from each failure mode. In analysing such automobile failures, both the time and type of the failure serve as response variables. However, these two response variables are highly correlated with each other since the timing of failures has an association with the mode of the failure. When there are more than one correlated response variables, the fitting of a multivariate model is more preferable than separate univariate models. Therefore, a bivariate model of time and type of failure becomes appealing for such automobile failure data. When there are multiple failure observations pertaining to a single automobile, such data cannot be treated as independent data because failure instances of a single automobile are correlated with each other while failures among different automobiles can be treated as independent. Therefore, this study proposes a bivariate model consisting time and type of failure as responses adjusted for correlated data. The proposed model was formulated following the approaches of shared parameter models and random effects models for joining the responses and for representing the correlated data respectively. The proposed model is applied to a sample of automobile failures with three types of failure modes and up to five failure recurrences. The parametric distributions that were suitable for the two responses of time to failure and type of failure were Weibull distribution and multinomial distribution respectively. The proposed bivariate model was programmed in SAS Procedure Proc NLMIXED by user programming appropriate likelihood functions. The performance of the bivariate model was compared with separate univariate models fitted for the two responses and it was identified that better performance is secured by

  12. Bivariate Genomic Footprinting Detects Changes in Transcription Factor Activity

    Directory of Open Access Journals (Sweden)

    Songjoon Baek

    2017-05-01

    Full Text Available In response to activating signals, transcription factors (TFs bind DNA and regulate gene expression. TF binding can be measured by protection of the bound sequence from DNase digestion (i.e., footprint. Here, we report that 80% of TF binding motifs do not show a measurable footprint, partly because of a variable cleavage pattern within the motif sequence. To more faithfully portray the effect of TFs on chromatin, we developed an algorithm that captures two TF-dependent effects on chromatin accessibility: footprinting and motif-flanking accessibility. The algorithm, termed bivariate genomic footprinting (BaGFoot, efficiently detects TF activity. BaGFoot is robust to different accessibility assays (DNase-seq, ATAC-seq, all examined peak-calling programs, and a variety of cut bias correction approaches. BaGFoot reliably predicts TF binding and provides valuable information regarding the TFs affecting chromatin accessibility in various biological systems and following various biological events, including in cases where an absolute footprint cannot be determined.

  13. Preparation and bivariate analysis of suspensions of human chromosomes

    Energy Technology Data Exchange (ETDEWEB)

    van den Engh, G.J.; Trask, B.J.; Gray, J.W.; Langlois, R.G.; Yu, L.C.

    1985-01-01

    Chromosomes were isolated from a variety of human cell types using a HEPES-buffered hypotonic solution (pH 8.0) containing KCl, MgSO/sub 4/ dithioerythritol, and RNase. The chromosomes isolated by this procedure could be stained with a variety of fluorescent stains including propidium iodide, chromomycin A3, and Hoeschst 33258. Addition of sodium citrate to the stained chromosomes was found to improve the total fluorescence resolution. High-quality bivariate Hoeschst vs. chromomycin fluorescence distributions were obtained for chromosomes isolated from a human fibroblast cell strain, a human colon carcinoma cell line, and human peripheral blood lymphocyte cultures. Good flow karyotypes were also obtained from primary amniotic cell cultures. The Hoeschst vs. chromomycin flow karyotypes of a given cell line, made at different times and at dye concentrations varying over fourfold ranges, show little variation in the relative peak positions of the chromosomes. The size of the DNA in chromosomes isolated using this procedure ranges from 20 to 50 kilobases. The described isolation procedure is simple, it yields high-quality flow karyotypes, and it can be used to prepare chromosomes from clinical samples. 22 references, 7 figures, 1 table.

  14. Bivariate Cointegration Analysis of Energy-Economy Interactions in Iran

    Directory of Open Access Journals (Sweden)

    Ismail Oladimeji Soile

    2015-12-01

    Full Text Available Fixing the prices of energy products below their opportunity cost for welfare and redistribution purposes is common with governments of many oil producing developing countries. This has often resulted in huge energy consumption in developing countries and the question that emerge is whether this increased energy consumption results in higher economic activities. Available statistics show that Iran’s economy growth shrunk for the first time in two decades from 2011 amidst the introduction of pricing reform in 2010 and 2014 suggesting a relationship between energy use and economic growth. Accordingly, the study examined the causality and the likelihood of a long term relationship between energy and economic growth in Iran. Unlike previous studies which have focused on the effects and effectiveness of the reform, the paper investigates the rationale for the reform. The study applied a bivariate cointegration time series econometric approach. The results reveals a one-way causality running from economic growth to energy with no feedback with evidence of long run connection. The implication of this is that energy conservation policy is not inimical to economic growth. This evidence lend further support for the ongoing subsidy reforms in Iran as a measure to check excessive and inefficient use of energy.

  15. A bivariate optimal replacement policy for a multistate repairable system

    International Nuclear Information System (INIS)

    Zhang Yuanlin; Yam, Richard C.M.; Zuo, Ming J.

    2007-01-01

    In this paper, a deteriorating simple repairable system with k+1 states, including k failure states and one working state, is studied. It is assumed that the system after repair is not 'as good as new' and the deterioration of the system is stochastic. We consider a bivariate replacement policy, denoted by (T,N), in which the system is replaced when its working age has reached T or the number of failures it has experienced has reached N, whichever occurs first. The objective is to determine the optimal replacement policy (T,N)* such that the long-run expected profit per unit time is maximized. The explicit expression of the long-run expected profit per unit time is derived and the corresponding optimal replacement policy can be determined analytically or numerically. We prove that the optimal policy (T,N)* is better than the optimal policy N* for a multistate simple repairable system. We also show that a general monotone process model for a multistate simple repairable system is equivalent to a geometric process model for a two-state simple repairable system in the sense that they have the same structure for the long-run expected profit (or cost) per unit time and the same optimal policy. Finally, a numerical example is given to illustrate the theoretical results

  16. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    Science.gov (United States)

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  17. Optimizing an objective function under a bivariate probability model

    NARCIS (Netherlands)

    X. Brusset; N.M. Temme (Nico)

    2007-01-01

    htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be

  18. Assessing the copula selection for bivariate frequency analysis ...

    Indian Academy of Sciences (India)

    58

    Copulas are applied to overcome the restriction of traditional bivariate frequency ... frequency analysis methods cannot describe the random variable properties that ... In order to overcome the limitation of multivariate distributions, a copula is a ..... The Mann-Kendall (M-K) test is a non-parametric statistical test which is used ...

  19. A New Measure Of Bivariate Asymmetry And Its Evaluation

    International Nuclear Information System (INIS)

    Ferreira, Flavio Henn; Kolev, Nikolai Valtchev

    2008-01-01

    In this paper we propose a new measure of bivariate asymmetry, based on conditional correlation coefficients. A decomposition of the Pearson correlation coefficient in terms of its conditional versions is studied and an example of application of the proposed measure is given.

  20. Building Bivariate Tables: The compareGroups Package for R

    Directory of Open Access Journals (Sweden)

    Isaac Subirana

    2014-05-01

    Full Text Available The R package compareGroups provides functions meant to facilitate the construction of bivariate tables (descriptives of several variables for comparison between groups and generates reports in several formats (LATEX, HTML or plain text CSV. Moreover, bivariate tables can be viewed directly on the R console in a nice format. A graphical user interface (GUI has been implemented to build the bivariate tables more easily for those users who are not familiar with the R software. Some new functions and methods have been incorporated in the newest version of the compareGroups package (version 1.x to deal with time-to-event variables, stratifying tables, merging several tables, and revising the statistical methods used. The GUI interface also has been improved, making it much easier and more intuitive to set the inputs for building the bivariate tables. The ?rst version (version 0.x and this version were presented at the 2010 useR! conference (Sanz, Subirana, and Vila 2010 and the 2011 useR! conference (Sanz, Subirana, and Vila 2011, respectively. Package compareGroups is available from the Comprehensive R Archive Network at http://CRAN.R-project.org/package=compareGroups.

  1. About some properties of bivariate splines with shape parameters

    Science.gov (United States)

    Caliò, F.; Marchetti, E.

    2017-07-01

    The paper presents and proves geometrical properties of a particular bivariate function spline, built and algorithmically implemented in previous papers. The properties typical of this family of splines impact the field of computer graphics in particular that of the reverse engineering.

  2. Dairy shows different associations with abdominal and BMI-defined overweight: Cross-sectional analyses exploring a variety of dairy products.

    Science.gov (United States)

    Brouwer-Brolsma, E M; Sluik, D; Singh-Povel, C M; Feskens, E J M

    2018-05-01

    Previous studies have suggested weight-regulatory properties for several dairy nutrients, but population-based studies on dairy and body weight are inconclusive. We explored cross-sectional associations between dairy consumption and indicators of overweight. We included 114,682 Dutch adults, aged ≥18 years. Dairy consumption was quantified by a food frequency questionnaire. Abdominal overweight was defined as waist circumference (WC) ≥88 cm (women) or ≥102 cm (men) (n = 37,391), overweight as BMI ≥25-30 kg/m 2 (n = 44,772) and obesity as BMI ≥30 kg/m 2 (n = 15,339). Associations were quantified by logistic (abdominal overweight, no/yes), multinomial logistic (BMI-defined overweight and obesity) and linear regression analyses (continuous measures of WC and BMI), and they were adjusted for relevant covariates. Total dairy showed a positive association with abdominal overweight (OR Q1 ref vs. Q5: 1.09; 95% CI: 1.04-1.14) and with BMI-defined overweight (OR Q5 1.13; 95% CI: 1.08-1.18) and obesity (OR Q5 1.09; 95% CI: 1.02-1.16). Skimmed, semi-skimmed and non-fermented dairy also showed positive associations with overweight categories. Full-fat dairy showed an inverse association with overweight and obesity (OR Q5 for obesity: 0.78; 95% CI: 0.73-0.83). Moreover, inverse associations were observed for yoghurt and custard and positive associations for milk, buttermilk, flavoured yoghurt drinks, cheese and cheese snacks. Fermented dairy, curd cheese and Dutch cheese did not show a consistent association with overweight categories. Total, skimmed, semi-skimmed and non-fermented dairy; milk; buttermilk; flavoured yoghurt drinks; total cheese and cheese snacks showed a positive association with overweight categories, whereas full-fat dairy, custard and yoghurt showed an inverse association with overweight categories. Copyright © 2018 The Italian Society of Diabetology, the Italian Society for the Study of Atherosclerosis, the Italian Society of Human

  3. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone...... as in human muscle tissue. This is the first bivariate GWAS meta-analysis to demonstrate genetic factors with pleiotropic effects on bone mineral density and lean mass.Bone mineral density and lean skeletal mass are heritable traits. Here, Medina-Gomez and colleagues perform bivariate GWAS analyses of total...

  4. Smoothing of the bivariate LOD score for non-normal quantitative traits.

    Science.gov (United States)

    Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John

    2005-12-30

    Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.

  5. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  6. Chain Plot: A Tool for Exploiting Bivariate Temporal Structures

    OpenAIRE

    Taylor, CC; Zempeni, A

    2004-01-01

    In this paper we present a graphical tool useful for visualizing the cyclic behaviour of bivariate time series. We investigate its properties and link it to the asymmetry of the two variables concerned. We also suggest adding approximate confidence bounds to the points on the plot and investigate the effect of lagging to the chain plot. We conclude our paper by some standard Fourier analysis, relating and comparing this to the chain plot.

  7. Spectrum-based estimators of the bivariate Hurst exponent

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2014-01-01

    Roč. 90, č. 6 (2014), art. 062802 ISSN 1539-3755 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : bivariate Hurst exponent * power- law cross-correlations * estimation Subject RIV: AH - Economics Impact factor: 2.288, year: 2014 http://library.utia.cas.cz/separaty/2014/E/kristoufek-0436818.pdf

  8. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Science.gov (United States)

    Zscheischler, Jakob; Orth, Rene; Seneviratne, Sonia I.

    2017-07-01

    Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature-precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 %) than models relying directly on temperature and precipitation as predictors (36 %). Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate-crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  9. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying

    2017-01-18

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  10. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying; Hering, Amanda S.; Browning, Joshua M.

    2017-01-01

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  11. A systematic review and meta-Analyses show that carbapenem use and medical devices are the leading risk factors for carbapenem- resistant pseudomonas aeruginosa

    NARCIS (Netherlands)

    A.F. Voor (Anne); J.A. Severin (Juliëtte); E.M.E.H. Lesaffre (Emmanuel); M.C. Vos (Margreet)

    2014-01-01

    textabstractA systematic review and meta-Analyses were performed to identify the risk factors associated with carbapenem-resistant Pseudomonas aeruginosa and to identify sources and reservoirs for the pathogen. A systematic search of PubMed and Embase databases from 1 January 1987 until 27 January

  12. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  13. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  14. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    Science.gov (United States)

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the

  15. Recurrent major depression and right hippocampal volume: A bivariate linkage and association study.

    Science.gov (United States)

    Mathias, Samuel R; Knowles, Emma E M; Kent, Jack W; McKay, D Reese; Curran, Joanne E; de Almeida, Marcio A A; Dyer, Thomas D; Göring, Harald H H; Olvera, Rene L; Duggirala, Ravi; Fox, Peter T; Almasy, Laura; Blangero, John; Glahn, David C

    2016-01-01

    Previous work has shown that the hippocampus is smaller in the brains of individuals suffering from major depressive disorder (MDD) than those of healthy controls. Moreover, right hippocampal volume specifically has been found to predict the probability of subsequent depressive episodes. This study explored the utility of right hippocampal volume as an endophenotype of recurrent MDD (rMDD). We observed a significant genetic correlation between the two traits in a large sample of Mexican American individuals from extended pedigrees (ρg = -0.34, p = 0.013). A bivariate linkage scan revealed a significant pleiotropic quantitative trait locus on chromosome 18p11.31-32 (LOD = 3.61). Bivariate association analysis conducted under the linkage peak revealed a variant (rs574972) within an intron of the gene SMCHD1 meeting the corrected significance level (χ(2) = 19.0, p = 7.4 × 10(-5)). Univariate association analyses of each phenotype separately revealed that the same variant was significant for right hippocampal volume alone, and also revealed a suggestively significant variant (rs12455524) within the gene DLGAP1 for rMDD alone. The results implicate right-hemisphere hippocampal volume as a possible endophenotype of rMDD, and in so doing highlight a potential gene of interest for rMDD risk. © 2015 Wiley Periodicals, Inc.

  16. Unadjusted Bivariate Two-Group Comparisons: When Simpler is Better.

    Science.gov (United States)

    Vetter, Thomas R; Mascha, Edward J

    2018-01-01

    Hypothesis testing involves posing both a null hypothesis and an alternative hypothesis. This basic statistical tutorial discusses the appropriate use, including their so-called assumptions, of the common unadjusted bivariate tests for hypothesis testing and thus comparing study sample data for a difference or association. The appropriate choice of a statistical test is predicated on the type of data being analyzed and compared. The unpaired or independent samples t test is used to test the null hypothesis that the 2 population means are equal, thereby accepting the alternative hypothesis that the 2 population means are not equal. The unpaired t test is intended for comparing dependent continuous (interval or ratio) data from 2 study groups. A common mistake is to apply several unpaired t tests when comparing data from 3 or more study groups. In this situation, an analysis of variance with post hoc (posttest) intragroup comparisons should instead be applied. Another common mistake is to apply a series of unpaired t tests when comparing sequentially collected data from 2 study groups. In this situation, a repeated-measures analysis of variance, with tests for group-by-time interaction, and post hoc comparisons, as appropriate, should instead be applied in analyzing data from sequential collection points. The paired t test is used to assess the difference in the means of 2 study groups when the sample observations have been obtained in pairs, often before and after an intervention in each study subject. The Pearson chi-square test is widely used to test the null hypothesis that 2 unpaired categorical variables, each with 2 or more nominal levels (values), are independent of each other. When the null hypothesis is rejected, 1 concludes that there is a probable association between the 2 unpaired categorical variables. When comparing 2 groups on an ordinal or nonnormally distributed continuous outcome variable, the 2-sample t test is usually not appropriate. The

  17. Perceived Social Support and Academic Achievement: Cross-Lagged Panel and Bivariate Growth Curve Analyses

    Science.gov (United States)

    Mackinnon, Sean P.

    2012-01-01

    As students transition to post-secondary education, they experience considerable stress and declines in academic performance. Perceived social support is thought to improve academic achievement by reducing stress. Longitudinal designs with three or more waves are needed in this area because they permit stronger causal inferences and help…

  18. Metabolic and molecular analyses of white mutant Vaccinium berries show down-regulation of MYBPA1-type R2R3 MYB regulatory factor.

    Science.gov (United States)

    Primetta, Anja K; Karppinen, Katja; Riihinen, Kaisu R; Jaakola, Laura

    2015-09-01

    MYBPA1-type R2R3 MYB transcription factor shows down-regulation in white mutant berries of Vaccinium uliginosum deficient in anthocyanins but not proanthocyanidins suggesting a role in the regulation of anthocyanin biosynthesis. Berries of the genus Vaccinium are among the best natural sources of flavonoids. In this study, the expression of structural and regulatory flavonoid biosynthetic genes and the accumulation of flavonoids in white mutant and blue-colored wild-type bog bilberry (V. uliginosum) fruits were measured at different stages of berry development. In contrast to high contents of anthocyanins in ripe blue-colored berries, only traces were detected by HPLC-ESI-MS in ripe white mutant berries. However, similar profile and high levels of flavonol glycosides and proanthocyanidins were quantified in both ripe white and ripe wild-type berries. Analysis with qRT-PCR showed strong down-regulation of structural genes chalcone synthase (VuCHS), dihydroflavonol 4-reductase (VuDFR) and anthocyanidin synthase (VuANS) as well as MYBPA1-type transcription factor VuMYBPA1 in white berries during ripening compared to wild-type berries. The profiles of transcript accumulation of chalcone isomerase (VuCHI), anthocyanidin reductase (VuANR), leucoanthocyanidin reductase (VuLAR) and flavonoid 3'5' hydroxylase (VuF3'5'H) were more similar between the white and the wild-type berries during fruit development, while expression of UDP-glucose: flavonoid 3-O-glucosyltransferase (VuUFGT) showed similar trend but fourfold lower level in white mutant. VuMYBPA1, the R2R3 MYB family member, is a homologue of VmMYB2 of V. myrtillus and VcMYBPA1 of V. corymbosum and belongs to MYBPA1-type MYB family which members are shown in some species to be related with proanthocyanidin biosynthesis in fruits. Our results combined with earlier data of the role of VmMYB2 in white mutant berries of V. myrtillus suggest that the regulation of anthocyanin biosynthesis in Vaccinium species could differ

  19. Comparative sequence, structure and redox analyses of Klebsiella pneumoniae DsbA show that anti-virulence target DsbA enzymes fall into distinct classes.

    Directory of Open Access Journals (Sweden)

    Fabian Kurth

    Full Text Available Bacterial DsbA enzymes catalyze oxidative folding of virulence factors, and have been identified as targets for antivirulence drugs. However, DsbA enzymes characterized to date exhibit a wide spectrum of redox properties and divergent structural features compared to the prototypical DsbA enzyme of Escherichia coli DsbA (EcDsbA. Nonetheless, sequence analysis shows that DsbAs are more highly conserved than their known substrate virulence factors, highlighting the potential to inhibit virulence across a range of organisms by targeting DsbA. For example, Salmonella enterica typhimurium (SeDsbA, 86 % sequence identity to EcDsbA shares almost identical structural, surface and redox properties. Using comparative sequence and structure analysis we predicted that five other bacterial DsbAs would share these properties. To confirm this, we characterized Klebsiella pneumoniae DsbA (KpDsbA, 81 % identity to EcDsbA. As expected, the redox properties, structure and surface features (from crystal and NMR data of KpDsbA were almost identical to those of EcDsbA and SeDsbA. Moreover, KpDsbA and EcDsbA bind peptides derived from their respective DsbBs with almost equal affinity, supporting the notion that compounds designed to inhibit EcDsbA will also inhibit KpDsbA. Taken together, our data show that DsbAs fall into different classes; that DsbAs within a class may be predicted by sequence analysis of binding loops; that DsbAs within a class are able to complement one another in vivo and that compounds designed to inhibit EcDsbA are likely to inhibit DsbAs within the same class.

  20. SNPMClust: Bivariate Gaussian Genotype Clustering and Calling for Illumina Microarrays

    Directory of Open Access Journals (Sweden)

    Stephen W. Erickson

    2016-07-01

    Full Text Available SNPMClust is an R package for genotype clustering and calling with Illumina microarrays. It was originally developed for studies using the GoldenGate custom genotyping platform but can be used with other Illumina platforms, including Infinium BeadChip. The algorithm first rescales the fluorescent signal intensity data, adds empirically derived pseudo-data to minor allele genotype clusters, then uses the package mclust for bivariate Gaussian model fitting. We compared the accuracy and sensitivity of SNPMClust to that of GenCall, Illumina's proprietary algorithm, on a data set of 94 whole-genome amplified buccal (cheek swab DNA samples. These samples were genotyped on a custom panel which included 1064 SNPs for which the true genotype was known with high confidence. SNPMClust produced uniformly lower false call rates over a wide range of overall call rates.

  1. Selection effects in the bivariate brightness distribution for spiral galaxies

    International Nuclear Information System (INIS)

    Phillipps, S.; Disney, M.

    1986-01-01

    The joint distribution of total luminosity and characteristic surface brightness (the bivariate brightness distribution) is investigated for a complete sample of spiral galaxies in the Virgo cluster. The influence of selection and physical limits of various kinds on the apparent distribution are detailed. While the distribution of surface brightness for bright galaxies may be genuinely fairly narrow, faint galaxies exist right across the (quite small) range of accessible surface brightnesses so no statement can be made about the true extent of the distribution. The lack of high surface brightness bright galaxies in the Virgo sample relative to an overall RC2 sample (mostly field galaxies) supports the contention that the star-formation rate is reduced in the inner region of the cluster for environmental reasons. (author)

  2. Bivariate pointing movements on large touch screens: investigating the validity of a refined Fitts' Law.

    Science.gov (United States)

    Bützler, Jennifer; Vetter, Sebastian; Jochems, Nicole; Schlick, Christopher M

    2012-01-01

    On the basis of three empirical studies Fitts' Law was refined for bivariate pointing tasks on large touch screens. In the first study different target width parameters were investigated. The second study considered the effect of the motion angle. Based on the results of the two studies a refined model for movement time in human-computer interaction was formulated. A third study, which is described here in detail, concerns the validation of the refined model. For the validation study 20 subjects had to execute a bivariate pointing task on a large touch screen. In the experimental task 250 rectangular target objects were displayed at a randomly chosen position on the screen covering a broad range of ID values (ID= [1.01; 4.88]). Compared to existing refinements of Fitts' Law, the new model shows highest predictive validity. A promising field of application of the model is the ergonomic design and evaluation of project management software. By using the refined model, software designers can calculate a priori the appropriate angular position and the size of buttons, menus or icons.

  3. A non-stationary cost-benefit based bivariate extreme flood estimation approach

    Science.gov (United States)

    Qi, Wei; Liu, Junguo

    2018-02-01

    Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.

  4. A bivariate space-time downscaler under space and time misalignment.

    Science.gov (United States)

    Berrocal, Veronica J; Gelfand, Alan E; Holland, David M

    2010-12-01

    Ozone and particulate matter PM(2.5) are co-pollutants that have long been associated with increased public health risks. Information on concentration levels for both pollutants come from two sources: monitoring sites and output from complex numerical models that produce concentration surfaces over large spatial regions. In this paper, we offer a fully-model based approach for fusing these two sources of information for the pair of co-pollutants which is computationally feasible over large spatial regions and long periods of time. Due to the association between concentration levels of the two environmental contaminants, it is expected that information regarding one will help to improve prediction of the other. Misalignment is an obvious issue since the monitoring networks for the two contaminants only partly intersect and because the collection rate for PM(2.5) is typically less frequent than that for ozone.Extending previous work in Berrocal et al. (2009), we introduce a bivariate downscaler that provides a flexible class of bivariate space-time assimilation models. We discuss computational issues for model fitting and analyze a dataset for ozone and PM(2.5) for the ozone season during year 2002. We show a modest improvement in predictive performance, not surprising in a setting where we can anticipate only a small gain.

  5. Epileptic seizure prediction based on a bivariate spectral power methodology.

    Science.gov (United States)

    Bandarabadi, Mojtaba; Teixeira, Cesar A; Direito, Bruno; Dourado, Antonio

    2012-01-01

    The spectral power of 5 frequently considered frequency bands (Alpha, Beta, Gamma, Theta and Delta) for 6 EEG channels is computed and then all the possible pairwise combinations among the 30 features set, are used to create a 435 dimensional feature space. Two new feature selection methods are introduced to choose the best candidate features among those and to reduce the dimensionality of this feature space. The selected features are then fed to Support Vector Machines (SVMs) that classify the cerebral state in preictal and non-preictal classes. The outputs of the SVM are regularized using a method that accounts for the classification dynamics of the preictal class, also known as "Firing Power" method. The results obtained using our feature selection approaches are compared with the ones obtained using minimum Redundancy Maximum Relevance (mRMR) feature selection method. The results in a group of 12 patients of the EPILEPSIAE database, containing 46 seizures and 787 hours multichannel recording for out-of-sample data, indicate the efficiency of the bivariate approach as well as the two new feature selection methods. The best results presented sensitivity of 76.09% (35 of 46 seizures predicted) and a false prediction rate of 0.15(-1).

  6. Asymptotics of bivariate generating functions with algebraic singularities

    Science.gov (United States)

    Greenwood, Torin

    Flajolet and Odlyzko (1990) derived asymptotic formulae the coefficients of a class of uni- variate generating functions with algebraic singularities. Gao and Richmond (1992) and Hwang (1996, 1998) extended these results to classes of multivariate generating functions, in both cases by reducing to the univariate case. Pemantle and Wilson (2013) outlined new multivariate ana- lytic techniques and used them to analyze the coefficients of rational generating functions. After overviewing these methods, we use them to find asymptotic formulae for the coefficients of a broad class of bivariate generating functions with algebraic singularities. Beginning with the Cauchy integral formula, we explicity deform the contour of integration so that it hugs a set of critical points. The asymptotic contribution to the integral comes from analyzing the integrand near these points, leading to explicit asymptotic formulae. Next, we use this formula to analyze an example from current research. In the following chapter, we apply multivariate analytic techniques to quan- tum walks. Bressler and Pemantle (2007) found a (d + 1)-dimensional rational generating function whose coefficients described the amplitude of a particle at a position in the integer lattice after n steps. Here, the minimal critical points form a curve on the (d + 1)-dimensional unit torus. We find asymptotic formulae for the amplitude of a particle in a given position, normalized by the number of steps n, as n approaches infinity. Each critical point contributes to the asymptotics for a specific normalized position. Using Groebner bases in Maple again, we compute the explicit locations of peak amplitudes. In a scaling window of size the square root of n near the peaks, each amplitude is asymptotic to an Airy function.

  7. Bivariate- distribution for transition matrix elements in Breit-Wigner to Gaussian domains of interacting particle systems.

    Science.gov (United States)

    Kota, V K B; Chavda, N D; Sahu, R

    2006-04-01

    Interacting many-particle systems with a mean-field one-body part plus a chaos generating random two-body interaction having strength lambda exhibit Poisson to Gaussian orthogonal ensemble and Breit-Wigner (BW) to Gaussian transitions in level fluctuations and strength functions with transition points marked by lambda = lambda c and lambda = lambda F, respectively; lambda F > lambda c. For these systems a theory for the matrix elements of one-body transition operators is available, as valid in the Gaussian domain, with lambda > lambda F, in terms of orbital occupation numbers, level densities, and an integral involving a bivariate Gaussian in the initial and final energies. Here we show that, using a bivariate-t distribution, the theory extends below from the Gaussian regime to the BW regime up to lambda = lambda c. This is well tested in numerical calculations for 6 spinless fermions in 12 single-particle states.

  8. A Bivariate Chebyshev Spectral Collocation Quasilinearization Method for Nonlinear Evolution Parabolic Equations

    Directory of Open Access Journals (Sweden)

    S. S. Motsa

    2014-01-01

    Full Text Available This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs. The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  9. A bivariate Chebyshev spectral collocation quasilinearization method for nonlinear evolution parabolic equations.

    Science.gov (United States)

    Motsa, S S; Magagula, V M; Sibanda, P

    2014-01-01

    This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs). The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  10. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Directory of Open Access Journals (Sweden)

    J. Zscheischler

    2017-07-01

    Full Text Available Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature–precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 % than models relying directly on temperature and precipitation as predictors (36 %. Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate–crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  11. Historical and future drought in Bangladesh using copula-based bivariate regional frequency analysis

    Science.gov (United States)

    Mortuza, Md Rubayet; Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2018-02-01

    The study aims at regional and probabilistic evaluation of bivariate drought characteristics to assess both the past and future drought duration and severity in Bangladesh. The procedures involve applying (1) standardized precipitation index to identify drought duration and severity, (2) regional frequency analysis to determine the appropriate marginal distributions for both duration and severity, (3) copula model to estimate the joint probability distribution of drought duration and severity, and (4) precipitation projections from multiple climate models to assess future drought trends. Since drought duration and severity in Bangladesh are often strongly correlated and do not follow same marginal distributions, the joint and conditional return periods of droughts are characterized using the copula-based joint distribution. The country is divided into three homogeneous regions using Fuzzy clustering and multivariate discordancy and homogeneity measures. For given severity and duration values, the joint return periods for a drought to exceed both values are on average 45% larger, while to exceed either value are 40% less than the return periods from the univariate frequency analysis, which treats drought duration and severity independently. These suggest that compared to the bivariate drought frequency analysis, the standard univariate frequency analysis under/overestimate the frequency and severity of droughts depending on how their duration and severity are related. Overall, more frequent and severe droughts are observed in the west side of the country. Future drought trend based on four climate models and two scenarios showed the possibility of less frequent drought in the future (2020-2100) than in the past (1961-2010).

  12. Genetic correlations between body condition scores and fertility in dairy cattle using bivariate random regression models.

    Science.gov (United States)

    De Haas, Y; Janss, L L G; Kadarmideen, H N

    2007-10-01

    Genetic correlations between body condition score (BCS) and fertility traits in dairy cattle were estimated using bivariate random regression models. BCS was recorded by the Swiss Holstein Association on 22,075 lactating heifers (primiparous cows) from 856 sires. Fertility data during first lactation were extracted for 40,736 cows. The fertility traits were days to first service (DFS), days between first and last insemination (DFLI), calving interval (CI), number of services per conception (NSPC) and conception rate to first insemination (CRFI). A bivariate model was used to estimate genetic correlations between BCS as a longitudinal trait by random regression components, and daughter's fertility at the sire level as a single lactation measurement. Heritability of BCS was 0.17, and heritabilities for fertility traits were low (0.01-0.08). Genetic correlations between BCS and fertility over the lactation varied from: -0.45 to -0.14 for DFS; -0.75 to 0.03 for DFLI; from -0.59 to -0.02 for CI; from -0.47 to 0.33 for NSPC and from 0.08 to 0.82 for CRFI. These results show (genetic) interactions between fat reserves and reproduction along the lactation trajectory of modern dairy cows, which can be useful in genetic selection as well as in management. Maximum genetic gain in fertility from indirect selection on BCS should be based on measurements taken in mid lactation when the genetic variance for BCS is largest, and the genetic correlations between BCS and fertility is strongest.

  13. Aquaculture in artificially developed wetlands in urban areas: an application of the bivariate relationship between soil and surface water in landscape ecology.

    Science.gov (United States)

    Paul, Abhijit

    2011-01-01

    Wetlands show a strong bivariate relationship between soil and surface water. Artificially developed wetlands help to build landscape ecology and make built environments sustainable. The bheries, wetlands of eastern Calcutta (India), utilize the city sewage to develop urban aquaculture that supports the local fish industries and opens a new frontier in sustainable environmental planning research.

  14. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  15. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    Science.gov (United States)

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  16. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    Ali Haghizadeh

    2017-11-23

    Nov 23, 2017 ... regions. This study shows the potency of two GIS-based data driven ... growth of these tools has also prepared another ..... Urban. 30467. 3. 0.06. 0.20. 0.74. 0.80. −0.64. Distance from road ..... and artificial neural networks for potential groundwater .... ping: A case study at Mehran region, Iran; Catena 137.

  17. Improving runoff risk estimates: Formulating runoff as a bivariate process using the SCS curve number method

    Science.gov (United States)

    Shaw, Stephen B.; Walter, M. Todd

    2009-03-01

    The Soil Conservation Service curve number (SCS-CN) method is widely used to predict storm runoff for hydraulic design purposes, such as sizing culverts and detention basins. As traditionally used, the probability of calculated runoff is equated to the probability of the causative rainfall event, an assumption that fails to account for the influence of variations in soil moisture on runoff generation. We propose a modification to the SCS-CN method that explicitly incorporates rainfall return periods and the frequency of different soil moisture states to quantify storm runoff risks. Soil moisture status is assumed to be correlated to stream base flow. Fundamentally, this approach treats runoff as the outcome of a bivariate process instead of dictating a 1:1 relationship between causative rainfall and resulting runoff volumes. Using data from the Fall Creek watershed in western New York and the headwaters of the French Broad River in the mountains of North Carolina, we show that our modified SCS-CN method improves frequency discharge predictions in medium-sized watersheds in the eastern United States in comparison to the traditional application of the method.

  18. The Role of Wealth and Health in Insurance Choice: Bivariate Probit Analysis in China

    Directory of Open Access Journals (Sweden)

    Yiding Yue

    2014-01-01

    Full Text Available This paper captures the correlation between the choices of health insurance and pension insurance using the bivariate probit model and then studies the effect of wealth and health on insurance choice. Our empirical evidence shows that people who participate in a health care program are more likely to participate in a pension plan at the same time, while wealth and health have different effects on the choices of the health care program and the pension program. Generally, the higher an individual’s wealth level is, the more likelihood he will participate in a health care program; but wealth has no effect on the participation of pension. Health status has opposite effects on choices of health care programs and pension plans; the poorer an individual’s health is, the more likely he is to participate in health care programs, while the better health he enjoys, the more likely he is to participate in pension plans. When the investigation scope narrows down to commercial insurance, there is only a significant effect of health status on commercial health insurance. The commercial insurance choice and the insurance choice of the agricultural population are more complicated.

  19. Two new bivariate zero-inflated generalized Poisson distributions with a flexible correlation structure

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2015-05-01

    Full Text Available To model correlated bivariate count data with extra zero observations, this paper proposes two new bivariate zero-inflated generalized Poisson (ZIGP distributions by incorporating a multiplicative factor (or dependency parameter λ, named as Type I and Type II bivariate ZIGP distributions, respectively. The proposed distributions possess a flexible correlation structure and can be used to fit either positively or negatively correlated and either over- or under-dispersed count data, comparing to the existing models that can only fit positively correlated count data with over-dispersion. The two marginal distributions of Type I bivariate ZIGP share a common parameter of zero inflation while the two marginal distributions of Type II bivariate ZIGP have their own parameters of zero inflation, resulting in a much wider range of applications. The important distributional properties are explored and some useful statistical inference methods including maximum likelihood estimations of parameters, standard errors estimation, bootstrap confidence intervals and related testing hypotheses are developed for the two distributions. A real data are thoroughly analyzed by using the proposed distributions and statistical methods. Several simulation studies are conducted to evaluate the performance of the proposed methods.

  20. Bivariate copulas on the exponentially weighted moving average control chart

    Directory of Open Access Journals (Sweden)

    Sasigarn Kuvattana

    2016-10-01

    Full Text Available This paper proposes four types of copulas on the Exponentially Weighted Moving Average (EWMA control chart when observations are from an exponential distribution using a Monte Carlo simulation approach. The performance of the control chart is based on the Average Run Length (ARL which is compared for each copula. Copula functions for specifying dependence between random variables are used and measured by Kendall’s tau. The results show that the Normal copula can be used for almost all shifts.

  1. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both w...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  2. Estimating twin concordance for bivariate competing risks twin data

    DEFF Research Database (Denmark)

    Scheike, Thomas; Holst, Klaus K.; Hjelmborg, Jacob B.

    2014-01-01

    For twin time-to-event data, we consider different concordance probabilities, such as the casewise concordance that are routinely computed as a measure of the lifetime dependence/correlation for specific diseases. The concordance probability here is the probability that both twins have experience...... events with the competing risk death. We thus aim to quantify the degree of dependence through the casewise concordance function and show a significant genetic component...... the event of interest. Under the assumption that both twins are censored at the same time, we show how to estimate this probability in the presence of right censoring, and as a consequence, we can then estimate the casewise twin concordance. In addition, we can model the magnitude of within pair dependence...... over time, and covariates may be further influential on the marginal risk and dependence structure. We establish the estimators large sample properties and suggest various tests, for example, for inferring familial influence. The method is demonstrated and motivated by specific twin data on cancer...

  3. Discrete bivariate population balance modelling of heteroaggregation processes.

    Science.gov (United States)

    Rollié, Sascha; Briesen, Heiko; Sundmacher, Kai

    2009-08-15

    Heteroaggregation in binary particle mixtures was simulated with a discrete population balance model in terms of two internal coordinates describing the particle properties. The considered particle species are of different size and zeta-potential. Property space is reduced with a semi-heuristic approach to enable an efficient solution. Aggregation rates are based on deterministic models for Brownian motion and stability, under consideration of DLVO interaction potentials. A charge-balance kernel is presented, relating the electrostatic surface potential to the property space by a simple charge balance. Parameter sensitivity with respect to the fractal dimension, aggregate size, hydrodynamic correction, ionic strength and absolute particle concentration was assessed. Results were compared to simulations with the literature kernel based on geometric coverage effects for clusters with heterogeneous surface properties. In both cases electrostatic phenomena, which dominate the aggregation process, show identical trends: impeded cluster-cluster aggregation at low particle mixing ratio (1:1), restabilisation at high mixing ratios (100:1) and formation of complex clusters for intermediate ratios (10:1). The particle mixing ratio controls the surface coverage extent of the larger particle species. Simulation results are compared to experimental flow cytometric data and show very satisfactory agreement.

  4. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  5. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi; Zhou, Lan; Najibi, Seyed Morteza; Gao, Xin; Huang, Jianhua Z.

    2015-01-01

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  6. DBH Prediction Using Allometry Described by Bivariate Copula Distribution

    Science.gov (United States)

    Xu, Q.; Hou, Z.; Li, B.; Greenberg, J. A.

    2017-12-01

    Forest biomass mapping based on single tree detection from the airborne laser scanning (ALS) usually depends on an allometric equation that relates diameter at breast height (DBH) with per-tree aboveground biomass. The incapability of the ALS technology in directly measuring DBH leads to the need to predict DBH with other ALS-measured tree-level structural parameters. A copula-based method is proposed in the study to predict DBH with the ALS-measured tree height and crown diameter using a dataset measured in the Lassen National Forest in California. Instead of exploring an explicit mathematical equation that explains the underlying relationship between DBH and other structural parameters, the copula-based prediction method utilizes the dependency between cumulative distributions of these variables, and solves the DBH based on an assumption that for a single tree, the cumulative probability of each structural parameter is identical. Results show that compared with the bench-marking least-square linear regression and the k-MSN imputation, the copula-based method obtains better accuracy in the DBH for the Lassen National Forest. To assess the generalization of the proposed method, prediction uncertainty is quantified using bootstrapping techniques that examine the variability of the RMSE of the predicted DBH. We find that the copula distribution is reliable in describing the allometric relationship between tree-level structural parameters, and it contributes to the reduction of prediction uncertainty.

  7. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    Science.gov (United States)

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    Perhaps no other pair of variables in ecology has generated as much discussion as species richness and ecosystem productivity, as illustrated by the reactions by Pierce (2013) and others to Adler et al.'s (2011) report that empirical patterns are weak and inconsistent. Adler et al. (2011) argued we need to move beyond a focus on simplistic bivariate relationships and test mechanistic, multivariate causal hypotheses. We feel the continuing debate over productivity–richness relationships (PRRs) provides a focused context for illustrating the fundamental difficulties of using bivariate relationships to gain scientific understanding.

  8. Cost-offsets of prescription drug expenditures: data analysis via a copula-based bivariate dynamic hurdle model.

    Science.gov (United States)

    Deb, Partha; Trivedi, Pravin K; Zimmer, David M

    2014-10-01

    In this paper, we estimate a copula-based bivariate dynamic hurdle model of prescription drug and nondrug expenditures to test the cost-offset hypothesis, which posits that increased expenditures on prescription drugs are offset by reductions in other nondrug expenditures. We apply the proposed methodology to data from the Medical Expenditure Panel Survey, which have the following features: (i) the observed bivariate outcomes are a mixture of zeros and continuously measured positives; (ii) both the zero and positive outcomes show state dependence and inter-temporal interdependence; and (iii) the zeros and the positives display contemporaneous association. The point mass at zero is accommodated using a hurdle or a two-part approach. The copula-based approach to generating joint distributions is appealing because the contemporaneous association involves asymmetric dependence. The paper studies samples categorized by four health conditions: arthritis, diabetes, heart disease, and mental illness. There is evidence of greater than dollar-for-dollar cost-offsets of expenditures on prescribed drugs for relatively low levels of spending on drugs and less than dollar-for-dollar cost-offsets at higher levels of drug expenditures. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  10. Optical Coherence Tomography Noise Reduction Using Anisotropic Local Bivariate Gaussian Mixture Prior in 3D Complex Wavelet Domain.

    Science.gov (United States)

    Rabbani, Hossein; Sonka, Milan; Abramoff, Michael D

    2013-01-01

    In this paper, MMSE estimator is employed for noise-free 3D OCT data recovery in 3D complex wavelet domain. Since the proposed distribution for noise-free data plays a key role in the performance of MMSE estimator, a priori distribution for the pdf of noise-free 3D complex wavelet coefficients is proposed which is able to model the main statistical properties of wavelets. We model the coefficients with a mixture of two bivariate Gaussian pdfs with local parameters which are able to capture the heavy-tailed property and inter- and intrascale dependencies of coefficients. In addition, based on the special structure of OCT images, we use an anisotropic windowing procedure for local parameters estimation that results in visual quality improvement. On this base, several OCT despeckling algorithms are obtained based on using Gaussian/two-sided Rayleigh noise distribution and homomorphic/nonhomomorphic model. In order to evaluate the performance of the proposed algorithm, we use 156 selected ROIs from 650 × 512 × 128 OCT dataset in the presence of wet AMD pathology. Our simulations show that the best MMSE estimator using local bivariate mixture prior is for the nonhomomorphic model in the presence of Gaussian noise which results in an improvement of 7.8 ± 1.7 in CNR.

  11. Optical Coherence Tomography Noise Reduction Using Anisotropic Local Bivariate Gaussian Mixture Prior in 3D Complex Wavelet Domain

    Directory of Open Access Journals (Sweden)

    Hossein Rabbani

    2013-01-01

    Full Text Available In this paper, MMSE estimator is employed for noise-free 3D OCT data recovery in 3D complex wavelet domain. Since the proposed distribution for noise-free data plays a key role in the performance of MMSE estimator, a priori distribution for the pdf of noise-free 3D complex wavelet coefficients is proposed which is able to model the main statistical properties of wavelets. We model the coefficients with a mixture of two bivariate Gaussian pdfs with local parameters which are able to capture the heavy-tailed property and inter- and intrascale dependencies of coefficients. In addition, based on the special structure of OCT images, we use an anisotropic windowing procedure for local parameters estimation that results in visual quality improvement. On this base, several OCT despeckling algorithms are obtained based on using Gaussian/two-sided Rayleigh noise distribution and homomorphic/nonhomomorphic model. In order to evaluate the performance of the proposed algorithm, we use 156 selected ROIs from 650 × 512 × 128 OCT dataset in the presence of wet AMD pathology. Our simulations show that the best MMSE estimator using local bivariate mixture prior is for the nonhomomorphic model in the presence of Gaussian noise which results in an improvement of 7.8 ± 1.7 in CNR.

  12. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Science.gov (United States)

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K; Schad, Lothar R; Zöllner, Frank Gerrit

    2015-01-01

    Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  13. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Directory of Open Access Journals (Sweden)

    Jakob Nikolas Kather

    Full Text Available Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions.In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images.To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images.Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  14. Semi-automated detection of aberrant chromosomes in bivariate flow karyotypes

    NARCIS (Netherlands)

    Boschman, G. A.; Manders, E. M.; Rens, W.; Slater, R.; Aten, J. A.

    1992-01-01

    A method is described that is designed to compare, in a standardized procedure, bivariate flow karyotypes of Hoechst 33258 (HO)/Chromomycin A3 (CA) stained human chromosomes from cells with aberrations with a reference flow karyotype of normal chromosomes. In addition to uniform normalization of

  15. Carbon and oxygen isotopic ratio bi-variate distribution for marble artifacts quarry assignment

    International Nuclear Information System (INIS)

    Pentia, M.

    1995-01-01

    Statistical description, by a Gaussian bi-variate probability distribution of 13 C/ 12 C and 18 O/ 16 O isotopic ratios in the ancient marble quarries has been done and the new method for obtaining the confidence level quarry assignment for marble artifacts has been presented. (author) 8 figs., 3 tabs., 4 refs

  16. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    Science.gov (United States)

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  17. Parameter estimation and statistical test of geographically weighted bivariate Poisson inverse Gaussian regression models

    Science.gov (United States)

    Amalia, Junita; Purhadi, Otok, Bambang Widjanarko

    2017-11-01

    Poisson distribution is a discrete distribution with count data as the random variables and it has one parameter defines both mean and variance. Poisson regression assumes mean and variance should be same (equidispersion). Nonetheless, some case of the count data unsatisfied this assumption because variance exceeds mean (over-dispersion). The ignorance of over-dispersion causes underestimates in standard error. Furthermore, it causes incorrect decision in the statistical test. Previously, paired count data has a correlation and it has bivariate Poisson distribution. If there is over-dispersion, modeling paired count data is not sufficient with simple bivariate Poisson regression. Bivariate Poisson Inverse Gaussian Regression (BPIGR) model is mix Poisson regression for modeling paired count data within over-dispersion. BPIGR model produces a global model for all locations. In another hand, each location has different geographic conditions, social, cultural and economic so that Geographically Weighted Regression (GWR) is needed. The weighting function of each location in GWR generates a different local model. Geographically Weighted Bivariate Poisson Inverse Gaussian Regression (GWBPIGR) model is used to solve over-dispersion and to generate local models. Parameter estimation of GWBPIGR model obtained by Maximum Likelihood Estimation (MLE) method. Meanwhile, hypothesis testing of GWBPIGR model acquired by Maximum Likelihood Ratio Test (MLRT) method.

  18. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  19. Using bivariate signal analysis to characterize the epileptic focus: the benefit of surrogates.

    Science.gov (United States)

    Andrzejak, R G; Chicharro, D; Lehnertz, K; Mormann, F

    2011-04-01

    The disease epilepsy is related to hypersynchronous activity of networks of neurons. While acute epileptic seizures are the most extreme manifestation of this hypersynchronous activity, an elevated level of interdependence of neuronal dynamics is thought to persist also during the seizure-free interval. In multichannel recordings from brain areas involved in the epileptic process, this interdependence can be reflected in an increased linear cross correlation but also in signal properties of higher order. Bivariate time series analysis comprises a variety of approaches, each with different degrees of sensitivity and specificity for interdependencies reflected in lower- or higher-order properties of pairs of simultaneously recorded signals. Here we investigate which approach is best suited to detect putatively elevated interdependence levels in signals recorded from brain areas involved in the epileptic process. For this purpose, we use the linear cross correlation that is sensitive to lower-order signatures of interdependence, a nonlinear interdependence measure that integrates both lower- and higher-order properties, and a surrogate-corrected nonlinear interdependence measure that aims to specifically characterize higher-order properties. We analyze intracranial electroencephalographic recordings of the seizure-free interval from 29 patients with an epileptic focus located in the medial temporal lobe. Our results show that all three approaches detect higher levels of interdependence for signals recorded from the brain hemisphere containing the epileptic focus as compared to signals recorded from the opposite hemisphere. For the linear cross correlation, however, these differences are not significant. For the nonlinear interdependence measure, results are significant but only of moderate accuracy with regard to the discriminative power for the focal and nonfocal hemispheres. The highest significance and accuracy is obtained for the surrogate-corrected nonlinear

  20. Effect of catchment properties and flood generation regime on copula selection for bivariate flood frequency analysis

    Science.gov (United States)

    Filipova, Valeriya; Lawrence, Deborah; Klempe, Harald

    2018-02-01

    Applying copula-based bivariate flood frequency analysis is advantageous because the results provide information on both the flood peak and volume. More data are, however, required for such an analysis, and it is often the case that only data series with a limited record length are available. To overcome this issue of limited record length, data regarding climatic and geomorphological properties can be used to complement statistical methods. In this paper, we present a study of 27 catchments located throughout Norway, in which we assess whether catchment properties, flood generation processes and flood regime have an effect on the correlation between flood peak and volume and, in turn, on the selection of copulas. To achieve this, the annual maximum flood events were first classified into events generated primarily by rainfall, snowmelt or a combination of these. The catchments were then classified into flood regime, depending on the predominant flood generation process producing the annual maximum flood events. A contingency table and Fisher's exact test were used to determine the factors that affect the selection of copulas in the study area. The results show that the two-parameter copulas BB1 and BB7 are more commonly selected in catchments with high steepness, high mean annual runoff and rainfall flood regime. These findings suggest that in these types of catchments, the dependence structure between flood peak and volume is more complex and cannot be modeled effectively using a one-parameter copula. The results illustrate that by relating copula types to flood regime and catchment properties, additional information can be supplied for selecting copulas in catchments with limited data.

  1. Bivariate Probit Models for Analysing how “Knowledge” Affects Innovation and Performance in Small and Medium Sized Firms

    OpenAIRE

    FARACE, Salvatore; MAZZOTTA, Fernanda

    2011-01-01

    This paper examines the determinants of innovation and its effects on small- and medium-sized firms We use the data from the OPIS databank, which provides a survey on a representative sample of firms from a province of the Southern Italy. We want to study whether small and medium sized firms can have a competitive advantage using their innovative capabilities, regardless of their sectoral and size limits. The main factor influencing the likelihood of innovation is knowledge, which is acquired...

  2. Bivariate functional data clustering: grouping streams based on a varying coefficient model of the stream water and air temperature relationship

    Science.gov (United States)

    H. Li; X. Deng; Andy Dolloff; E. P. Smith

    2015-01-01

    A novel clustering method for bivariate functional data is proposed to group streams based on their water–air temperature relationship. A distance measure is developed for bivariate curves by using a time-varying coefficient model and a weighting scheme. This distance is also adjusted by spatial correlation of streams via the variogram. Therefore, the proposed...

  3. Improving the modelling of redshift-space distortions - I. A bivariate Gaussian description for the galaxy pairwise velocity distributions

    Science.gov (United States)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2015-01-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation r, such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and dispersion σ. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distortions on all scales, fully capturing the overall linear and non-linear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of RSD is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. We also show how this description naturally allows for the Taylor expansion of 1 + ξS(s) around 1 + ξR(r), which leads to the Kaiser linear formula when truncated to second order, explicating its connection with the moments of the velocity distribution functions. More work is needed, but these results indicate a very promising path to make definitive progress in our programme to improve RSD estimators.

  4. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  5. Testing independence of bivariate interval-censored data using modified Kendall's tau statistic.

    Science.gov (United States)

    Kim, Yuneung; Lim, Johan; Park, DoHwan

    2015-11-01

    In this paper, we study a nonparametric procedure to test independence of bivariate interval censored data; for both current status data (case 1 interval-censored data) and case 2 interval-censored data. To do it, we propose a score-based modification of the Kendall's tau statistic for bivariate interval-censored data. Our modification defines the Kendall's tau statistic with expected numbers of concordant and disconcordant pairs of data. The performance of the modified approach is illustrated by simulation studies and application to the AIDS study. We compare our method to alternative approaches such as the two-stage estimation method by Sun et al. (Scandinavian Journal of Statistics, 2006) and the multiple imputation method by Betensky and Finkelstein (Statistics in Medicine, 1999b). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Genetics of Obesity Traits: A Bivariate Genome-Wide Association Analysis

    DEFF Research Database (Denmark)

    Wu, Yili; Duan, Haiping; Tian, Xiaocao

    2018-01-01

    Previous genome-wide association studies on anthropometric measurements have identified more than 100 related loci, but only a small portion of heritability in obesity was explained. Here we present a bivariate twin study to look for the genetic variants associated with body mass index and waist......-hip ratio, and to explore the obesity-related pathways in Northern Han Chinese. Cholesky decompositionmodel for 242monozygotic and 140 dizygotic twin pairs indicated a moderate genetic correlation (r = 0.53, 95%CI: 0.42–0.64) between body mass index and waist-hip ratio. Bivariate genome-wide association.......05. Expression quantitative trait loci analysis identified rs2242044 as a significant cis-eQTL in both the normal adipose-subcutaneous (P = 1.7 × 10−9) and adipose-visceral (P = 4.4 × 10−15) tissue. These findings may provide an important entry point to unravel genetic pleiotropy in obesity traits....

  7. On minimum divergence adaptation of discrete bivariate distributions to given marginals

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; van der Meulen, E. C.

    2005-01-01

    Roč. 51, č. 1 (2005), s. 313-320 ISSN 0018-9448 R&D Projects: GA ČR GA201/02/1391; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : approximation of contingency tables * bivariate discrete distributions * minimization of divergences Subject RIV: BD - Theory of Information Impact factor: 2.183, year: 2005

  8. Evidence for bivariate linkage of obesity and HDL-C levels in the Framingham Heart Study.

    Science.gov (United States)

    Arya, Rector; Lehman, Donna; Hunt, Kelly J; Schneider, Jennifer; Almasy, Laura; Blangero, John; Stern, Michael P; Duggirala, Ravindranath

    2003-12-31

    Epidemiological studies have indicated that obesity and low high-density lipoprotein (HDL) levels are strong cardiovascular risk factors, and that these traits are inversely correlated. Despite the belief that these traits are correlated in part due to pleiotropy, knowledge on specific genes commonly affecting obesity and dyslipidemia is very limited. To address this issue, we first conducted univariate multipoint linkage analysis for body mass index (BMI) and HDL-C to identify loci influencing variation in these phenotypes using Framingham Heart Study data relating to 1702 subjects distributed across 330 pedigrees. Subsequently, we performed bivariate multipoint linkage analysis to detect common loci influencing covariation between these two traits. We scanned the genome and identified a major locus near marker D6S1009 influencing variation in BMI (LOD = 3.9) using the program SOLAR. We also identified a major locus for HDL-C near marker D2S1334 on chromosome 2 (LOD = 3.5) and another region near marker D6S1009 on chromosome 6 with suggestive evidence for linkage (LOD = 2.7). Since these two phenotypes have been independently mapped to the same region on chromosome 6q, we used the bivariate multipoint linkage approach using SOLAR. The bivariate linkage analysis of BMI and HDL-C implicated the genetic region near marker D6S1009 as harboring a major gene commonly influencing these phenotypes (bivariate LOD = 6.2; LODeq = 5.5) and appears to improve power to map the correlated traits to a region, precisely. We found substantial evidence for a quantitative trait locus with pleiotropic effects, which appears to influence both BMI and HDL-C phenotypes in the Framingham data.

  9. The bivariate probit model of uncomplicated control of tumor: a heuristic exposition of the methodology

    International Nuclear Information System (INIS)

    Herbert, Donald

    1997-01-01

    Purpose: To describe the concept, models, and methods for the construction of estimates of joint probability of uncomplicated control of tumors in radiation oncology. Interpolations using this model can lead to the identification of more efficient treatment regimens for an individual patient. The requirement to find the treatment regimen that will maximize the joint probability of uncomplicated control of tumors suggests a new class of evolutionary experimental designs--Response Surface Methods--for clinical trials in radiation oncology. Methods and Materials: The software developed by Lesaffre and Molenberghs is used to construct bivariate probit models of the joint probability of uncomplicated control of cancer of the oropharynx from a set of 45 patients for each of whom the presence/absence of recurrent tumor (the binary event E-bar 1 /E 1 ) and the presence/absence of necrosis (the binary event E 2 /E-bar 2 ) of the normal tissues of the target volume is recorded, together with the treatment variables dose, time, and fractionation. Results: The bivariate probit model can be used to select a treatment regime that will give a specified probability, say P(S) = 0.60, of uncomplicated control of tumor by interpolation within a set of treatment regimes with known outcomes of recurrence and necrosis. The bivariate probit model can be used to guide a sequence of clinical trials to find the maximum probability of uncomplicated control of tumor for patients in a given prognostic stratum using Response Surface methods by extrapolation from an initial set of treatment regimens. Conclusions: The design of treatments for individual patients and the design of clinical trials might be improved by use of a bivariate probit model and Response Surface Methods

  10. Comparison of Six Methods for the Detection of Causality in a Bivariate Time Series

    Czech Academy of Sciences Publication Activity Database

    Krakovská, A.; Jakubík, J.; Chvosteková, M.; Coufal, David; Jajcay, Nikola; Paluš, Milan

    2018-01-01

    Roč. 97, č. 4 (2018), č. článku 042207. ISSN 2470-0045 R&D Projects: GA MZd(CZ) NV15-33250A Institutional support: RVO:67985807 Keywords : comparative study * causality detection * bivariate models * Granger causality * transfer entropy * convergent cross mappings Impact factor: 2.366, year: 2016 https://journals.aps.org/pre/abstract/10.1103/PhysRevE.97.042207

  11. Can the bivariate Hurst exponent be higher than an average of the separate Hurst exponents?

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2015-01-01

    Roč. 431, č. 1 (2015), s. 124-127 ISSN 0378-4371 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : Correlations * Power- law cross-correlations * Bivariate Hurst exponent * Spectrum coherence Subject RIV: AH - Economics Impact factor: 1.785, year: 2015 http://library.utia.cas.cz/separaty/2015/E/kristoufek-0452314.pdf

  12. Histograms showing variations in oil yield, water yield, and specific gravity of oil from Fischer assay analyses of oil-shale drill cores and cuttings from the Piceance Basin, northwestern Colorado

    Science.gov (United States)

    Dietrich, John D.; Brownfield, Michael E.; Johnson, Ronald C.; Mercier, Tracey J.

    2014-01-01

    Recent studies indicate that the Piceance Basin in northwestern Colorado contains over 1.5 trillion barrels of oil in place, making the basin the largest known oil-shale deposit in the world. Previously published histograms display oil-yield variations with depth and widely correlate rich and lean oil-shale beds and zones throughout the basin. Histograms in this report display oil-yield data plotted alongside either water-yield or oil specific-gravity data. Fischer assay analyses of core and cutting samples collected from exploration drill holes penetrating the Eocene Green River Formation in the Piceance Basin can aid in determining the origins of those deposits, as well as estimating the amount of organic matter, halite, nahcolite, and water-bearing minerals. This report focuses only on the oil yield plotted against water yield and oil specific gravity.

  13. Review: Martin Spetsmann-Kunkel (2004. Die Moral der Daytime Talkshow. Eine soziologische Analyse eines umstrittenen Fernsehformats [The Morality of the Daytime Talk Show. A Sociological Analysis of a Controversial Television Format

    Directory of Open Access Journals (Sweden)

    Nicola Döring

    2006-05-01

    Full Text Available This book deals with the phenomenon of the daytime talk show from a sociological perspective. The author questions the common cultural pessimism of this TV format ("exhibitionist guests," "voyeuristic spectators". He first describes the characteristics of the daytime talk show and summarizes the results of previous surveys that reveal a broad variety of talk show guests' and recipients' motives—beyond pathology. Drawing on concepts like civilization and individualisation, the book outlines the societal functions of the daytime talk show. A participatory observation study in the editorial office of "Hans Meiser" and free interpretations of three series from "Vera am Mittag" are presented as "empirical evidence." Unfortunately the book lacks theoretical and methodological rigor and a sound empirical basis. The bibliography could have been more comprehensive. The work is useful, though, as an inspired, readable introduction into the topic. URN: urn:nbn:de:0114-fqs0603119

  14. A COMPARISON OF SOME ROBUST BIVARIATE CONTROL CHARTS FOR INDIVIDUAL OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    Moustafa Omar Ahmed Abu - Shawiesh

    2014-06-01

    Full Text Available This paper proposed and considered some bivariate control charts to monitor individual observations from a statistical process control. Usual control charts which use mean and variance-covariance estimators are sensitive to outliers. We consider the following robust alternatives to the classical Hoteling's T2: T2MedMAD, T2MCD, T2MVE a simulation study has been conducted to compare the performance of these control charts. Two real life data are analyzed to illustrate the application of these robust alternatives.

  15. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    Science.gov (United States)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  16. A comparison between multivariate and bivariate analysis used in marketing research

    Directory of Open Access Journals (Sweden)

    Constantin, C.

    2012-01-01

    Full Text Available This paper is about an instrumental research conducted in order to compare the information given by two multivariate data analysis in comparison with the usual bivariate analysis. The outcomes of the research reveal that sometimes the multivariate methods use more information from a certain variable, but sometimes they use only a part of the information considered the most important for certain associations. For this reason, a researcher should use both categories of data analysis in order to obtain entirely useful information.

  17. On the construction of bivariate exponential distributions with an arbitrary correlation coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    In this paper we use a concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...... coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase {type distributions, The latter property is potentially important for the development hypothesis testing in linear models. Thirdly, it is very easy to simulate...

  18. Regression analysis for bivariate gap time with missing first gap time data.

    Science.gov (United States)

    Huang, Chia-Hui; Chen, Yi-Hau

    2017-01-01

    We consider ordered bivariate gap time while data on the first gap time are unobservable. This study is motivated by the HIV infection and AIDS study, where the initial HIV contracting time is unavailable, but the diagnosis times for HIV and AIDS are available. We are interested in studying the risk factors for the gap time between initial HIV contraction and HIV diagnosis, and gap time between HIV and AIDS diagnoses. Besides, the association between the two gap times is also of interest. Accordingly, in the data analysis we are faced with two-fold complexity, namely data on the first gap time is completely missing, and the second gap time is subject to induced informative censoring due to dependence between the two gap times. We propose a modeling framework for regression analysis of bivariate gap time under the complexity of the data. The estimating equations for the covariate effects on, as well as the association between, the two gap times are derived through maximum likelihood and suitable counting processes. Large sample properties of the resulting estimators are developed by martingale theory. Simulations are performed to examine the performance of the proposed analysis procedure. An application of data from the HIV and AIDS study mentioned above is reported for illustration.

  19. Geovisualization of land use and land cover using bivariate maps and Sankey flow diagrams

    Science.gov (United States)

    Strode, Georgianna; Mesev, Victor; Thornton, Benjamin; Jerez, Marjorie; Tricarico, Thomas; McAlear, Tyler

    2018-05-01

    The terms `land use' and `land cover' typically describe categories that convey information about the landscape. Despite the major difference of land use implying some degree of anthropogenic disturbance, the two terms are commonly used interchangeably, especially when anthropogenic disturbance is ambiguous, say managed forestland or abandoned agricultural fields. Cartographically, land use and land cover are also sometimes represented interchangeably within common legends, giving with the impression that the landscape is a seamless continuum of land use parcels spatially adjacent to land cover tracts. We believe this is misleading, and feel we need to reiterate the well-established symbiosis of land uses as amalgams of land covers; in other words land covers are subsets of land use. Our paper addresses this spatially complex, and frequently ambiguous relationship, and posits that bivariate cartographic techniques are an ideal vehicle for representing both land use and land cover simultaneously. In more specific terms, we explore the use of nested symbology as ways to represent graphically land use and land cover, where land cover are circles nested with land use squares. We also investigate bivariate legends for representing statistical covariance as a means for visualizing the combinations of land use and cover. Lastly, we apply Sankey flow diagrams to further illustrate the complex, multifaceted relationships between land use and land cover. Our work is demonstrated on data representing land use and cover data for the US state of Florida.

  20. Probabilistic modeling using bivariate normal distributions for identification of flow and displacement intervals in longwall overburden

    Energy Technology Data Exchange (ETDEWEB)

    Karacan, C.O.; Goodman, G.V.R. [NIOSH, Pittsburgh, PA (United States). Off Mine Safety & Health Research

    2011-01-15

    Gob gas ventholes (GGV) are used to control methane emissions in longwall mines by capturing it within the overlying fractured strata before it enters the work environment. In order for GGVs to effectively capture more methane and less mine air, the length of the slotted sections and their proximity to top of the coal bed should be designed based on the potential gas sources and their locations, as well as the displacements in the overburden that will create potential flow paths for the gas. In this paper, an approach to determine the conditional probabilities of depth-displacement, depth-flow percentage, depth-formation and depth-gas content of the formations was developed using bivariate normal distributions. The flow percentage, displacement and formation data as a function of distance from coal bed used in this study were obtained from a series of borehole experiments contracted by the former US Bureau of Mines as part of a research project. Each of these parameters was tested for normality and was modeled using bivariate normal distributions to determine all tail probabilities. In addition, the probability of coal bed gas content as a function of depth was determined using the same techniques. The tail probabilities at various depths were used to calculate conditional probabilities for each of the parameters. The conditional probabilities predicted for various values of the critical parameters can be used with the measurements of flow and methane percentage at gob gas ventholes to optimize their performance.

  1. A bivariate measurement error model for semicontinuous and continuous variables: Application to nutritional epidemiology.

    Science.gov (United States)

    Kipnis, Victor; Freedman, Laurence S; Carroll, Raymond J; Midthune, Douglas

    2016-03-01

    Semicontinuous data in the form of a mixture of a large portion of zero values and continuously distributed positive values frequently arise in many areas of biostatistics. This article is motivated by the analysis of relationships between disease outcomes and intakes of episodically consumed dietary components. An important aspect of studies in nutritional epidemiology is that true diet is unobservable and commonly evaluated by food frequency questionnaires with substantial measurement error. Following the regression calibration approach for measurement error correction, unknown individual intakes in the risk model are replaced by their conditional expectations given mismeasured intakes and other model covariates. Those regression calibration predictors are estimated using short-term unbiased reference measurements in a calibration substudy. Since dietary intakes are often "energy-adjusted," e.g., by using ratios of the intake of interest to total energy intake, the correct estimation of the regression calibration predictor for each energy-adjusted episodically consumed dietary component requires modeling short-term reference measurements of the component (a semicontinuous variable), and energy (a continuous variable) simultaneously in a bivariate model. In this article, we develop such a bivariate model, together with its application to regression calibration. We illustrate the new methodology using data from the NIH-AARP Diet and Health Study (Schatzkin et al., 2001, American Journal of Epidemiology 154, 1119-1125), and also evaluate its performance in a simulation study. © 2015, The International Biometric Society.

  2. A method of moments to estimate bivariate survival functions: the copula approach

    Directory of Open Access Journals (Sweden)

    Silvia Angela Osmetti

    2013-05-01

    Full Text Available In this paper we discuss the problem on parametric and non parametric estimation of the distributions generated by the Marshall-Olkin copula. This copula comes from the Marshall-Olkin bivariate exponential distribution used in reliability analysis. We generalize this model by the copula and different marginal distributions to construct several bivariate survival functions. The cumulative distribution functions are not absolutely continuous and they unknown parameters are often not be obtained in explicit form. In order to estimate the parameters we propose an easy procedure based on the moments. This method consist in two steps: in the first step we estimate only the parameters of marginal distributions and in the second step we estimate only the copula parameter. This procedure can be used to estimate the parameters of complex survival functions in which it is difficult to find an explicit expression of the mixed moments. Moreover it is preferred to the maximum likelihood one for its simplex mathematic form; in particular for distributions whose maximum likelihood parameters estimators can not be obtained in explicit form.

  3. Xp21 contiguous gene syndromes: Deletion quantitation with bivariate flow karyotyping allows mapping of patient breakpoints

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, E.R.B.; Towbin, J.A. (Baylor College of Medicine, Houston, TX (United States)); Engh, G. van den; Trask, B.J. (Lawrence Livermore National Lab., CA (United States))

    1992-12-01

    Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.

  4. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  5. Statistical power of intervention analyses: simulation and empirical application to treated lumber prices

    Science.gov (United States)

    Jeffrey P. Prestemon

    2009-01-01

    Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...

  6. Which global stock indices trigger stronger contagion risk in the Vietnamese stock market? Evidence using a bivariate analysis

    Directory of Open Access Journals (Sweden)

    Wang Kuan-Min

    2013-01-01

    Full Text Available This paper extends recent investigations into risk contagion effects on stock markets to the Vietnamese stock market. Daily data spanning October 9, 2006 to May 3, 2012 are sourced to empirically validate the contagion effects between stock markets in Vietnam, and China, Japan, Singapore, and the US. To facilitate the validation of contagion effects with market-related coefficients, this paper constructs a bivariate EGARCH model of dynamic conditional correlation coefficients. Using the correlation contagion test and Dungey et al.’s (2005 contagion test, we find contagion effects between the Vietnamese and four other stock markets, namely Japan, Singapore, China, and the US. Second, we show that the Japanese stock market causes stronger contagion risk in the Vietnamese stock market compared to the stock markets of China, Singapore, and the US. Finally, we show that the Chinese and US stock markets cause weaker contagion effects in the Vietnamese stock market because of stronger interdependence effects between the former two markets.

  7. Showing/Sharing: Analysing Visual Communication from a Praxeological Perspective

    Directory of Open Access Journals (Sweden)

    Maria Schreiber

    2017-12-01

    Full Text Available This contribution proposes a methodological framework for empirical research into visual practices on social media. The framework identifies practices, pictures and platforms as relevant dimensions of analysis. It is mainly developed within, and is compatible with qualitative, interpretive approaches which focus on visual communication as part of everyday personal communicative practices. Two screenshots from Instagram and Facebook are introduced as empirical examples to investigate collaborative practices of meaning-making relating to pictures on social media. While social media seems to augment reflexive, processual practices of negotiating identities, visual media, in particular, amps up aesthetic, ambivalent and embodied dimensions within these practices.

  8. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    , situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto...... with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration...... distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration...

  9. REGRES: A FORTRAN-77 program to calculate nonparametric and ``structural'' parametric solutions to bivariate regression equations

    Science.gov (United States)

    Rock, N. M. S.; Duffy, T. R.

    REGRES allows a range of regression equations to be calculated for paired sets of data values in which both variables are subject to error (i.e. neither is the "independent" variable). Nonparametric regressions, based on medians of all possible pairwise slopes and intercepts, are treated in detail. Estimated slopes and intercepts are output, along with confidence limits, Spearman and Kendall rank correlation coefficients. Outliers can be rejected with user-determined stringency. Parametric regressions can be calculated for any value of λ (the ratio of the variances of the random errors for y and x)—including: (1) major axis ( λ = 1); (2) reduced major axis ( λ = variance of y/variance of x); (3) Y on Xλ = infinity; or (4) X on Y ( λ = 0) solutions. Pearson linear correlation coefficients also are output. REGRES provides an alternative to conventional isochron assessment techniques where bivariate normal errors cannot be assumed, or weighting methods are inappropriate.

  10. Assessing characteristics related to the use of seatbelts and cell phones by drivers: application of a bivariate probit model.

    Science.gov (United States)

    Russo, Brendan J; Kay, Jonathan J; Savolainen, Peter T; Gates, Timothy J

    2014-06-01

    The effects of cell phone use and safety belt use have been an important focus of research related to driver safety. Cell phone use has been shown to be a significant source of driver distraction contributing to substantial degradations in driver performance, while safety belts have been demonstrated to play a vital role in mitigating injuries to crash-involved occupants. This study examines the prevalence of cell phone use and safety belt non-use among the driving population through direct observation surveys. A bivariate probit model is developed to simultaneously examine the factors that affect cell phone and safety belt use among motor vehicle drivers. The results show that several factors may influence drivers' decision to use cell phones and safety belts, and that these decisions are correlated. Understanding the factors that affect both cell phone use and safety belt non-use is essential to targeting policy and programs that reduce such behavior. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  12. Bivariate tensor product ( p , q $(p, q$ -analogue of Kantorovich-type Bernstein-Stancu-Schurer operators

    Directory of Open Access Journals (Sweden)

    Qing-Bo Cai

    2017-11-01

    Full Text Available Abstract In this paper, we construct a bivariate tensor product generalization of Kantorovich-type Bernstein-Stancu-Schurer operators based on the concept of ( p , q $(p, q$ -integers. We obtain moments and central moments of these operators, give the rate of convergence by using the complete modulus of continuity for the bivariate case and estimate a convergence theorem for the Lipschitz continuous functions. We also give some graphs and numerical examples to illustrate the convergence properties of these operators to certain functions.

  13. Quasi-bivariate variational mode decomposition as a tool of scale analysis in wall-bounded turbulence

    Science.gov (United States)

    Wang, Wenkang; Pan, Chong; Wang, Jinjun

    2018-01-01

    The identification and separation of multi-scale coherent structures is a critical task for the study of scale interaction in wall-bounded turbulence. Here, we propose a quasi-bivariate variational mode decomposition (QB-VMD) method to extract structures with various scales from instantaneous two-dimensional (2D) velocity field which has only one primary dimension. This method is developed from the one-dimensional VMD algorithm proposed by Dragomiretskiy and Zosso (IEEE Trans Signal Process 62:531-544, 2014) to cope with a quasi-2D scenario. It poses the feature of length-scale bandwidth constraint along the decomposed dimension, together with the central frequency re-balancing along the non-decomposed dimension. The feasibility of this method is tested on both a synthetic flow field and a turbulent boundary layer at moderate Reynolds number (Re_{τ } = 3458) measured by 2D particle image velocimetry (PIV). Some other popular scale separation tools, including pseudo-bi-dimensional empirical mode decomposition (PB-EMD), bi-dimensional EMD (B-EMD) and proper orthogonal decomposition (POD), are also tested for comparison. Among all these methods, QB-VMD shows advantages in both scale characterization and energy recovery. More importantly, the mode mixing problem, which degrades the performance of EMD-based methods, is avoided or minimized in QB-VMD. Finally, QB-VMD analysis of the wall-parallel plane in the log layer (at y/δ = 0.12) of the studied turbulent boundary layer shows the coexistence of large- or very large-scale motions (LSMs or VLSMs) and inner-scaled structures, which can be fully decomposed in both physical and spectral domains.

  14. Bivariate Gaussian bridges: directional factorization of diffusion in Brownian bridge models.

    Science.gov (United States)

    Kranstauber, Bart; Safi, Kamran; Bartumeus, Frederic

    2014-01-01

    In recent years high resolution animal tracking data has become the standard in movement ecology. The Brownian Bridge Movement Model (BBMM) is a widely adopted approach to describe animal space use from such high resolution tracks. One of the underlying assumptions of the BBMM is isotropic diffusive motion between consecutive locations, i.e. invariant with respect to the direction. Here we propose to relax this often unrealistic assumption by separating the Brownian motion variance into two directional components, one parallel and one orthogonal to the direction of the motion. Our new model, the Bivariate Gaussian bridge (BGB), tracks movement heterogeneity across time. Using the BGB and identifying directed and non-directed movement within a trajectory resulted in more accurate utilisation distributions compared to dynamic Brownian bridges, especially for trajectories with a non-isotropic diffusion, such as directed movement or Lévy like movements. We evaluated our model with simulated trajectories and observed tracks, demonstrating that the improvement of our model scales with the directional correlation of a correlated random walk. We find that many of the animal trajectories do not adhere to the assumptions of the BBMM. The proposed model improves accuracy when describing the space use both in simulated correlated random walks as well as observed animal tracks. Our novel approach is implemented and available within the "move" package for R.

  15. Improving risk estimates of runoff producing areas: formulating variable source areas as a bivariate process.

    Science.gov (United States)

    Cheng, Xiaoya; Shaw, Stephen B; Marjerison, Rebecca D; Yearick, Christopher D; DeGloria, Stephen D; Walter, M Todd

    2014-05-01

    Predicting runoff producing areas and their corresponding risks of generating storm runoff is important for developing watershed management strategies to mitigate non-point source pollution. However, few methods for making these predictions have been proposed, especially operational approaches that would be useful in areas where variable source area (VSA) hydrology dominates storm runoff. The objective of this study is to develop a simple approach to estimate spatially-distributed risks of runoff production. By considering the development of overland flow as a bivariate process, we incorporated both rainfall and antecedent soil moisture conditions into a method for predicting VSAs based on the Natural Resource Conservation Service-Curve Number equation. We used base-flow immediately preceding storm events as an index of antecedent soil wetness status. Using nine sub-basins of the Upper Susquehanna River Basin, we demonstrated that our estimated runoff volumes and extent of VSAs agreed with observations. We further demonstrated a method for mapping these areas in a Geographic Information System using a Soil Topographic Index. The proposed methodology provides a new tool for watershed planners for quantifying runoff risks across watersheds, which can be used to target water quality protection strategies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    Science.gov (United States)

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  17. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.

    2015-05-22

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields\\' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  18. Bivariate empirical mode decomposition for ECG-based biometric identification with emotional data.

    Science.gov (United States)

    Ferdinando, Hany; Seppanen, Tapio; Alasaarela, Esko

    2017-07-01

    Emotions modulate ECG signals such that they might affect ECG-based biometric identification in real life application. It motivated in finding good feature extraction methods where the emotional state of the subjects has minimum impacts. This paper evaluates feature extraction based on bivariate empirical mode decomposition (BEMD) for biometric identification when emotion is considered. Using the ECG signal from the Mahnob-HCI database for affect recognition, the features were statistical distributions of dominant frequency after applying BEMD analysis to ECG signals. The achieved accuracy was 99.5% with high consistency using kNN classifier in 10-fold cross validation to identify 26 subjects when the emotional states of the subjects were ignored. When the emotional states of the subject were considered, the proposed method also delivered high accuracy, around 99.4%. We concluded that the proposed method offers emotion-independent features for ECG-based biometric identification. The proposed method needs more evaluation related to testing with other classifier and variation in ECG signals, e.g. normal ECG vs. ECG with arrhythmias, ECG from various ages, and ECG from other affective databases.

  19. Applying Emax model and bivariate thin plate splines to assess drug interactions.

    Science.gov (United States)

    Kong, Maiying; Lee, J Jack

    2010-01-01

    We review the semiparametric approach previously proposed by Kong and Lee and extend it to a case in which the dose-effect curves follow the Emax model instead of the median effect equation. When the maximum effects for the investigated drugs are different, we provide a procedure to obtain the additive effect based on the Loewe additivity model. Then, we apply a bivariate thin plate spline approach to estimate the effect beyond additivity along with its 95 per cent point-wise confidence interval as well as its 95 per cent simultaneous confidence interval for any combination dose. Thus, synergy, additivity, and antagonism can be identified. The advantages of the method are that it provides an overall assessment of the combination effect on the entire two-dimensional dose space spanned by the experimental doses, and it enables us to identify complex patterns of drug interaction in combination studies. In addition, this approach is robust to outliers. To illustrate this procedure, we analyzed data from two case studies.

  20. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

    Science.gov (United States)

    Winahju, W. S.; Mukarromah, A.; Putri, S.

    2015-03-01

    Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

  1. THE BIVARIATE SIZE-LUMINOSITY RELATIONS FOR LYMAN BREAK GALAXIES AT z {approx} 4-5

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Kuang-Han; Su, Jian [Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Ferguson, Henry C. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Ravindranath, Swara, E-mail: kuanghan@pha.jhu.edu [The Inter-University Center for Astronomy and Astrophysics, Pune University Campus, Pune 411007, Maharashtra (India)

    2013-03-01

    We study the bivariate size-luminosity distribution of Lyman break galaxies (LBGs) selected at redshifts around 4 and 5 in GOODS and the HUDF fields. We model the size-luminosity distribution as a combination of log-normal distribution (in size) and Schechter function (in luminosity), therefore it enables a more detailed study of the selection effects. We perform extensive simulations to quantify the dropout-selection completenesses and measurement biases and uncertainties in two-dimensional size and magnitude bins, and transform the theoretical size-luminosity distribution to the expected distribution for the observed data. Using maximum-likelihood estimator, we find that the Schechter function parameters for B {sub 435}-dropouts and are consistent with the values in the literature, but the size distributions are wider than expected from the angular momentum distribution of the underlying dark matter halos. The slope of the size-luminosity (RL) relation is similar to those found for local disk galaxies, but considerably shallower than local early-type galaxies.

  2. THE BIVARIATE SIZE-LUMINOSITY RELATIONS FOR LYMAN BREAK GALAXIES AT z ∼ 4-5

    International Nuclear Information System (INIS)

    Huang, Kuang-Han; Su, Jian; Ferguson, Henry C.; Ravindranath, Swara

    2013-01-01

    We study the bivariate size-luminosity distribution of Lyman break galaxies (LBGs) selected at redshifts around 4 and 5 in GOODS and the HUDF fields. We model the size-luminosity distribution as a combination of log-normal distribution (in size) and Schechter function (in luminosity), therefore it enables a more detailed study of the selection effects. We perform extensive simulations to quantify the dropout-selection completenesses and measurement biases and uncertainties in two-dimensional size and magnitude bins, and transform the theoretical size-luminosity distribution to the expected distribution for the observed data. Using maximum-likelihood estimator, we find that the Schechter function parameters for B 435 -dropouts and are consistent with the values in the literature, but the size distributions are wider than expected from the angular momentum distribution of the underlying dark matter halos. The slope of the size-luminosity (RL) relation is similar to those found for local disk galaxies, but considerably shallower than local early-type galaxies.

  3. IDF relationships using bivariate copula for storm events in Peninsular Malaysia

    Science.gov (United States)

    Ariff, N. M.; Jemain, A. A.; Ibrahim, K.; Wan Zin, W. Z.

    2012-11-01

    SummaryIntensity-duration-frequency (IDF) curves are used in many hydrologic designs for the purpose of water managements and flood preventions. The IDF curves available in Malaysia are those obtained from univariate analysis approach which only considers the intensity of rainfalls at fixed time intervals. As several rainfall variables are correlated with each other such as intensity and duration, this paper aims to derive IDF points for storm events in Peninsular Malaysia by means of bivariate frequency analysis. This is achieved through utilizing the relationship between storm intensities and durations using the copula method. Four types of copulas; namely the Ali-Mikhail-Haq (AMH), Frank, Gaussian and Farlie-Gumbel-Morgenstern (FGM) copulas are considered because the correlation between storm intensity, I, and duration, D, are negative and these copulas are appropriate when the relationship between the variables are negative. The correlations are attained by means of Kendall's τ estimation. The analysis was performed on twenty rainfall stations with hourly data across Peninsular Malaysia. Using Akaike's Information Criteria (AIC) for testing goodness-of-fit, both Frank and Gaussian copulas are found to be suitable to represent the relationship between I and D. The IDF points found by the copula method are compared to the IDF curves yielded based on the typical IDF empirical formula of the univariate approach. This study indicates that storm intensities obtained from both methods are in agreement with each other for any given storm duration and for various return periods.

  4. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.; Jun, M.

    2015-01-01

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  5. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    Science.gov (United States)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  6. Application of bivariate mapping for hydrological classification and analysis of temporal change and scale effects in Switzerland

    NARCIS (Netherlands)

    Speich, Matthias J.R.; Bernhard, Luzi; Teuling, Ryan; Zappa, Massimiliano

    2015-01-01

    Hydrological classification schemes are important tools for assessing the impacts of a changing climate on the hydrology of a region. In this paper, we present bivariate mapping as a simple means of classifying hydrological data for a quantitative and qualitative assessment of temporal change.

  7. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi; Gao, Xin; Huang, Jianhua Z.

    2012-01-01

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  8. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez, E-mail: valter.costa@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  9. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  10. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    International Nuclear Information System (INIS)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez

    2011-01-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  11. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  12. A view on coupled cluster perturbation theory using a bivariational Lagrangian formulation.

    Science.gov (United States)

    Kristensen, Kasper; Eriksen, Janus J; Matthews, Devin A; Olsen, Jeppe; Jørgensen, Poul

    2016-02-14

    We consider two distinct coupled cluster (CC) perturbation series that both expand the difference between the energies of the CCSD (CC with single and double excitations) and CCSDT (CC with single, double, and triple excitations) models in orders of the Møller-Plesset fluctuation potential. We initially introduce the E-CCSD(T-n) series, in which the CCSD amplitude equations are satisfied at the expansion point, and compare it to the recently developed CCSD(T-n) series [J. J. Eriksen et al., J. Chem. Phys. 140, 064108 (2014)], in which not only the CCSD amplitude, but also the CCSD multiplier equations are satisfied at the expansion point. The computational scaling is similar for the two series, and both are term-wise size extensive with a formal convergence towards the CCSDT target energy. However, the two series are different, and the CCSD(T-n) series is found to exhibit a more rapid convergence up through the series, which we trace back to the fact that more information at the expansion point is utilized than for the E-CCSD(T-n) series. The present analysis can be generalized to any perturbation expansion representing the difference between a parent CC model and a higher-level target CC model. In general, we demonstrate that, whenever the parent parameters depend upon the perturbation operator, a perturbation expansion of the CC energy (where only parent amplitudes are used) differs from a perturbation expansion of the CC Lagrangian (where both parent amplitudes and parent multipliers are used). For the latter case, the bivariational Lagrangian formulation becomes more than a convenient mathematical tool, since it facilitates a different and faster convergent perturbation series than the simpler energy-based expansion.

  13. A non-parametric conditional bivariate reference region with an application to height/weight measurements on normal girls

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm

    2009-01-01

    A conceptually simple two-dimensional conditional reference curve is described. The curve gives a decision basis for determining whether a bivariate response from an individual is "normal" or "abnormal" when taking into account that a third (conditioning) variable may influence the bivariate...... response. The reference curve is not only characterized analytically but also by geometric properties that are easily communicated to medical doctors - the users of such curves. The reference curve estimator is completely non-parametric, so no distributional assumptions are needed about the two......-dimensional response. An example that will serve to motivate and illustrate the reference is the study of the height/weight distribution of 7-8-year-old Danish school girls born in 1930, 1950, or 1970....

  14. Bivariate least squares linear regression: Towards a unified analytic formalism. I. Functional models

    Science.gov (United States)

    Caimmi, R.

    2011-08-01

    Concerning bivariate least squares linear regression, the classical approach pursued for functional models in earlier attempts ( York, 1966, 1969) is reviewed using a new formalism in terms of deviation (matrix) traces which, for unweighted data, reduce to usual quantities leaving aside an unessential (but dimensional) multiplicative factor. Within the framework of classical error models, the dependent variable relates to the independent variable according to the usual additive model. The classes of linear models considered are regression lines in the general case of correlated errors in X and in Y for weighted data, and in the opposite limiting situations of (i) uncorrelated errors in X and in Y, and (ii) completely correlated errors in X and in Y. The special case of (C) generalized orthogonal regression is considered in detail together with well known subcases, namely: (Y) errors in X negligible (ideally null) with respect to errors in Y; (X) errors in Y negligible (ideally null) with respect to errors in X; (O) genuine orthogonal regression; (R) reduced major-axis regression. In the limit of unweighted data, the results determined for functional models are compared with their counterparts related to extreme structural models i.e. the instrumental scatter is negligible (ideally null) with respect to the intrinsic scatter ( Isobe et al., 1990; Feigelson and Babu, 1992). While regression line slope and intercept estimators for functional and structural models necessarily coincide, the contrary holds for related variance estimators even if the residuals obey a Gaussian distribution, with the exception of Y models. An example of astronomical application is considered, concerning the [O/H]-[Fe/H] empirical relations deduced from five samples related to different stars and/or different methods of oxygen abundance determination. For selected samples and assigned methods, different regression models yield consistent results within the errors (∓ σ) for both

  15. Which global stock indices trigger stronger contagion risk in the Vietnamese stock market? Evidence using a bivariate analysis

    OpenAIRE

    Wang Kuan-Min; Lai Hung-Cheng

    2013-01-01

    This paper extends recent investigations into risk contagion effects on stock markets to the Vietnamese stock market. Daily data spanning October 9, 2006 to May 3, 2012 are sourced to empirically validate the contagion effects between stock markets in Vietnam, and China, Japan, Singapore, and the US. To facilitate the validation of contagion effects with market-related coefficients, this paper constructs a bivariate EGARCH model of dynamic conditional correlation coefficients. Using the...

  16. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Energy Technology Data Exchange (ETDEWEB)

    Cardil Forradellas, A.; Molina Terrén, D.M.; Oliveres, J.; Castellnou, M.

    2016-07-01

    Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights. Area of study: North-West of Spain. Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill.) stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution. Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic. Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass. (Author)

  17. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    Science.gov (United States)

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration. Copyright © 2013 John Wiley & Sons, Ltd.

  18. A power study of bivariate LOD score analysis of a complex trait and fear/discomfort with strangers.

    Science.gov (United States)

    Ji, Fei; Lee, Dayoung; Mendell, Nancy Role

    2005-12-30

    Complex diseases are often reported along with disease-related traits (DRT). Sometimes investigators consider both disease and DRT phenotypes separately and sometimes they consider individuals as affected if they have either the disease or the DRT, or both. We propose instead to consider the joint distribution of the disease and the DRT and do a linkage analysis assuming a pleiotropic model. We evaluated our results through analysis of the simulated datasets provided by Genetic Analysis Workshop 14. We first conducted univariate linkage analysis of the simulated disease, Kofendrerd Personality Disorder and one of its simulated associated traits, phenotype b (fear/discomfort with strangers). Subsequently, we considered the bivariate phenotype, which combined the information on Kofendrerd Personality Disorder and fear/discomfort with strangers. We developed a program to perform bivariate linkage analysis using an extension to the Elston-Stewart peeling method of likelihood calculation. Using this program we considered the microsatellites within 30 cM of the gene pleiotropic for this simulated disease and DRT. Based on 100 simulations of 300 families we observed excellent power to detect linkage within 10 cM of the disease locus using the DRT and the bivariate trait.

  19. Show-Bix &

    DEFF Research Database (Denmark)

    2014-01-01

    The anti-reenactment 'Show-Bix &' consists of 5 dias projectors, a dial phone, quintophonic sound, and interactive elements. A responsive interface will enable the Dias projectors to show copies of original dias slides from the Show-Bix piece ”March på Stedet”, 265 images in total. The copies are...

  20. Risk Aversion in Game Shows

    DEFF Research Database (Denmark)

    Andersen, Steffen; Harrison, Glenn W.; Lau, Morten I.

    2008-01-01

    We review the use of behavior from television game shows to infer risk attitudes. These shows provide evidence when contestants are making decisions over very large stakes, and in a replicated, structured way. Inferences are generally confounded by the subjective assessment of skill in some games......, and the dynamic nature of the task in most games. We consider the game shows Card Sharks, Jeopardy!, Lingo, and finally Deal Or No Deal. We provide a detailed case study of the analyses of Deal Or No Deal, since it is suitable for inference about risk attitudes and has attracted considerable attention....

  1. Talking with TV shows

    DEFF Research Database (Denmark)

    Sandvik, Kjetil; Laursen, Ditte

    2014-01-01

    User interaction with radio and television programmes is not a new thing. However, with new cross-media production concepts such as X Factor and Voice, this is changing dramatically. The second-screen logic of these productions encourages viewers, along with TV’s traditional one-way communication...... mode, to communicate on interactive (dialogue-enabling) devices such as laptops, smartphones and tablets. Using the TV show Voice as our example, this article shows how the technological and situational set-up of the production invites viewers to engage in new ways of interaction and communication...

  2. Talk Show Science.

    Science.gov (United States)

    Moore, Mitzi Ruth

    1992-01-01

    Proposes having students perform skits in which they play the roles of the science concepts they are trying to understand. Provides the dialog for a skit in which hot and cold gas molecules are interviewed on a talk show to study how these properties affect wind, rain, and other weather phenomena. (MDH)

  3. Obesity in show cats.

    Science.gov (United States)

    Corbee, R J

    2014-12-01

    Obesity is an important disease with a high prevalence in cats. Because obesity is related to several other diseases, it is important to identify the population at risk. Several risk factors for obesity have been described in the literature. A higher incidence of obesity in certain cat breeds has been suggested. The aim of this study was to determine whether obesity occurs more often in certain breeds. The second aim was to relate the increased prevalence of obesity in certain breeds to the official standards of that breed. To this end, 268 cats of 22 different breeds investigated by determining their body condition score (BCS) on a nine-point scale by inspection and palpation, at two different cat shows. Overall, 45.5% of the show cats had a BCS > 5, and 4.5% of the show cats had a BCS > 7. There were significant differences between breeds, which could be related to the breed standards. Most overweight and obese cats were in the neutered group. It warrants firm discussions with breeders and cat show judges to come to different interpretations of the standards in order to prevent overweight conditions in certain breeds from being the standard of beauty. Neutering predisposes for obesity and requires early nutritional intervention to prevent obese conditions. Journal of Animal Physiology and Animal Nutrition © 2014 Blackwell Verlag GmbH.

  4. Honored Teacher Shows Commitment.

    Science.gov (United States)

    Ratte, Kathy

    1987-01-01

    Part of the acceptance speech of the 1985 National Council for the Social Studies Teacher of the Year, this article describes the censorship experience of this honored social studies teacher. The incident involved the showing of a videotape version of the feature film entitled "The Seduction of Joe Tynan." (JDH)

  5. Optical Coherence Tomography Noise Reduction Using Anisotropic Local Bivariate Gaussian Mixture Prior in 3D Complex Wavelet Domain

    OpenAIRE

    Rabbani, Hossein; Sonka, Milan; Abramoff, Michael D.

    2013-01-01

    In this paper, MMSE estimator is employed for noise-free 3D OCT data recovery in 3D complex wavelet domain. Since the proposed distribution for noise-free data plays a key role in the performance of MMSE estimator, a priori distribution for the pdf of noise-free 3D complex wavelet coefficients is proposed which is able to model the main statistical properties of wavelets. We model the coefficients with a mixture of two bivariate Gaussian pdfs with local parameters which are able to capture th...

  6. The energy show

    International Nuclear Information System (INIS)

    1988-01-01

    The Energy Show is a new look at the problems of world energy, where our supplies come from, now and in the future. The programme looks at how we need energy to maintain our standards of living. Energy supply is shown as the complicated set of problems it is - that Fossil Fuels are both raw materials and energy sources, that some 'alternatives' so readily suggested as practical options are in reality a long way from being effective. (author)

  7. Simultaneous use of serum IgG and IgM for risk scoring of suspected early Lyme borreliosis: graphical and bivariate analyses

    DEFF Research Database (Denmark)

    Dessau, Ram B; Ejlertsen, Tove; Hilden, Jørgen

    2010-01-01

    The laboratory diagnosis of early disseminated Lyme borreliosis (LB) rests on IgM and IgG antibodies in serum. The purpose of this study was to refine the statistical interpretation of IgM and IgG by combining the diagnostic evidence provided by the two immunoglobulins and exploiting the whole...... range of the quantitative variation in test values. ELISA assays based on purified flagella antigen were performed on sera from 815 healthy Danish blood donors as negative controls and 117 consecutive patients with confirmed neuroborreliosis (NB). A logistic regression model combining the standardized...... units of the IgM and IgG ELISA assays was constructed and the resulting disease risks graphically evaluated by receiver operating characteristic and 'predictiveness' curves. The combined model improves the discrimination between NB patients and blood donors. Hence, it is possible to report a predicted...

  8. Using bivariate latent basis growth curve analysis to better understand treatment outcome in youth with anorexia nervosa.

    Science.gov (United States)

    Byrne, Catherine E; Wonderlich, Joseph A; Curby, Timothy; Fischer, Sarah; Lock, James; Le Grange, Daniel

    2018-04-25

    This study explored the relation between eating-related obsessionality and weight restoration utilizing bivariate latent basis growth curve modelling. Eating-related obsessionality is a moderator of treatment outcome for adolescents with anorexia nervosa (AN). This study examined the degree to which the rate of change in eating-related obsessionality was associated with the rate of change in weight over time in family-based treatment (FBT) and individual therapy for AN. Data were drawn from a 2-site randomized controlled trial that compared FBT and adolescent focused therapy for AN. Bivariate latent basis growth curves were used to examine the differences of the relations between trajectories of body weight and symptoms associated with eating and weight obsessionality. In the FBT group, the slope of eating-related obsessionality scores and the slope of weight were significantly (negatively) correlated. This finding indicates that a decrease in overall eating-relating obsessionality is significantly associated with an increase in weight for individuals who received FBT. However, there was no relation between change in obsessionality scores and change in weight in the adolescent focused therapy group. Results suggest that FBT has a specific impact on both weight gain and obsessive compulsive behaviour that is distinct from individual therapy. Copyright © 2018 John Wiley & Sons, Ltd and Eating Disorders Association.

  9. Showing Value (Editorial

    Directory of Open Access Journals (Sweden)

    Denise Koufogiannakis

    2009-06-01

    Full Text Available When Su Cleyle and I first decided to start Evidence Based Library and Information Practice, one of the things we agreed upon immediately was that the journal be open access. We knew that a major obstacle to librarians using the research literature was that they did not have access to the research literature. Although Su and I are both academic librarians who can access a wide variety of library and information literature from our institutions, we belong to a profession where not everyone has equal access to the research in our field. Without such access to our own body of literature, how can we ever hope for practitioners to use research evidence in their decision making? It would have been contradictory to the principles of evidence based library and information practice to do otherwise.One of the specific groups we thought could use such an open access venue for discovering research literature was school librarians. School librarians are often isolated and lacking access to the research literature that may help them prove to stakeholders the importance of their libraries and their role within schools. Certainly, school libraries have been in decline and the use of evidence to show value is needed. As Ken Haycock noted in his 2003 report, The Crisis in Canada’s School Libraries: The Case for Reform and Reinvestment, “Across the country, teacher-librarians are losing their jobs or being reassigned. Collections are becoming depleted owing to budget cuts. Some principals believe that in the age of the Internet and the classroom workstation, the school library is an artifact” (9. Within this context, school librarians are looking to our research literature for evidence of the impact that school library programs have on learning outcomes and student success. They are integrating that evidence into their practice, and reflecting upon what can be improved locally. They are focusing on students and showing the impact of school libraries and

  10. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  11. Bi-variate statistical attribute filtering : A tool for robust detection of faint objects

    NARCIS (Netherlands)

    Teeninga, Paul; Moschini, Ugo; Trager, Scott C.; Wilkinson, M.H.F.

    2013-01-01

    We present a new method for morphological connected attribute filtering for object detection in astronomical images. In this approach, a threshold is set on one attribute (power), based on its distribution due to noise, as a function of object area. The results show an order of magnitude higher

  12. Bivariate and Multivariate Associations between Trait Listening Goals and Trait Communicator Preferences

    Science.gov (United States)

    Keaton, Shaughan A.; Keteyian, Robert V.; Bodie, Graham D.

    2014-01-01

    This article provides validity evidence for a measure of listening goals by showing theoretically consistent relationships with an existing communication preference questionnaire. Participants (N = 257) were administered trait measures for listening goals and communicator preferences. The four listening goals--relational, task-oriented,…

  13. Aesthetic Surgery Reality Television Shows: Do they Influence Public Perception of the Scope of Plastic Surgery?

    Science.gov (United States)

    Denadai, Rafael; Araujo, Karin Milleni; Samartine Junior, Hugo; Denadai, Rodrigo; Raposo-Amaral, Cassio Eduardo

    2015-12-01

    The purpose of this survey was to assess the influence of aesthetic surgery "reality television" shows viewing on the public's perception of the scope of plastic surgery practice. Perceptions of the scope of plastic surgery (33 scenarios), aesthetic surgery "reality television" viewing patterns ("high," "moderate," or "low" familiarity, similarity, confidence, and influence viewers), sociodemographic data, and previous plastic surgery interaction were collected from 2148 members of the public. Response patterns were created and bivariate and multivariate analyses were applied to assess the possible determinants of overall public choice of plastic surgeons as experts in the plastic surgery-related scenarios. Both "plastic surgeons" and "plastic surgeons alone" were the main response patterns (all p television" viewing negatively influences the public perception of the broad scope of plastic surgery. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.

  14. A Bivariate Generalized Linear Item Response Theory Modeling Framework to the Analysis of Responses and Response Times.

    Science.gov (United States)

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-01-01

    A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.

  15. Evaluation of Factors Affecting E-Bike Involved Crash and E-Bike License Plate Use in China Using a Bivariate Probit Model

    Directory of Open Access Journals (Sweden)

    Yanyong Guo

    2017-01-01

    Full Text Available The primary objective of this study is to evaluate factors affecting e-bike involved crash and license plate use in China. E-bike crashes data were collected from police database and completed through a telephone interview. Noncrash samples were collected by a questionnaire survey. A bivariate probit (BP model was developed to simultaneously examine the significant factors associated with e-bike involved crash and e-bike license plate and to account for the correlations between them. Marginal effects for contributory factors were calculated to quantify their impacts on the outcomes. The results show that several contributory factors, including gender, age, education level, driver license, car in household, experiences in using e-bike, law compliance, and aggressive driving behaviors, are found to have significant impacts on both e-bike involved crash and license plate use. Moreover, type of e-bike, frequency of using e-bike, impulse behavior, degree of riding experience, and risk perception scale are found to be associated with e-bike involved crash. It is also found that e-bike involved crash and e-bike license plate use are strongly correlated and are negative in direction. The result enhanced our comprehension of the factors related to e-bike involved crash and e-bike license plate use.

  16. Mortality as a bivariate function of age and size in indeterminate growers

    DEFF Research Database (Denmark)

    Colchero, Fernando; Schaible, Ralf

    2014-01-01

    Mortality in organisms that grow indefinitely, known as indeterminate growers, is thought to be driven primarily by size. However, a number of ageing mechanisms also act as functions of age. Thus, to explain mortality in these species, both size and age need to be explicitly modelled. Here we...... contribution of age, as a proxy for chronological deterioration, is of typical senescence; while a seemingly senescent population can have underlying age-related negative senescence, which is, however, overcome by negative underlying size effects. We show how inference about these unobserved processes can...

  17. BROJA-2PID: A Robust Estimator for Bivariate Partial Information Decomposition

    Directory of Open Access Journals (Sweden)

    Abdullah Makkeh

    2018-04-01

    Full Text Available Makkeh, Theis, and Vicente found that Cone Programming model is the most robust to compute the Bertschinger et al. partial information decomposition (BROJA PID measure. We developed a production-quality robust software that computes the BROJA PID measure based on the Cone Programming model. In this paper, we prove the important property of strong duality for the Cone Program and prove an equivalence between the Cone Program and the original Convex problem. Then, we describe in detail our software, explain how to use it, and perform some experiments comparing it to other estimators. Finally, we show that the software can be extended to compute some quantities of a trivaraite PID measure.

  18. Calculus of bivariant function

    OpenAIRE

    PTÁČNÍK, Jan

    2011-01-01

    This thesis deals with the introduction of function of two variables and differential calculus of this function. This work should serve as a textbook for students of elementary school's teacher. Each chapter contains a summary of basic concepts and explanations of relationships, then solved model exercises of the topic and finally the exercises, which should solve the student himself. Thesis have transmit to students basic knowledges of differential calculus of functions of two variables, inc...

  19. Bivariate quadratic method in quantifying the differential capacitance and energy capacity of supercapacitors under high current operation

    Science.gov (United States)

    Goh, Chin-Teng; Cruden, Andrew

    2014-11-01

    Capacitance and resistance are the fundamental electrical parameters used to evaluate the electrical characteristics of a supercapacitor, namely the dynamic voltage response, energy capacity, state of charge and health condition. In the British Standards EN62391 and EN62576, the constant capacitance method can be further improved with a differential capacitance that more accurately describes the dynamic voltage response of supercapacitors. This paper presents a novel bivariate quadratic based method to model the dynamic voltage response of supercapacitors under high current charge-discharge cycling, and to enable the derivation of the differential capacitance and energy capacity directly from terminal measurements, i.e. voltage and current, rather than from multiple pulsed-current or excitation signal tests across different bias levels. The estimation results the author achieves are in close agreement with experimental measurements, within a relative error of 0.2%, at various high current levels (25-200 A), more accurate than the constant capacitance method (4-7%). The archival value of this paper is the introduction of an improved quantification method for the electrical characteristics of supercapacitors, and the disclosure of the distinct properties of supercapacitors: the nonlinear capacitance-voltage characteristic, capacitance variation between charging and discharging, and distribution of energy capacity across the operating voltage window.

  20. A comparison of the effect of 5-bromodeoxyuridine substitution on 33258 Hoechst- and DAPI-fluorescence of isolated chromosomes by bivariate flow karyotyping

    NARCIS (Netherlands)

    Buys, C. H.; Mesa, J.; van der Veen, A. Y.; Aten, J. A.

    1986-01-01

    Application of the fluorescent DNA-intercalator propidium iodide for stabilization of the mitotic chromosome structure during isolation of chromosomes from V79 Chinese hamster cells and subsequent staining with the fluorochromes 33258 Hoechst or DAPI allowed bivariate flow karyotyping of isolated

  1. Diagnostic value of sTREM-1 in bronchoalveolar lavage fluid in ICU patients with bacterial lung infections: a bivariate meta-analysis.

    Science.gov (United States)

    Shi, Jia-Xin; Li, Jia-Shu; Hu, Rong; Li, Chun-Hua; Wen, Yan; Zheng, Hong; Zhang, Feng; Li, Qin

    2013-01-01

    The serum soluble triggering receptor expressed on myeloid cells-1 (sTREM-1) is a useful biomarker in differentiating bacterial infections from others. However, the diagnostic value of sTREM-1 in bronchoalveolar lavage fluid (BALF) in lung infections has not been well established. We performed a meta-analysis to assess the accuracy of sTREM-1 in BALF for diagnosis of bacterial lung infections in intensive care unit (ICU) patients. We searched PUBMED, EMBASE and Web of Knowledge (from January 1966 to October 2012) databases for relevant studies that reported diagnostic accuracy data of BALF sTREM-1 in the diagnosis of bacterial lung infections in ICU patients. Pooled sensitivity, specificity, and positive and negative likelihood ratios were calculated by a bivariate regression analysis. Measures of accuracy and Q point value (Q*) were calculated using summary receiver operating characteristic (SROC) curve. The potential between-studies heterogeneity was explored by subgroup analysis. Nine studies were included in the present meta-analysis. Overall, the prevalence was 50.6%; the sensitivity was 0.87 (95% confidence interval (CI), 0.72-0.95); the specificity was 0.79 (95% CI, 0.56-0.92); the positive likelihood ratio (PLR) was 4.18 (95% CI, 1.78-9.86); the negative likelihood ratio (NLR) was 0.16 (95% CI, 0.07-0.36), and the diagnostic odds ratio (DOR) was 25.60 (95% CI, 7.28-89.93). The area under the SROC curve was 0.91 (95% CI, 0.88-0.93), with a Q* of 0.83. Subgroup analysis showed that the assay method and cutoff value influenced the diagnostic accuracy of sTREM-1. BALF sTREM-1 is a useful biomarker of bacterial lung infections in ICU patients. Further studies are needed to confirm the optimized cutoff value.

  2. Bivariate threshold models for genetic evaluation of susceptibility to and ability to recover from mastitis in Danish Holstein cows.

    Science.gov (United States)

    Welderufael, B G; Janss, L L G; de Koning, D J; Sørensen, L P; Løvendahl, P; Fikse, W F

    2017-06-01

    Mastitis in dairy cows is an unavoidable problem and genetic variation in recovery from mastitis, in addition to susceptibility, is therefore of interest. Genetic parameters for susceptibility to and recovery from mastitis were estimated for Danish Holstein-Friesian cows using data from automatic milking systems equipped with online somatic cell count measuring units. The somatic cell count measurements were converted to elevated mastitis risk, a continuous variable [on a (0-1) scale] indicating the risk of mastitis. Risk values >0.6 were assumed to indicate that a cow had mastitis. For each cow and lactation, the sequence of health states (mastitic or healthy) was converted to a weekly transition: 0 if the cow stayed within the same state and 1 if the cow changed state. The result was 2 series of transitions: one for healthy to diseased (HD, to model mastitis susceptibility) and the other for diseased to healthy (DH, to model recovery ability). The 2 series of transitions were analyzed with bivariate threshold models, including several systematic effects and a function of time. The model included effects of herd, parity, herd-test-week, permanent environment (to account for the repetitive nature of transition records from a cow) plus two time-varying effects (lactation stage and time within episode). In early lactation, there was an increased risk of getting mastitis but the risk remained stable afterwards. Mean recovery rate was 45% per lactation. Heritabilities were 0.07 [posterior mean of standard deviations (PSD) = 0.03] for HD and 0.08 (PSD = 0.03) for DH. The genetic correlation between HD and DH has a posterior mean of -0.83 (PSD = 0.13). Although susceptibility and recovery from mastitis are strongly negatively correlated, recovery can be considered as a new trait for selection. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under

  3. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  4. Measuring performance at trade shows

    DEFF Research Database (Denmark)

    Hansen, Kåre

    2004-01-01

    Trade shows is an increasingly important marketing activity to many companies, but current measures of trade show performance do not adequately capture dimensions important to exhibitors. Based on the marketing literature's outcome and behavior-based control system taxonomy, a model is built...... that captures a outcome-based sales dimension and four behavior-based dimensions (i.e. information-gathering, relationship building, image building, and motivation activities). A 16-item instrument is developed for assessing exhibitors perceptions of their trade show performance. The paper presents evidence...

  5. The use of bivariate spatial modeling of questionnaire and parasitology data to predict the distribution of Schistosoma haematobium in Coastal Kenya.

    Directory of Open Access Journals (Sweden)

    Hugh J W Sturrock

    Full Text Available Questionnaires of reported blood in urine (BIU distributed through the existing school system provide a rapid and reliable method to classify schools according to the prevalence of Schistosoma haematobium, thereby helping in the targeting of schistosomiasis control. However, not all schools return questionnaires and it is unclear whether treatment is warranted in such schools. This study investigates the use of bivariate spatial modelling of available and multiple data sources to predict the prevalence of S. haematobium at every school along the Kenyan coast.Data from a questionnaire survey conducted by the Kenya Ministry of Education in Coast Province in 2009 were combined with available parasitological and environmental data in a Bayesian bivariate spatial model. This modeled the relationship between BIU data and environmental covariates, as well as the relationship between BIU and S. haematobium infection prevalence, to predict S. haematobium infection prevalence at all schools in the study region. Validation procedures were implemented to assess the predictive accuracy of endemicity classification.The prevalence of BIU was negatively correlated with distance to nearest river and there was considerable residual spatial correlation at small (~15 km spatial scales. There was a predictable relationship between the prevalence of reported BIU and S. haematobium infection. The final model exhibited excellent sensitivity (0.94 but moderate specificity (0.69 in identifying low (<10% prevalence schools, and had poor performance in differentiating between moderate and high prevalence schools (sensitivity 0.5, specificity 1.Schistosomiasis is highly focal and there is a need to target treatment on a school-by-school basis. The use of bivariate spatial modelling can supplement questionnaire data to identify schools requiring mass treatment, but is unable to distinguish between moderate and high prevalence schools.

  6. A Bivariate Mixture Model for Natural Antibody Levels to Human Papillomavirus Types 16 and 18: Baseline Estimates for Monitoring the Herd Effects of Immunization.

    Directory of Open Access Journals (Sweden)

    Margaretha A Vink

    Full Text Available Post-vaccine monitoring programs for human papillomavirus (HPV have been introduced in many countries, but HPV serology is still an underutilized tool, partly owing to the weak antibody response to HPV infection. Changes in antibody levels among non-vaccinated individuals could be employed to monitor herd effects of immunization against HPV vaccine types 16 and 18, but inference requires an appropriate statistical model. The authors developed a four-component bivariate mixture model for jointly estimating vaccine-type seroprevalence from correlated antibody responses against HPV16 and -18 infections. This model takes account of the correlation between HPV16 and -18 antibody concentrations within subjects, caused e.g. by heterogeneity in exposure level and immune response. The model was fitted to HPV16 and -18 antibody concentrations as measured by a multiplex immunoassay in a large serological survey (3,875 females carried out in the Netherlands in 2006/2007, before the introduction of mass immunization. Parameters were estimated by Bayesian analysis. We used the deviance information criterion for model selection; performance of the preferred model was assessed through simulation. Our analysis uncovered elevated antibody concentrations in doubly as compared to singly seropositive individuals, and a strong clustering of HPV16 and -18 seropositivity, particularly around the age of sexual debut. The bivariate model resulted in a more reliable classification of singly and doubly seropositive individuals than achieved by a combination of two univariate models, and suggested a higher pre-vaccine HPV16 seroprevalence than previously estimated. The bivariate mixture model provides valuable baseline estimates of vaccine-type seroprevalence and may prove useful in seroepidemiologic assessment of the herd effects of HPV vaccination.

  7. Tokyo Motor Show 2003; Tokyo Motor Show 2003

    Energy Technology Data Exchange (ETDEWEB)

    Joly, E.

    2004-01-01

    The text which follows present the different techniques exposed during the 37. Tokyo Motor Show. The report points out the great tendencies of developments of the Japanese automobile industry. The hybrid electric-powered vehicles or those equipped with fuel cells have been highlighted by the Japanese manufacturers which allow considerable budgets in the research of less polluting vehicles. The exposed models, although being all different according to the manufacturer, use always a hybrid system: fuel cell/battery. The manufacturers have stressed too on the intelligent systems for navigation and safety as well as on the design and comfort. (O.M.)

  8. Reality show: um paradoxo nietzschiano

    Directory of Open Access Journals (Sweden)

    Ilana Feldman

    2011-01-01

    Full Text Available

    O fenômeno dos reality shows - e a subseqüente relação entre imagem e verdade - assenta-se sobre uma série de paradoxos. Tais paradoxos podem ser compreendidos à luz do pensamento do filósofo alemão Friedrich Nietzsche, que, através dos usos de formulações paradoxais, concebia a realidade como um mundo de pura aparência e a verdade como um acréscimo ficcional, como um efeito. A ficção é então tomada, na filosofia de Nietzsche, não em seu aspecto falsificante e desrealizador - como sempre pleiteou nossa tradição metafísica -, mas como condição necessária para que certa espécie de invenção possa operar como verdade. Sendo assim, a própria expressão reality show, através de sua formulação paradoxal, engendra explicitamente um mundo de pura aparência, em que a verdade, a parte reality da proposição, é da ordem do suplemento, daquilo que se acrescenta ficcionalmente - como um adjetivo - a show. O ornamento, nesse caso, passa a ocupar o lugar central, apontando para o efeito produzido: o efeito-de-verdade. Seguindo, então, o pensamento nietzschiano e sua atualização na contemporaneidade, investigaremos de que forma os televisivos “shows de realidade” operam paradoxalmente, em consonância com nossas paradoxais práticas culturais.

  9. Complexity analyses show two distinct types of nonlinear dynamics in short heart period variability recordings

    Science.gov (United States)

    Porta, Alberto; Bari, Vlasta; Marchi, Andrea; De Maria, Beatrice; Cysarz, Dirk; Van Leeuwen, Peter; Takahashi, Anielle C. M.; Catai, Aparecida M.; Gnecchi-Ruscone, Tomaso

    2015-01-01

    Two diverse complexity metrics quantifying time irreversibility and local prediction, in connection with a surrogate data approach, were utilized to detect nonlinear dynamics in short heart period (HP) variability series recorded in fetuses, as a function of the gestational period, and in healthy humans, as a function of the magnitude of the orthostatic challenge. The metrics indicated the presence of two distinct types of nonlinear HP dynamics characterized by diverse ranges of time scales. These findings stress the need to render more specific the analysis of nonlinear components of HP dynamics by accounting for different temporal scales. PMID:25806002

  10. Advanced Behavioral Analyses Show that the Presence of Food Causes Subtle Changes in C. elegans Movement

    OpenAIRE

    Angstman, Nicholas B.; Frank, Hans-Georg; Schmitz, Christoph

    2016-01-01

    As a widely used and studied model organism, Caenorhabditis elegans worms offer the ability to investigate implications of behavioral change. Although, investigation of C. elegans behavioral traits has been shown, analysis is often narrowed down to measurements based off a single point, and thus cannot pick up on subtle behavioral and morphological changes. In the present study videos were captured of four different C. elegans strains grown in liquid cultures and transferred to NGM-agar plate...

  11. A Hybrid Forecasting Model Based on Bivariate Division and a Backpropagation Artificial Neural Network Optimized by Chaos Particle Swarm Optimization for Day-Ahead Electricity Price

    Directory of Open Access Journals (Sweden)

    Zhilong Wang

    2014-01-01

    Full Text Available In the electricity market, the electricity price plays an inevitable role. Nevertheless, accurate price forecasting, a vital factor affecting both government regulatory agencies and public power companies, remains a huge challenge and a critical problem. Determining how to address the accurate forecasting problem becomes an even more significant task in an era in which electricity is increasingly important. Based on the chaos particle swarm optimization (CPSO, the backpropagation artificial neural network (BPANN, and the idea of bivariate division, this paper proposes a bivariate division BPANN (BD-BPANN method and the CPSO-BD-BPANN method for forecasting electricity price. The former method creatively transforms the electricity demand and price to be a new variable, named DV, which is calculated using the division principle, to forecast the day-ahead electricity by multiplying the forecasted values of the DVs and forecasted values of the demand. Next, to improve the accuracy of BD-BPANN, chaos particle swarm optimization and BD-BPANN are synthesized to form a novel model, CPSO-BD-BPANN. In this study, CPSO is utilized to optimize the initial parameters of BD-BPANN to make its output more stable than the original model. Finally, two forecasting strategies are proposed regarding different situations.

  12. Effectiveness of enforcement levels of speed limit and drink driving laws and associated factors – Exploratory empirical analysis using a bivariate ordered probit model

    Directory of Open Access Journals (Sweden)

    Behram Wali

    2017-06-01

    Full Text Available The contemporary traffic safety research comprises little information on quantifying the simultaneous association between drink driving and speeding among fatally injured drivers. Potential correlation between driver's drink driving and speeding behavior poses a substantial methodological concern which needs investigation. This study therefore focused on investigating the simultaneous impact of socioeconomic factors, fatalities, vehicle ownership, health services and highway agency road safety policies on enforcement levels of speed limit and drink driving laws. The effectiveness of enforcement levels of speed limit and drink driving laws has been investigated through development of bivariate ordered probit model using data extricated from WHO's global status report on road safety in 2013. The consistent and intuitive parameter estimates along with statistically significant correlation between response outcomes validates the statistical supremacy of bivariate ordered probit model. The results revealed that fatalities per thousand registered vehicles, hospital beds per hundred thousand population and road safety policies are associated with a likely medium or high effectiveness of enforcement levels of speed limit and drink driving laws, respectively. Also, the model encapsulates the effect of several other agency related variables and socio-economic status on the response outcomes. Marginal effects are reported for analyzing the impact of such factors on intermediate categories of response outcomes. The results of this study are expected to provide necessary insights to elemental enforcement programs. Also, marginal effects of explanatory variables may provide useful directions for formulating effective policy countermeasures for overcoming driver's speeding and drink driving behavior.

  13. A bivariate contaminated binormal model for robust fitting of proper ROC curves to a pair of correlated, possibly degenerate, ROC datasets.

    Science.gov (United States)

    Zhai, Xuetong; Chakraborty, Dev P

    2017-06-01

    The objective was to design and implement a bivariate extension to the contaminated binormal model (CBM) to fit paired receiver operating characteristic (ROC) datasets-possibly degenerate-with proper ROC curves. Paired datasets yield two correlated ratings per case. Degenerate datasets have no interior operating points and proper ROC curves do not inappropriately cross the chance diagonal. The existing method, developed more than three decades ago utilizes a bivariate extension to the binormal model, implemented in CORROC2 software, which yields improper ROC curves and cannot fit degenerate datasets. CBM can fit proper ROC curves to unpaired (i.e., yielding one rating per case) and degenerate datasets, and there is a clear scientific need to extend it to handle paired datasets. In CBM, nondiseased cases are modeled by a probability density function (pdf) consisting of a unit variance peak centered at zero. Diseased cases are modeled with a mixture distribution whose pdf consists of two unit variance peaks, one centered at positive μ with integrated probability α, the mixing fraction parameter, corresponding to the fraction of diseased cases where the disease was visible to the radiologist, and one centered at zero, with integrated probability (1-α), corresponding to disease that was not visible. It is shown that: (a) for nondiseased cases the bivariate extension is a unit variances bivariate normal distribution centered at (0,0) with a specified correlation ρ 1 ; (b) for diseased cases the bivariate extension is a mixture distribution with four peaks, corresponding to disease not visible in either condition, disease visible in only one condition, contributing two peaks, and disease visible in both conditions. An expression for the likelihood function is derived. A maximum likelihood estimation (MLE) algorithm, CORCBM, was implemented in the R programming language that yields parameter estimates and the covariance matrix of the parameters, and other statistics

  14. Contributory fault and level of personal injury to drivers involved in head-on collisions: Application of copula-based bivariate ordinal models.

    Science.gov (United States)

    Wali, Behram; Khattak, Asad J; Xu, Jingjing

    2018-01-01

    The main objective of this study is to simultaneously investigate the degree of injury severity sustained by drivers involved in head-on collisions with respect to fault status designation. This is complicated to answer due to many issues, one of which is the potential presence of correlation between injury outcomes of drivers involved in the same head-on collision. To address this concern, we present seemingly unrelated bivariate ordered response models by analyzing the joint injury severity probability distribution of at-fault and not-at-fault drivers. Moreover, the assumption of bivariate normality of residuals and the linear form of stochastic dependence implied by such models may be unduly restrictive. To test this, Archimedean copula structures and normal mixture marginals are integrated into the joint estimation framework, which can characterize complex forms of stochastic dependencies and non-normality in residual terms. The models are estimated using 2013 Virginia police reported two-vehicle head-on collision data, where exactly one driver is at-fault. The results suggest that both at-fault and not-at-fault drivers sustained serious/fatal injuries in 8% of crashes, whereas, in 4% of the cases, the not-at-fault driver sustained a serious/fatal injury with no injury to the at-fault driver at all. Furthermore, if the at-fault driver is fatigued, apparently asleep, or has been drinking the not-at-fault driver is more likely to sustain a severe/fatal injury, controlling for other factors and potential correlations between the injury outcomes. While not-at-fault vehicle speed affects injury severity of at-fault driver, the effect is smaller than the effect of at-fault vehicle speed on at-fault injury outcome. Contrarily, and importantly, the effect of at-fault vehicle speed on injury severity of not-at-fault driver is almost equal to the effect of not-at-fault vehicle speed on injury outcome of not-at-fault driver. Compared to traditional ordered probability

  15. Evolution of association between renal and liver functions while awaiting heart transplant: An application using a bivariate multiphase nonlinear mixed effects model.

    Science.gov (United States)

    Rajeswaran, Jeevanantham; Blackstone, Eugene H; Barnard, John

    2018-07-01

    In many longitudinal follow-up studies, we observe more than one longitudinal outcome. Impaired renal and liver functions are indicators of poor clinical outcomes for patients who are on mechanical circulatory support and awaiting heart transplant. Hence, monitoring organ functions while waiting for heart transplant is an integral part of patient management. Longitudinal measurements of bilirubin can be used as a marker for liver function and glomerular filtration rate for renal function. We derive an approximation to evolution of association between these two organ functions using a bivariate nonlinear mixed effects model for continuous longitudinal measurements, where the two submodels are linked by a common distribution of time-dependent latent variables and a common distribution of measurement errors.

  16. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  17. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  18. Point processes statistics of stable isotopes: analysing water uptake patterns in a mixed stand of Aleppo pine and Holm oak

    Directory of Open Access Journals (Sweden)

    Carles Comas

    2015-04-01

    Full Text Available Aim of study: Understanding inter- and intra-specific competition for water is crucial in drought-prone environments. However, little is known about the spatial interdependencies for water uptake among individuals in mixed stands. The aim of this work was to compare water uptake patterns during a drought episode in two common Mediterranean tree species, Quercus ilex L. and Pinus halepensis Mill., using the isotope composition of xylem water (δ18O, δ2H as hydrological marker. Area of study: The study was performed in a mixed stand, sampling a total of 33 oaks and 78 pines (plot area= 888 m2. We tested the hypothesis that both species uptake water differentially along the soil profile, thus showing different levels of tree-to-tree interdependency, depending on whether neighbouring trees belong to one species or the other. Material and Methods: We used pair-correlation functions to study intra-specific point-tree configurations and the bivariate pair correlation function to analyse the inter-specific spatial configuration. Moreover, the isotopic composition of xylem water was analysed as a mark point pattern. Main results: Values for Q. ilex (δ18O = –5.3 ± 0.2‰, δ2H = –54.3 ± 0.7‰ were significantly lower than for P. halepensis (δ18O = –1.2 ± 0.2‰, δ2H = –25.1 ± 0.8‰, pointing to a greater contribution of deeper soil layers for water uptake by Q. ilex. Research highlights: Point-process analyses revealed spatial intra-specific dependencies among neighbouring pines, showing neither oak-oak nor oak-pine interactions. This supports niche segregation for water uptake between the two species.

  19. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  20. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  1. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  2. Recriticality analyses for CAPRA cores

    International Nuclear Information System (INIS)

    Maschek, W.; Thiem, D.

    1995-01-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  3. Recriticality analyses for CAPRA cores

    Energy Technology Data Exchange (ETDEWEB)

    Maschek, W.; Thiem, D.

    1995-08-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  4. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  5. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  6. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  7. Histogram Estimators of Bivariate Densities

    National Research Council Canada - National Science Library

    Husemann, Joyce A

    1986-01-01

    One-dimensional fixed-interval histogram estimators of univariate probability density functions are less efficient than the analogous variable-interval estimators which are constructed from intervals...

  8. BIMOND3, Monotone Bivariate Interpolation

    International Nuclear Information System (INIS)

    Fritsch, F.N.; Carlson, R.E.

    2001-01-01

    1 - Description of program or function: BIMOND is a FORTRAN-77 subroutine for piecewise bi-cubic interpolation to data on a rectangular mesh, which reproduces the monotonousness of the data. A driver program, BIMOND1, is provided which reads data, computes the interpolating surface parameters, and evaluates the function on a mesh suitable for plotting. 2 - Method of solution: Monotonic piecewise bi-cubic Hermite interpolation is used. 3 - Restrictions on the complexity of the problem: The current version of the program can treat data which are monotone in only one of the independent variables, but cannot handle piecewise monotone data

  9. Regional Analysis of Precipitation by Means of Bivariate Distribution Adjusted by Maximum Entropy; Analisis regional de precipitacion con base en una distribucion bivariada ajustada por maxima entropia

    Energy Technology Data Exchange (ETDEWEB)

    Escalante Sandoval, Carlos A.; Dominguez Esquivel, Jose Y. [Universidad Nacional Autonoma de Mexico (Mexico)

    2001-09-01

    The principle of maximum entropy (POME) is used to derive an alternative method of parameter estimation for the bivariate Gumbel distribution. A simple algorithm for this parameter estimation technique is presented. This method is applied to analyze the precipitation in a region of Mexico. Design events are compered with those obtained by the maximum likelihood procedure. According to the results, the proposed technique is a suitable option to be considered when performing frequency analysis of precipitation with small samples. [Spanish] El principio de maxima entropia, conocido como POME, es utilizado para derivar un procedimiento alternativo de estimacion de parametros de la distribucion bivariada de valores extremos con marginales Gumbel. El modelo se aplica al analisis de la precipitacion maxima en 24 horas en una region de Mexico y los eventos de diseno obtenidos son comparados con los proporcionados por la tecnica de maxima verosimilitud. De acuerdo con los resultados obtenidos, se concluye que la tecnica propuesta representa una buena opcion, sobre todo para el caso de muestras pequenas.

  10. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis.

    Science.gov (United States)

    Shields, Katherine F; Bain, Robert E S; Cronk, Ryan; Wright, Jim A; Bartram, Jamie

    2015-12-01

    Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. We performed a bivariate random-effects meta-analysis of 45 studies, identified through a systematic review, that reported either the proportion of samples free of fecal indicator bacteria and/or individual sample bacteria counts for source and HSW, disaggregated by supply type. Water quality deteriorated substantially between source and stored water. The mean percentage of contaminated samples (noncompliance) at the source was 46% (95% CI: 33, 60%), whereas mean noncompliance in HSW was 75% (95% CI: 64, 84%). Water supply type was significantly associated with noncompliance at the source (p water (OR = 0.2; 95% CI: 0.1, 0.5) and HSW (OR = 0.3; 95% CI: 0.2, 0.8) from piped supplies had significantly lower odds of contamination compared with non-piped water, potentially due to residual chlorine. Piped water is less likely to be contaminated compared with other water supply types at both the source and in HSW. A focus on upgrading water services to piped supplies may help improve safety, including for those drinking stored water.

  11. Modelling the vicious circle between obesity and physical activity in children and adolescents using a bivariate probit model with endogenous regressors.

    Science.gov (United States)

    Yeh, C-Y; Chen, L-J; Ku, P-W; Chen, C-M

    2015-01-01

    The increasing prevalence of obesity in children and adolescents has become one of the most important public health issues around the world. Lack of physical activity is a risk factor for obesity, while being obese could reduce the likelihood of participating in physical activity. Failing to account for the endogeneity between obesity and physical activity would result in biased estimation. This study investigates the relationship between overweight and physical activity by taking endogeneity into consideration. It develops an endogenous bivariate probit model estimated by the maximum likelihood method. The data included 4008 boys and 4197 girls in the 5th-9th grades in Taiwan in 2007-2008. The relationship between overweight and physical activity is significantly negative in the endogenous model, but insignificant in the comparative exogenous model. This endogenous relationship presents a vicious circle in which lower levels of physical activity lead to overweight, while those who are already overweight engage in less physical activity. The results not only reveal the importance of endogenous treatment, but also demonstrate the robust negative relationship between these two factors. An emphasis should be put on overweight and obese children and adolescents in order to break the vicious circle. Promotion of physical activity by appropriate counselling programmes and peer support could be effective in reducing the prevalence of obesity in children and adolescents.

  12. Investigating the relationship between costs and outcomes for English mental health providers: a bi-variate multi-level regression analysis.

    Science.gov (United States)

    Moran, Valerie; Jacobs, Rowena

    2018-06-01

    Provider payment systems for mental health care that incentivize cost control and quality improvement have been a policy focus in a number of countries. In England, a new prospective provider payment system is being introduced to mental health that should encourage providers to control costs and improve outcomes. The aim of this research is to investigate the relationship between costs and outcomes to ascertain whether there is a trade-off between controlling costs and improving outcomes. The main data source is the Mental Health Minimum Data Set (MHMDS) for the years 2011/12 and 2012/13. Costs are calculated using NHS reference cost data while outcomes are measured using the Health of the Nation Outcome Scales (HoNOS). We estimate a bivariate multi-level model with costs and outcomes simultaneously. We calculate the correlation and plot the pairwise relationship between residual costs and outcomes at the provider level. After controlling for a range of demographic, need, social, and treatment variables, residual variation in costs and outcomes remains at the provider level. The correlation between residual costs and outcomes is negative, but very small, suggesting that cost-containment efforts by providers should not undermine outcome-improving efforts under the new payment system.

  13. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  14. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  15. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  16. Analyses of MHD instabilities

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki

    1985-01-01

    In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)

  17. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  18. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  19. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  20. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  1. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  2. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  3. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  4. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  5. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  6. Analyse van kwalitatief onderzoeksmateriaal

    NARCIS (Netherlands)

    Wester, F.P.J.

    2004-01-01

    Qualitative research is characterised by its analytical goals: the development of categories, the elaboration of concepts or the formulation of a theory. Because of this analytical openness, the research design shows successive phases, each with its own objective and specific demands for data

  7. Analysing international relations

    DEFF Research Database (Denmark)

    Corry, Olaf

    2014-01-01

    matters by depicting reality in new ways. I then show how different theories rely on different ‘pictures’ of what makes up the international system. Section 2 shows how theories differ in terms of their scope, their aims and their purposes. Section 3 explores some of the choices to be made when using...... theories to ‘explain’ international relations and distinguishes between different kinds of explanation. In Section 4 I look at how different theories have been grouped – first according to their underlying views of what is valid knowledge, and second in terms of different accounts of how history works.......Presented with conflicting evidence and interpretations, how do we ever come to valid conclusions about complex questions of continuity and change in global politics? In any analytical task you need to consider a number of things and this final chapter will take you through some of them including...

  8. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    Science.gov (United States)

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  9. NOAA's National Snow Analyses

    Science.gov (United States)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  10. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power......Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  11. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  12. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  13. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  14. Molecular ecological network analyses.

    Science.gov (United States)

    Deng, Ye; Jiang, Yi-Huei; Yang, Yunfeng; He, Zhili; Luo, Feng; Zhou, Jizhong

    2012-05-30

    Understanding the interaction among different species within a community and their responses to environmental changes is a central goal in ecology. However, defining the network structure in a microbial community is very challenging due to their extremely high diversity and as-yet uncultivated status. Although recent advance of metagenomic technologies, such as high throughout sequencing and functional gene arrays, provide revolutionary tools for analyzing microbial community structure, it is still difficult to examine network interactions in a microbial community based on high-throughput metagenomics data. Here, we describe a novel mathematical and bioinformatics framework to construct ecological association networks named molecular ecological networks (MENs) through Random Matrix Theory (RMT)-based methods. Compared to other network construction methods, this approach is remarkable in that the network is automatically defined and robust to noise, thus providing excellent solutions to several common issues associated with high-throughput metagenomics data. We applied it to determine the network structure of microbial communities subjected to long-term experimental warming based on pyrosequencing data of 16 S rRNA genes. We showed that the constructed MENs under both warming and unwarming conditions exhibited topological features of scale free, small world and modularity, which were consistent with previously described molecular ecological networks. Eigengene analysis indicated that the eigengenes represented the module profiles relatively well. In consistency with many other studies, several major environmental traits including temperature and soil pH were found to be important in determining network interactions in the microbial communities examined. To facilitate its application by the scientific community, all these methods and statistical tools have been integrated into a comprehensive Molecular Ecological Network Analysis Pipeline (MENAP), which is open

  15. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  16. Multi-level Bayesian analyses for single- and multi-vehicle freeway crashes.

    Science.gov (United States)

    Yu, Rongjie; Abdel-Aty, Mohamed

    2013-09-01

    This study presents multi-level analyses for single- and multi-vehicle crashes on a mountainous freeway. Data from a 15-mile mountainous freeway section on I-70 were investigated. Both aggregate and disaggregate models for the two crash conditions were developed. Five years of crash data were used in the aggregate investigation, while the disaggregate models utilized one year of crash data along with real-time traffic and weather data. For the aggregate analyses, safety performance functions were developed for the purpose of revealing the contributing factors for each crash type. Two methodologies, a Bayesian bivariate Poisson-lognormal model and a Bayesian hierarchical Poisson model with correlated random effects, were estimated to simultaneously analyze the two crash conditions with consideration of possible correlations. Except for the factors related to geometric characteristics, two exposure parameters (annual average daily traffic and segment length) were included. Two different sets of significant explanatory and exposure variables were identified for the single-vehicle (SV) and multi-vehicle (MV) crashes. It was found that the Bayesian bivariate Poisson-lognormal model is superior to the Bayesian hierarchical Poisson model, the former with a substantially lower DIC and more significant variables. In addition to the aggregate analyses, microscopic real-time crash risk evaluation models were developed for the two crash conditions. Multi-level Bayesian logistic regression models were estimated with the random parameters accounting for seasonal variations, crash-unit-level diversity and segment-level random effects capturing unobserved heterogeneity caused by the geometric characteristics. The model results indicate that the effects of the selected variables on crash occurrence vary across seasons and crash units; and that geometric characteristic variables contribute to the segment variations: the more unobserved heterogeneity have been accounted, the better

  17. UV Photography Shows Hidden Sun Damage

    Science.gov (United States)

    ... mcat1=de12", ]; for (var c = 0; c UV photography shows hidden sun damage A UV photograph gives ... developing skin cancer and prematurely aged skin. Normal photography UV photography 18 months of age: This boy's ...

  18. Foodstuff analyses show that seafood and water are major perfluoroalkyl acids (PFAAs) sources to humans in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jin-Ju; Lee, Ji-Woo [Department of Civil and Environmental Engineering, Pusan National University, Busan, 609-735 (Korea, Republic of); Kim, Seung-Kyu [Department of Marine Science, College of Natural Sciences, Incheon National University, Incheon, 406-772 (Korea, Republic of); Oh, Jeong-Eun, E-mail: jeoh@pusan.ac.kr [Department of Civil and Environmental Engineering, Pusan National University, Busan, 609-735 (Korea, Republic of)

    2014-08-30

    Graphical abstract: - Highlights: • 16 PFAAs in 397 samples of 66 food types and 34 tap water samples were analyzed. • Dietary exposure to PFAAs was estimated by using the PFAAs measured concentrations. • The major contributors of PFAAs dietary exposure were confirmed. - Abstract: We measured concentrations of PFAAs in 397 foods, of 66 types, in Korea, and determined the daily human dietary PFAAs intake and the contribution of each foodstuff to that intake. The PFAAs concentration in the 66 different food types ranged from below the detection limit to 48.3 ng/g. Perfluorooctane sulfonate (PFOS) and long-chain perfluorocarboxylic acids (PFCAs) were the dominant PFAAs in fish, shellfish, and processed foods, while perfluorooctanoic acid (PFOA) and short-chain PFCAs dominated dairy foodstuffs and beverages. The Korean adult dietary intake ranges, estimated for a range of scenarios, were 0.60–3.03 and 0.17–1.68 ng kg{sup −1} bw d{sup −1} for PFOS and PFOA, respectively, which were lower than the total daily intake limits suggested by European Food Safety Authority (PFOS: 150 ng kg{sup −1} bw d{sup −1}; PFOA: 1500 ng kg{sup −1} bw d{sup −1}). The major contributors to PFAAs dietary exposure varied with subject age and PFAAs. For example, fish was a major contributor of PFOS but dairy foods were major contributors of PFOA. However, tap water was a major contributor to PFOA intake when it was the main source of drinking water (rather than bottled water)

  19. Tract-Specific Analyses of Diffusion Tensor Imaging Show Widespread White Matter Compromise in Autism Spectrum Disorder

    Science.gov (United States)

    Shukla, Dinesh K.; Keehn, Brandon; Muller, Ralph-Axel

    2011-01-01

    Background: Previous diffusion tensor imaging (DTI) studies have shown white matter compromise in children and adults with autism spectrum disorder (ASD), which may relate to reduced connectivity and impaired function of distributed networks. However, tract-specific evidence remains limited in ASD. We applied tract-based spatial statistics (TBSS)…

  20. Foodstuff analyses show that seafood and water are major perfluoroalkyl acids (PFAAs) sources to humans in Korea

    International Nuclear Information System (INIS)

    Heo, Jin-Ju; Lee, Ji-Woo; Kim, Seung-Kyu; Oh, Jeong-Eun

    2014-01-01

    Graphical abstract: - Highlights: • 16 PFAAs in 397 samples of 66 food types and 34 tap water samples were analyzed. • Dietary exposure to PFAAs was estimated by using the PFAAs measured concentrations. • The major contributors of PFAAs dietary exposure were confirmed. - Abstract: We measured concentrations of PFAAs in 397 foods, of 66 types, in Korea, and determined the daily human dietary PFAAs intake and the contribution of each foodstuff to that intake. The PFAAs concentration in the 66 different food types ranged from below the detection limit to 48.3 ng/g. Perfluorooctane sulfonate (PFOS) and long-chain perfluorocarboxylic acids (PFCAs) were the dominant PFAAs in fish, shellfish, and processed foods, while perfluorooctanoic acid (PFOA) and short-chain PFCAs dominated dairy foodstuffs and beverages. The Korean adult dietary intake ranges, estimated for a range of scenarios, were 0.60–3.03 and 0.17–1.68 ng kg −1 bw d −1 for PFOS and PFOA, respectively, which were lower than the total daily intake limits suggested by European Food Safety Authority (PFOS: 150 ng kg −1 bw d −1 ; PFOA: 1500 ng kg −1 bw d −1 ). The major contributors to PFAAs dietary exposure varied with subject age and PFAAs. For example, fish was a major contributor of PFOS but dairy foods were major contributors of PFOA. However, tap water was a major contributor to PFOA intake when it was the main source of drinking water (rather than bottled water)

  1. Genomic analyses of the Chlamydia trachomatis core genome show an association between chromosomal genome, plasmid type and disease

    NARCIS (Netherlands)

    Versteeg, Bart; Bruisten, Sylvia M.; Pannekoek, Yvonne; Jolley, Keith A.; Maiden, Martin C. J.; van der Ende, Arie; Harrison, Odile B.

    2018-01-01

    Background: Chlamydia trachomatis (Ct) plasmid has been shown to encode genes essential for infection. We evaluated the population structure of Ct using whole-genome sequence data (WGS). In particular, the relationship between the Ct genome, plasmid and disease was investigated. Results: WGS data

  2. Community-level physiological profiling analyses show potential to identify the copiotrophic bacteria present in soil environments

    Czech Academy of Sciences Publication Activity Database

    Lladó, Salvador; Baldrian, Petr

    2017-01-01

    Roč. 12, č. 2 (2017), s. 1-9, č. článku e0171638. E-ISSN 1932-6203 R&D Projects: GA ČR(CZ) GP14-09040P; GA MŠk(CZ) LD15086 Institutional support: RVO:61388971 Keywords : SUBSTRATE UTILIZATION PATTERNS * CARBON -SOURCE UTILIZATION * MICROBIAL COMMUNITIES Subject RIV: EE - Microbiology, Virology OBOR OECD: Microbiology Impact factor: 2.806, year: 2016

  3. Educational Outreach: The Space Science Road Show

    Science.gov (United States)

    Cox, N. L. J.

    2002-01-01

    The poster presented will give an overview of a study towards a "Space Road Show". The topic of this show is space science. The target group is adolescents, aged 12 to 15, at Dutch high schools. The show and its accompanying experiments would be supported with suitable educational material. Science teachers at schools can decide for themselves if they want to use this material in advance, afterwards or not at all. The aims of this outreach effort are: to motivate students for space science and engineering, to help them understand the importance of (space) research, to give them a positive feeling about the possibilities offered by space and in the process give them useful knowledge on space basics. The show revolves around three main themes: applications, science and society. First the students will get some historical background on the importance of space/astronomy to civilization. Secondly they will learn more about novel uses of space. On the one hand they will learn of "Views on Earth" involving technologies like Remote Sensing (or Spying), Communication, Broadcasting, GPS and Telemedicine. On the other hand they will experience "Views on Space" illustrated by past, present and future space research missions, like the space exploration missions (Cassini/Huygens, Mars Express and Rosetta) and the astronomy missions (Soho and XMM). Meanwhile, the students will learn more about the technology of launchers and satellites needed to accomplish these space missions. Throughout the show and especially towards the end attention will be paid to the third theme "Why go to space"? Other reasons for people to get into space will be explored. An important question in this is the commercial (manned) exploration of space. Thus, the questions of benefit of space to society are integrated in the entire show. It raises some fundamental questions about the effects of space travel on our environment, poverty and other moral issues. The show attempts to connect scientific with

  4. 2008 LHC Open Days Physics: the show

    CERN Multimedia

    2008-01-01

    A host of events and activities await visitors to the LHC Open Days on 5 and 6 April. A highlight will be the physics shows funded by the European Physical Society (EPS), which are set to surprise and challenge children and adults alike! School children use their experience of riding a bicycle to understand how planets move around the sun (Copyright : Circus Naturally) Participating in the Circus Naturally show could leave a strange taste in your mouth! (Copyright : Circus Naturally) The Rino Foundation’s experiments with liquid nitrogen can be pretty exciting! (Copyright: The Rino Foundation)What does a bicycle have in common with the solar system? Have you ever tried to weigh air or visualise sound? Ever heard of a vacuum bazooka? If you want to discover the answers to these questions and more then come to the Physics Shows taking place at the CERN O...

  5. Online Italian fandoms of American TV shows

    Directory of Open Access Journals (Sweden)

    Eleonora Benecchi

    2015-06-01

    Full Text Available The Internet has changed media fandom in two main ways: it helps fans connect with each other despite physical distance, leading to the formation of international fan communities; and it helps fans connect with the creators of the TV show, deepening the relationship between TV producers and international fandoms. To assess whether Italian fan communities active online are indeed part of transnational online communities and whether the Internet has actually altered their relationship with the creators of the original text they are devoted to, qualitative analysis and narrative interviews of 26 Italian fans of American TV shows were conducted to explore the fan-producer relationship. Results indicated that the online Italian fans surveyed preferred to stay local, rather than using geography-leveling online tools. Further, the sampled Italian fans' relationships with the show runners were mediated or even absent.

  6. Archaea Signal Recognition Particle Shows the Way

    Directory of Open Access Journals (Sweden)

    Christian Zwieb

    2010-01-01

    Full Text Available Archaea SRP is composed of an SRP RNA molecule and two bound proteins named SRP19 and SRP54. Regulated by the binding and hydrolysis of guanosine triphosphates, the RNA-bound SRP54 protein transiently associates not only with the hydrophobic signal sequence as it emerges from the ribosomal exit tunnel, but also interacts with the membrane-associated SRP receptor (FtsY. Comparative analyses of the archaea genomes and their SRP component sequences, combined with structural and biochemical data, support a prominent role of the SRP RNA in the assembly and function of the archaea SRP. The 5e motif, which in eukaryotes binds a 72 kilodalton protein, is preserved in most archaea SRP RNAs despite the lack of an archaea SRP72 homolog. The primary function of the 5e region may be to serve as a hinge, strategically positioned between the small and large SRP domain, allowing the elongated SRP to bind simultaneously to distant ribosomal sites. SRP19, required in eukaryotes for initiating SRP assembly, appears to play a subordinate role in the archaea SRP or may be defunct. The N-terminal A region and a novel C-terminal R region of the archaea SRP receptor (FtsY are strikingly diverse or absent even among the members of a taxonomic subgroup.

  7. Duchenne muscular dystrophy models show their age

    OpenAIRE

    Chamberlain, Jeffrey S.

    2010-01-01

    The lack of appropriate animal models has hampered efforts to develop therapies for Duchenne muscular dystrophy (DMD). A new mouse model lacking both dystrophin and telomerase (Sacco et al., 2010) closely mimics the pathological progression of human DMD and shows that muscle stem cell activity is a key determinant of disease severity.

  8. Show Them You Really Want the Job

    Science.gov (United States)

    Perlmutter, David D.

    2012-01-01

    Showing that one really "wants" the job entails more than just really wanting the job. An interview is part Broadway casting call, part intellectual dating game, part personality test, and part, well, job interview. When there are 300 applicants for a position, many of them will "fit" the required (and even the preferred) skills listed in the job…

  9. A Talk Show from the Past.

    Science.gov (United States)

    Gallagher, Arlene F.

    1991-01-01

    Describes a two-day activity in which elementary students examine voting rights, the right to assemble, and women's suffrage. Explains the game, "Assemble, Reassemble," and a student-produced talk show with five students playing the roles of leaders of the women's suffrage movement. Profiles Elizabeth Cady Stanton, Lucretia Mott, Susan…

  10. Laser entertainment and light shows in education

    Science.gov (United States)

    Sabaratnam, Andrew T.; Symons, Charles

    2002-05-01

    Laser shows and beam effects have been a source of entertainment since its first public performance May 9, 1969, at Mills College in Oakland, California. Since 1997, the Photonics Center, NgeeAnn Polytechnic, Singapore, has been using laser shows as a teaching tool. Students are able to exhibit their creative skills and learn at the same time how lasers are used in the entertainment industry. Students will acquire a number of skills including handling three- phase power supply, operation of cooling system, and laser alignment. Students also acquire an appreciation of the arts, learning about shapes and contours as they develop graphics for the shows. After holography, laser show animation provides a combination of the arts and technology. This paper aims to briefly describe how a krypton-argon laser, galvanometer scanners, a polychromatic acousto-optic modulator and related electronics are put together to develop a laser projector. The paper also describes how students are trained to make their own laser animation and beam effects with music, and at the same time have an appreciation of the operation of a Class IV laser and the handling of optical components.

  11. The Last Great American Picture Show

    NARCIS (Netherlands)

    Elsaesser, Thomas; King, Noel; Horwath, Alexander

    2004-01-01

    The Last Great American Picture Show brings together essays by scholars and writers who chart the changing evaluations of the American cinema of the 1970s, sometimes referred to as the decade of the lost generation, but now more and more recognized as the first New Hollywood, without which the

  12. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show that ...

  13. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  14. Do men and women show love differently in marriage?

    Science.gov (United States)

    Schoenfeld, Elizabeth A; Bredow, Carrie A; Huston, Ted L

    2012-11-01

    In Western societies, women are considered more adept than men at expressing love in romantic relationships. Although scholars have argued that this view of love gives short shrift to men's ways of showing love (e.g., Cancian, 1986; Noller, 1996), the widely embraced premise that men and women "love differently" has rarely been examined empirically. Using data collected at four time points over 13 years of marriage, the authors examined whether love is associated with different behaviors for husbands and wives. Multilevel analyses revealed that, counter to theoretical expectations, both genders were equally likely to show love through affection. But whereas wives expressed love by enacting fewer negative or antagonistic behaviors, husbands showed love by initiating sex, sharing leisure activities, and doing household work together with their wives. Overall, the findings indicate that men and women show their love in more nuanced ways than cultural stereotypes suggest.

  15. Reality, ficción o show

    Directory of Open Access Journals (Sweden)

    Sandra Ruíz Moreno

    2002-01-01

    Full Text Available Para tener un punto de vista claro y objetivo frente a la polémica establecida en torno al programa “Protagonistas de novela” y la tendiente proliferación de los reality show en las parrillas de programación de la televisión colombiana, se realizó un análisis de texto y contenido de dicho programa, intentando definirlo desde sus posibilidades de realidad, ficción y show. Las unidades de análisis y el estudio de su tratamiento arrojaron un alto contenido que gira en torno a las emociones del ser humano relacionadas con la convivencia, tratadas a manera de show y con algunos aportes textuales de ficción, pero sin su elemento mediador básico, el actor, quitándole toda la posibilidad de tener un tratamiento con la profundidad, distancia y ética que requieren los temas de esta índole. El resultado es un formato que sólo busca altos índices de sintonía y que pertenece más a la denominada televisión “trash”, que a una búsqueda de realidad del hombre y mucho menos de sociedad.

  16. Comparison and validation of shallow landslides susceptibility maps generated by bi-variate and multi-variate linear probabilistic GIS-based techniques. A case study from Ribeira Quente Valley (S. Miguel Island, Azores)

    Science.gov (United States)

    Marques, R.; Amaral, P.; Zêzere, J. L.; Queiroz, G.; Goulart, C.

    2009-04-01

    Slope instability research and susceptibility mapping is a fundamental component of hazard assessment and is of extreme importance for risk mitigation, land-use management and emergency planning. Landslide susceptibility zonation has been actively pursued during the last two decades and several methodologies are still being improved. Among all the methods presented in the literature, indirect quantitative probabilistic methods have been extensively used. In this work different linear probabilistic methods, both bi-variate and multi-variate (Informative Value, Fuzzy Logic, Weights of Evidence and Logistic Regression), were used for the computation of the spatial probability of landslide occurrence, using the pixel as mapping unit. The methods used are based on linear relationships between landslides and 9 considered conditioning factors (altimetry, slope angle, exposition, curvature, distance to streams, wetness index, contribution area, lithology and land-use). It was assumed that future landslides will be conditioned by the same factors as past landslides in the study area. The presented work was developed for Ribeira Quente Valley (S. Miguel Island, Azores), a study area of 9,5 km2, mainly composed of volcanic deposits (ash and pumice lapilli) produced by explosive eruptions in Furnas Volcano. This materials associated to the steepness of the slopes (38,9% of the area has slope angles higher than 35°, reaching a maximum of 87,5°), make the area very prone to landslide activity. A total of 1.495 shallow landslides were mapped (at 1:5.000 scale) and included in a GIS database. The total affected area is 401.744 m2 (4,5% of the study area). Most slope movements are translational slides frequently evolving into debris-flows. The landslides are elongated, with maximum length generally equivalent to the slope extent, and their width normally does not exceed 25 m. The failure depth rarely exceeds 1,5 m and the volume is usually smaller than 700 m3. For modelling

  17. Myopes show increased susceptibility to nearwork aftereffects.

    Science.gov (United States)

    Ciuffreda, K J; Wallis, D M

    1998-09-01

    Some aspects of accommodation may be slightly abnormal (or different) in myopes, compared with accommodation in emmetropes and hyperopes. For example, the initial magnitude of accommodative adaptation in the dark after nearwork is greatest in myopes. However, the critical test is to assess this initial accommodative aftereffect and its subsequent decay in the light under more natural viewing conditions with blur-related visual feedback present, if a possible link between this phenomenon and clinical myopia is to be considered. Subjects consisted of adult late- (n = 11) and early-onset (n = 13) myopes, emmetropes (n = 11), and hyperopes (n = 9). The distance-refractive state was assessed objectively using an autorefractor immediately before and after a 10-minute binocular near task at 20 cm (5 diopters [D]). Group results showed that myopes were most susceptible to the nearwork aftereffect. It averaged 0.35 D in initial magnitude, with considerably faster posttask decay to baseline in the early-onset (35 seconds) versus late-onset (63 seconds) myopes. There was no myopic aftereffect in the remaining two refractive groups. The myopes showed particularly striking accommodatively related nearwork aftereffect susceptibility. As has been speculated and found by many others, transient pseudomyopia may cause or be a precursor to permanent myopia or myopic progression. Time-integrated increased retinal defocus causing axial elongation is proposed as a possible mechanism.

  18. Ancient bacteria show evidence of DNA repair

    DEFF Research Database (Denmark)

    Johnson, Sarah Stewart; Hebsgaard, Martin B; Christensen, Torben R

    2007-01-01

    -term survival of bacteria sealed in frozen conditions for up to one million years. Our results show evidence of bacterial survival in samples up to half a million years in age, making this the oldest independently authenticated DNA to date obtained from viable cells. Additionally, we find strong evidence...... geological timescales. There has been no direct evidence in ancient microbes for the most likely mechanism, active DNA repair, or for the metabolic activity necessary to sustain it. In this paper, we couple PCR and enzymatic treatment of DNA with direct respiration measurements to investigate long...... that this long-term survival is closely tied to cellular metabolic activity and DNA repair that over time proves to be superior to dormancy as a mechanism in sustaining bacteria viability....

  19. Microbiological and environmental issues in show caves.

    Science.gov (United States)

    Saiz-Jimenez, Cesareo

    2012-07-01

    Cultural tourism expanded in the last half of the twentieth century, and the interest of visitors has come to include caves containing archaeological remains. Some show caves attracted mass tourism, and economical interests prevailed over conservation, which led to a deterioration of the subterranean environment and the rock art. The presence and the role of microorganisms in caves is a topic that is often ignored in cave management. Knowledge of the colonisation patterns, the dispersion mechanisms, and the effect on human health and, when present, over rock art paintings of these microorganisms is of the utmost importance. In this review the most recent advances in the study of microorganisms in caves are presented, together with the environmental implications of the findings.

  20. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  1. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  2. Severe accident recriticality analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. E-mail: wiktor.frid@ski.se; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H

    2001-11-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s{sup -1} injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g{sup -1}, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s{sup -1}. In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated

  3. Severe accident recriticality analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s -1 injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g -1 , was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s -1 . In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated quasi steady

  4. Severe Accident Recriticality Analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Puska, E.K.; Nilsson, Lars; Sjoevall, H.

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B 4 C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  5. NASA GIBS Use in Live Planetarium Shows

    Science.gov (United States)

    Emmart, C. B.

    2015-12-01

    The American Museum of Natural History's Hayden Planetarium was rebuilt in year 2000 as an immersive theater for scientific data visualization to show the universe in context to our planet. Specific astrophysical movie productions provide the main daily programming, but interactive control software, developed at AMNH allows immersive presentation within a data aggregation of astronomical catalogs called the Digital Universe 3D Atlas. Since 2006, WMS globe browsing capabilities have been built into a software development collaboration with Sweden's Linkoping University (LiU). The resulting Uniview software, now a product of the company SCISS, is operated by about fifty planetariums around that world with ability to network amongst the sites for global presentations. Public presentation of NASA GIBS has allowed authoritative narratives to be presented within the range of data available in context to other sources such as Science on a Sphere, NASA Earth Observatory and Google Earth KML resources. Specifically, the NOAA supported World Views Network conducted a series of presentations across the US that focused on local ecological issues that could then be expanded in the course of presentation to national and global scales of examination. NASA support of for GIBS resources in an easy access multi scale streaming format like WMS has tremendously enabled particularly facile presentations of global monitoring like never before. Global networking of theaters for distributed presentations broadens out the potential for impact of this medium. Archiving and refinement of these presentations has already begun to inform new types of documentary productions that examine pertinent, global interdependency topics.

  6. Geoscience is Important? Show Me Why

    Science.gov (United States)

    Boland, M. A.

    2017-12-01

    "The public" is not homogenous and no single message or form of messaging will connect the entire public with the geosciences. One approach to promoting trust in, and engagement with, the geosciences is to identify specific sectors of the public and then develop interactions and communication products that are immediately relevant to that sector's interests. If the content and delivery are appropriate, this approach empowers people to connect with the geosciences on their own terms and to understand the relevance of the geosciences to their own situation. Federal policy makers are a distinct and influential subgroup of the general public. In preparation for the 2016 presidential election, the American Geosciences Institute (AGI) in collaboration with its 51 member societies prepared Geoscience for America's Critical Needs: Invitation to a National Dialogue, a document that identified major geoscience policy issues that should be addressed in a national policy platform. Following the election, AGI worked with eight other geoscience societies to develop Geoscience Policy Recommendations for the New Administration and the 115th Congress, which outlines specific policy actions to address national issues. State and local decision makers are another important subgroup of the public. AGI has developed online content, factsheets, and case studies with different levels of technical complexity so people can explore societally-relevant geoscience topics at their level of technical proficiency. A related webinar series is attracting a growing worldwide audience from many employment sectors. Partnering with government agencies and other scientific and professional societies has increased the visibility and credibility of these information products with our target audience. Surveys and other feedback show that these products are raising awareness of the geosciences and helping to build reciprocal relationships between geoscientists and decision makers. The core message of all

  7. Bacteriophages show promise as antimicrobial agents.

    Science.gov (United States)

    Alisky, J; Iczkowski, K; Rapoport, A; Troitsky, N

    1998-01-01

    The emergence of antibiotic-resistant bacteria has prompted interest in alternatives to conventional drugs. One possible option is to use bacteriophages (phage) as antimicrobial agents. We have conducted a literature review of all Medline citations from 1966-1996 that dealt with the therapeutic use of phage. There were 27 papers from Poland, the Soviet Union, Britain and the U.S.A. The Polish and Soviets administered phage orally, topically or systemically to treat a wide variety of antibiotic-resistant pathogens in both adults and children. Infections included suppurative wound infections, gastroenteritis, sepsis, osteomyelitis, dermatitis, empyemas and pneumonia; pathogens included Staphylococcus, Streptococcus, Klebsiella, Escherichia, Proteus, Pseudomonas, Shigella and Salmonella spp. Overall, the Polish and Soviets reported success rates of 80-95% for phage therapy, with rare, reversible gastrointestinal or allergic side effects. However, efficacy of phage was determined almost exclusively by qualitative clinical assessment of patients, and details of dosages and clinical criteria were very sketchy. There were also six British reports describing controlled trials of phage in animal models (mice, guinea pigs and livestock), measuring survival rates and other objective criteria. All of the British studies raised phage against specific pathogens then used to create experimental infections. Demonstrable efficacy against Escherichia, Acinetobacter, Pseudomonas and Staphylococcus spp. was noted in these model systems. Two U.S. papers dealt with improving the bioavailability of phage. Phage is sequestered in the spleen and removed from circulation. This can be overcome by serial passage of phage through mice to isolate mutants that resist sequestration. In conclusion, bacteriophages may show promise for treating antibiotic resistant pathogens. To facilitate further progress, directions for future research are discussed and a directory of authors from the reviewed

  8. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  9. Bivariate Correlation Analysis of the Chemometric Profiles of Chinese Wild Salvia miltiorrhiza Based on UPLC-Qqq-MS and Antioxidant Activities

    Directory of Open Access Journals (Sweden)

    Xiaodan Zhang

    2018-02-01

    Full Text Available To better understand the mechanisms underlying the pharmacological actions of Salvia miltiorrhiza, correlation between the chemical profiles and in vitro antioxidant activities in 50 batches of wild S. miltiorrhiza samples was analyzed. Our ultra-performance liquid chromatography–tandem mass spectrometry analysis detected twelve phenolic acids and five tanshinones and obtained various chemical profiles from different origins. In a principal component analysis (PCA and cluster analysis, the tanshinones cryptotanshinone, tanshinone IIA and dihydrotanshinone I exhibited higher weights in PC1, whereas the phenolic acids danshensu, salvianolic acids A and B and lithospermic acid were highly loaded in PC2. All components could be optimized as markers of different locations and might be suitable for S. miltiorrhiza quality analyses. Additionally, the DPPH and ABTS assays used to comprehensively evaluate antioxidant activities indicated large variations, with mean DPPH and ABTS scavenging potencies of 32.24 and 23.39 μg/mL, respectively, among S. miltiorrhiza extract solutions. Notably, samples that exceeded the mean IC50 values had higher phenolic acid contents. A correlation analysis indicated a strong correlation between the antioxidant activities and phenolic acid contents. Caffeic acid, danshensu, rosmarinic acid, lithospermic acid and salvianolic acid B were major contributors to antioxidant activity. In conclusion, phenolic compounds were the predominant antioxidant components in the investigated plant species. These plants may be sources of potent natural antioxidants and beneficial chemopreventive agents.

  10. Bivariate analysis of the genetic variability among some accessions of African Yam Bean (Sphenostylis stenocarpa (Hochst ex A. RichHarms

    Directory of Open Access Journals (Sweden)

    Solomon Tayo AKINYOSOYE

    2017-12-01

    Full Text Available Variability is an important factor to consider in crop improvement programmes. This study was conducted in two years to assess genetic variability and determine relationship between seed yield, its components and tuber production characters among twelve accessions of African yam bean. Data collected were subjected to combined analysis of variance (ANOVA, Principal Component Analysis (PCA, hierarchical and K-means clustering analyses. Results obtained revealed that genotype by year (G × Y interaction had significant effects on some of variables measured (days to first flowering, days to 50 % flowering, number of pod per plant, pod length, seed yield and tuber yield per plant in this study.The first five principal components (PC with Eigen values greater than 1.0 accounted for about 66.70 % of the total variation, where PC1 and PC 2 accounted for 39.48 % of variation and were associated with seed and tuber yield variables. Three heterotic groups were clearly delineated among genotypes with accessions AY03 and AY10 identified for high seed yield and tuber yield respectively. Non-significant relationship that existed between tuber and seed yield per plant of these accessions was recommended for further test in various agro-ecologies for their suitability, adaptability and possible exploitation of heterosis to further improve the accessions.

  11. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  12. Analysing Children's Drawings: Applied Imagination

    Science.gov (United States)

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  13. Impact analyses after pipe rupture

    International Nuclear Information System (INIS)

    Chun, R.C.; Chuang, T.Y.

    1983-01-01

    Two of the French pipe whip experiments are reproduced with the computer code WIPS. The WIPS results are in good agreement with the experimental data and the French computer code TEDEL. This justifies the use of its pipe element in conjunction with its U-bar element in a simplified method of impact analyses

  14. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  15. Analyser of sweeping electron beam

    International Nuclear Information System (INIS)

    Strasser, A.

    1993-01-01

    The electron beam analyser has an array of conductors that can be positioned in the field of the sweeping beam, an electronic signal treatment system for the analysis of the signals generated in the conductors by the incident electrons and a display for the different characteristics of the electron beam

  16. A Simple, Reliable Precision Time Analyser

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, B. V.; Nargundkar, V. R.; Subbarao, K.; Kamath, M. S.; Eligar, S. K. [Atomic Energy Establishment Trombay, Bombay (India)

    1966-06-15

    A 30-channel time analyser is described. The time analyser was designed and built for pulsed neutron research but can be applied to other uses. Most of the logic is performed by means of ferrite memory core and transistor switching circuits. This leads to great versatility, low power consumption, extreme reliability and low cost. The analyser described provides channel Widths from 10 {mu}s to 10 ms; arbitrarily wider channels are easily obtainable. It can handle counting rates up to 2000 counts/min in each channel with less than 1% dead time loss. There is a provision for an initial delay equal to 100 channel widths. An input pulse de-randomizer unit using tunnel diodes ensures exactly equal channel widths. A brief description of the principles involved in core switching circuitry is given. The core-transistor transfer loop is compared with the usual core-diode loops and is shown to be more versatile and better adapted to the making of a time analyser. The circuits derived from the basic loop are described. These include the scale of ten, the frequency dividers and the delay generator. The current drivers developed for driving the cores are described. The crystal-controlled clock which controls the width of the time channels and synchronizes the operation of the various circuits is described. The detector pulse derandomizer unit using tunnel diodes is described. The scheme of the time analyser is then described showing how the various circuits can be integrated together to form a versatile time analyser. (author)

  17. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    bone mineral density loci: WNT4, GALNT3, MEPE, CPED1/WNT16, TNFSF11, RIN3, and PPP6R3/LRP5. Variants in the TOM1L2/SREBF1 locus exert opposing effects TB-LM and TBLH-BMD, and have a stronger association with the former trait. We show that SREBF1 is expressed in murine and human osteoblasts, as well...

  18. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species....

  19. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  20. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  1. Migraineurs without aura show microstructural abnormalities in the cerebellum and frontal lobe.

    Science.gov (United States)

    Granziera, C; Romascano, D; Daducci, A; Roche, A; Vincent, M; Krueger, G; Hadjikhani, N

    2013-12-01

    The involvement of the cerebellum in migraine pathophysiology is not well understood. We used a biparametric approach at high-field MRI (3 T) to assess the structural integrity of the cerebellum in 15 migraineurs with aura (MWA), 23 migraineurs without aura (MWoA), and 20 healthy controls (HC). High-resolution T1 relaxation maps were acquired together with magnetization transfer images in order to probe microstructural and myelin integrity. Clusterwise analysis was performed on T1 and magnetization transfer ratio (MTR) maps of the cerebellum of MWA, MWoA, and HC using an ANOVA and a non-parametric clusterwise permutation F test, with age and gender as covariates and correction for familywise error rate. In addition, mean MTR and T1 in frontal regions known to be highly connected to the cerebellum were computed. Clusterwise comparison among groups showed a cluster of lower MTR in the right Crus I of MWoA patients vs. HC and MWA subjects (p = 0.04). Univariate and bivariate analysis on T1 and MTR contrasts showed that MWoA patients had longer T1 and lower MTR in the right and left pars orbitalis compared to MWA (p < 0.01 and 0.05, respectively), but no differences were found with HC. Lower MTR and longer T1 point at a loss of macromolecules and/or micro-edema in Crus I and pars orbitalis in MWoA patients vs. HC and vs. MWA. The pathophysiological implications of these findings are discussed in light of recent literature.

  2. Is bilingualism associated with a lower risk of dementia in community-living older adults? Cross-sectional and prospective analyses.

    Science.gov (United States)

    Yeung, Caleb M; St John, Philip D; Menec, Verena; Tyas, Suzanne L

    2014-01-01

    The aim of this study was to determine whether bilingualism is associated with dementia in cross-sectional or prospective analyses of older adults. In 1991, 1616 community-living older adults were assessed and were followed 5 years later. Measures included age, sex, education, subjective memory loss (SML), and the modified Mini-mental State Examination (3MS). Dementia was determined by clinical examination in those who scored below the cut point on the 3MS. Language status was categorized based upon self-report into 3 groups: English as a first language (monolingual English, bilingual English) and English as a Second Language (ESL). The ESL category had lower education, lower 3MS scores, more SML, and were more likely to be diagnosed with cognitive impairment, no dementia at both time 1 and time 2 compared with those speaking English as a first language. There was no association between being bilingual (ESL and bilingual English vs. monolingual) and having dementia at time 1 in bivariate or multivariate analyses. In those who were cognitively intact at time 1, there was no association between being bilingual and having dementia at time 2 in bivariate or multivariate analyses. We did not find any association between speaking >1 language and dementia.

  3. Kinetic stability analyses in a bumpy cylinder

    International Nuclear Information System (INIS)

    Dominguez, R.R.; Berk, H.L.

    1981-01-01

    Recent interest in the ELMO Bumpy Torus (EBT) has prompted a number of stability analyses of both the hot electron rings and the toroidal plasma. Typically these works employ the local approximation, neglecting radial eigenmode structure and ballooning effects to perform the stability analysis. In the present work we develop a fully kinetic formalism for performing nonlocal stability analyses in a bumpy cylinder. We show that the Vlasov-Maxwell integral equations (with one ignorable coordinate) are self-adjoint and hence amenable to analysis using numerical techniques developed for self-adjoint systems of equations. The representation we obtain for the kernel of the Vlasov-Maxwell equations is a differential operator of arbitrarily high order. This form leads to a manifestly self-adjoint system of differential equations for long wavelength modes

  4. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    In 2001 the activity in the field of safety analyses was focused on verification of the safety analyses reports for NPP V-2 Bohunice and NPP Mochovce concerning the new profiled fuel and probabilistic safety assessment study for NPP Mochovce. The calculation safety analyses were performed and expert reviews for the internal UJD needs were elaborated. An important part of work was performed also in solving of scientific and technical tasks appointed within bilateral projects of co-operation between UJD and its international partnership organisations as well as within international projects ordered and financed by the European Commission. All these activities served as an independent support for UJD in its deterministic and probabilistic safety assessment of nuclear installations. A special attention was paid to a review of probabilistic safety assessment study of level 1 for NPP Mochovce. The probabilistic safety analysis of NPP related to the full power operation was elaborated in the study and a contribution of the technical and operational improvements to the risk decreasing was quantified. A core damage frequency of the reactor was calculated and the dominant initiating events and accident sequences with the major contribution to the risk were determined. The target of the review was to determine the acceptance of the sources of input information, assumptions, models, data, analyses and obtained results, so that the probabilistic model could give a real picture of the NPP. The review of the study was performed in co-operation of UJD with the IAEA (IPSART mission) as well as with other external organisations, which were not involved in the elaboration of the reviewed document and probabilistic model of NPP. The review was made in accordance with the IAEA guidelines and methodical documents of UJD and US NRC. In the field of calculation safety analyses the UJD activity was focused on the analysis of an operational event, analyses of the selected accident scenarios

  5. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2014-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...

  6. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  7. Show Horse Welfare: Horse Show Competitors' Understanding, Awareness, and Perceptions of Equine Welfare.

    Science.gov (United States)

    Voigt, Melissa A; Hiney, Kristina; Richardson, Jennifer C; Waite, Karen; Borron, Abigail; Brady, Colleen M

    2016-01-01

    The purpose of this study was to gain a better understanding of stock-type horse show competitors' understanding of welfare and level of concern for stock-type show horses' welfare. Data were collected through an online questionnaire that included questions relating to (a) interest and general understanding of horse welfare, (b) welfare concerns of the horse show industry and specifically the stock-type horse show industry, (c) decision-making influences, and (d) level of empathic characteristics. The majority of respondents indicated they agree or strongly agree that physical metrics should be a factor when assessing horse welfare, while fewer agreed that behavioral and mental metrics should be a factor. Respondent empathy levels were moderate to high and were positively correlated with the belief that mental and behavioral metrics should be a factor in assessing horse welfare. Respondents indicated the inhumane practices that most often occur at stock-type shows include excessive jerking on reins, excessive spurring, and induced excessive unnatural movement. Additionally, respondents indicated association rules, hired trainers, and hired riding instructors are the most influential regarding the decisions they make related to their horses' care and treatment.

  8. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  9. CFD analyses in regulatory practice

    International Nuclear Information System (INIS)

    Bloemeling, F.; Pandazis, P.; Schaffrath, A.

    2012-01-01

    Numerical software is used in nuclear regulatory procedures for many problems in the fields of neutron physics, structural mechanics, thermal hydraulics etc. Among other things, the software is employed in dimensioning and designing systems and components and in simulating transients and accidents. In nuclear technology, analyses of this kind must meet strict requirements. Computational Fluid Dynamics (CFD) codes were developed for computing multidimensional flow processes of the type occurring in reactor cooling systems or in containments. Extensive experience has been accumulated by now in selected single-phase flow phenomena. At the present time, there is a need for development and validation with respect to the simulation of multi-phase and multi-component flows. As insufficient input by the user can lead to faulty results, the validity of the results and an assessment of uncertainties are guaranteed only through consistent application of so-called Best Practice Guidelines. The authors present the possibilities now available to CFD analyses in nuclear regulatory practice. This includes a discussion of the fundamental requirements to be met by numerical software, especially the demands upon computational analysis made by nuclear rules and regulations. In conclusion, 2 examples are presented of applications of CFD analysis to nuclear problems: Determining deboration in the condenser reflux mode of operation, and protection of the reactor pressure vessel (RPV) against brittle failure. (orig.)

  10. Five Kepler target stars that show multiple transiting exoplanet candidates

    Energy Technology Data Exchange (ETDEWEB)

    Steffen, Jason H.; /Fermilab; Batalha, Natalie M.; /San Jose State U.; Borucki, William J.; /NASA, Ames; Buchhave, Lars A.; /Harvard-Smithsonian Ctr. Astrophys. /Bohr Inst.; Caldwell, Douglas A.; /NASA, Ames /SETI Inst., Mtn. View; Cochran, William D.; /Texas U.; Endl, Michael; /Texas U.; Fabrycky, Daniel C.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Ford, Eric B.; /Florida U.; Fortney, Jonathan J.; /UC, Santa Cruz, Phys. Dept. /NASA, Ames

    2010-06-01

    We present and discuss five candidate exoplanetary systems identified with the Kepler spacecraft. These five systems show transits from multiple exoplanet candidates. Should these objects prove to be planetary in nature, then these five systems open new opportunities for the field of exoplanets and provide new insights into the formation and dynamical evolution of planetary systems. We discuss the methods used to identify multiple transiting objects from the Kepler photometry as well as the false-positive rejection methods that have been applied to these data. One system shows transits from three distinct objects while the remaining four systems show transits from two objects. Three systems have planet candidates that are near mean motion commensurabilities - two near 2:1 and one just outside 5:2. We discuss the implications that multitransiting systems have on the distribution of orbital inclinations in planetary systems, and hence their dynamical histories; as well as their likely masses and chemical compositions. A Monte Carlo study indicates that, with additional data, most of these systems should exhibit detectable transit timing variations (TTV) due to gravitational interactions - though none are apparent in these data. We also discuss new challenges that arise in TTV analyses due to the presence of more than two planets in a system.

  11. DMPD: Structural and functional analyses of bacterial lipopolysaccharides. [Dynamic Macrophage Pathway CSML Database

    Lifescience Database Archive (English)

    Full Text Available 12106784 Structural and functional analyses of bacterial lipopolysaccharides. Carof...html) (.csml) Show Structural and functional analyses of bacterial lipopolysaccharides. PubmedID 12106784 Title Structural and functi...onal analyses of bacterial lipopolysaccharides. Authors

  12. Recurrent and multiple bladder tumors show conserved expression profiles

    International Nuclear Information System (INIS)

    Lindgren, David; Fioretos, Thoas; Månsson, Wiking; Höglund, Mattias; Gudjonsson, Sigurdur; Jee, Kowan Ja; Liedberg, Fredrik; Aits, Sonja; Andersson, Anna; Chebil, Gunilla; Borg, Åke; Knuutila, Sakari

    2008-01-01

    Urothelial carcinomas originate from the epithelial cells of the inner lining of the bladder and may appear as single or as multiple synchronous tumors. Patients with urothelial carcinomas frequently show recurrences after treatment making follow-up necessary. The leading hypothesis explaining the origin of meta- and synchronous tumors assumes a monoclonal origin. However, the genetic relationship among consecutive tumors has been shown to be complex in as much as the genetic evolution does not adhere to the chronological appearance of the metachronous tumors. Consequently, genetically less evolved tumors may appear chronologically later than genetically related but more evolved tumors. Forty-nine meta- or synchronous urothelial tumors from 22 patients were analyzed using expression profiling, conventional CGH, LOH, and mutation analyses. We show by CGH that partial chromosomal losses in the initial tumors may not be present in the recurring tumors, by LOH that different haplotypes may be lost and that detected regions of LOH may be smaller in recurring tumors, and that mutations present in the initial tumor may not be present in the recurring ones. In contrast we show that despite apparent genomic differences, the recurrent and multiple bladder tumors from the same patients display remarkably similar expression profiles. Our findings show that even though the vast majority of the analyzed meta- and synchronous tumors from the same patients are not likely to have originated directly from the preceding tumor they still show remarkably similar expressions profiles. The presented data suggests that an expression profile is established early in tumor development and that this profile is stable and maintained in recurring tumors

  13. Best in show but not best shape: a photographic assessment of show dog body condition.

    Science.gov (United States)

    Such, Z R; German, A J

    2015-08-01

    Previous studies suggest that owners often wrongly perceive overweight dogs to be in normal condition. The body shape of dogs attending shows might influence owners' perceptions, with online images of overweight show winners having a negative effect. This was an observational in silico study of canine body condition. 14 obese-prone breeds and 14 matched non-obese-probe breeds were first selected, and one operator then used an online search engine to identify 40 images, per breed, of dogs that had appeared at a major national UK show (Crufts). After images were anonymised and coded, a second observer subjectively assessed body condition, in a single sitting, using a previously validated method. Of 1120 photographs initially identified, 960 were suitable for assessing body condition, with all unsuitable images being from longhaired breeds. None of the dogs (0 per cent) were underweight, 708 (74 per cent) were in ideal condition and 252 (26 per cent) were overweight. Pugs, basset hounds and Labrador retrievers were most likely to be overweight, while standard poodles, Rhodesian ridgebacks, Hungarian vizslas and Dobermanns were least likely to be overweight. Given the proportion of show dogs from some breeds that are overweight, breed standards should be redefined to be consistent with a dog in optimal body condition. British Veterinary Association.

  14. Extensão bivariada do índice de confiabilidade univariado para avaliação da estabilidade fenotípica Bivariate extension of univariate reliability index for evaluating phenotypic stability

    Directory of Open Access Journals (Sweden)

    Suzankelly Cunha Arruda de Abreu

    2004-10-01

    Full Text Available Com o presente trabalho, objetiva-se realizar a derivação teórica da extensão bivariada dos métodos de Annicchiarico (1992 e Annicchiarico et al. (1995 para estudar a estabilidade fenotípica. A partir dos ensaios com genótipos em ambientes e mensurações de duas variáveis, cada genótipo teve seu valor padronizado com relação a cada variável k = 1, 2. Essa padronização foi realizada em função da média do ambiente, da seguinte forma: Wijk = Yijk/×100 ; em que Wijk representa o valor padronizado do genótipo i, no ambiente j para a variável k; representa a média observada do genótipo , no ambiente para a variável k e , a média de todos genótipos para o ambiente e variável k. Com os valores padronizados foram estimados o vetor média e a matriz de variância e covariância de cada genótipo. Foi obtida a derivação teórica da extensão bivariada do índice de risco (Ii de Annicchiarico com sucesso e foi proposto um segundo índice de risco baseado nas probabilidades bivariada (Prb i; os dois índices apresentaram grande concordância nos resultados obtidos em um exemplo ilustrativo com genótipos de melões.The objective of this work was to obtain the theoretical derivation of the bivariate extension to the methods proposed by Annicchiarico (1992 and Annicchiarico et al. (1995 for studing phenotypic stability. Considering assays with genotypes in environments and two variates, every genotype had the response of each variate (k = 1, 2 standardized. This standardization has been made using the environment means as follows: Wijk = Yijk/×100 ; where Wijk represents the ith genotype standard value in the jth environment for the kth variate; represents the observed mean of the ith genotype, in jth environment for the kth variate e the overall genotypes means for jth environment to kth variate. Considering the standardized values, the genotypes mean vector and covariance matrix were estimated. The theoretical derivation of the

  15. The hemispherical deflector analyser revisited

    Energy Technology Data Exchange (ETDEWEB)

    Benis, E.P. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece)], E-mail: benis@iesl.forth.gr; Zouros, T.J.M. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece); Department of Physics, University of Crete, P.O. Box 2208, 71003 Heraklion, Crete (Greece)

    2008-04-15

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R{sub 0} and the nominal value of the potential V(R{sub 0}) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD.

  16. The hemispherical deflector analyser revisited

    International Nuclear Information System (INIS)

    Benis, E.P.; Zouros, T.J.M.

    2008-01-01

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R 0 and the nominal value of the potential V(R 0 ) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD

  17. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  18. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  19. Signature of Nonstationarity in Precipitation Extremes over Urbanizing Regions in India Identified through a Multivariate Frequency Analyses

    Science.gov (United States)

    Singh, Jitendra; Hari, Vittal; Sharma, Tarul; Karmakar, Subhankar; Ghosh, Subimal

    2016-04-01

    The statistical assumption of stationarity in hydrologic extreme time/event series has been relied heavily in frequency analysis. However, due to the analytically perceivable impacts of climate change, urbanization and concomitant land use pattern, assumption of stationarity in hydrologic time series will draw erroneous results, which in turn may affect the policy and decision-making. Past studies provided sufficient evidences on changes in the characteristics of Indian monsoon precipitation extremes and further it has been attributed to climate change and urbanization, which shows need of nonstationary analysis on the Indian monsoon extremes. Therefore, a comprehensive multivariate nonstationary frequency analysis has been conducted for the entire India to identify the precipitation characteristics (intensity, duration and depth) responsible for significant nonstationarity in the Indian monsoon. We use 1o resolution of precipitation data for a period of 1901-2004, in a Generalized Additive Model for Location, Scale and Shape (GAMLSS) framework. A cluster of GAMLSS models has been developed by considering nonstationarity in different combinations of distribution parameters through different regression techniques, and the best-fit model is further applied for bivariate analysis. A population density data has been utilized to identify the urban, urbanizing and rural regions. The results showed significant differences in the stationary and nonstationary bivariate return periods for the urbanizing grids, when compared to urbanized and rural grids. A comprehensive multivariate analysis has also been conducted to identify the precipitation characteristics particularly responsible for imprinting signature of nonstationarity.

  20. Herbarium specimens show contrasting phenological responses to Himalayan climate.

    Science.gov (United States)

    Hart, Robbie; Salick, Jan; Ranjitkar, Sailesh; Xu, Jianchu

    2014-07-22

    Responses by flowering plants to climate change are complex and only beginning to be understood. Through analyses of 10,295 herbarium specimens of Himalayan Rhododendron collected by plant hunters and botanists since 1884, we were able to separate these responses into significant components. We found a lack of directional change in mean flowering time over the past 45 y of rapid warming. However, over the full 125 y of collections, mean flowering time shows a significant response to year-to-year changes in temperature, and this response varies with season of warming. Mean flowering advances with annual warming (2.27 d earlier per 1 °C warming), and also is delayed with fall warming (2.54 d later per 1 °C warming). Annual warming may advance flowering through positive effects on overwintering bud formation, whereas fall warming may delay flowering through an impact on chilling requirements. The lack of a directional response suggests that contrasting phenological responses to temperature changes may obscure temperature sensitivity in plants. By drawing on large collections from multiple herbaria, made over more than a century, we show how these data may inform studies even of remote localities, and we highlight the increasing value of these and other natural history collections in understanding long-term change.

  1. [How children show positive and negative relationships on their drawings].

    Science.gov (United States)

    Gramel, Sabine

    2005-01-01

    This study analyses, whether pictures of children showing a positive relationship are significantly different from those showing a negative one with respect to several criteria. The study involved a random selection of 45 children aged 4;6 to 11;6 years. The children painted a picture with themselves and a person they liked and a picture of themselves with someone they disliked. For the most part, the children drew pictures of themselves with peers both with respect to positive as well as negative images. In an interview afterwards, the children specified the criteria in their drawings by which the quality of the particular relationship can be identified. Positive and negative relationship paintings differ in the character of activity described. The sun as an element in children's paintings is painted not more frequent on positive compared to negative pictures. The colour black is used more often in the drawings signifying negative relationships. While girls used more colour in negative relationship drawings, boys used more colour in the positive ones. There was no significant difference in the use of favourite colours and decorative elements between the two groups. Only in negative relationship drawings people were looking away from each other. Smiling individuals were more common in the positive relationship pictures and in pictures painted by the 6 to 8 year olds. A greater distance between the individuals emerged on negative relationship drawings of the girls.

  2. Tomato Fruits Show Wide Phenomic Diversity but Fruit Developmental Genes Show Low Genomic Diversity.

    Directory of Open Access Journals (Sweden)

    Vijee Mohan

    Full Text Available Domestication of tomato has resulted in large diversity in fruit phenotypes. An intensive phenotyping of 127 tomato accessions from 20 countries revealed extensive morphological diversity in fruit traits. The diversity in fruit traits clustered the accessions into nine classes and identified certain promising lines having desirable traits pertaining to total soluble salts (TSS, carotenoids, ripening index, weight and shape. Factor analysis of the morphometric data from Tomato Analyzer showed that the fruit shape is a complex trait shared by several factors. The 100% variance between round and flat fruit shapes was explained by one discriminant function having a canonical correlation of 0.874 by stepwise discriminant analysis. A set of 10 genes (ACS2, COP1, CYC-B, RIN, MSH2, NAC-NOR, PHOT1, PHYA, PHYB and PSY1 involved in various plant developmental processes were screened for SNP polymorphism by EcoTILLING. The genetic diversity in these genes revealed a total of 36 non-synonymous and 18 synonymous changes leading to the identification of 28 haplotypes. The average frequency of polymorphism across the genes was 0.038/Kb. Significant negative Tajima'D statistic in two of the genes, ACS2 and PHOT1 indicated the presence of rare alleles in low frequency. Our study indicates that while there is low polymorphic diversity in the genes regulating plant development, the population shows wider phenotype diversity. Nonetheless, morphological and genetic diversity of the present collection can be further exploited as potential resources in future.

  3. A Vehicle for Bivariate Data Analysis

    Science.gov (United States)

    Roscoe, Matt B.

    2016-01-01

    Instead of reserving the study of probability and statistics for special fourth-year high school courses, the Common Core State Standards for Mathematics (CCSSM) takes a "statistics for all" approach. The standards recommend that students in grades 6-8 learn to summarize and describe data distributions, understand probability, draw…

  4. Spectral density regression for bivariate extremes

    KAUST Repository

    Castro Camilo, Daniela; de Carvalho, Miguel

    2016-01-01

    can be seen as an extension of the Nadaraya–Watson estimator where the usual scalar responses are replaced by mean constrained densities on the unit interval. Numerical experiments with the methods illustrate their resilience in a variety of contexts

  5. Stereology of extremes; bivariate models and computation

    Czech Academy of Sciences Publication Activity Database

    Beneš, Viktor; Bodlák, M.; Hlubinka, D.

    2003-01-01

    Roč. 5, č. 3 (2003), s. 289-308 ISSN 1387-5841 R&D Projects: GA AV ČR IAA1075201; GA ČR GA201/03/0946 Institutional research plan: CEZ:AV0Z1075907 Keywords : sample extremes * domain of attraction * normalizing constants Subject RIV: BA - General Mathematics

  6. We're Playing "Jeremy Kyle"! Television Talk Shows in the Playground

    Science.gov (United States)

    Marsh, Jackie; Bishop, Julia

    2014-01-01

    This paper focuses on an episode of play in a primary school playground in England, which featured a group of children re-enacting elements of the television talk show "The Jeremy Kyle Show". The episode is analysed in the light of work that has identified the key elements of the talk show genre and the children's play is examined in…

  7. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  8. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  9. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  10. Passive safety injection experiments and analyses (PAHKO)

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1998-01-01

    PAHKO project involved experiments on the PACTEL facility and computer simulations of selected experiments. The experiments focused on the performance of Passive Safety Injection Systems (PSIS) of Advanced Light Water Reactors (ALWRs) in Small Break Loss-Of-Coolant Accident (SBLOCA) conditions. The PSIS consisted of a Core Make-up Tank (CMT) and two pipelines (Pressure Balancing Line, PBL, and Injection Line, IL). The examined PSIS worked efficiently in SBLOCAs although the flow through the PSIS stopped temporarily if the break was very small and the hot water filled the CMT. The experiments demonstrated the importance of the flow distributor in the CMT to limit rapid condensation. The project included validation of three thermal-hydraulic computer codes (APROS, CATHARE and RELAP5). The analyses showed the codes are capable to simulate the overall behaviour of the transients. The detailed analyses of the results showed some models in the codes still need improvements. Especially, further development of models for thermal stratification, condensation and natural circulation flow with small driving forces would be necessary for accurate simulation of the PSIS phenomena. (orig.)

  11. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  12. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  13. Kinematic top analyses at CDF

    Energy Technology Data Exchange (ETDEWEB)

    Grassmann, H.; CDF Collaboration

    1995-03-01

    We present an update of the top quark analysis using kinematic techniques in p{bar p} collisions at {radical}s = 1.8 TeV with the Collider Detector at Fermilab (CDF). We reported before on a study which used 19.3 pb{sup {minus}1} of data from the 1992--1993 collider run, but now we use a larger data sample of 67 pb{sup {minus}1}. First, we analyze the total transverse energy of the hard collision in W+{ge}3 jet events, showing the likely presence of a t{bar t} component in the event sample. Next, we compare in more detail the kinematic structure of W+ {ge}3 jet events with expectations for top pair production and with background processes, predominantly direct W+ jet production. We again find W+ {ge} 3 jet events which cannot be explained in terms of background, but show kinematic features as expected from top. These events also show evidence for beauty quarks, in agreement with expectations from top, but not compatible with expectations from backgrounds. The findings confirm the observation of top events made earlier in the data of the 1992--1993 collider run.

  14. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  15. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  16. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  17. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  18. Analysing Terrorism from a Systems Thinking Perspective

    Directory of Open Access Journals (Sweden)

    Lukas Schoenenberger

    2014-02-01

    Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.

  19. ATLAS helicity analyses in beauty hadron decays

    CERN Document Server

    Smizanska, M

    2000-01-01

    The ATLAS detector will allow a precise spatial reconstruction of the kinematics of B hadron decays. In combination with the efficient lepton identification applied already at trigger level, ATLAS is expected to provide large samples of exclusive decay channels cleanly separable from background. These data sets will allow spin-dependent analyses leading to the determination of production and decay parameters, which are not accessible if the helicity amplitudes are not separated. Measurement feasibility studies for decays B/sub s //sup 0/ to J/ psi phi and Lambda /sub b//sup 0/ to Lambda J/ psi , presented in this document, show the experimental precisions that can be achieved in determination of B/sub s//sup 0/ and Lambda /sub b //sup 0/ characteristics. (19 refs).

  20. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  1. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  2. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  3. International trade shows: Structure, strategy and performance of exhibitors at individual booths vs. joint booths

    DEFF Research Database (Denmark)

    Hansen, Kåre

    2000-01-01

    This paper examines differences in exhibitors who participate at international trade shows at joint booths and those who participate at individual booths. The structure, strategy, and trade show performance of exhibitors at joint booths and those at individual booths are analysed. The analysis...... implications for exhibitors at interna-tional trade shows and export marketing programmes and other marketing programmes offering services to international trade show exhibitors....... of exhibitors at the international food shows SIAL (Paris) and ANUGA (Cologne) showed several significant differences with regard to structure and strategy. However, no significant differences in the performance assessments between the two partici-pation modes were found. The findings have important...

  4. Achieving reasonable conservatism in nuclear safety analyses

    International Nuclear Information System (INIS)

    Jamali, Kamiar

    2015-01-01

    In the absence of methods that explicitly account for uncertainties, seeking reasonable conservatism in nuclear safety analyses can quickly lead to extreme conservatism. The rate of divergence to extreme conservatism is often beyond the expert analysts’ intuitive feeling, but can be demonstrated mathematically. Too much conservatism in addressing the safety of nuclear facilities is not beneficial to society. Using certain properties of lognormal distributions for representation of input parameter uncertainties, example calculations for the risk and consequence of a fictitious facility accident scenario are presented. Results show that there are large differences between the calculated 95th percentiles and the extreme bounding values derived from using all input variables at their upper-bound estimates. Showing the relationship of the mean values to the key parameters of the output distributions, the paper concludes that the mean is the ideal candidate for representation of the value of an uncertain parameter. The mean value is proposed as the metric that is consistent with the concept of reasonable conservatism in nuclear safety analysis, because its value increases towards higher percentiles of the underlying positively skewed distribution with increasing levels of uncertainty. Insensitivity of the results to the actual underlying distributions is briefly demonstrated. - Highlights: • Multiple conservative assumptions can quickly diverge into extreme conservatism. • Mathematics and attractive properties provide basis for wide use of lognormal distribution. • Mean values are ideal candidates for representation of parameter uncertainties. • Mean values are proposed as reasonably conservative estimates of parameter uncertainties

  5. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  6. Thermomagnetic Analyses to Test Concrete Stability

    Science.gov (United States)

    Geiss, C. E.; Gourley, J. R.

    2017-12-01

    Over the past decades pyrrhotite-containing aggregate has been used in concrete to build basements and foundations in central Connecticut. The sulphur in the pyrrhotite reacts to several secondary minerals, and associated changes in volume lead to a loss of structural integrity. As a result hundreds of homes have been rendered worthless as remediation costs often exceed the value of the homes and the value of many other homes constructed during the same time period is in question as concrete provenance and potential future structural issues are unknown. While minor abundances of pyrrhotite are difficult to detect or quantify by traditional means, the mineral is easily identified through its magnetic properties. All concrete samples from affected homes show a clear increase in magnetic susceptibility above 220°C due to the γ - transition of Fe9S10 [1] and a clearly defined Curie-temperature near 320°C for Fe7S8. X-ray analyses confirm the presence of pyrrhotite and ettringite in these samples. Synthetic mixtures of commercially available concrete and pyrrhotite show that the method is semiquantitative but needs to be calibrated for specific pyrrhotite mineralogies. 1. Schwarz, E.J., Magnetic properties of pyrrhotite and their use in applied geology and geophysics. 1975, Geological Survey of Canada : Ottawa, ON, Canada: Canada.

  7. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  8. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  9. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  10. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  11. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  12. CFD analyses of coolant channel flowfields

    Science.gov (United States)

    Yagley, Jennifer A.; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    The flowfield characteristics in rocket engine coolant channels are analyzed by means of a numerical model. The channels are characterized by large length to diameter ratios, high Reynolds numbers, and asymmetrical heating. At representative flow conditions, the channel length is approximately twice the hydraulic entrance length so that fully developed conditions would be reached for a constant property fluid. For the supercritical hydrogen that is used as the coolant, the strong property variations create significant secondary flows in the cross-plane which have a major influence on the flow and the resulting heat transfer. Comparison of constant and variable property solutions show substantial differences. In addition, the property variations prevent fully developed flow. The density variation accelerates the fluid in the channels increasing the pressure drop without an accompanying increase in heat flux. Analyses of the inlet configuration suggest that side entry from a manifold can affect the development of the velocity profile because of vortices generated as the flow enters the channel. Current work is focused on studying the effects of channel bifurcation on the flow field and the heat transfer characteristics.

  13. Fast and accurate methods for phylogenomic analyses

    Directory of Open Access Journals (Sweden)

    Warnow Tandy

    2011-10-01

    Full Text Available Abstract Background Species phylogenies are not estimated directly, but rather through phylogenetic analyses of different gene datasets. However, true gene trees can differ from the true species tree (and hence from one another due to biological processes such as horizontal gene transfer, incomplete lineage sorting, and gene duplication and loss, so that no single gene tree is a reliable estimate of the species tree. Several methods have been developed to estimate species trees from estimated gene trees, differing according to the specific algorithmic technique used and the biological model used to explain differences between species and gene trees. Relatively little is known about the relative performance of these methods. Results We report on a study evaluating several different methods for estimating species trees from sequence datasets, simulating sequence evolution under a complex model including indels (insertions and deletions, substitutions, and incomplete lineage sorting. The most important finding of our study is that some fast and simple methods are nearly as accurate as the most accurate methods, which employ sophisticated statistical methods and are computationally quite intensive. We also observe that methods that explicitly consider errors in the estimated gene trees produce more accurate trees than methods that assume the estimated gene trees are correct. Conclusions Our study shows that highly accurate estimations of species trees are achievable, even when gene trees differ from each other and from the species tree, and that these estimations can be obtained using fairly simple and computationally tractable methods.

  14. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  15. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  16. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  17. BN-600 hybrid core benchmark analyses

    International Nuclear Information System (INIS)

    Kim, Y.I.; Stanculescu, A.; Finck, P.; Hill, R.N.; Grimm, K.N.

    2003-01-01

    Benchmark analyses for the hybrid BN-600 reactor that contains three uranium enrichment zones and one plutonium zone in the core, have been performed within the frame of an IAEA sponsored Coordinated Research Project. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN-600 core were evaluated. The comparison of the diffusion and transport results obtained for the homogeneous representation generally shows good agreement for most parameters between the RZ and HEX-Z models. The burnup effect and the heterogeneity effect on most reactivity parameters also show good agreement for the HEX-Z diffusion and transport theory results. A large difference noticed for the sodium and steel density coefficients is mainly due to differences in the spatial coefficient predictions for non fuelled regions. The burnup reactivity loss was evaluated to be 0.025 (4.3 $) within ∼ 5.0% standard deviation. The heterogeneity effect on most reactivity coefficients was estimated to be small. The heterogeneity treatment reduced the control rod worth by 2.3%. The heterogeneity effect on the k-eff and control rod worth appeared to differ strongly depending on the heterogeneity treatment method. A substantial spread noticed for several reactivity coefficients did not give a significant impact on the transient behavior prediction. This result is attributable to compensating effects between several reactivity effects and the specific design of the partially MOX fuelled hybrid core. (author)

  18. Database-Driven Analyses of Astronomical Spectra

    Science.gov (United States)

    Cami, Jan

    2012-03-01

    steps, good databases of molecular spectroscopic information is vital; the second step furthermore requires a good understanding of how these spectral properties change for each of the physical parameters of interest. Here, we will first present a general strategy using a convenient and powerful computational tool. We will then show two different, specific examples of analyses where some fundamental spectroscopic information is available in (a) database(s), and where that information has been successfully used to analyze observations. Whereas these examples do not directly address the question of automated analyses, they offer some insight into the possibilities and difficulties one can expect to encounter on this quest. In Section 13.2, we first describe the general formalism and method that we will be using, and we offer a few general considerations about what is involved in modeling. In Section 13.3, we illustrate our method for the case of polycyclic aromatic hydrocarbons (PAHs). In Section 13.4,we show how the same approach can be slightly modified to extract physical parameters from infrared observations of red giant stars. We then come back to the question of automated spectral analyses in Section 13.5. precise shape of molecular bands depends primarily on the temperature of the gas, and infrared observations of molecular bands can thus be used to probe gas at temperatures as low as 10 K and up to a few thousand K. Similarly, molecular band shapes contain some information about the gas densities. The resulting ro-vibrational spectrum is truly unique for each molecular species, just as is the case for atoms.As a telling example, Figure 13.2 illustrates how we can even discriminate between different isotopologues - two species with the same chemical formula and structure, but where one (or more) of the atoms is a different

  19. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  20. An MDE Approach for Modular Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Aksit, Mehmet; Rensink, Arend

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java, and access the program to be analyzed through libraries that offer an API for reading, writing

  1. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  2. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  3. Race, Gender, and Reseacher Positionality Analysed Through Memory Work

    DEFF Research Database (Denmark)

    Andreassen, Rikke; Myong, Lene

    2017-01-01

    Drawing upon feminist standpoint theory and memory work, the authors analyse racial privilege by investigating their own racialized and gendered subjectifications as academic researchers. By looking at their own experiences within academia, they show how authority and agency are contingent upon...

  4. Radiocarbon analyses along the EDML ice core in Antarctica

    NARCIS (Netherlands)

    van de Wal, R.S.W.; Meijer, H.A.J.; van Rooij, M.; van der Veen, C.

    2007-01-01

    Samples, 17 in total, from the EDML core drilled at Kohnen station Antarctica are analysed for 14CO and 14CO2 with a dry-extraction technique in combination with accelerator mass spectrometry. Results of the in situ produced 14CO fraction show a very low concentration of in situ produced 14CO.

  5. Radiocarbon analyses along the EDML ice core in Antarctica

    NARCIS (Netherlands)

    Van de Wal, R. S. W.; Meijer, H. A. J.; De Rooij, M.; Van der Veen, C.

    Samples, 17 in total, from the EDML core drilled at Kohnen station Antarctica are analysed for (CO)-C-14 and (CO2)-C-14 with a dry-extraction technique in combination with accelerator mass spectrometry. Results of the in situ produced (CO)-C-14 fraction show a very low concentration of in situ

  6. Payload specialist Reinhard Furrer show evidence of previous blood sampling

    Science.gov (United States)

    1985-01-01

    Payload specialist Reinhard Furrer shows evidence of previous blood sampling while Wubbo J. Ockels, Dutch payload specialist (only partially visible), extends his right arm after a sample has been taken. Both men show bruises on their arms.

  7. Integrated Field Analyses of Thermal Springs

    Science.gov (United States)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  8. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  9. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  10. Analyses of genetic relationships between linear type traits, fat-to-protein ratio, milk production traits, and somatic cell count in first-parity Czech Holstein cows

    DEFF Research Database (Denmark)

    Zink, V; Zavadilová, L; Lassen, Jan

    2014-01-01

    . The number of animals for each linear type trait was 59 454, except for locomotion, for which 53 424 animals were recorded. The numbers of animals with records of milk production data were 43 992 for milk yield, fat percentage, protein percentage, and fat-to-protein percentage ratio and 43 978 for fat yield...... and protein yield. In total, 27 098 somatic cell score records were available. The strongest positive genetic correlation between production traits and linear type traits was estimated between udder width and fat yield (0.51 ± 0.04), while the strongest negative correlation estimated was between body......Genetic and phenotypic correlations between production traits, selected linear type traits, and somatic cell score were estimated. The results could be useful for breeding programs involving Czech Holstein dairy cows or other populations. A series of bivariate analyses was applied whereby (co...

  11. High performance liquid chromatography in pharmaceutical analyses

    Directory of Open Access Journals (Sweden)

    Branko Nikolin

    2004-05-01

    serum contains numerous endogenous compounds often present in concentrations much greater than those of analyte. Analiyte concentrations are often low, and in the case of drugs, the endogenous compounds are sometimes structurally very similar to the drug to be measured. The binding of drugs to the plasma protein also may occur which decreases the amount of free compound that is measured. To undertake the analyses of drugs and metabolites in body fluids the analyst is facet with several problems. The first problem is due to the complex nature of the body fluid, the drugs must be isolated by an extraction technique, which ideally should provide a relatively clean extract, and the separation system must be capable of resolving the drugs of interest from co extractives. All mentioned when we are using high performance liquid chromatography require good selections of detectors, good stationary phase, eluents and adequate program during separation. UV/VIS detector is the most versatile detector used in high performance liquid chromatography it is not always ideal since it is lack of specificity means high resolution of the analyte that may be required. UV detection is preferred since it offers excellent linearity and rapid quantitative analyses can be performed against a single standard of the drug being determined. Diode array and rapid scanning detector are useful for peak identification and monitoring peak purity but they are somewhat less sensitive then single wavelength detectors. In liquid chromatography some components may have a poor UV chromophores if UV detection is being used or be completely retained on the liquid chromatography column. Fluorescence and electrochemical detector are not only considerably more sensitive towed appropriate analytes but also more selective than UV detectors for many compounds. If at all possible fluorescence detectors are sensitive, stable, selective and easy to operate. It is selectivity shows itself in the lack of frontal

  12. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    compounds often present in concentrations much greater than those of analyte. Analiyte concentrations are often low, and in the case of drugs, the endogenous compounds are sometimes structurally very similar to the drug to be measured. The binding of drugs to the plasma protein also may occur which decreases the amount of free compound that is measured. To undertake the analyses of drugs and metabolites in body fluids the analyst is facet with several problems. The first problem is due to the complex nature of the body fluid, the drugs must be isolated by an extraction technique, which ideally should provide a relatively clean extract, and the separation system must be capable of resolving the drugs of interest from co extractives. All mentioned when we are using high performance liquid chromatography require good selections of detectors, good stationary phase, eluents and adequate program during separation. UV/VIS detector is the most versatile detector used in high performance liquid chromatography it is not always ideal since it is lack of specificity means high resolution of the analyte that may be required. UV detection is preferred since it offers excellent linearity and rapid quantitative analyses can be performed against a single standard of the drug being determined. Diode array and rapid scanning detector are useful for peak identification and monitoring peak purity but they are somewhat less sensitive then single wavelength detectors. In liquid chromatography some components may have a poor UV chromophores if UV detection is being used or be completely retained on the liquid chromatography column. Fluorescence and electrochemical detector are not only considerably more sensitive towed appropriate analytes but also more selective than UV detectors for many compounds. If at all possible fluorescence detectors are sensitive, stable, selective and easy to operate. It is selectivity shows itself in the lack of frontal components observed in plasma extract whereas

  13. X-chromosome SNP analyses in 11 human Mediterranean populations show a high overall genetic homogeneity except in North-west Africans (Moroccans)

    DEFF Research Database (Denmark)

    Tomas Mas, Carmen; Sanchez Sanchez, Juan Jose; Barbaro, Anna

    2008-01-01

    BACKGROUND: Due to its history, with a high number of migration events, the Mediterranean basin represents a challenging area for population genetic studies. A large number of genetic studies have been carried out in the Mediterranean area using different markers but no consensus has been reached...

  14. Dairy shows different associations with abdominal and BMI-defined overweight: cross-sectional analyses exploring a variety of dairy products

    NARCIS (Netherlands)

    Brouwer, E.M.; Sluik, D.; Singh-Povel, C.M.; Feskens, E.J.M.

    2018-01-01

    Background and aims: Previous studies suggest weight-regulatory properties for several dairy nutrients, but population-based studies on dairy and body weight are inconclusive. We explored cross-sectional associations between dairy consumption and indicators of overweight. Methods and results: We

  15. Analyses adjusting for selective crossover show improved overall survival with adjuvant letrozole compared with tamoxifen in the BIG 1-98 study

    DEFF Research Database (Denmark)

    Colleoni, Marco; Giobbie-Hurder, Anita; Regan, Meredith M

    2011-01-01

    Among postmenopausal women with endocrine-responsive breast cancer, the aromatase inhibitor letrozole, when compared with tamoxifen, has been shown to significantly improve disease-free survival (DFS) and time to distant recurrence (TDR). We investigated whether letrozole monotherapy prolonged ov...

  16. Comparison with Russian analyses of meteor impact

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  17. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  18. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  19. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  20. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  1. Safety analyses for reprocessing and waste processing

    International Nuclear Information System (INIS)

    1983-03-01

    Presentation of an incident analysis of process steps of the RP, simplified considerations concerning safety, and safety analyses of the storage and solidification facilities of the RP. A release tree method is developed and tested. An incident analysis of process steps, the evaluation of the SRL-study and safety analyses of the storage and solidification facilities of the RP are performed in particular. (DG) [de

  2. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  3. Design and manufacture of TL analyser by using the microcomputer

    International Nuclear Information System (INIS)

    Doh, Sih Hong; Woo, Chong Ho

    1986-01-01

    This paper describes the design of the thermoluminescence analyser using microcomputer. TL analyser is designed to perform the three step heat treatment, such as pre-read heating, readout procedure and post-heating (or pre-irradiation ) anneal. We used a 12-bit A/D converter to get the precise measurement and the phase control method to control the heating temperature. Since the used Apple II microcomputer is cheap and popular, it is possible to design the economical system. Experimental results showed the successful operation with flexibility. The error of temperature control was less than ± 0.2% of the expected value. (Author)

  4. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  5. Uudised : Otsman taas Riias show'l. Rokkstaarist ministriks

    Index Scriptorium Estoniae

    2007-01-01

    Drag-kabareeartist Erkki Otsman esineb detsembris Riias "Sapnu Fabrikas" toimuval jõulu-show'l. Austraalia rokkansambli Midnight Oil endine laulja Peter Garrett nimetati valitsuse keskkonnaministriks

  6. Neoliberalism in education: Five images of critical analyses

    Directory of Open Access Journals (Sweden)

    Branislav Pupala

    2011-03-01

    Full Text Available The survey study brings information about the way that educational researchcopes with neoliberalism as a generalized form of social government in the currentwestern culture. It shows that neoliberalism is considered as a universal scope of otherchanges in the basic segments of education and those theoretical and critical analyses ofthis phenomenon represent an important part of production in the area of educationalresearch. It emphasizes the contribution of formation and development of the socalledgovernmental studies for comprehension of mechanisms and consequences ofneoliberal government of the society and shows how way the methodology of thesestudies helps to identify neoliberal strategies used in the regulation of social subjectsby education. There are five selected segments of critical analyses elaborated (fromthe concept of a lifelong learning, through preschool and university education to theeducation of teachers and PISA project that obviously show ideological and theoreticalcohesiveness of the education analysis through the scope of neoliberal governmentality.

  7. Time dependent patient no-show predictive modelling development.

    Science.gov (United States)

    Huang, Yu-Li; Hanauer, David A

    2016-05-09

    Purpose - The purpose of this paper is to develop evident-based predictive no-show models considering patients' each past appointment status, a time-dependent component, as an independent predictor to improve predictability. Design/methodology/approach - A ten-year retrospective data set was extracted from a pediatric clinic. It consisted of 7,291 distinct patients who had at least two visits along with their appointment characteristics, patient demographics, and insurance information. Logistic regression was adopted to develop no-show models using two-thirds of the data for training and the remaining data for validation. The no-show threshold was then determined based on minimizing the misclassification of show/no-show assignments. There were a total of 26 predictive model developed based on the number of available past appointments. Simulation was employed to test the effective of each model on costs of patient wait time, physician idle time, and overtime. Findings - The results demonstrated the misclassification rate and the area under the curve of the receiver operating characteristic gradually improved as more appointment history was included until around the 20th predictive model. The overbooking method with no-show predictive models suggested incorporating up to the 16th model and outperformed other overbooking methods by as much as 9.4 per cent in the cost per patient while allowing two additional patients in a clinic day. Research limitations/implications - The challenge now is to actually implement the no-show predictive model systematically to further demonstrate its robustness and simplicity in various scheduling systems. Originality/value - This paper provides examples of how to build the no-show predictive models with time-dependent components to improve the overbooking policy. Accurately identifying scheduled patients' show/no-show status allows clinics to proactively schedule patients to reduce the negative impact of patient no-shows.

  8. Entertaining politics, seriously?! : How talk show formats blur conceptual boundaries

    NARCIS (Netherlands)

    Schohaus, Birte

    2017-01-01

    What happens behind the scenes of a talk show? Why do some politicians seem to appear on every show while others are hardly ever seen? Birte Schohaus conducted a multi-layered research in which she conducted interviews with journalists, producers, PR advisors and (former) politicians and combined

  9. Effects of TV Crime Shows on Behavioural Development of Children

    Directory of Open Access Journals (Sweden)

    Abdullah Mudassar

    2017-01-01

    Full Text Available Television crime dramas and shows are very popular all over the world. This popularity is not bound to a certain age group, rather all the TV viewers like these shows very much. Like other countries, dozens of TV channels are telecasting these crime shows in Pakistan. Furthermore, few of the channels telecast crime shows at prime time which attests the popularity of such genre. Some of the media contents behave in morally disputed ways. The crime depictions as re-enactments of TV crime shows are questionable in the field of research signifying diverse cultural contexts. A large number of people are habitual to watch these shows, which may probably come out with negative behavioural outcomes. Especially the children who are at their behavioural developmental phase; are more susceptible to adopt negative behavioural leanings. In this research effort, introduction and detail of TV crime shows in Pakistan are provided, the literature concerning “media as risk factor“ in children development is discussed, and relevant theories inferences are deliberated.it was found that media has powerful role in behaviour formulating of children and violence media portrayal (TV crime shows may appear with grave concerns. Previous scientific literature was reviewed to find and discuss the problem in hand. In the research effort, the literature review provides research propositions to explore further dimensions to TV crime shows’ effects and possible negative or positive behavioural outcomes in children behaviour.

  10. The Presentation of Science in Everyday Life: The Science Show

    Science.gov (United States)

    Watermeyer, Richard

    2013-01-01

    This paper constitutes a case-study of the "science show" model of public engagement employed by a company of science communicators focused on the popularization of science, technology, engineering and mathematics (STEM) subject disciplines with learner constituencies. It examines the potential of the science show to foster the interest…

  11. "The Daily Show with Jon Stewart": Part 1

    Science.gov (United States)

    Trier, James

    2008-01-01

    Comedy Central's popular program "The Daily Show With Jon Stewart" is the best critical media literacy program on television, and it can be used in valuable ways in the classroom as part of a media literacy pedagogy. This Media Literacy column provides an overview of the show and its accompanying website and considers ways it might be used in the…

  12. The Daily Show with Jon Stewart: Part 2

    Science.gov (United States)

    Trier, James

    2008-01-01

    "The Daily Show With Jon Stewart" is one of the best critical literacy programs on television, and in this Media Literacy column the author suggests ways that teachers can use video clips from the show in their classrooms. (For Part 1, see EJ784683.)

  13. 16 CFR 5.57 - Order to show cause.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Order to show cause. 5.57 Section 5.57 Commercial Practices FEDERAL TRADE COMMISSION ORGANIZATION, PROCEDURES AND RULES OF PRACTICE STANDARDS OF CONDUCT Disciplinary Actions Concerning Postemployment Conflict of Interest § 5.57 Order to show cause. (a...

  14. Robust Correlation Analyses: False Positive and Power Validation Using a New Open Source Matlab Toolbox

    Science.gov (United States)

    Pernet, Cyril R.; Wilcox, Rand; Rousselet, Guillaume A.

    2012-01-01

    Pearson’s correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab(R) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand. PMID:23335907

  15. Robust correlation analyses: false positive and power validation using a new open source matlab toolbox.

    Science.gov (United States)

    Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A

    2012-01-01

    Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.

  16. PWR plant transient analyses using TRAC-PF1

    International Nuclear Information System (INIS)

    Ireland, J.R.; Boyack, B.E.

    1984-01-01

    This paper describes some of the pressurized water reactor (PWR) transient analyses performed at Los Alamos for the US Nuclear Regulatory Commission using the Transient Reactor Analysis Code (TRAC-PF1). Many of the transient analyses performed directly address current PWR safety issues. Included in this paper are examples of two safety issues addressed by TRAC-PF1. These examples are pressurized thermal shock (PTS) and feed-and-bleed cooling for Oconee-1. The calculations performed were plant specific in that details of both the primary and secondary sides were modeled in addition to models of the plant integrated control systems. The results of these analyses show that for these two transients, the reactor cores remained covered and cooled at all times posing no real threat to the reactor system nor to the public

  17. Davedan Show Di Amphi Theatre Nusa Dua Bali

    Directory of Open Access Journals (Sweden)

    Ni Made Ruastiti

    2018-05-01

    Full Text Available Artikel ini disusun dari hasil penelitian yang bertujuan untuk dapat memahami pertunjukan Davedan Show di Amphi Theatre Nusa Dua Bali. Penelitian ini dilakukan karena adanya ketimpangan antara asumsi dan kenyataan di lapangan. Pada umumnya wisatawan yang datang ke Bali hanya senang dan antusias menonton seni pertunjukan pariwisata berbasis seni budaya lokal saja. Tetapi kenyataan ini berbeda. Walaupun Davedan Show tidak dibangun dari seni budaya lokal saja, tetapi kenyataannya wisatawan sangat senang menonton pertunjukan tersebut. Pertanyaannya: bagaimanakah bentuk pertunjukan Davedan Show tersebut?; mengapa wisatawan senang menonton pertunjukan itu?; apa implikasinya bagi pelaku, masyarakat, dan industri pariwisata di Nusa Dua, Bali?. Penelitian ini menggunakan metode penelitian kualitatif, khususnya implementatif partisipatoris yang mengutamakan kerjasama antara periset dengan para informan terkait. Sumber data penelitian ini adalah pertunjukan Davedan itu sendiri, pihak manajemen, para penari, penonton, hasil-hasil penelitian yang telah ada sebelumnya. Seluruh data yang telah dikumpulkan dengan teknik observasi, wawancara, FGD, dan studi kepustakaan itu dianalisis secara kritis dengan menggunakan teori estetika postmodern, teori praktik, dan teori relasi kuasa pengetahuan. Hasil penelitian menunjukan bahwa: (1 Davedan Show disajikan dalam bentuk oratorium. Hal itu dapat dilihat dari cara penyajian, koreografi, dan iringan pertunjukannya. Davedan Show yang menampilkan tema Treasure of The Archipelago, membuka gerbang petualangan baru itu diiringi musik rekaman etnik Nusantara secara medley, berkelanjutan dengan struktur pertunjukan: seni budaya Bali, Sumatra, Sunda, Solo, Kalimantan, dan seni budaya Papua; (2 Davedan Show banyak diminati wisatawan manca negara karena penciptaan pertunjukan itu dilatari oleh ideologi pasar, ideologi estetika, dan ideologi budaya Nusantara; (3 Hingga kini Davedan Show berkembang secara berkelanjutan di Nusa Dua

  18. QLab 3 show control projects for live performances & installations

    CERN Document Server

    Hopgood, Jeromy

    2013-01-01

    Used from Broadway to Britain's West End, QLab software is the tool of choice for many of the world's most prominent sound, projection, and integrated media designers. QLab 3 Show Control: Projects for Live Performances & Installations is a project-based book on QLab software covering sound, video, and show control. With information on both sound and video system basics and the more advanced functions of QLab such as MIDI show control, new OSC capabilities, networking, video effects, and microphone integration, each chapter's specific projects will allow you to learn the software's capabilitie

  19. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  20. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  1. Adaptive governance good practice: Show me the evidence!

    Science.gov (United States)

    Sharma-Wallace, Lisa; Velarde, Sandra J; Wreford, Anita

    2018-09-15

    Adaptive governance has emerged in the last decade as an intriguing avenue of theory and practice for the holistic management of complex environmental problems. Research on adaptive governance has flourished since the field's inception, probing the process and mechanisms underpinning the new approach while offering various justifications and prescriptions for empirical use. Nevertheless, recent reviews of adaptive governance reveal some important conceptual and practical gaps in the field, particularly concerning challenges in its application to real-world cases. In this paper, we respond directly to the empirical challenge of adaptive governance, specifically asking: which methods contribute to the implementation of successful adaptive governance process and outcomes in practice and across cases and contexts? We adopt a systematic literature review methodology which considers the current body of empirical literature on adaptive governance of social-ecological systems in order to assess and analyse the methods affecting successful adaptive governance practice across the range of existing cases. We find that methods contributing to adaptive governance in practice resemble the design recommendations outlined in previous adaptive governance scholarship, including meaningful collaboration across actors and scales; effective coordination between stakeholders and levels; building social capital; community empowerment and engagement; capacity development; linking knowledge and decision-making through data collection and monitoring; promoting leadership capacity; and exploiting or creating governance opportunities. However, we critically contextualise these methods by analysing and summarising their patterns-in-use, drawing examples from the cases to explore the specific ways they were successfully or unsuccessfully applied to governance issues on-the-ground. Our results indicate some important underlying shared patterns, trajectories, and lessons learned for evidence

  2. marker development for two novel rice genes showing differential ...

    Indian Academy of Sciences (India)

    2014-08-19

    Aug 19, 2014 ... School of Crop Improvement, College of PostGraduate Studies, Central Agricultural University, ... from the root transcriptome data for tolerance to low P. .... Values show a representative result of three independent experiments ...

  3. Mass separated neutral particle energy analyser

    International Nuclear Information System (INIS)

    Takeuchi, Hiroshi; Matsuda, Toshiaki; Miura, Yukitoshi; Shiho, Makoto; Maeda, Hikosuke; Hashimoto, Kiyoshi; Hayashi, Kazuo.

    1983-09-01

    A mass separated neutral particle energy analyser which could simultaneously measure hydrogen and deuterium atoms emitted from tokamak plasma was constructed. The analyser was calibrated for the energy and mass separation in the energy range from 0.4 keV to 9 keV. In order to investigate the behavior of deuteron and proton in the JFT-2 tokamak plasma heated with ion cyclotron wave and neutral beam injection, this analyser was installed in JFT-2 tokamak. It was found that the energy spectrum could be determined with sufficient accuracy. The obtained ion temperature and ratio of deuteron and proton density from the energy spectrum were in good agreement with the value deduced from Doppler broadening of TiXIV line and the line intensities of H sub(α) and D sub(α) respectively. (author)

  4. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  5. Thermal and stress analyses with ANSYS program

    International Nuclear Information System (INIS)

    Kanoo, Iwao; Kawaguchi, Osamu; Asakura, Junichi.

    1975-03-01

    Some analyses of the heat conduction and elastic/inelastic stresses, carried out in Power Reactor and Nuclear Fuel Development Corporation (PNC) in fiscal 1973 using ANSYS (Engineering Analysis System) program, are summarized. In chapter I, the present state of structural analysis programs available for a FBR (fast breeder reactor) in PNC is explained. Chapter II is a brief description of the ANSYS current status. In chapter III are presented 8 examples of the steady-state and transient thermal analyses for fast-reactor plant components, and in chapter IV 5 examples of the inelastic structural analysis. With the advance in the field of finite element method, its applications in design study should extend progressively in the future. The present report, it is hoped, will contribute as references in similar analyses and at the same time help to understand the deformation and strain behaviors of structures. (Mori, K.)

  6. Periodic safety analyses; Les essais periodiques

    Energy Technology Data Exchange (ETDEWEB)

    Gouffon, A; Zermizoglou, R

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989.

  7. Army Study Shows Decline In Behavioral Health Stigma

    Science.gov (United States)

    2012-01-01

    Army Study Shows Decline in Behavioral Health Stigma By Rob McIlvaine Army News Service WASHINGTON, Jan. 20, 2012 - A newly released Army study on...conference yesterday. The three-year study outlines the problem of suicide in the Army and related issues of substance abuse, spouse abuse and child abuse...REPORT TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Army Study Shows Decline In Behavioral Health Stigma 5a. CONTRACT

  8. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  9. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...

  10. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  11. Dolphin shows and interaction programs: benefits for conservation education?

    Science.gov (United States)

    Miller, L J; Zeigler-Hill, V; Mellen, J; Koeppel, J; Greer, T; Kuczaj, S

    2013-01-01

    Dolphin shows and dolphin interaction programs are two types of education programs within zoological institutions used to educate visitors about dolphins and the marine environment. The current study examined the short- and long-term effects of these programs on visitors' conservation-related knowledge, attitude, and behavior. Participants of both dolphin shows and interaction programs demonstrated a significant short-term increase in knowledge, attitudes, and behavioral intentions. Three months following the experience, participants of both dolphin shows and interaction programs retained the knowledge learned during their experience and reported engaging in more conservation-related behaviors. Additionally, the number of dolphin shows attended in the past was a significant predictor of recent conservation-related behavior suggesting that repetition of these types of experiences may be important in inspiring people to conservation action. These results suggest that both dolphin shows and dolphin interaction programs can be an important part of a conservation education program for visitors of zoological facilities. © 2012 Wiley Periodicals, Inc.

  12. Analysing spatially extended high-dimensional dynamics by recurrence plots

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.de [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Kurths, Jürgen [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Humboldt Universität zu Berlin, Institut für Physik (Germany); Nizhny Novgorod State University, Department of Control Theory, Nizhny Novgorod (Russian Federation); Foerster, Saskia [GFZ German Research Centre for Geosciences, Section 1.4 Remote Sensing, Telegrafenberg, 14473 Potsdam (Germany)

    2015-05-08

    Recurrence plot based measures of complexity are capable tools for characterizing complex dynamics. In this letter we show the potential of selected recurrence plot measures for the investigation of even high-dimensional dynamics. We apply this method on spatially extended chaos, such as derived from the Lorenz96 model and show that the recurrence plot based measures can qualitatively characterize typical dynamical properties such as chaotic or periodic dynamics. Moreover, we demonstrate its power by analysing satellite image time series of vegetation cover with contrasting dynamics as a spatially extended and potentially high-dimensional example from the real world. - Highlights: • We use recurrence plots for analysing partially extended dynamics. • We investigate the high-dimensional chaos of the Lorenz96 model. • The approach distinguishes different spatio-temporal dynamics. • We use the method for studying vegetation cover time series.

  13. Pedagogical Techniques Employed by the Television Show "MythBusters"

    Science.gov (United States)

    Zavrel, Erik

    2016-11-01

    "MythBusters," the long-running though recently discontinued Discovery Channel science entertainment television program, has proven itself to be far more than just a highly rated show. While its focus is on entertainment, the show employs an array of pedagogical techniques to communicate scientific concepts to its audience. These techniques include: achieving active learning, avoiding jargon, employing repetition to ensure comprehension, using captivating demonstrations, cultivating an enthusiastic disposition, and increasing intrinsic motivation to learn. In this content analysis, episodes from the show's 10-year history were examined for these techniques. "MythBusters" represents an untapped source of pedagogical techniques, which science educators may consider availing themselves of in their tireless effort to better reach their students. Physics educators in particular may look to "MythBusters" for inspiration and guidance in how to incorporate these techniques into their own teaching and help their students in the learning process.

  14. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The analyses... processes such as erosion, mass wasting, slope failure, settlement of wastes and backfill, infiltration through covers over disposal areas and adjacent soils, and surface drainage of the disposal site. The...

  15. Analysing Simple Electric Motors in the Classroom

    Science.gov (United States)

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  16. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  17. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  18. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  19. UMTS signal measurements with digital spectrum analysers

    International Nuclear Information System (INIS)

    Licitra, G.; Palazzuoli, D.; Ricci, A. S.; Silvi, A. M.

    2004-01-01

    The launch of the Universal Mobile Telecommunications System (UNITS), the most recent mobile telecommunications standard has imposed the requirement of updating measurement instrumentation and methodologies. In order to define the most reliable measurement procedure, which is aimed at assessing the exposure to electromagnetic fields, modern spectrum analysers' features for correct signal characterisation has been reviewed. (authors)

  20. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein...

  1. Systematic review and meta-analyses

    DEFF Research Database (Denmark)

    Dreier, Julie Werenberg; Andersen, Anne-Marie Nybo; Berg-Beckhoff, Gabriele

    2014-01-01

    1990 were excluded. RESULTS: The available literature supported an increased risk of adverse offspring health in association with fever during pregnancy. The strongest evidence was available for neural tube defects, congenital heart defects, and oral clefts, in which meta-analyses suggested between a 1...

  2. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  3. Exergoeconomic and environmental analyses of CO

    NARCIS (Netherlands)

    Mosaffa, A. H.; Garousi Farshi, L; Infante Ferreira, C.A.; Rosen, M. A.

    2016-01-01

    Exergoeconomic and environmental analyses are presented for two CO2/NH3 cascade refrigeration systems equipped with (1) two flash tanks and (2) a flash tank along with a flash intercooler with indirect subcooler. A comparative study is performed for the proposed systems, and

  4. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  5. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  6. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    Chromosome preparation and karyotype description. The material analysed consists of chromosome preparations of the tayassuid species T. pecari (three individuals) and. P. tajacu (four individuals) and were made from short-term lymphocyte cultures of whole blood samples using standard protocols (Chaves et al. 2002).

  7. Implications of the Goal Theory on air show programs planning

    Directory of Open Access Journals (Sweden)

    Dewald Venter

    2014-01-01

    Full Text Available Events have long played an important role in human society (Shone & Parry, 2010: 3. The toils and efforts of daily lives have often been broken up by events of all kinds as humans seek an escape from the harsh reality of existence and events provide the outlet. Events are classified into four categories according to Shone and Parry (2010: 5 namely leisure (sport, recreation, personal (weddings, birthdays, cultural (art, folklore and organizational (politics, commercial. Successful events either match or exceed visitor motives and goals. It is critical that data be collected from visitors to determine their motives and goals in order to satisfy them and thereby encouraging repeat visits. One such event is the annual air show held at the Zwartkop Air Force Base (AFB in Pretoria, South Africa. Zwartkop AFB is also home to the South African Air Force (SAAF museum that also the hosts of the air show. Much of the museum‟s funds are generated through hosting the air show and sponsor contributions. Visitor goal satisfaction should therefore be of critically importance to the program planners. Military hardware has long held a fascination for those who used them and inspired the imagination of young and old. Such hardware often serves as a remembrance of times passed and as a testament to those who perished. For many visiting museums and air shows, curiosity plays a big role. The particular focus of this article will be on how the goal theory of leisure travel can be utilized by the air show organizers to enhance visitor experience to an air show.

  8. CERN cars drive by the Geneva Motor Show

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    One of CERN's new gas-fuelled cars was a special guest at the press days of the Geneva motor show this year. The car enjoyed a prominent position on the Gazmobil stand, right next to the latest Mazeratis and Ferraris. Journalists previewing the motor show could discover CERN's support for green technologies and also find out more about the lab - home to the fastest racetrack on the planet, with protons in the LHC running at 99.9999991% of the speed of light.    

  9. The Biochemistry Show: a new and fun tool for learning

    Directory of Open Access Journals (Sweden)

    A.H Ono

    2006-07-01

    Full Text Available The traditional methods to teach biochemistry in most universities are based on the memorization of chemical structures,  biochemical  pathways  and  reagent  names,  which  is  many  times  dismotivating  for  the  students.  We presently describe an innovative, interactive and alternative method for teaching biochemistry to medical and nutrition undergraduate students, called the Biochemistry Show (BioBio Show.The Biobio show is based on active participation of the students. They are divided in groups and the groups face each other. One group faces another one group at a time, in a game based on true or false questions that involve subjects of applied biochemistry (exercise, obesity, diabetes, cholesterol, free radicals, among others. The questions of the Show are previously elaborated by senior students. The Biobio Show has four phases, the first one is a selection exam, and from the second to the fourth phase, eliminatory confrontations happen. On a confrontation, the first group must select a certain quantity of questions for the opponent to answer.  The group who choses the questions must know how to answer and justify the selected questions. This procedure is repeated on all phases of the show. On the last phase, the questions used are taken from an exam previously performed by the students: either the 9-hour biochemistry exam (Sé et al. A 9-hour biochemistry exam. An iron man competition or a good way of evaluating undergraduate students? SBBq 2005, abstract K-6 or the True-or-False exam (TFE (Sé et al. Are tutor-students capable of writing good biochemistry exams? SBBq 2004, abstract K-18. The winner group receives an extra 0,5 point on the final grade. Over 70% of the students informed on a questionnaire that the Biobio Show is a valuable tool for learning biochemistry.    That is a new way to enrich the discussion of biochemistry in the classroom without the students getting bored. Moreover, learning

  10. Analysing public relations education through international standards: The Portuguese case

    OpenAIRE

    Gonçalves, Gisela Marques Pereira; Spínola, Susana de Carvalho; Padamo, Celma

    2013-01-01

    By using international reports on PR education as a benchmark we analyse the status of PR higher education in Portugal. Despite differences among the study programs, the findings reveal that the standard five courses recommendation by the Commission on Public Relations Education (CPRE) are a part of Portuguese undergraduate curriculum. This includes 12 of the 14 content field guidelines needed to achieve the ideal master's program. Data shows, however, the difficulty of positioning public rel...

  11. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  12. Fungal communities in wheat grain show significant co-existence patterns among species

    DEFF Research Database (Denmark)

    Nicolaisen, M.; Justesen, A. F.; Knorr, K.

    2014-01-01

    identified as ‘core’ OTUs as they were found in all or almost all samples and accounted for almost 99 % of all sequences. The remaining OTUs were only sporadically found and only in small amounts. Cluster and factor analyses showed patterns of co-existence among the core species. Cluster analysis grouped...... the 21 core OTUs into three clusters: cluster 1 consisting of saprotrophs, cluster 2 consisting mainly of yeasts and saprotrophs and cluster 3 consisting of wheat pathogens. Principal component extraction showed that the Fusarium graminearum group was inversely related to OTUs of clusters 1 and 2....

  13. An autopsied case of tuberculous meningitis showing interesting CT findings

    International Nuclear Information System (INIS)

    Abiko, Takashi; Higuchi, Hiroshi; Imada, Ryuichi; Nagai, Kenichi

    1983-01-01

    A 61-year-old female patient died of a neurological disorder of unknown origin one month after the first visit and was found to have had tuberculous meningitis at autopsy. CT revealed a low density area showing an enlargement of the cerebral ventricle but did not reveal contrast enhancement in the basal cistern peculiar to tuberculous meningitis. (Namekawa, K.)

  14. Auditory temporal-order thresholds show no gender differences

    NARCIS (Netherlands)

    van Kesteren, Marlieke T. R.; Wierslnca-Post, J. Esther C.

    2007-01-01

    Purpose: Several studies on auditory temporal-order processing showed gender differences. Women needed longer inter-stimulus intervals than men when indicating the temporal order of two clicks presented to the left and right ear. In this study, we examined whether we could reproduce these results in

  15. Auditory temporal-order thresholds show no gender differences

    NARCIS (Netherlands)

    van Kesteren, Marlieke T R; Wiersinga-Post, J Esther C

    2007-01-01

    PURPOSE: Several studies on auditory temporal-order processing showed gender differences. Women needed longer inter-stimulus intervals than men when indicating the temporal order of two clicks presented to the left and right ear. In this study, we examined whether we could reproduce these results in

  16. An Easy Way to Show Memory Color Effects.

    Science.gov (United States)

    Witzel, Christoph

    2016-01-01

    This study proposes and evaluates a simple stimulus display that allows one to measure memory color effects (the effect of object knowledge and memory on color perception). The proposed approach is fast and easy and does not require running an extensive experiment. It shows that memory color effects are robust to minor variations due to a lack of color calibration.

  17. Polypyridyl iron(II) complexes showing remarkable photocytotoxicity ...

    Indian Academy of Sciences (India)

    aditya

    Polypyridyl iron(II) complexes showing remarkable photocytotoxicity in visible light. ADITYA GARAI a. , UTTARA BASU a. , ILA PANT b. , PATURU KONDAIAH*. ,b. AND. AKHIL R. CHAKRAVARTY*. ,a a. Department of Inorganic and Physical Chemistry, Indian Institute of Science, Bangalore. 560012, India. E-mail: ...

  18. Manumycin from a new Streptomyces strain shows antagonistic ...

    African Journals Online (AJOL)

    Manumycin from a new Streptomyces strain shows antagonistic effect against methicillin-resistant Staphylococcus aureus (MRSA)/vancomycin-resistant enterococci (VRE) strains from Korean Hospitals. Yun Hee Choi, Seung Sik Cho, Jaya Ram Simkhada, Chi Nam Seong, Hyo Jeong Lee, Hong Seop Moon, Jin Cheol Yoo ...

  19. Five kepler target stars that show multiple transiting exoplanet candidates

    DEFF Research Database (Denmark)

    Steffen..[], Jason H.; Batalha, N. M.; Broucki, W J.

    2010-01-01

    We present and discuss five candidate exoplanetary systems identified with the Kepler spacecraft. These five systems show transits from multiple exoplanet candidates. Should these objects prove to be planetary in nature, then these five systems open new opportunities for the field of exoplanets a...

  20. Your Town Television Show: SMART Program (Part 1) [video

    OpenAIRE

    Naval Postgraduate School, (U.S.); Sanders, John; Millsaps, Knox; Shifflett, Deborah

    2010-01-01

    From "Your Town" television show. SMART Scholarship Program featured on Your Town television program in Monterey, California. Host John Sanders, Special Collections Manager of the Naval Postgraduate School's Dudley Knox Library, interviews Dr. Knox Millsaps, Executive Agent for the SMART Program, and Deborah Shifflett, SMART Program Manager.

  1. Your Town Television Show: SMART Program (Part 3) [video

    OpenAIRE

    Naval Postgraduate School, (U.S.); Sanders, John; Millsaps, Knox; Shifflett, Deborah

    2010-01-01

    From "Your Town" television show. SMART Scholarship Program featured on Your Town television program in Monterey, California. Host John Sanders, Special Collections Manager of the Naval Postgraduate School's Dudley Knox Library, interviews Dr. Knox Millsaps, Executive Agent for the SMART Program, and Deborah Shifflett, SMART Program Manager.

  2. A Progress Evaluation of Four Bilingual Children's Television Shows.

    Science.gov (United States)

    Klein, Stephen P.; And Others

    An evaluation of a bilingual education TV series was conducted involving 6-year-old English speaking, Spanish speaking, and bilingual children at four sites. Children were assigned to control and experimental groups with the latter group seeing four 30 minute shows. A pretest-posttest design was employed with the pretest serving as the covariate…

  3. The neonicotinoid imidachloprid shows high chronic toxicity to mayfly nymphs

    NARCIS (Netherlands)

    Roessink, I.; Merga, L.B.; Zweers, A.J.; Brink, van den P.J.

    2013-01-01

    The present study evaluated the acute and chronic toxicity of imidacloprid to a range of freshwater arthropods. Mayfly and caddisfly species were most sensitive to short-term imidacloprid exposures (10 tests), whereas the mayflies showed by far the most sensitive response to long-term exposure of

  4. An Easy Way to Show Memory Color Effects

    OpenAIRE

    Witzel, Christoph

    2016-01-01

    This study proposes and evaluates a simple stimulus display that allows one to measure memory color effects (the effect of object knowledge and memory on color perception). The proposed approach is fast and easy and does not require running an extensive experiment. It shows that memory color effects are robust to minor variations due to a lack of color calibration.

  5. 36 CFR 14.24 - Showing as to citizenship required.

    Science.gov (United States)

    2010-07-01

    ... INTERIOR RIGHTS-OF-WAY Procedures § 14.24 Showing as to citizenship required. (a) Individuals. An individual applicant applying for a right-of-way under any right-of-way act, except the Act of March 3, 1891... applicant resided in the United States thereafter while a minor, should be furnished. Where the husband and...

  6. Mice lacking neuropeptide Y show increased sensitivity to cocaine

    DEFF Research Database (Denmark)

    Sørensen, Gunnar; Woldbye, David Paul Drucker

    2012-01-01

    There is increasing data implicating neuropeptide Y (NPY) in the neurobiology of addiction. This study explored the possible role of NPY in cocaine-induced behavior using NPY knockout mice. The transgenic mice showed a hypersensitive response to cocaine in three animal models of cocaine addiction...

  7. Television Judge Shows: Nordic and U.S. Perspectives

    DEFF Research Database (Denmark)

    Porsdam, Helle

    2017-01-01

    Legal discourse is language that people use in a globalizing and multicultural society to negotiate acceptable behaviors and values. We see this played out in popular cultural forums such as judicial television dramas. In the American context, television judge shows are virtually synonymous...

  8. Mixed cultures of Kimchi lactic acid bacteria show increased cell ...

    African Journals Online (AJOL)

    ufuoma

    anaerobic organisms that are highly resistant to salts. Probiotic cultures for use in ... kimchi have a superior ability to decompose and utilize nutrients, and show ... citrate, 5 g sodium acetate, 1 g Tween, 2 g K2HPO4, 0.2 g. MgSO4•7H2O, 0.2 g ...

  9. Teaching Job Interviewing Skills with the Help of Television Shows

    Science.gov (United States)

    Bloch, Janel

    2011-01-01

    Because of its potential for humor and drama, job interviewing is frequently portrayed on television. This article discusses how scenes from popular television series such as "Everybody Loves Raymond," "Friends," and "The Mary Tyler Moore Show" can be used to teach effective job interview skills in business communication courses. Television…

  10. Airline Overbooking Problem with Uncertain No-Shows

    Directory of Open Access Journals (Sweden)

    Chunxiao Zhang

    2014-01-01

    Full Text Available This paper considers an airline overbooking problem of a new single-leg flight with discount fare. Due to the absence of historical data of no-shows for a new flight, and various uncertain human behaviors or unexpected events which causes that a few passengers cannot board their aircraft on time, we fail to obtain the probability distribution of no-shows. In this case, the airlines have to invite some domain experts to provide belief degree of no-shows to estimate its distribution. However, human beings often overestimate unlikely events, which makes the variance of belief degree much greater than that of the frequency. If we still regard the belief degree as a subjective probability, the derived results will exceed our expectations. In order to deal with this uncertainty, the number of no-shows of new flight is assumed to be an uncertain variable in this paper. Given the chance constraint of social reputation, an overbooking model with discount fares is developed to maximize the profit rate based on uncertain programming theory. Finally, the analytic expression of the optimal booking limit is obtained through a numerical example, and the results of sensitivity analysis indicate that the optimal booking limit is affected by flight capacity, discount, confidence level, and parameters of the uncertainty distribution significantly.

  11. Triphala, a formulation of traditional Ayurvedic medicine, shows

    Indian Academy of Sciences (India)

    Triphala, a formulation of traditional Ayurvedic medicine, shows protective effect against X-radiation in HeLa cells. YUKI TAKAUJI KENSUKE ... with the cellscultured in vitro. The simple bioassay system with human cultured cells would facilitate the understanding of themolecular basis for the beneficial effects of Triphala.

  12. Bilinguals Show Weaker Lexical Access during Spoken Sentence Comprehension

    Science.gov (United States)

    Shook, Anthony; Goldrick, Matthew; Engstler, Caroline; Marian, Viorica

    2015-01-01

    When bilinguals process written language, they show delays in accessing lexical items relative to monolinguals. The present study investigated whether this effect extended to spoken language comprehension, examining the processing of sentences with either low or high semantic constraint in both first and second languages. English-German…

  13. Soil bacteria show different tolerance ranges to an unprecedented disturbance

    DEFF Research Database (Denmark)

    Nunes, Ines Marques; Jurburg, Stephanie; Jacquiod, Samuel Jehan Auguste

    2018-01-01

    stress doses. FRG1, the most sensitive group, was dominated by Actinobacteria. FRG2 and FRG3, with intermediate tolerance, displayed prevalence of Proteobacteria, while FRG4, the most resistant group, was driven by Firmicutes. While the most sensitive FRGs showed predictable responses linked to changes...

  14. Genoa Boat Show – Good Example of Event Management

    Directory of Open Access Journals (Sweden)

    Dunja Demirović

    2012-07-01

    Full Text Available International Boat Show, a business and tourist event, has been held annually in Italian city of Genoa since 1962. The fair is one of the oldest, largest and best known in the field of boating industry worldwide, primarily due to good management of the event and it can serve as case study for domestic fair organizers to improve the quality of their business and services. Since Belgrade is the city of fairs, but compared to Genoa still underdeveloped in terms of trade shows, the following tasks imposed naturally in this study: to determine the relationship of the organizers of Genoa Boat Show in the sector of preparation and fair offer, in the sector of selection and communication with specific target groups (especially visitors, services during the fair and functioning of the city during the fair. During the research the authors have mostly used historical method, comparison, synthesis and the interview method. The results of theoretical research, in addition, may help not only managers of fair shows and of exhibitions, but also to organizers of other events in our country

  15. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  16. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  17. New Inspiring Planetarium Show Introduces ALMA to the Public

    Science.gov (United States)

    2009-03-01

    As part of a wide range of education and public outreach activities for the International Year of Astronomy 2009 (IYA2009), ESO, together with the Association of French Language Planetariums (APLF), has produced a 30-minute planetarium show, In Search of our Cosmic Origins. It is centred on the global ground-based astronomical Atacama Large Millimeter/submillimeter Array (ALMA) project and represents a unique chance for planetariums to be associated with the IYA2009. ESO PR Photo 09a/09 Logo of the ALMA Planetarium Show ESO PR Photo 09b/09 Galileo's first observations with a telescope ESO PR Photo 09c/09 The ALMA Observatory ESO PR Photo 09d/09 The Milky Way band ESO PR Video 09a/09 Trailer in English ALMA is the leading telescope for observing the cool Universe -- the relic radiation of the Big Bang, and the molecular gas and dust that constitute the building blocks of stars, planetary systems, galaxies and life itself. It is currently being built in the extremely arid environment of the Chajnantor plateau, at 5000 metres altitude in the Chilean Andes, and will start scientific observations around 2011. ALMA, the largest current astronomical project, is a revolutionary telescope, comprising a state-of-the-art array of 66 giant 12-metre and 7-metre diameter antennas observing at millimetre and submillimetre wavelengths. In Search of our Cosmic Origins highlights the unprecedented window on the Universe that this facility will open for astronomers. "The show gives viewers a fascinating tour of the highest observatory on Earth, and takes them from there out into our Milky Way, and beyond," says Douglas Pierce-Price, the ALMA Public Information Officer at ESO. Edited by world fulldome experts Mirage3D, the emphasis of the new planetarium show is on the incomparable scientific adventure of the ALMA project. A young female astronomer guides the audience through a story that includes unique animations and footage, leading the viewer from the first observations by Galileo

  18. HUBBLE VISION: A Planetarium Show About Hubble Space Telescope

    Science.gov (United States)

    Petersen, Carolyn Collins

    1995-05-01

    In 1991, a planetarium show called "Hubble: Report From Orbit" outlining the current achievements of the Hubble Space Telescope was produced by the independent planetarium production company Loch Ness Productions, for distribution to facilities around the world. The program was subsequently converted to video. In 1994, that program was updated and re-produced under the name "Hubble Vision" and offered to the planetarium community. It is periodically updated and remains a sought-after and valuable resource within the community. This paper describes the production of the program, and the role of the astronomical community in the show's production (and subsequent updates). The paper is accompanied by a video presentation of Hubble Vision.

  19. Political Show-Technology in the Post-Soviet Space

    Directory of Open Access Journals (Sweden)

    O. E. Grishin

    2015-01-01

    Full Text Available In the modern political process of Russia actively used technology show. With their help, political actors can influence public opinion and shape the public interest in certain issues. In Russia, these technologies are relevant, and are especially well developed. The pressing of the problem is due to a new round of information war on the territory of Ukraine. Inclusion in the information space, and discuss the pressing issues of modern Ukraine by media people such as C. Shuster and V. Solovyov, suggests that the political show has entertainment figures, actuality, the ambiguity of the proposed conclusions. At the same time it becomes part of the information war and political reality of the normal state.

  20. Migraine patients consistently show abnormal vestibular bedside tests

    Directory of Open Access Journals (Sweden)

    Eliana Teixeira Maranhão

    2015-01-01

    Full Text Available Migraine and vertigo are common disorders, with lifetime prevalences of 16% and 7% respectively, and co-morbidity around 3.2%. Vestibular syndromes and dizziness occur more frequently in migraine patients. We investigated bedside clinical signs indicative of vestibular dysfunction in migraineurs.Objective To test the hypothesis that vestibulo-ocular reflex, vestibulo-spinal reflex and fall risk (FR responses as measured by 14 bedside tests are abnormal in migraineurs without vertigo, as compared with controls.Method Cross-sectional study including sixty individuals – thirty migraineurs, 25 women, 19-60 y-o; and 30 gender/age healthy paired controls.Results Migraineurs showed a tendency to perform worse in almost all tests, albeit only the Romberg tandem test was statistically different from controls. A combination of four abnormal tests better discriminated the two groups (93.3% specificity.Conclusion Migraine patients consistently showed abnormal vestibular bedside tests when compared with controls.

  1. Migraine patients consistently show abnormal vestibular bedside tests.

    Science.gov (United States)

    Maranhão, Eliana Teixeira; Maranhão-Filho, Péricles; Luiz, Ronir Raggio; Vincent, Maurice Borges

    2016-01-01

    Migraine and vertigo are common disorders, with lifetime prevalences of 16% and 7% respectively, and co-morbidity around 3.2%. Vestibular syndromes and dizziness occur more frequently in migraine patients. We investigated bedside clinical signs indicative of vestibular dysfunction in migraineurs. To test the hypothesis that vestibulo-ocular reflex, vestibulo-spinal reflex and fall risk (FR) responses as measured by 14 bedside tests are abnormal in migraineurs without vertigo, as compared with controls. Cross-sectional study including sixty individuals - thirty migraineurs, 25 women, 19-60 y-o; and 30 gender/age healthy paired controls. Migraineurs showed a tendency to perform worse in almost all tests, albeit only the Romberg tandem test was statistically different from controls. A combination of four abnormal tests better discriminated the two groups (93.3% specificity). Migraine patients consistently showed abnormal vestibular bedside tests when compared with controls.

  2. CT findings of rectosigmoid carcinoma showing exophytic growth

    International Nuclear Information System (INIS)

    Ohgi, Kazuyuki; Kohno, Atsushi; Higuchi, Mutsumi

    1987-01-01

    CT findings of 7 rectosigmoid carcinomas showing exophytic growth were evaluated. All cases had bulky masses, ranging from 6.0 to 11.5 cm in maximum diameter. All masses were difficult to differentiate from the other pelvic masses on CT, presumably due to their exophytic growth and/or invasion to the surrounding organs. However, 3 out of 7 cases showed diffuse rectosigmoidal wall thickening adjacent to the primary tumor, and it is considered to be valuable in the determination of primary site. All female cases had gynecological symptom such as genital bleeding, due to uterine and/or vaginal invasion. When indeterminate pelvic mass is revealed by CT, rectosigmoid carcinoma should be considered into differential diagnosis. (author)

  3. Preschoolers show less trust in physically disabled or obese informants

    Directory of Open Access Journals (Sweden)

    Sara eJaffer

    2015-01-01

    Full Text Available This research examined whether preschool-aged children show less trust in physically disabled or obese informants. In Study 1, when learning about novel physical activities and facts, 4- and 5-year-olds preferred to endorse the testimony of a physically abled, non-obese informant rather than a physically disabled or obese one. In Study 2, after seeing that the physically disabled or obese informant was previously reliable whereas the physically abled, non-obese one was unreliable, 4- and 5-year-olds did not show a significant preference for either informant. We conclude that in line with the literature on children’s negative stereotypes of physically disabled or obese others, preschoolers are biased against these individuals as potential sources of new knowledge. This bias is robust in that past reliability might undermine its effect on children, but cannot reverse it.

  4. El reality show a la hora de la merienda

    Directory of Open Access Journals (Sweden)

    Lic. Rosa María Ganga Ganga

    2000-01-01

    Full Text Available Los programas de testimonio, inscritos dentro del género televisivo del Reality Show, son una variante del más amplio subgénero del Talk Show y tienen ya una cierta tradición en nuestro país. El presente trabajo se centrará en este tipo de programas de testimonio que basan su estrategia discursiva en la presentación y representación del relato autobiográfico del hombre o la mujer anónimos, integrándose de esta forma en las corrientes más recientes de la sociología y la historiografía, y persigue esclarecer algunas de sus características y funciones, especialmente su función socializadora, a través del mecanismo biográfico y del concepto de habitus tomado de Pierre Bourdieu.

  5. High-frequency parameters of magnetic films showing magnetization dispersion

    International Nuclear Information System (INIS)

    Sidorenkov, V.V.; Zimin, A.B.; Kornev, Yu.V.

    1988-01-01

    Magnetization dispersion leads to skewed resonance curves shifted towards higher magnetizing fields, together with considerable reduction in the resonant absorption, while the FMR line width is considerably increased. These effects increase considerably with frequency, in contrast to films showing magnetic-anisotropy dispersion, where they decrease. It is concluded that there may be anomalies in the frequency dependence of the resonance parameters for polycrystalline magnetic films

  6. Persistent cannabis users show neuropsychological decline from childhood to midlife

    OpenAIRE

    Meier, Madeline H.; Caspi, Avshalom; Ambler, Antony; Harrington, HonaLee; Houts, Renate; Keefe, Richard S. E.; McDonald, Kay; Ward, Aimee; Poulton, Richie; Moffitt, Terrie E.

    2012-01-01

    Recent reports show that fewer adolescents believe that regular cannabis use is harmful to health. Concomitantly, adolescents are initiating cannabis use at younger ages, and more adolescents are using cannabis on a daily basis. The purpose of the present study was to test the association between persistent cannabis use and neuropsychological decline and determine whether decline is concentrated among adolescent-onset cannabis users. Participants were members of the Dunedin Study, a prospecti...

  7. Bonobos (Pan paniscus) show an attentional bias toward conspecifics' emotions.

    Science.gov (United States)

    Kret, Mariska E; Jaasma, Linda; Bionda, Thomas; Wijnen, Jasper G

    2016-04-05

    In social animals, the fast detection of group members' emotional expressions promotes swift and adequate responses, which is crucial for the maintenance of social bonds and ultimately for group survival. The dot-probe task is a well-established paradigm in psychology, measuring emotional attention through reaction times. Humans tend to be biased toward emotional images, especially when the emotion is of a threatening nature. Bonobos have rich, social emotional lives and are known for their soft and friendly character. In the present study, we investigated (i) whether bonobos, similar to humans, have an attentional bias toward emotional scenes compared with conspecifics showing a neutral expression, and (ii) which emotional behaviors attract their attention the most. As predicted, results consistently showed that bonobos' attention was biased toward the location of the emotional versus neutral scene. Interestingly, their attention was grabbed most by images showing conspecifics such as sexual behavior, yawning, or grooming, and not as much-as is often observed in humans-by signs of distress or aggression. The results suggest that protective and affiliative behaviors are pivotal in bonobo society and therefore attract immediate attention in this species.

  8. An Undergraduate Endeavor: Assembling a Live Planetarium Show About Mars

    Science.gov (United States)

    McGraw, Allison M.

    2016-10-01

    Viewing the mysterious red planet Mars goes back thousands of years with just the human eye but in more recent years the growth of telescopes, satellites and lander missions unveil unrivaled detail of the Martian surface that tells a story worth listening to. This planetarium show will go through the observations starting with the ancients to current understandings of the Martian surface, atmosphere and inner-workings through past and current Mars missions. Visual animations of its planetary motions, display of high resolution images from the Hi-RISE (High Resolution Imaging Science Experiment) and CTX (Context Camera) data imagery aboard the MRO (Mars Reconnaissance Orbiter) as well as other datasets will be used to display the terrain detail and imagery of the planet Mars with a digital projection system. Local planetary scientists and Mars specialists from the Lunar and Planetary Lab at the University of Arizona (Tucson, AZ) will be interviewed and used in the show to highlight current technology and understandings of the red planet. This is an undergraduate project that is looking for collaborations and insight in order gain structure in script writing that will teach about this planetary body to all ages in the format of a live planetarium show.

  9. AirShow 1.0 CFD Software Users' Guide

    Science.gov (United States)

    Mohler, Stanley R., Jr.

    2005-01-01

    AirShow is visualization post-processing software for Computational Fluid Dynamics (CFD). Upon reading binary PLOT3D grid and solution files into AirShow, the engineer can quickly see how hundreds of complex 3-D structured blocks are arranged and numbered. Additionally, chosen grid planes can be displayed and colored according to various aerodynamic flow quantities such as Mach number and pressure. The user may interactively rotate and translate the graphical objects using the mouse. The software source code was written in cross-platform Java, C++, and OpenGL, and runs on Unix, Linux, and Windows. The graphical user interface (GUI) was written using Java Swing. Java also provides multiple synchronized threads. The Java Native Interface (JNI) provides a bridge between the Java code and the C++ code where the PLOT3D files are read, the OpenGL graphics are rendered, and numerical calculations are performed. AirShow is easy to learn and simple to use. The source code is available for free from the NASA Technology Transfer and Partnership Office.

  10. Compliance With Recommended Food Safety Practices in Television Cooking Shows.

    Science.gov (United States)

    Cohen, Nancy L; Olson, Rita Brennan

    Examine compliance with recommended food safety practices in television cooking shows. Using a tool based on the Massachusetts Food Establishment Inspection Report, raters examined 39 episodes from 10 television cooking shows. Chefs demonstrated conformance with good retail practices for proper use and storage of utensils in 78% of episodes; preventing contamination (62%), and fingernail care (82%). However, 50% to 88% of episodes were found to be out of compliance with other personal hygiene practices, proper use of gloves and barriers (85% to 100%), and maintaining proper time and temperature controls (93%). Over 90% failed to conform to recommendations regarding preventing contamination through wiping cloths and washing produce. In only 13% of episodes were food safety practices mentioned. There appears to be little attention to food safety during most cooking shows. Celebrity and competing chefs have the opportunity to model and teach good food safety practices for millions of viewers. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  11. Architectures for Quantum Simulation Showing a Quantum Speedup

    Science.gov (United States)

    Bermejo-Vega, Juan; Hangleiter, Dominik; Schwarz, Martin; Raussendorf, Robert; Eisert, Jens

    2018-04-01

    One of the main aims in the field of quantum simulation is to achieve a quantum speedup, often referred to as "quantum computational supremacy," referring to the experimental realization of a quantum device that computationally outperforms classical computers. In this work, we show that one can devise versatile and feasible schemes of two-dimensional, dynamical, quantum simulators showing such a quantum speedup, building on intermediate problems involving nonadaptive, measurement-based, quantum computation. In each of the schemes, an initial product state is prepared, potentially involving an element of randomness as in disordered models, followed by a short-time evolution under a basic translationally invariant Hamiltonian with simple nearest-neighbor interactions and a mere sampling measurement in a fixed basis. The correctness of the final-state preparation in each scheme is fully efficiently certifiable. We discuss experimental necessities and possible physical architectures, inspired by platforms of cold atoms in optical lattices and a number of others, as well as specific assumptions that enter the complexity-theoretic arguments. This work shows that benchmark settings exhibiting a quantum speedup may require little control, in contrast to universal quantum computing. Thus, our proposal puts a convincing experimental demonstration of a quantum speedup within reach in the near term.

  12. Eccentric muscle challenge shows osteopontin polymorphism modulation of muscle damage.

    Science.gov (United States)

    Barfield, Whitney L; Uaesoontrachoon, Kitipong; Wu, Chung-Sheih; Lin, Stephen; Chen, Yue; Wang, Paul C; Kanaan, Yasmine; Bond, Vernon; Hoffman, Eric P

    2014-08-01

    A promoter polymorphism of the osteopontin (OPN) gene (rs28357094) has been associated with multiple inflammatory states, severity of Duchenne muscular dystrophy (DMD) and muscle size in healthy young adults. We sought to define the mechanism of action of the polymorphism, using allele-specific in vitro reporter assays in muscle cells, and a genotype-stratified intervention in healthy controls. In vitro reporter constructs showed the G allele to respond to estrogen treatment, whereas the T allele showed no transcriptional response. Young adult volunteers (n = 187) were enrolled into a baseline study, and subjects with specific rs28357094 genotypes enrolled into an eccentric muscle challenge intervention [n = 3 TT; n = 3 GG/GT (dominant inheritance model)]. Female volunteers carrying the G allele showed significantly greater inflammation and increased muscle volume change as determined by magnetic resonance imaging T1- and T2-weighted images after eccentric challenge, as well as greater decrement in biceps muscle force. Our data suggest a model where the G allele enables enhanced activities of upstream enhancer elements due to loss of Sp1 binding at the polymorphic site. This results in significantly greater expression of the pro-inflammatory OPN cytokine during tissue remodeling in response to challenge in G allele carriers, promoting muscle hypertrophy in normal females, but increased damage in DMD patients. © The Author 2014. Published by Oxford University Press.

  13. Fat stigmatization in television shows and movies: a content analysis.

    Science.gov (United States)

    Himes, Susan M; Thompson, J Kevin

    2007-03-01

    To examine the phenomenon of fat stigmatization messages presented in television shows and movies, a content analysis was used to quantify and categorize fat-specific commentary and humor. Fat stigmatization vignettes were identified using a targeted sampling procedure, and 135 scenes were excised from movies and television shows. The material was coded by trained raters. Reliability indices were uniformly high for the seven categories (percentage agreement ranged from 0.90 to 0.98; kappas ranged from 0.66 to 0.94). Results indicated that fat stigmatization commentary and fat humor were often verbal, directed toward another person, and often presented directly in the presence of the overweight target. Results also indicated that male characters were three times more likely to engage in fat stigmatization commentary or fat humor than female characters. To our knowledge, these findings provide the first information regarding the specific gender, age, and types of fat stigmatization that occur frequently in movies and television shows. The stimuli should prove useful in future research examining the role of individual difference factors (e.g., BMI) in the reaction to viewing such vignettes.

  14. Radon in Austrian tourist mines and show caves

    International Nuclear Information System (INIS)

    Ringer, W.; Graeser, J.

    2009-01-01

    The radon situation in tourist mines and show caves is barely investigated in Austria. This paper investigates the influence of its determining factors, such as climate, structure and geology. For this purpose, long-term time-resolved measurements over 6 to 12 months in 4 tourist mines and 2 show caves - with 5 to 9 measuring points each - have been carried out to obtain the course of radon concentration throughout the year. In addition, temperature and air-pressure were measured and compared to the data outside where available. Results suggest that the dominating factors of the average radon concentration are structure and location (geology) of the tunnel-system, whereas the diurnal and annual variation is mainly caused by the changing airflow, which is driven by the difference in temperature inside and outside. Downcast air is connected with very low radon concentrations, upcast air with high concentrations. In some locations the maximum values appear when the airflow ceases. But airflow can be different in different parts of mines and caves. Systems close to the surface show generally lower radon levels than the ones located deeper underground. Due to variation of structure, geology and local climate, the radon situation in mines and caves can only be described by simultaneous measurements at several measuring points. (orig.)

  15. Analyses and characterization of double shell tank

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  16. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  17. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  18. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  19. Soil analyses by ICP-MS (Review)

    International Nuclear Information System (INIS)

    Yamasaki, Shin-ichi

    2000-01-01

    Soil analyses by inductively coupled plasma mass spectrometry (ICP-MS) are reviewed. The first half of the paper is devoted to the development of techniques applicable to soil analyses, where diverse analytical parameters are carefully evaluated. However, the choice of soil samples is somewhat arbitrary, and only a limited number of samples (mostly reference materials) are examined. In the second half, efforts are mostly concentrated on the introduction of reports, where a large number of samples and/or very precious samples have been analyzed. Although the analytical techniques used in these reports are not necessarily novel, valuable information concerning such topics as background levels of elements in soils, chemical forms of elements in soils and behavior of elements in soil ecosystems and the environment can be obtained. The major topics discussed are total elemental analysis, analysis of radionuclides with long half-lives, speciation, leaching techniques, and isotope ratio measurements. (author)

  20. Standardized analyses of nuclear shipping containers

    International Nuclear Information System (INIS)

    Parks, C.V.; Hermann, O.W.; Petrie, L.M.; Hoffman, T.J.; Tang, J.S.; Landers, N.F.; Turner, W.D.

    1983-01-01

    This paper describes improved capabilities for analyses of nuclear fuel shipping containers within SCALE -- a modular code system for Standardized Computer Analyses for Licensing Evaluation. Criticality analysis improvements include the new KENO V, a code which contains an enhanced geometry package and a new control module which uses KENO V and allows a criticality search on optimum pitch (maximum k-effective) to be performed. The SAS2 sequence is a new shielding analysis module which couples fuel burnup, source term generation, and radial cask shielding. The SAS5 shielding sequence allows a multidimensional Monte Carlo analysis of a shipping cask with code generated biasing of the particle histories. The thermal analysis sequence (HTAS1) provides an easy-to-use tool for evaluating a shipping cask response to the accident capability of the SCALE system to provide the cask designer or evaluator with a computational system that provides the automated procedures and easy-to-understand input that leads to standarization

  1. Quantitative Analyse und Visualisierung der Herzfunktionen

    Science.gov (United States)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  2. Exergetic and thermoeconomic analyses of power plants

    International Nuclear Information System (INIS)

    Kwak, H.-Y.; Kim, D.-J.; Jeon, J.-S.

    2003-01-01

    Exergetic and thermoeconomic analyses were performed for a 500-MW combined cycle plant. In these analyses, mass and energy conservation laws were applied to each component of the system. Quantitative balances of the exergy and exergetic cost for each component, and for the whole system was carefully considered. The exergoeconomic model, which represented the productive structure of the system considered, was used to visualize the cost formation process and the productive interaction between components. The computer program developed in this study can determine the production costs of power plants, such as gas- and steam-turbines plants and gas-turbine cogeneration plants. The program can be also be used to study plant characteristics, namely, thermodynamic performance and sensitivity to changes in process and/or component design variables

  3. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  4. Sectorial Group for Incident Analyses (GSAI)

    International Nuclear Information System (INIS)

    Galles, Q.; Gamo, J. M.; Jorda, M.; Sanchez-Garrido, P.; Lopez, F.; Asensio, L.; Reig, J.

    2013-01-01

    In 2008, the UNESA Nuclear Energy Committee (CEN) proposed the creation of a working group formed by experts from all Spanish NPPs with the purpose of jointly analyze relevant incidents occurred in each one of the plants. This initiative was a response to a historical situation in which the exchange of information on incidents between the Spanish NPP's was below the desired level. In june 2009, UNESA's Guide CEN-29 established the performance criteria for the so called Sectorial Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group of Operating Experience, under the Operations Commission (COP). (Author)

  5. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack......, and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  6. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  7. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  8. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  9. Visuelle Analyse von E-mail-Verkehr

    OpenAIRE

    Mansmann, Florian

    2003-01-01

    Diese Arbeit beschreibt Methoden zur visuellen geographischen Analyse von E-mail Verkehr.Aus dem Header einer E-mail können Hostadressen und IP-Adressen herausgefiltert werden. Anhand einer Datenbank werden diesen Host- und IP-Adressen geographische Koordinaten zugeordnet.Durch eine Visualisierung werden in übersichtlicher Art und Weise mehrere tausend E-mail Routen dargestellt. Zusätzlich dazu wurden interktive Manipulationsmöglichkeiten vorgestellt, welche eine visuelle Exploration der Date...

  10. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  11. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  12. Neuronal network analyses: premises, promises and uncertainties

    OpenAIRE

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the diffic...

  13. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  14. Kinematic gait analyses in healthy Golden Retrievers

    OpenAIRE

    Silva, Gabriela C.A.; Cardoso, Mariana Trés; Gaiad, Thais P.; Brolio, Marina P.; Oliveira, Vanessa C.; Assis Neto, Antonio; Martins, Daniele S.; Ambrósio, Carlos E.

    2014-01-01

    Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female,...

  15. Evaluation of periodic safety status analyses

    International Nuclear Information System (INIS)

    Faber, C.; Staub, G.

    1997-01-01

    In order to carry out the evaluation of safety status analyses by the safety assessor within the periodical safety reviews of nuclear power plants safety goal oriented requirements have been formulated together with complementary evaluation criteria. Their application in an inter-disciplinary coopertion covering the subject areas involved facilitates a complete safety goal oriented assessment of the plant status. The procedure is outlined briefly by an example for the safety goal 'reactivity control' for BWRs. (orig.) [de

  16. Publication bias in dermatology systematic reviews and meta-analyses.

    Science.gov (United States)

    Atakpo, Paul; Vassar, Matt

    2016-05-01

    Systematic reviews and meta-analyses in dermatology provide high-level evidence for clinicians and policy makers that influence clinical decision making and treatment guidelines. One methodological problem with systematic reviews is the under representation of unpublished studies. This problem is due in part to publication bias. Omission of statistically non-significant data from meta-analyses may result in overestimation of treatment effect sizes which may lead to clinical consequences. Our goal was to assess whether systematic reviewers in dermatology evaluate and report publication bias. Further, we wanted to conduct our own evaluation of publication bias on meta-analyses that failed to do so. Our study considered systematic reviews and meta-analyses from ten dermatology journals from 2006 to 2016. A PubMed search was conducted, and all full-text articles that met our inclusion criteria were retrieved and coded by the primary author. 293 articles were included in our analysis. Additionally, we formally evaluated publication bias in meta-analyses that failed to do so using trim and fill and cumulative meta-analysis by precision methods. Publication bias was mentioned in 107 articles (36.5%) and was formally evaluated in 64 articles (21.8%). Visual inspection of a funnel plot was the most common method of evaluating publication bias. Publication bias was present in 45 articles (15.3%), not present in 57 articles (19.5%) and not determined in 191 articles (65.2%). Using the trim and fill method, 7 meta-analyses (33.33%) showed evidence of publication bias. Although the trim and fill method only found evidence of publication bias in 7 meta-analyses, the cumulative meta-analysis by precision method found evidence of publication bias in 15 meta-analyses (71.4%). Many of the reviews in our study did not mention or evaluate publication bias. Further, of the 42 articles that stated following PRISMA reporting guidelines, 19 (45.2%) evaluated for publication bias. In

  17. Application of RUNTA code in flood analyses

    International Nuclear Information System (INIS)

    Perez Martin, F.; Benitez Fonzalez, F.

    1994-01-01

    Flood probability analyses carried out to date indicate the need to evaluate a large number of flood scenarios. This necessity is due to a variety of reasons, the most important of which include: - Large number of potential flood sources - Wide variety of characteristics of flood sources - Large possibility of flood-affected areas becoming inter linked, depending on the location of the potential flood sources - Diversity of flood flows from one flood source, depending on the size of the rupture and mode of operation - Isolation times applicable - Uncertainties in respect of the structural resistance of doors, penetration seals and floors - Applicable degrees of obstruction of floor drainage system Consequently, a tool which carries out the large number of calculations usually required in flood analyses, with speed and flexibility, is considered necessary. The RUNTA Code enables the range of possible scenarios to be calculated numerically, in accordance with all those parameters which, as a result of previous flood analyses, it is necessary to take into account in order to cover all the possible floods associated with each flood area

  18. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  19. Remembering Operación Triunfo: a Latin Music Reality Show in the Era of Talent Shows

    NARCIS (Netherlands)

    Savini, Paola

    2016-01-01

    abstractThe music format Operación Triunfo (2001–2011), which aired on RTVE for the first time in 2001, started as a television (TV) and musical success in Spain and today is one of the most famous shows around the world as well as an incredible socio-economic phenomenon in Spanish TV. This paper

  20. Perforating pilomatrixoma showing atypical presentation: A rare clinical variant

    Directory of Open Access Journals (Sweden)

    Nevra Seyhan

    2018-03-01

    Full Text Available Pilomatrixoma, also known as calcifying epithelioma of Malherbe, is a rare benign skin tumor arising from hair follicle stem cells. The most common localization is the head and neck region. Female/male ratio is 3/2. It shows deep subcutaneous placement and occurs in the first two decades of life. Its diameter ranges from 0.5 cm to 3 cm. Multiple lesions are rarely seen. Histopathologically it is characterized by basoloid and ghost cells. Perforating type is a rare clinical variant. Treatment is surgical excision. Our case is presented to draw attention to a rare clinical variant of pilomatrixioma.