WorldWideScience

Sample records for bivariate analyses showed

  1. Bivariate genome-wide association analyses identified genetic pleiotropic effects for bone mineral density and alcohol drinking in Caucasians.

    Science.gov (United States)

    Lu, Shan; Zhao, Lan-Juan; Chen, Xiang-Ding; Papasian, Christopher J; Wu, Ke-Hao; Tan, Li-Jun; Wang, Zhuo-Er; Pei, Yu-Fang; Tian, Qing; Deng, Hong-Wen

    2017-11-01

    Several studies indicated bone mineral density (BMD) and alcohol intake might share common genetic factors. The study aimed to explore potential SNPs/genes related to both phenotypes in US Caucasians at the genome-wide level. A bivariate genome-wide association study (GWAS) was performed in 2069 unrelated participants. Regular drinking was graded as 1, 2, 3, 4, 5, or 6, representing drinking alcohol never, less than once, once or twice, three to six times, seven to ten times, or more than ten times per week respectively. Hip, spine, and whole body BMDs were measured. The bivariate GWAS was conducted on the basis of a bivariate linear regression model. Sex-stratified association analyses were performed in the male and female subgroups. In males, the most significant association signal was detected in SNP rs685395 in DYNC2H1 with bivariate spine BMD and alcohol drinking (P = 1.94 × 10 -8 ). SNP rs685395 and five other SNPs, rs657752, rs614902, rs682851, rs626330, and rs689295, located in the same haplotype block in DYNC2H1 were the top ten most significant SNPs in the bivariate GWAS in males. Additionally, two SNPs in GRIK4 in males and three SNPs in OPRM1 in females were suggestively associated with BMDs (of the hip, spine, and whole body) and alcohol drinking. Nine SNPs in IL1RN were only suggestively associated with female whole body BMD and alcohol drinking. Our study indicated that DYNC2H1 may contribute to the genetic mechanisms of both spine BMD and alcohol drinking in male Caucasians. Moreover, our study suggested potential pleiotropic roles of OPRM1 and IL1RN in females and GRIK4 in males underlying variation of both BMD and alcohol drinking.

  2. Perceived social support and academic achievement: cross-lagged panel and bivariate growth curve analyses.

    Science.gov (United States)

    Mackinnon, Sean P

    2012-04-01

    As students transition to post-secondary education, they experience considerable stress and declines in academic performance. Perceived social support is thought to improve academic achievement by reducing stress. Longitudinal designs with three or more waves are needed in this area because they permit stronger causal inferences and help disentangle the direction of relationships. This study uses a cross-lagged panel and a bivariate growth curve analysis with a three-wave longitudinal design. Participants include 10,445 students (56% female; 12.6% born outside of Canada) transitioning to post-secondary education from ages 15-19. Self-report measures of academic achievement and a generalized measure of perceived social support were used. An increase in average relative standing in academic achievement predicted an increase in average relative standing on perceived social support 2 years later, but the reverse was not true. High levels of perceived social support at age 15 did not protect against declines in academic achievement over time. In sum, perceived social support appears to have no bearing on adolescents' future academic performance, despite commonly held assumptions of its importance.

  3. Powerful bivariate genome-wide association analyses suggest the SOX6 gene influencing both obesity and osteoporosis phenotypes in males.

    Directory of Open Access Journals (Sweden)

    Yao-Zhong Liu

    2009-08-01

    Full Text Available Current genome-wide association studies (GWAS are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically.To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI, with the osteoporosis risk phenotype, hip bone mineral density (BMD, scanning approximately 380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6 gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82x10(-7 and 1.47x10(-6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the approximately 380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS cohort containing 3,355 Caucasians (1,370 males and 1,985 females from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat.Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis.

  4. Analysing risk factors of co-occurrence of schistosomiasis haematobium and hookworm using bivariate regression models: Case study of Chikwawa, Malawi

    Directory of Open Access Journals (Sweden)

    Bruce B.W. Phiri

    2016-06-01

    Full Text Available Schistosomiasis and soil-transmitted helminth (STH infections constitute a major public health problem in many parts of sub-Saharan Africa. In areas where prevalence of geo-helminths and schistosomes is high, co-infection with multiple parasite species is common, resulting in disproportionately elevated burden compared with single infections. Determining risk factors of co-infection intensity is important for better design of targeted interventions. In this paper, we examined risk factors of hookworm and S. haematobium co-infection intensity, in Chikwawa district, southern Malawi in 2005, using bivariate count models. Results show that hookworm and S. haematobium infections were much localised with small proportion of individuals harbouring more parasites especially among school-aged children. The risk of co-intensity with both hookworm and S. haematobium was high for all ages, although this diminished with increasing age, increased with fishing (hookworm: coefficient. = 12.29; 95% CI = 11.50–13.09; S. haematobium: 0.040; 95% CI = 0.0037, 3.832. Both infections were abundant in those with primary education (hookworm: coef. = 0.072; 95% CI = 0.056, 0.401 and S. haematobium: coef. = 0.286; 95% CI = 0.034, 0.538. However, much lower risk was observed for those who were farmers (hookworm: coef. = −0.349, 95% CI = −0.547,−0.150; S. haematobium: coef. −0.239, 95% CI = −0.406, −0.072. In conclusion, our findings suggest that efforts to control helminths infection should be co-integrated and health promotion campaigns should be aimed at school-going children and adults who are in constant contact with water.

  5. Ordinal Bivariate Inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    2016-01-01

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlation-increasing switches and....../or median-preserving spreads. For the canonical 2 × 2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  6. Ordinal bivariate inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlationincreasing switches and/or median......-preserving spreads. For the canonical 2x2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  7. Monitoring bivariate process

    Directory of Open Access Journals (Sweden)

    Marcela A. G. Machado

    2009-12-01

    Full Text Available The T² chart and the generalized variance |S| chart are the usual tools for monitoring the mean vector and the covariance matrix of multivariate processes. The main drawback of these charts is the difficulty to obtain and to interpret the values of their monitoring statistics. In this paper, we study control charts for monitoring bivariate processes that only requires the computation of sample means (the ZMAX chart for monitoring the mean vector, sample variances (the VMAX chart for monitoring the covariance matrix, or both sample means and sample variances (the MCMAX chart in the case of the joint control of the mean vector and the covariance matrix.Os gráficos de T² e da variância amostral generalizada |S| são as ferramentas usualmente utilizadas no monitoramento do vetor de médias e da matriz de covariâncias de processos multivariados. A principal desvantagem desses gráficos é a dificuldade em obter e interpretar os valores de suas estatísticas de monitoramento. Neste artigo, estudam-se gráficos de controle para o monitoramento de processos bivariados que necessitam somente do cálculo de médias amostrais (gráfico ZMAX para o monitoramento do vetor de médias, ou das variâncias amostrais (gráfico VMAX para o monitoramento da matriz de covariâncias, ou então das médias e variâncias amostrais (gráfico MCMAX para o caso do monitoramento conjunto do vetor de médias e da matriz de covariâncias.

  8. Financial Applications of Bivariate Markov Processes

    OpenAIRE

    Ortobelli Lozza, Sergio; Angelelli, Enrico; Bianchi, Annamaria

    2011-01-01

    This paper describes a methodology to approximate a bivariate Markov process by means of a proper Markov chain and presents possible financial applications in portfolio theory, option pricing and risk management. In particular, we first show how to model the joint distribution between market stochastic bounds and future wealth and propose an application to large-scale portfolio problems. Secondly, we examine an application to VaR estimation. Finally, we propose a methodology...

  9. Bivariate value-at-risk

    Directory of Open Access Journals (Sweden)

    Giuseppe Arbia

    2007-10-01

    Full Text Available In this paper we extend the concept of Value-at-risk (VaR to bivariate return distributions in order to obtain measures of the market risk of an asset taking into account additional features linked to downside risk exposure. We first present a general definition of risk as the probability of an adverse event over a random distribution and we then introduce a measure of market risk (b-VaR that admits the traditional b of an asset in portfolio management as a special case when asset returns are normally distributed. Empirical evidences are provided by using Italian stock market data.

  10. New genetic and linguistic analyses show ancient human influence on baobab evolution and distribution in Australia.

    Directory of Open Access Journals (Sweden)

    Haripriya Rangan

    Full Text Available This study investigates the role of human agency in the gene flow and geographical distribution of the Australian baobab, Adansonia gregorii. The genus Adansonia is a charismatic tree endemic to Africa, Madagascar, and northwest Australia that has long been valued by humans for its multiple uses. The distribution of genetic variation in baobabs in Africa has been partially attributed to human-mediated dispersal over millennia, but this relationship has never been investigated for the Australian species. We combined genetic and linguistic data to analyse geographic patterns of gene flow and movement of word-forms for A. gregorii in the Aboriginal languages of northwest Australia. Comprehensive assessment of genetic diversity showed weak geographic structure and high gene flow. Of potential dispersal vectors, humans were identified as most likely to have enabled gene flow across biogeographic barriers in northwest Australia. Genetic-linguistic analysis demonstrated congruence of gene flow patterns and directional movement of Aboriginal loanwords for A. gregorii. These findings, along with previous archaeobotanical evidence from the Late Pleistocene and Holocene, suggest that ancient humans significantly influenced the geographic distribution of Adansonia in northwest Australia.

  11. Correlation analyses between volatiles and glucosinolates show no evidence for chemical defense signaling in Brassica rapa

    Directory of Open Access Journals (Sweden)

    Florian Paul Schiestl

    2014-04-01

    Full Text Available Positive correlations between volatile organic compounds (VOCs and defense chemicals indicate signaling of defense status. Such aposematic signaling has been hypothesized to be widespread in plants, however, it has up to now only been shown for visual signals. Correlations between identical compounds in different plant tissues, on the other hand, can be informative about the (co-regulation of their biosynthesis or emission. Here I use Brassica rapa to investigate 1 correlations between identical metabolites (volatiles, glucosinolates in leaf and flower tissue, and 2 correlations between volatiles and glucosinolates in the same plant organs (flowers and leaves. Whereas the amounts of many glucosinolates were positively correlated in leaves and flower tissue, identical leaf and floral VOCs showed no such correlations, indicating independent regulation of emission. None of the leaf or flower volatiles showed positive correlations with the two major glucosinolates (gluconapin, glucobrassicanapin or the sum of all glucosinolates in either leaves or flowers. Some VOCs, however, showed positive correlations with minor glucosinolates which, however, represented less than one percent of the total amounts of glucosinolates. Some leaf monoterpenes showed negative associations with gluconapin. The lack of consistent positive correlations between VOCs and major defense compounds suggests that plants do not chemically signal their defense status. This could be adaptive as it may avoid eavesdropping by specialist herbivores to locate their host plants. Negative correlations likely indicate chemical trade-offs in the synthesis of secondary metabolites.

  12. Bibliographic study showed improving statistical methodology of network meta-analyses published between 1999 and 2015.

    Science.gov (United States)

    Petropoulou, Maria; Nikolakopoulou, Adriani; Veroniki, Areti-Angeliki; Rios, Patricia; Vafaei, Afshin; Zarin, Wasifa; Giannatsi, Myrsini; Sullivan, Shannon; Tricco, Andrea C; Chaimani, Anna; Egger, Matthias; Salanti, Georgia

    2017-02-01

    To assess the characteristics and core statistical methodology specific to network meta-analyses (NMAs) in clinical research articles. We searched MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews from inception until April 14, 2015, for NMAs of randomized controlled trials including at least four different interventions. Two reviewers independently screened potential studies, whereas data abstraction was performed by a single reviewer and verified by a second. A total of 456 NMAs, which included a median (interquartile range) of 21 (13-40) studies and 7 (5-9) treatment nodes, were assessed. A total of 125 NMAs (27%) were star networks; this proportion declined from 100% in 2005 to 19% in 2015 (P = 0.01 by test of trend). An increasing number of NMAs discussed transitivity or inconsistency (0% in 2005, 86% in 2015, P < 0.01) and 150 (45%) used appropriate methods to test for inconsistency (14% in 2006, 74% in 2015, P < 0.01). Heterogeneity was explored in 256 NMAs (56%), with no change over time (P = 0.10). All pairwise effects were reported in 234 NMAs (51%), with some increase over time (P = 0.02). The hierarchy of treatments was presented in 195 NMAs (43%), the probability of being best was most commonly reported (137 NMAs, 70%), but use of surface under the cumulative ranking curves increased steeply (0% in 2005, 33% in 2015, P < 0.01). Many NMAs published in the medical literature have significant limitations in both the conduct and reporting of the statistical analysis and numerical results. The situation has, however, improved in recent years, in particular with respect to the evaluation of the underlying assumptions, but considerable room for further improvements remains. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Bivariate hard thresholding in wavelet function estimation

    OpenAIRE

    Piotr Fryzlewicz

    2007-01-01

    We propose a generic bivariate hard thresholding estimator of the discrete wavelet coefficients of a function contaminated with i.i.d. Gaussian noise. We demonstrate its good risk properties in a motivating example, and derive upper bounds for its mean-square error. Motivated by the clustering of large wavelet coefficients in real-life signals, we propose two wavelet denoising algorithms, both of which use specific instances of our bivariate estimator. The BABTE algorithm uses basis averaging...

  15. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster–Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran.

  16. Comparison between two bivariate Poisson distributions through the ...

    African Journals Online (AJOL)

    To remedy this problem, Berkhout and Plug proposed a bivariate Poisson distribution accepting the correlation as well negative, equal to zero, that positive. In this paper, we show that these models are nearly everywhere asymptotically equal. From this survey that the ø-divergence converges toward zero, both models are ...

  17. Comparative Analyses of the Lipooligosaccharides from Nontypeable Haemophilus influenzae and Haemophilus haemolyticus Show Differences in Sialic Acid and Phosphorylcholine Modifications.

    Science.gov (United States)

    Post, Deborah M B; Ketterer, Margaret R; Coffin, Jeremy E; Reinders, Lorri M; Munson, Robert S; Bair, Thomas; Murphy, Timothy F; Foster, Eric D; Gibson, Bradford W; Apicella, Michael A

    2016-01-04

    Haemophilus haemolyticus and nontypeable Haemophilus influenzae (NTHi) are closely related upper airway commensal bacteria that are difficult to distinguish phenotypically. NTHi causes upper and lower airway tract infections in individuals with compromised airways, while H. haemolyticus rarely causes such infections. The lipooligosaccharide (LOS) is an outer membrane component of both species and plays a role in NTHi pathogenesis. In this study, comparative analyses of the LOS structures and corresponding biosynthesis genes were performed. Mass spectrometric and immunochemical analyses showed that NTHi LOS contained terminal sialic acid more frequently and to a higher extent than H. haemolyticus LOS did. Genomic analyses of 10 strains demonstrated that H. haemolyticus lacked the sialyltransferase genes lic3A and lic3B (9/10) and siaA (10/10), but all strains contained the sialic acid uptake genes siaP and siaT (10/10). However, isothermal titration calorimetry analyses of SiaP from two H. haemolyticus strains showed a 3.4- to 7.3-fold lower affinity for sialic acid compared to that of NTHi SiaP. Additionally, mass spectrometric and immunochemical analyses showed that the LOS from H. haemolyticus contained phosphorylcholine (ChoP) less frequently than the LOS from NTHi strains. These differences observed in the levels of sialic acid and ChoP incorporation in the LOS structures from H. haemolyticus and NTHi may explain some of the differences in their propensities to cause disease. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  18. Bivariate extreme value with application to PM10 concentration analysis

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-05-01

    This study is focus on a bivariate extreme of renormalized componentwise maxima with generalized extreme value distribution as a marginal function. The limiting joint distribution of several parametric models are presented. Maximum likelihood estimation is employed for parameter estimations and the best model is selected based on the Akaike Information Criterion. The weekly and monthly componentwise maxima series are extracted from the original observations of daily maxima PM10 data for two air quality monitoring stations located in Pasir Gudang and Johor Bahru. The 10 years data are considered for both stations from year 2001 to 2010. The asymmetric negative logistic model is found as the best fit bivariate extreme model for both weekly and monthly maxima componentwise series. However the dependence parameters show that the variables for weekly maxima series is more dependence to each other compared to the monthly maxima.

  19. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  20. Reliability for some bivariate beta distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models there has been a large amount of work as regards estimation of the reliability R=Pr( Xbivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate beta. The calculations involve the use of special functions.

  1. Reliability for some bivariate gamma distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models, there has been a large amount of work as regards estimation of the reliability R=Pr( Xbivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate gamma. The calculations involve the use of special functions.

  2. Covariate analysis of bivariate survival data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  3. On the matched pairs sign test using bivariate ranked set sampling ...

    African Journals Online (AJOL)

    BVRSS) is introduced and investigated. We show that this test is asymptotically more efficient than its counterpart sign test based on a bivariate simple random sample (BVSRS). The asymptotic null distribution and the efficiency of the test are derived.

  4. Spectral density regression for bivariate extremes

    KAUST Repository

    Castro Camilo, Daniela

    2016-05-11

    We introduce a density regression model for the spectral density of a bivariate extreme value distribution, that allows us to assess how extremal dependence can change over a covariate. Inference is performed through a double kernel estimator, which can be seen as an extension of the Nadaraya–Watson estimator where the usual scalar responses are replaced by mean constrained densities on the unit interval. Numerical experiments with the methods illustrate their resilience in a variety of contexts of practical interest. An extreme temperature dataset is used to illustrate our methods. © 2016 Springer-Verlag Berlin Heidelberg

  5. Bivariate Kumaraswamy Models via Modified FGM Copulas: Properties and Applications

    Directory of Open Access Journals (Sweden)

    Indranil Ghosh

    2017-11-01

    Full Text Available A copula is a useful tool for constructing bivariate and/or multivariate distributions. In this article, we consider a new modified class of FGM (Farlie–Gumbel–Morgenstern bivariate copula for constructing several different bivariate Kumaraswamy type copulas and discuss their structural properties, including dependence structures. It is established that construction of bivariate distributions by this method allows for greater flexibility in the values of Spearman’s correlation coefficient, ρ and Kendall’s τ .

  6. Computational approach to Thornley's problem by bivariate operational calculus

    Science.gov (United States)

    Bazhlekova, E.; Dimovski, I.

    2012-10-01

    Thornley's problem is an initial-boundary value problem with a nonlocal boundary condition for linear onedimensional reaction-diffusion equation, used as a mathematical model of spiral phyllotaxis in botany. Applying a bivariate operational calculus we find explicit representation of the solution, containing two convolution products of special solutions and the arbitrary initial and boundary functions. We use a non-classical convolution with respect to the space variable, extending in this way the classical Duhamel principle. The special solutions involved are represented in the form of fast convergent series. Numerical examples are considered to show the application of the present technique and to analyze the character of the solution.

  7. Solving Bivariate Polynomial Systems on a GPU

    International Nuclear Information System (INIS)

    Moreno Maza, Marc; Pan Wei

    2012-01-01

    We present a CUDA implementation of dense multivariate polynomial arithmetic based on Fast Fourier Transforms over finite fields. Our core routine computes on the device (GPU) the subresultant chain of two polynomials with respect to a given variable. This subresultant chain is encoded by values on a FFT grid and is manipulated from the host (CPU) in higher-level procedures. We have realized a bivariate polynomial system solver supported by our GPU code. Our experimental results (including detailed profiling information and benchmarks against a serial polynomial system solver implementing the same algorithm) demonstrate that our strategy is well suited for GPU implementation and provides large speedup factors with respect to pure CPU code.

  8. Bivariate Rayleigh Distribution and its Properties

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2007-01-01

    Full Text Available Rayleigh (1880 observed that the sea waves follow no law because of the complexities of the sea, but it has been seen that the probability distributions of wave heights, wave length, wave induce pitch, wave and heave motions of the ships follow the Rayleigh distribution. At present, several different quantities are in use for describing the state of the sea; for example, the mean height of the waves, the root mean square height, the height of the “significant waves” (the mean height of the highest one-third of all the waves the maximum height over a given interval of the time, and so on. At present, the ship building industry knows less than any other construction industry about the service conditions under which it must operate. Only small efforts have been made to establish the stresses and motions and to incorporate the result of such studies in to design. This is due to the complexity of the problem caused by the extensive variability of the sea and the corresponding response of the ships. Although the problem appears feasible, yet it is possible to predict service conditions for ships in an orderly and relatively simple manner Rayleigh (1980 derived it from the amplitude of sound resulting from many independent sources. This distribution is also connected with one or two dimensions and is sometimes referred to as “random walk” frequency distribution. The Rayleigh distribution can be derived from the bivariate normal distribution when the variate are independent and random with equal variances. We try to construct bivariate Rayleigh distribution with marginal Rayleigh distribution function and discuss its fundamental properties.

  9. Multi-Scale Carbon Isotopic Analyses Show Allende Nanodiamonds are Mostly Solar with Some PreSolar

    Science.gov (United States)

    Lewis, J. B.; Isheim, D.; Floss, C.; Gyngard, F.; Seidman, D. N.

    2017-07-01

    NanoSIMS and atom-probe experiments on different-sized aggregates of meteoritic nanodiamonds show mostly normal C isotopes, with a fraction of 13C-enriched material. The best interpretation is a combination of solar system and supernova formation.

  10. Bivariate generalized Pareto distribution for extreme atmospheric particulate matter

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-02-01

    The high particulate matter (PM10) level is the prominent issue causing various impacts to human health and seriously affecting the economics. The asymptotic theory of extreme value is apply for analyzing the relation of extreme PM10 data from two nearby air quality monitoring stations. The series of daily maxima PM10 for Johor Bahru and Pasir Gudang stations are consider for year 2001 to 2010 databases. The 85% and 95% marginal quantile apply to determine the threshold values and hence construct the series of exceedances over the chosen threshold. The logistic, asymmetric logistic, negative logistic and asymmetric negative logistic models areconsidered as the dependence function to the joint distribution of a bivariate observation. Maximum likelihood estimation is employed for parameter estimations. The best fitted model is chosen based on the Akaike Information Criterion and the quantile plots. It is found that the asymmetric logistic model gives the best fitted model for bivariate extreme PM10 data and shows the weak dependence between two stations.

  11. Elapid Snake Venom Analyses Show the Specificity of the Peptide Composition at the Level of Genera Naja and Notechis

    Directory of Open Access Journals (Sweden)

    Aisha Munawar

    2014-02-01

    Full Text Available Elapid snake venom is a highly valuable, but till now mainly unexplored, source of pharmacologically important peptides. We analyzed the peptide fractions with molecular masses up to 10 kDa of two elapid snake venoms—that of the African cobra, N. m. mossambica (genus Naja, and the Peninsula tiger snake, N. scutatus, from Kangaroo Island (genus Notechis. A combination of chromatographic methods was used to isolate the peptides, which were characterized by combining complimentary mass spectrometric techniques. Comparative analysis of the peptide compositions of two venoms showed specificity at the genus level. Three-finger (3-F cytotoxins, bradykinin-potentiating peptides (BPPs and a bradykinin inhibitor were isolated from the Naja venom. 3-F neurotoxins, Kunitz/basic pancreatic trypsin inhibitor (BPTI-type inhibitors and a natriuretic peptide were identified in the N. venom. The inhibiting activity of the peptides was confirmed in vitro with a selected array of proteases. Cytotoxin 1 (P01467 from the Naja venom might be involved in the disturbance of cellular processes by inhibiting the cell 20S-proteasome. A high degree of similarity between BPPs from elapid and viperid snake venoms was observed, suggesting that these molecules play a key role in snake venoms and also indicating that these peptides were recruited into the snake venom prior to the evolutionary divergence of the snakes.

  12. Efficient estimation of semiparametric copula models for bivariate survival data

    KAUST Repository

    Cheng, Guang

    2014-01-01

    A semiparametric copula model for bivariate survival data is characterized by a parametric copula model of dependence and nonparametric models of two marginal survival functions. Efficient estimation for the semiparametric copula model has been recently studied for the complete data case. When the survival data are censored, semiparametric efficient estimation has only been considered for some specific copula models such as the Gaussian copulas. In this paper, we obtain the semiparametric efficiency bound and efficient estimation for general semiparametric copula models for possibly censored data. We construct an approximate maximum likelihood estimator by approximating the log baseline hazard functions with spline functions. We show that our estimates of the copula dependence parameter and the survival functions are asymptotically normal and efficient. Simple consistent covariance estimators are also provided. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2013 Elsevier Inc.

  13. Bell-Type Inequalities for Bivariate Maps on Orthomodular Lattices

    Science.gov (United States)

    Pykacz, Jarosław; Valášková, L'ubica; Nánásiová, Ol'ga

    2015-08-01

    Bell-type inequalities on orthomodular lattices, in which conjunctions of propositions are not modeled by meets but by maps for simultaneous measurements (-maps), are studied. It is shown, that the most simple of these inequalities, that involves only two propositions, is always satisfied, contrary to what happens in the case of traditional version of this inequality in which conjunctions of propositions are modeled by meets. Equivalence of various Bell-type inequalities formulated with the aid of bivariate maps on orthomodular lattices is studied. Our investigations shed new light on the interpretation of various multivariate maps defined on orthomodular lattices already studied in the literature. The paper is concluded by showing the possibility of using -maps and -maps to represent counterfactual conjunctions and disjunctions of non-compatible propositions about quantum systems.

  14. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1bivariate distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  15. STUDI PERBANDINGAN ANTARA ALGORITMA BIVARIATE MARGINAL DISTRIBUTION DENGAN ALGORITMA GENETIKA

    Directory of Open Access Journals (Sweden)

    Chastine Fatichah

    2006-01-01

    Full Text Available Bivariate Marginal Distribution Algorithm is extended from Estimation of Distribution Algorithm. This heuristic algorithm proposes the new approach for recombination of generate new individual that without crossover and mutation process such as genetic algorithm. Bivariate Marginal Distribution Algorithm uses connectivity variable the pair gene for recombination of generate new individual. Connectivity between variable is doing along optimization process. In this research, genetic algorithm performance with one point crossover is compared with Bivariate Marginal Distribution Algorithm performance in case Onemax, De Jong F2 function, and Traveling Salesman Problem. In this research, experimental results have shown performance the both algorithm is dependence of parameter respectively and also population size that used. For Onemax case with size small problem, Genetic Algorithm perform better with small number of iteration and more fast for get optimum result. However, Bivariate Marginal Distribution Algorithm perform better of result optimization for case Onemax with huge size problem. For De Jong F2 function, Genetic Algorithm perform better from Bivariate Marginal Distribution Algorithm of a number of iteration and time. For case Traveling Salesman Problem, Bivariate Marginal Distribution Algorithm have shown perform better from Genetic Algorithm of optimization result. Abstract in Bahasa Indonesia : Bivariate Marginal Distribution Algorithm merupakan perkembangan lebih lanjut dari Estimation of Distribution Algorithm. Algoritma heuristik ini mengenalkan pendekatan baru dalam melakukan rekombinasi untuk membentuk individu baru, yaitu tidak menggunakan proses crossover dan mutasi seperti pada Genetic Algorithm. Bivariate Marginal Distribution Algorithm menggunakan keterkaitan pasangan variabel dalam melakukan rekombinasi untuk membentuk individu baru. Keterkaitan antar variabel tersebut ditemukan selama proses optimasi berlangsung. Aplikasi yang

  16. Bivariable analysis of ventricular late potentials in high resolution ECG records

    International Nuclear Information System (INIS)

    Orosco, L; Laciar, E

    2007-01-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, EN QRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually

  17. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting

    2011-03-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  18. A note on finding peakedness in bivariate normal distribution using Mathematica

    Directory of Open Access Journals (Sweden)

    Anwer Khurshid

    2007-07-01

    Full Text Available Peakedness measures the concentration around the central value. A classical standard measure of peakedness is kurtosis which is the degree of peakedness of a probability distribution. In view of inconsistency of kurtosis in measuring of the peakedness of a distribution, Horn (1983 proposed a measure of peakedness for symmetrically unimodal distributions. The objective of this paper is two-fold. First, Horn’s method has been extended for bivariate normal distribution. Secondly, to show that computer algebra system Mathematica can be extremely useful tool for all sorts of computation related to bivariate normal distribution. Mathematica programs are also provided.

  19. A comparison of bivariate and univariate QTL mapping in livestock populations

    Directory of Open Access Journals (Sweden)

    Sorensen Daniel

    2003-11-01

    Full Text Available Abstract This study presents a multivariate, variance component-based QTL mapping model implemented via restricted maximum likelihood (REML. The method was applied to investigate bivariate and univariate QTL mapping analyses, using simulated data. Specifically, we report results on the statistical power to detect a QTL and on the precision of parameter estimates using univariate and bivariate approaches. The model and methodology were also applied to study the effectiveness of partitioning the overall genetic correlation between two traits into a component due to many genes of small effect, and one due to the QTL. It is shown that when the QTL has a pleiotropic effect on two traits, a bivariate analysis leads to a higher statistical power of detecting the QTL and to a more precise estimate of the QTL's map position, in particular in the case when the QTL has a small effect on the trait. The increase in power is most marked in cases where the contributions of the QTL and of the polygenic components to the genetic correlation have opposite signs. The bivariate REML analysis can successfully partition the two components contributing to the genetic correlation between traits.

  20. A Basic Bivariate Structure of Personality Attributes Evident Across Nine Languages.

    Science.gov (United States)

    Saucier, Gerard; Thalmayer, Amber Gayle; Payne, Doris L; Carlson, Robert; Sanogo, Lamine; Ole-Kotikash, Leonard; Church, A Timothy; Katigbak, Marcia S; Somer, Oya; Szarota, Piotr; Szirmák, Zsofia; Zhou, Xinyue

    2014-02-01

    Here, two studies seek to characterize a parsimonious common-denominator personality structure with optimal cross-cultural replicability. Personality differences are observed in all human populations and cultures, but lexicons for personality attributes contain so many distinctions that parsimony is lacking. Models stipulating the most important attributes have been formulated by experts or by empirical studies drawing on experience in a very limited range of cultures. Factor analyses of personality lexicons of nine languages of diverse provenance (Chinese, Korean, Filipino, Turkish, Greek, Polish, Hungarian, Maasai, and Senoufo) were examined, and their common structure was compared to that of several prominent models in psychology. A parsimonious bivariate model showed evidence of substantial convergence and ubiquity across cultures. Analyses involving key markers of these dimensions in English indicate that they are broad dimensions involving the overlapping content of the interpersonal circumplex, models of communion and agency, and morality/warmth and competence. These "Big Two" dimensions-Social Self-Regulation and Dynamism-provide a common-denominator model involving the two most crucial axes of personality variation, ubiquitous across cultures. The Big Two might serve as an umbrella model serving to link diverse theoretical models and associated research literatures. © 2013 Wiley Periodicals, Inc.

  1. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus.

    Science.gov (United States)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L; Kreiner, Eskil; Chesi, Alessandra; Zemel, Babette S; Bønnelykke, Klaus; Boer, Cindy G; Ahluwalia, Tarunveer S; Bisgaard, Hans; Evangelou, Evangelos; Heppe, Denise H M; Bonewald, Lynda F; Gorski, Jeffrey P; Ghanbari, Mohsen; Demissie, Serkalem; Duque, Gustavo; Maurano, Matthew T; Kiel, Douglas P; Hsu, Yi-Hsiang; C J van der Eerden, Bram; Ackert-Bicknell, Cheryl; Reppe, Sjur; Gautvik, Kaare M; Raastad, Truls; Karasik, David; van de Peppel, Jeroen; Jaddoe, Vincent W V; Uitterlinden, André G; Tobias, Jonathan H; Grant, Struan F A; Bagos, Pantelis G; Evans, David M; Rivadeneira, Fernando

    2017-07-25

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone mineral density (TBLH-BMD) regions in 10,414 children. The estimated SNP heritability is 43% (95% CI: 34-52%) for TBLH-BMD, and 39% (95% CI: 30-48%) for TB-LM, with a shared genetic component of 43% (95% CI: 29-56%). We identify variants with pleiotropic effects in eight loci, including seven established bone mineral density loci: WNT4, GALNT3, MEPE, CPED1/WNT16, TNFSF11, RIN3, and PPP6R3/LRP5. Variants in the TOM1L2/SREBF1 locus exert opposing effects TB-LM and TBLH-BMD, and have a stronger association with the former trait. We show that SREBF1 is expressed in murine and human osteoblasts, as well as in human muscle tissue. This is the first bivariate GWAS meta-analysis to demonstrate genetic factors with pleiotropic effects on bone mineral density and lean mass.Bone mineral density and lean skeletal mass are heritable traits. Here, Medina-Gomez and colleagues perform bivariate GWAS analyses of total body lean mass and bone mass density in children, and show genetic loci with pleiotropic effects on both traits.

  2. Fitting statistical models in bivariate allometry.

    Science.gov (United States)

    Packard, Gary C; Birchard, Geoffrey F; Boardman, Thomas J

    2011-08-01

    Several attempts have been made in recent years to formulate a general explanation for what appear to be recurring patterns of allometric variation in morphology, physiology, and ecology of both plants and animals (e.g. the Metabolic Theory of Ecology, the Allometric Cascade, the Metabolic-Level Boundaries hypothesis). However, published estimates for parameters in allometric equations often are inaccurate, owing to undetected bias introduced by the traditional method for fitting lines to empirical data. The traditional method entails fitting a straight line to logarithmic transformations of the original data and then back-transforming the resulting equation to the arithmetic scale. Because of fundamental changes in distributions attending transformation of predictor and response variables, the traditional practice may cause influential outliers to go undetected, and it may result in an underparameterized model being fitted to the data. Also, substantial bias may be introduced by the insidious rotational distortion that accompanies regression analyses performed on logarithms. Consequently, the aforementioned patterns of allometric variation may be illusions, and the theoretical explanations may be wide of the mark. Problems attending the traditional procedure can be largely avoided in future research simply by performing preliminary analyses on arithmetic values and by validating fitted equations in the arithmetic domain. The goal of most allometric research is to characterize relationships between biological variables and body size, and this is done most effectively with data expressed in the units of measurement. Back-transforming from a straight line fitted to logarithms is not a generally reliable way to estimate an allometric equation in the original scale. © 2010 The Authors. Biological Reviews © 2010 Cambridge Philosophical Society.

  3. A bivariate model for analyzing recurrent multi-type automobile failures

    Science.gov (United States)

    Sunethra, A. A.; Sooriyarachchi, M. R.

    2017-09-01

    The failure mechanism in an automobile can be defined as a system of multi-type recurrent failures where failures can occur due to various multi-type failure modes and these failures are repetitive such that more than one failure can occur from each failure mode. In analysing such automobile failures, both the time and type of the failure serve as response variables. However, these two response variables are highly correlated with each other since the timing of failures has an association with the mode of the failure. When there are more than one correlated response variables, the fitting of a multivariate model is more preferable than separate univariate models. Therefore, a bivariate model of time and type of failure becomes appealing for such automobile failure data. When there are multiple failure observations pertaining to a single automobile, such data cannot be treated as independent data because failure instances of a single automobile are correlated with each other while failures among different automobiles can be treated as independent. Therefore, this study proposes a bivariate model consisting time and type of failure as responses adjusted for correlated data. The proposed model was formulated following the approaches of shared parameter models and random effects models for joining the responses and for representing the correlated data respectively. The proposed model is applied to a sample of automobile failures with three types of failure modes and up to five failure recurrences. The parametric distributions that were suitable for the two responses of time to failure and type of failure were Weibull distribution and multinomial distribution respectively. The proposed bivariate model was programmed in SAS Procedure Proc NLMIXED by user programming appropriate likelihood functions. The performance of the bivariate model was compared with separate univariate models fitted for the two responses and it was identified that better performance is secured by

  4. A bivariate optimal replacement policy for a multistate repairable system

    International Nuclear Information System (INIS)

    Zhang Yuanlin; Yam, Richard C.M.; Zuo, Ming J.

    2007-01-01

    In this paper, a deteriorating simple repairable system with k+1 states, including k failure states and one working state, is studied. It is assumed that the system after repair is not 'as good as new' and the deterioration of the system is stochastic. We consider a bivariate replacement policy, denoted by (T,N), in which the system is replaced when its working age has reached T or the number of failures it has experienced has reached N, whichever occurs first. The objective is to determine the optimal replacement policy (T,N)* such that the long-run expected profit per unit time is maximized. The explicit expression of the long-run expected profit per unit time is derived and the corresponding optimal replacement policy can be determined analytically or numerically. We prove that the optimal policy (T,N)* is better than the optimal policy N* for a multistate simple repairable system. We also show that a general monotone process model for a multistate simple repairable system is equivalent to a geometric process model for a two-state simple repairable system in the sense that they have the same structure for the long-run expected profit (or cost) per unit time and the same optimal policy. Finally, a numerical example is given to illustrate the theoretical results

  5. Preparation and bivariate analysis of suspensions of human chromosomes

    Energy Technology Data Exchange (ETDEWEB)

    van den Engh, G.J.; Trask, B.J.; Gray, J.W.; Langlois, R.G.; Yu, L.C.

    1985-01-01

    Chromosomes were isolated from a variety of human cell types using a HEPES-buffered hypotonic solution (pH 8.0) containing KCl, MgSO/sub 4/ dithioerythritol, and RNase. The chromosomes isolated by this procedure could be stained with a variety of fluorescent stains including propidium iodide, chromomycin A3, and Hoeschst 33258. Addition of sodium citrate to the stained chromosomes was found to improve the total fluorescence resolution. High-quality bivariate Hoeschst vs. chromomycin fluorescence distributions were obtained for chromosomes isolated from a human fibroblast cell strain, a human colon carcinoma cell line, and human peripheral blood lymphocyte cultures. Good flow karyotypes were also obtained from primary amniotic cell cultures. The Hoeschst vs. chromomycin flow karyotypes of a given cell line, made at different times and at dye concentrations varying over fourfold ranges, show little variation in the relative peak positions of the chromosomes. The size of the DNA in chromosomes isolated using this procedure ranges from 20 to 50 kilobases. The described isolation procedure is simple, it yields high-quality flow karyotypes, and it can be used to prepare chromosomes from clinical samples. 22 references, 7 figures, 1 table.

  6. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    Science.gov (United States)

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  7. Bivariate Developmental Relations between Calculations and Word Problems: A Latent Change Approach.

    Science.gov (United States)

    Gilbert, Jennifer K; Fuchs, Lynn S

    2017-10-01

    The relation between 2 forms of mathematical cognition, calculations and word problems, was examined. Across grades 2-3, performance of 328 children (mean starting age 7.63 [ SD =0.43]) was assessed 3 times. Comparison of a priori latent change score models indicated a dual change model, with consistently positive but slowing growth, described development in each domain better than a constant or proportional change model. The bivariate model including change models for both calculations and word problems indicated prior calculation performance and change were not predictors of subsequent word-problem change, and prior word-problem performance and change were not predictors of subsequent calculation change. Results were comparable for boys versus girls. The bivariate model, along with correlations among intercepts and slopes, suggest calculation and word-problem development are related, but through an external set of overlapping factors. Exploratory supplemental analyses corroborate findings and provide direction for future study.

  8. mitants of Order Statistics from Bivariate Inverse Rayleigh Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Aleem

    2006-01-01

    Full Text Available The probability density function (pdf of the rth, 1 r n and joint pdf of the rth and sth, 1 rBivariate Inverse Rayleigh Distribution and their moments, product moments are obtained. Its percentiles are also obtained.

  9. Dissecting the correlation structure of a bivariate phenotype ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 84; Issue 2. Dissecting the correlation structure of a bivariate phenotype: common genes or shared environment? ... High correlations between two quantitative traits may be either due to common genetic factors or common environmental factors or a combination of both.

  10. Modelling of Uncertainty and Bi-Variable Maps

    Science.gov (United States)

    Nánásiová, Ol'ga; Pykacz, Jarosław

    2016-05-01

    The paper gives an overview and compares various bi-varilable maps from orthomodular lattices into unit interval. It focuses mainly on such bi-variable maps that may be used for constructing joint probability distributions for random variables which are not defined on the same Boolean algebra.

  11. An assessment on the use of bivariate, multivariate and soft ...

    Indian Academy of Sciences (India)

    Conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) models representing the bivariate, multivariate and soft computing techniques were used in GIS based collapse susceptibility mapping in an area from Sivas basin (Turkey). Collapse-related factors, directly or indirectly related to the ...

  12. An assessment on the use of bivariate, multivariate and soft ...

    Indian Academy of Sciences (India)

    The paper presented herein compares and discusses the use of bivariate, multivariate and soft computing techniques for ... map is a useful tool in urban planning. ..... 381. Table 1. Frequency ratio of geological factors to collapse occurrences and results of the P(A/Bi) obtained from the. Conditional Probability model. Class.

  13. About some properties of bivariate splines with shape parameters

    Science.gov (United States)

    Caliò, F.; Marchetti, E.

    2017-07-01

    The paper presents and proves geometrical properties of a particular bivariate function spline, built and algorithmically implemented in previous papers. The properties typical of this family of splines impact the field of computer graphics in particular that of the reverse engineering.

  14. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  15. Spectrum-based estimators of the bivariate Hurst exponent

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2014-01-01

    Roč. 90, č. 6 (2014), art. 062802 ISSN 1539-3755 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : bivariate Hurst exponent * power- law cross-correlations * estimation Subject RIV: AH - Economics Impact factor: 2.288, year: 2014 http://library.utia.cas.cz/separaty/2014/E/kristoufek-0436818.pdf

  16. Nucleotide sequence analyses of coat protein gene of peanut stunt virus isolates from alfalfa and different hosts show a new tentative subgroup from Iran.

    Science.gov (United States)

    Amid-Motlagh, Mohammad Hadi; Massumi, Hossein; Heydarnejad, Jahangir; Mehrvar, Mohsen; Hajimorad, Mohammad Reza

    2017-09-01

    Alfalfa cultivars grown in 14 provinces in Iran were surveyed for the relative incidence of peanut stunt virus (PSV) during 2013-2016. PSV were detected in 41.89% of symptomatic alfalfa samples and a few alternate hosts by plate-trapped antigen ELISA. Among other hosts tested only Chenopodium album , Robinia pseudoacacia and Arachis hypogaea were found naturally infected with PSV. Twenty five isolates of PSV were chosen for biological and molecular characterizations based on their geographical distributions. There was not any differences in experimental host range of these isolates; however, variation in systemic symptoms observed on Nicotiana glutinosa . Total RNA from 25 of viral isolates were subjected to reverse transcription polymerase chain reaction analysis using primers directed against coat protein (CP) gene. The CP genes of 25 Iranian PSV isolates were either 651 or 666 nucleotides long. The nucleotide and amino acid identities for CP gene among Iranian PSV isolates were 79.3-99.7 and 72-100%, respectively. They also shared between 67.4 and 82.4% pairwise nucleotide identity with other PSV isolates reported elsewhere in the world. Phylogenetic analyses of CP gene sequences showed formation of a new subgroup comprising only the Iranian isolates. Natural infection of a few alternate hosts with PSV is reported for the first time from Iran.

  17. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying

    2017-01-18

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  18. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Science.gov (United States)

    Zscheischler, Jakob; Orth, Rene; Seneviratne, Sonia I.

    2017-07-01

    Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature-precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 %) than models relying directly on temperature and precipitation as predictors (36 %). Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate-crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  19. Short preheating at 41°C leads to a red blood cells count comparable to that in RET channel of Sysmex analysers in samples showing cold agglutination.

    Science.gov (United States)

    La Gioia, Antonio; Fumi, Maurizio; Fiorini, Fabiana; Pezzati, Paola; Balboni, Fiamma; Bombara, Maria; Marini, Alessandra; Pancione, Ylenia; Solarino, Leonardo; Marchese, Elisa; Sale, Silvia; Rocco, Vincenzo; Fiorini, Marcello

    2018-03-13

    The presence of cold agglutinin in blood samples can cause a spontaneous agglutination of red blood cells (RBCs) when low temperature occurs. This phenomenon causes a spurious lowering of RBC count on the automated haematological analysers that are detected by incongruous values (≥370 g/L) of the mean cellular haemoglobi concentration (MCHC). A preheating at 37°C can remove the RBC agglutination generally resulting in a reliable count. It has been reported that the same result can be reached by using the optical reticulocyte (RET) channel of Sysmex analysers where the RBC count is not influenced by the presence of cold agglutinin. This study aims to evaluate these data in a larger population, with regard to environmental conditions on Sysmex analysers. We have also evaluated the influence of different thermal pretreatments on the RBC count. This study was performed on 96 remnants of peripheral blood samples (48 with MCHC in normal range and 48 with MCHC > 370 g/L) which have been analysed in different preanalytical conditions on the Sysmex analysers. A preheating of samples at 41°C for 1 min leads to a reversibility of the cold agglutination comparable to the one observed in the RET channel and yields better results compared with 37°C for 2 hours. None of described procedures assure the complete cold agglutination reversibility in every case. Consequently, since the haematological analysers not yet provide reliable parameters to confirm the complete resolution of agglutination, further verification of RBC count accuracy needs to be performed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. A systematic review and meta-Analyses show that carbapenem use and medical devices are the leading risk factors for carbapenem- resistant pseudomonas aeruginosa

    NARCIS (Netherlands)

    A.F. Voor (Anne); J.A. Severin (Juliëtte); E.M.E.H. Lesaffre (Emmanuel); M.C. Vos (Margreet)

    2014-01-01

    textabstractA systematic review and meta-Analyses were performed to identify the risk factors associated with carbapenem-resistant Pseudomonas aeruginosa and to identify sources and reservoirs for the pathogen. A systematic search of PubMed and Embase databases from 1 January 1987 until 27 January

  1. Multiple imputation methods for bivariate outcomes in cluster randomised trials.

    Science.gov (United States)

    DiazOrdaz, K; Kenward, M G; Gomes, M; Grieve, R

    2016-09-10

    Missing observations are common in cluster randomised trials. The problem is exacerbated when modelling bivariate outcomes jointly, as the proportion of complete cases is often considerably smaller than the proportion having either of the outcomes fully observed. Approaches taken to handling such missing data include the following: complete case analysis, single-level multiple imputation that ignores the clustering, multiple imputation with a fixed effect for each cluster and multilevel multiple imputation. We contrasted the alternative approaches to handling missing data in a cost-effectiveness analysis that uses data from a cluster randomised trial to evaluate an exercise intervention for care home residents. We then conducted a simulation study to assess the performance of these approaches on bivariate continuous outcomes, in terms of confidence interval coverage and empirical bias in the estimated treatment effects. Missing-at-random clustered data scenarios were simulated following a full-factorial design. Across all the missing data mechanisms considered, the multiple imputation methods provided estimators with negligible bias, while complete case analysis resulted in biased treatment effect estimates in scenarios where the randomised treatment arm was associated with missingness. Confidence interval coverage was generally in excess of nominal levels (up to 99.8%) following fixed-effects multiple imputation and too low following single-level multiple imputation. Multilevel multiple imputation led to coverage levels of approximately 95% throughout. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  2. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  3. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    Science.gov (United States)

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the

  4. Bivariate genetic analyses of stuttering and nonfluency in a large sample of 5-year old twins

    NARCIS (Netherlands)

    van Beijsterveldt, C.E.M.; Felsenfeld, S.; Boomsma, D.I.

    2010-01-01

    Purpose: Behavioral genetic studies of speech fluency have focused on participants who present with clinical stuttering. Knowledge about genetic influences on the development and regulation of normal speech fluency is limited. The primary aims of this study were to identify the heritability of

  5. Perceived Social Support and Academic Achievement: Cross-Lagged Panel and Bivariate Growth Curve Analyses

    Science.gov (United States)

    Mackinnon, Sean P.

    2012-01-01

    As students transition to post-secondary education, they experience considerable stress and declines in academic performance. Perceived social support is thought to improve academic achievement by reducing stress. Longitudinal designs with three or more waves are needed in this area because they permit stronger causal inferences and help…

  6. Descriptive and network analyses of the equine contact network at an equestrian show in Ontario, Canada and implications for disease spread.

    Science.gov (United States)

    Spence, Kelsey L; O'Sullivan, Terri L; Poljak, Zvonimir; Greer, Amy L

    2017-06-21

    Identifying the contact structure within a population of horses attending a competition is an important element towards understanding the potential for the spread of equine pathogens as the horses subsequently travel from location to location. However, there is limited information in Ontario, Canada to quantify contact patterns of horses. The objective of this study was to describe the network of potential contacts associated with an equestrian show to determine how this network structure may influence potential disease transmission. This was a descriptive study of horses attending an equestrian show in southern Ontario, Canada on July 6 and 7, 2014. Horse show participants completed a questionnaire about their horse, travel patterns, and infection control practices. Questionnaire responses were received from horse owners of 79.7% (55/69) of the horses attending the show. Owners reported that horses attending the show were vaccinated for diseases such as rabies, equine influenza, and equine herpesvirus. Owners demonstrated high compliance with most infection control practices by reporting reduced opportunities for direct and indirect contact while away from home. The two-mode undirected network consisted of 820 nodes (41 locations and 779 horses). Eight percent of nodes in the network represented horses attending the show, 87% of nodes represented horses not attending the show, but boarded at individual home facilities, and 5% represented locations. The median degree of a horse in the network was 33 (range: 1-105). Developing disease management strategies without the explicit consideration of horses boarded at individual home facilities would underestimate the connectivity of horses in the population. The results of this study provides information that can be used by equestrian show organizers to configure event management in such a way that can limit the extent of potential disease spread.

  7. Selection effects in the bivariate brightness distribution for spiral galaxies

    International Nuclear Information System (INIS)

    Phillipps, S.; Disney, M.

    1986-01-01

    The joint distribution of total luminosity and characteristic surface brightness (the bivariate brightness distribution) is investigated for a complete sample of spiral galaxies in the Virgo cluster. The influence of selection and physical limits of various kinds on the apparent distribution are detailed. While the distribution of surface brightness for bright galaxies may be genuinely fairly narrow, faint galaxies exist right across the (quite small) range of accessible surface brightnesses so no statement can be made about the true extent of the distribution. The lack of high surface brightness bright galaxies in the Virgo sample relative to an overall RC2 sample (mostly field galaxies) supports the contention that the star-formation rate is reduced in the inner region of the cluster for environmental reasons. (author)

  8. Comparative sequence, structure and redox analyses of Klebsiella pneumoniae DsbA show that anti-virulence target DsbA enzymes fall into distinct classes.

    Directory of Open Access Journals (Sweden)

    Fabian Kurth

    Full Text Available Bacterial DsbA enzymes catalyze oxidative folding of virulence factors, and have been identified as targets for antivirulence drugs. However, DsbA enzymes characterized to date exhibit a wide spectrum of redox properties and divergent structural features compared to the prototypical DsbA enzyme of Escherichia coli DsbA (EcDsbA. Nonetheless, sequence analysis shows that DsbAs are more highly conserved than their known substrate virulence factors, highlighting the potential to inhibit virulence across a range of organisms by targeting DsbA. For example, Salmonella enterica typhimurium (SeDsbA, 86 % sequence identity to EcDsbA shares almost identical structural, surface and redox properties. Using comparative sequence and structure analysis we predicted that five other bacterial DsbAs would share these properties. To confirm this, we characterized Klebsiella pneumoniae DsbA (KpDsbA, 81 % identity to EcDsbA. As expected, the redox properties, structure and surface features (from crystal and NMR data of KpDsbA were almost identical to those of EcDsbA and SeDsbA. Moreover, KpDsbA and EcDsbA bind peptides derived from their respective DsbBs with almost equal affinity, supporting the notion that compounds designed to inhibit EcDsbA will also inhibit KpDsbA. Taken together, our data show that DsbAs fall into different classes; that DsbAs within a class may be predicted by sequence analysis of binding loops; that DsbAs within a class are able to complement one another in vivo and that compounds designed to inhibit EcDsbA are likely to inhibit DsbAs within the same class.

  9. GATA2 mutations in patients with acute myeloid leukemia-paired samples analyses show that the mutation is unstable during disease evolution.

    Science.gov (United States)

    Hou, Hsin-An; Lin, Yun-Chu; Kuo, Yuan-Yeh; Chou, Wen-Chien; Lin, Chien-Chin; Liu, Chieh-Yu; Chen, Chien-Yuan; Lin, Liang-In; Tseng, Mei-Hsuan; Huang, Chi-Fei; Chiang, Ying-Chieh; Liu, Ming-Chih; Liu, Chia-Wen; Tang, Jih-Luh; Yao, Ming; Huang, Shang-Yi; Ko, Bor-Sheng; Hsu, Szu-Chun; Wu, Shang-Ju; Tsay, Woei; Chen, Yao-Chang; Tien, Hwei-Fang

    2015-02-01

    Recently, mutations of the GATA binding protein 2 (GATA2) gene were identified in acute myeloid leukemia (AML) patients with CEBPA double mutations (CEBPA (double-mut)), but the interaction of this mutation with other genetic alterations and its dynamic changes during disease progression remain to be determined. In this study, 14 different missense GATA2 mutations, which were all clustered in the highly conserved N-terminal zinc finger 1 domain, were identified in 27.4, 6.7, and 1 % of patients with CEBPA (double-mut), CEBPA (single-mut), and CEBPA wild type, respectively. All but one patient with GATA2 mutation had concurrent CEBPA mutation. GATA2 mutations were closely associated with younger age, FAB M1 subtype, intermediate-risk cytogenetics, expression of HLA-DR, CD7, CD15, or CD34 on leukemic cells, and CEBPA mutation, but negatively associated with FAB M4 subtype, favorable-risk cytogenetics, and NPM1 mutation. Patients with GATA2 mutation had significantly better overall survival and relapse-free survival than those without GATA2 mutation. Sequential analysis showed that the original GATA2 mutations might be lost during disease progression in GATA2-mutated patients, while novel GATA2 mutations might be acquired at relapse in GATA2-wild patients. In conclusion, AML patients with GATA2 mutations had distinct clinic-biological features and a favorable prognosis. GATA2 mutations might be lost or acquired at disease progression, implying that it was a second hit in the leukemogenesis of AML, especially those with CEBPA mutation.

  10. A non-stationary cost-benefit based bivariate extreme flood estimation approach

    Science.gov (United States)

    Qi, Wei; Liu, Junguo

    2018-02-01

    Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.

  11. A bivariate space-time downscaler under space and time misalignment.

    Science.gov (United States)

    Berrocal, Veronica J; Gelfand, Alan E; Holland, David M

    2010-12-01

    Ozone and particulate matter PM(2.5) are co-pollutants that have long been associated with increased public health risks. Information on concentration levels for both pollutants come from two sources: monitoring sites and output from complex numerical models that produce concentration surfaces over large spatial regions. In this paper, we offer a fully-model based approach for fusing these two sources of information for the pair of co-pollutants which is computationally feasible over large spatial regions and long periods of time. Due to the association between concentration levels of the two environmental contaminants, it is expected that information regarding one will help to improve prediction of the other. Misalignment is an obvious issue since the monitoring networks for the two contaminants only partly intersect and because the collection rate for PM(2.5) is typically less frequent than that for ozone.Extending previous work in Berrocal et al. (2009), we introduce a bivariate downscaler that provides a flexible class of bivariate space-time assimilation models. We discuss computational issues for model fitting and analyze a dataset for ozone and PM(2.5) for the ozone season during year 2002. We show a modest improvement in predictive performance, not surprising in a setting where we can anticipate only a small gain.

  12. Semiparametric probit models with univariate and bivariate current-status data.

    Science.gov (United States)

    Liu, Hao; Qin, Jing

    2018-03-01

    Multivariate current-status data are frequently encountered in biomedical and public health studies. Semiparametric regression models have been extensively studied for univariate current-status data, but most existing estimation procedures are computationally intensive, involving either penalization or smoothing techniques. It becomes more challenging for the analysis of multivariate current-status data. In this article, we study the maximum likelihood estimations for univariate and bivariate current-status data under the semiparametric probit regression models. We present a simple computational procedure combining the expectation-maximization algorithm with the pool-adjacent-violators algorithm for solving the monotone constraint on the baseline function. Asymptotic properties of the maximum likelihood estimators are investigated, including the calculation of the explicit information bound for univariate current-status data, as well as the asymptotic consistency and convergence rate for bivariate current-status data. Extensive simulation studies showed that the proposed computational procedures performed well under small or moderate sample sizes. We demonstrate the estimation procedure with two real data examples in the areas of diabetic and HIV research. © 2017, The International Biometric Society.

  13. Inheritance of dermatoglyphic traits in twins: univariate and bivariate variance decomposition analysis.

    Science.gov (United States)

    Karmakar, Bibha; Malkin, Ida; Kobyliansky, Eugene

    2012-01-01

    Dermatoglyphic traits in a sample of twins were analyzed to estimate the resemblance between MZ and DZ twins and to evaluate the mode of inheritance by using the maximum likelihood-based Variance decomposition analysis. The additive genetic variance component was significant in both sexes for four traits--PII, AB_RC, RC_HB, and ATD_L. AB RC and RC_HB had significant sex differences in means, whereas PII and ATD_L did not. The results of the Bivariate Variance decomposition analysis revealed that PII and RC_HB have a significant correlation in both genetic and residual components. Significant correlation in the additive genetic variance between AB_RC and ATD_L was observed. The same analysis only for the females sub-sample in the three traits RBL, RBR and AB_DIS shows that the additive genetic RBR component was significant and the AB_DIS sibling component was not significant while others cannot be constrained to zero. The additive variance for AB DIS sibling component was not significant. The three components additive, sibling and residual were significantly correlated between each pair of traits revealed by the Bivariate Variance decomposition analysis.

  14. Bivariate pointing movements on large touch screens: investigating the validity of a refined Fitts' Law.

    Science.gov (United States)

    Bützler, Jennifer; Vetter, Sebastian; Jochems, Nicole; Schlick, Christopher M

    2012-01-01

    On the basis of three empirical studies Fitts' Law was refined for bivariate pointing tasks on large touch screens. In the first study different target width parameters were investigated. The second study considered the effect of the motion angle. Based on the results of the two studies a refined model for movement time in human-computer interaction was formulated. A third study, which is described here in detail, concerns the validation of the refined model. For the validation study 20 subjects had to execute a bivariate pointing task on a large touch screen. In the experimental task 250 rectangular target objects were displayed at a randomly chosen position on the screen covering a broad range of ID values (ID= [1.01; 4.88]). Compared to existing refinements of Fitts' Law, the new model shows highest predictive validity. A promising field of application of the model is the ergonomic design and evaluation of project management software. By using the refined model, software designers can calculate a priori the appropriate angular position and the size of buttons, menus or icons.

  15. Epileptic seizure prediction based on a bivariate spectral power methodology.

    Science.gov (United States)

    Bandarabadi, Mojtaba; Teixeira, Cesar A; Direito, Bruno; Dourado, Antonio

    2012-01-01

    The spectral power of 5 frequently considered frequency bands (Alpha, Beta, Gamma, Theta and Delta) for 6 EEG channels is computed and then all the possible pairwise combinations among the 30 features set, are used to create a 435 dimensional feature space. Two new feature selection methods are introduced to choose the best candidate features among those and to reduce the dimensionality of this feature space. The selected features are then fed to Support Vector Machines (SVMs) that classify the cerebral state in preictal and non-preictal classes. The outputs of the SVM are regularized using a method that accounts for the classification dynamics of the preictal class, also known as "Firing Power" method. The results obtained using our feature selection approaches are compared with the ones obtained using minimum Redundancy Maximum Relevance (mRMR) feature selection method. The results in a group of 12 patients of the EPILEPSIAE database, containing 46 seizures and 787 hours multichannel recording for out-of-sample data, indicate the efficiency of the bivariate approach as well as the two new feature selection methods. The best results presented sensitivity of 76.09% (35 of 46 seizures predicted) and a false prediction rate of 0.15(-1).

  16. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  17. Inland dissolved salt chemistry: statistical evaluation of bivariate and ternary diagram models for surface and subsurface waters

    Directory of Open Access Journals (Sweden)

    Stephen T. THRELKELD

    2000-08-01

    Full Text Available We compared the use of ternary and bivariate diagrams to distinguish the effects of atmospheric precipitation, rock weathering, and evaporation on inland surface and subsurface water chemistry. The three processes could not be statistically differentiated using bivariate models even if large water bodies were evaluated separate from small water bodies. Atmospheric precipitation effects were identified using ternary diagrams in water with total dissolved salts (TDS 1000 mg l-1. A principal components analysis showed that the variability in the relative proportions of the major ions was related to atmospheric precipitation, weathering, and evaporation. About half of the variation in the distribution of inorganic ions was related to rock weathering. By considering most of the important inorganic ions, ternary diagrams are able to distinguish the contributions of atmospheric precipitation, rock weathering, and evaporation to inland water chemistry.

  18. A Bivariate Chebyshev Spectral Collocation Quasilinearization Method for Nonlinear Evolution Parabolic Equations

    Directory of Open Access Journals (Sweden)

    S. S. Motsa

    2014-01-01

    Full Text Available This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs. The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  19. A composite likelihood method for bivariate meta-analysis in diagnostic systematic reviews.

    Science.gov (United States)

    Chen, Yong; Liu, Yulun; Ning, Jing; Nie, Lei; Zhu, Hongjian; Chu, Haitao

    2017-04-01

    Diagnostic systematic review is a vital step in the evaluation of diagnostic technologies. In many applications, it involves pooling pairs of sensitivity and specificity of a dichotomized diagnostic test from multiple studies. We propose a composite likelihood (CL) method for bivariate meta-analysis in diagnostic systematic reviews. This method provides an alternative way to make inference on diagnostic measures such as sensitivity, specificity, likelihood ratios, and diagnostic odds ratio. Its main advantages over the standard likelihood method are the avoidance of the nonconvergence problem, which is nontrivial when the number of studies is relatively small, the computational simplicity, and some robustness to model misspecifications. Simulation studies show that the CL method maintains high relative efficiency compared to that of the standard likelihood method. We illustrate our method in a diagnostic review of the performance of contemporary diagnostic imaging technologies for detecting metastases in patients with melanoma.

  20. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    the production trait evaluation of Nordic Red dairy cattle. Genotyped bulls with daughters are used as training animals, and genotyped bulls and producing cows as candidate animals. For simplicity, size of the data is chosen so that the full inverses of the mixed model equation coefficient matrices can......Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was...

  1. Historical and future drought in Bangladesh using copula-based bivariate regional frequency analysis

    Science.gov (United States)

    Mortuza, Md Rubayet; Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2018-02-01

    The study aims at regional and probabilistic evaluation of bivariate drought characteristics to assess both the past and future drought duration and severity in Bangladesh. The procedures involve applying (1) standardized precipitation index to identify drought duration and severity, (2) regional frequency analysis to determine the appropriate marginal distributions for both duration and severity, (3) copula model to estimate the joint probability distribution of drought duration and severity, and (4) precipitation projections from multiple climate models to assess future drought trends. Since drought duration and severity in Bangladesh are often strongly correlated and do not follow same marginal distributions, the joint and conditional return periods of droughts are characterized using the copula-based joint distribution. The country is divided into three homogeneous regions using Fuzzy clustering and multivariate discordancy and homogeneity measures. For given severity and duration values, the joint return periods for a drought to exceed both values are on average 45% larger, while to exceed either value are 40% less than the return periods from the univariate frequency analysis, which treats drought duration and severity independently. These suggest that compared to the bivariate drought frequency analysis, the standard univariate frequency analysis under/overestimate the frequency and severity of droughts depending on how their duration and severity are related. Overall, more frequent and severe droughts are observed in the west side of the country. Future drought trend based on four climate models and two scenarios showed the possibility of less frequent drought in the future (2020-2100) than in the past (1961-2010).

  2. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Directory of Open Access Journals (Sweden)

    J. Zscheischler

    2017-07-01

    Full Text Available Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature–precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 % than models relying directly on temperature and precipitation as predictors (36 %. Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate–crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  3. Bivariate flow cytometric analysis and sorting of different types of maize starch grains.

    Science.gov (United States)

    Zhang, Xudong; Feng, Jiaojiao; Wang, Heng; Zhu, Jianchu; Zhong, Yuyue; Liu, Linsan; Xu, Shutu; Zhang, Renhe; Zhang, Xinghua; Xue, Jiquan; Guo, Dongwei

    2018-02-01

    Particle-size distribution, granular structure, and composition significantly affect the physicochemical properties, rheological properties, and nutritional function of starch. Flow cytometry and flow sorting are widely considered convenient and efficient ways of classifying and separating natural biological particles or other substances into subpopulations, respectively, based on the differential response of each component to stimulation by a light beam; the results allow for the correlation analysis of parameters. In this study, different types of starches isolated from waxy maize, sweet maize, high-amylose maize, pop maize, and normal maize were initially classified into various subgroups by flow cytometer and then collected through flow sorting to observe their morphology and particle-size distribution. The results showed that a 0.25% Gelzan solution served as an optimal reagent for keeping individual starch particles homogeneously dispersed in suspension for a relatively long time. The bivariate flow cytometric population distributions indicated that the starches of normal maize, sweet maize, and pop maize were divided into two subgroups, whereas high-amylose maize starch had only one subgroup. Waxy maize starch, conversely, showed three subpopulations. The subgroups sorted by flow cytometer were determined and verified in terms of morphology and granule size by scanning electron microscopy and laser particle distribution analyzer. Results showed that flow cytometry can be regarded as a novel method for classifying and sorting starch granules. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  4. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  5. Multiscale Fluctuation Features of the Dynamic Correlation between Bivariate Time Series

    Directory of Open Access Journals (Sweden)

    Meihui Jiang

    2016-01-01

    Full Text Available The fluctuation of the dynamic correlation between bivariate time series has some special features on the time-frequency domain. In order to study these fluctuation features, this paper built the dynamic correlation network models using two kinds of time series as sample data. After studying the dynamic correlation networks at different time-scales, we found that the correlation between time series is a dynamic process. The correlation is strong and stable in the long term, but it is weak and unstable in the short and medium term. There are key correlation modes which can effectively indicate the trend of the correlation. The transmission characteristics of correlation modes show that it is easier to judge the trend of the fluctuation of the correlation between time series from the short term to long term. The evolution of media capability of the correlation modes shows that the transmission media in the long term have higher value to predict the trend of correlation. This work does not only propose a new perspective to analyze the correlation between time series but also provide important information for investors and decision makers.

  6. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    Science.gov (United States)

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  7. Bivariate copulas on the exponentially weighted moving average control chart

    Directory of Open Access Journals (Sweden)

    Sasigarn Kuvattana

    2016-10-01

    Full Text Available This paper proposes four types of copulas on the Exponentially Weighted Moving Average (EWMA control chart when observations are from an exponential distribution using a Monte Carlo simulation approach. The performance of the control chart is based on the Average Run Length (ARL which is compared for each copula. Copula functions for specifying dependence between random variables are used and measured by Kendall’s tau. The results show that the Normal copula can be used for almost all shifts.

  8. Neural Systems with Numerically Matched Input-Output Statistic: Isotonic Bivariate Statistical Modeling

    Directory of Open Access Journals (Sweden)

    Simone Fiori

    2007-07-01

    Full Text Available Bivariate statistical modeling from incomplete data is a useful statistical tool that allows to discover the model underlying two data sets when the data in the two sets do not correspond in size nor in ordering. Such situation may occur when the sizes of the two data sets do not match (i.e., there are “holes” in the data or when the data sets have been acquired independently. Also, statistical modeling is useful when the amount of available data is enough to show relevant statistical features of the phenomenon underlying the data. We propose to tackle the problem of statistical modeling via a neural (nonlinear system that is able to match its input-output statistic to the statistic of the available data sets. A key point of the new implementation proposed here is that it is based on look-up-table (LUT neural systems, which guarantee a computationally advantageous way of implementing neural systems. A number of numerical experiments, performed on both synthetic and real-world data sets, illustrate the features of the proposed modeling procedure.

  9. The Role of Wealth and Health in Insurance Choice: Bivariate Probit Analysis in China

    Directory of Open Access Journals (Sweden)

    Yiding Yue

    2014-01-01

    Full Text Available This paper captures the correlation between the choices of health insurance and pension insurance using the bivariate probit model and then studies the effect of wealth and health on insurance choice. Our empirical evidence shows that people who participate in a health care program are more likely to participate in a pension plan at the same time, while wealth and health have different effects on the choices of the health care program and the pension program. Generally, the higher an individual’s wealth level is, the more likelihood he will participate in a health care program; but wealth has no effect on the participation of pension. Health status has opposite effects on choices of health care programs and pension plans; the poorer an individual’s health is, the more likely he is to participate in health care programs, while the better health he enjoys, the more likely he is to participate in pension plans. When the investigation scope narrows down to commercial insurance, there is only a significant effect of health status on commercial health insurance. The commercial insurance choice and the insurance choice of the agricultural population are more complicated.

  10. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both w...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  11. Measuring early or late dependence for bivariate lifetimes of twins

    DEFF Research Database (Denmark)

    Scheike, Thomas; Holst, Klaus K; Hjelmborg, Jacob B

    2015-01-01

    -Oakes model. This model can be extended in several directions. One extension is to allow the dependence parameter to depend on covariates. Another extension is to model dependence via piecewise constant cross-hazard ratio models. We show how both these models can be implemented for large sample data......, and suggest a computational solution for obtaining standard errors for such models for large registry data. In addition we consider alternative models that have some computational advantages and with different dependence parameters based on odds ratios of the survival function using the Plackett distribution...

  12. Natural-color maps via coloring of bivariate grid data

    Science.gov (United States)

    Darbyshire, Jane E.; Jenny, Bernhard

    2017-09-01

    Natural ground color is useful for maps where a representation of the Earth's surface matters. Natural color schemes are less likely to be misinterpreted, as opposed to hypsometric color schemes, and are generally preferred by map readers. The creation of natural-color maps was once limited to manual cartographic techniques, but they can now be created digitally with the aid of raster graphics editing software. However, the creation of natural-color maps still requires many steps, a significant time investment, and fairly detailed digital land cover information, which makes this technique impossible to apply to global web maps at medium and large scales. A particular challenge for natural-color map creation is adjusting colors with location to create smoothly blending transitions. Adjustments with location are required to show land cover transitions between climate zones with a natural appearance. This study takes the first step in automating the process in order to facilitate the creation of medium- and large-scale natural-color maps covering large areas. A coloring method based on two grid inputs is presented. Here, we introduce an algorithmic method and prototype software for creating maps with this technique. The prototype software allows the map author to interactively assign colors to design the appearance of the map. This software can generate web map tiles at a global level for medium and large scales. Example natural-color web maps created with this coloring technique are provided.

  13. DBH Prediction Using Allometry Described by Bivariate Copula Distribution

    Science.gov (United States)

    Xu, Q.; Hou, Z.; Li, B.; Greenberg, J. A.

    2017-12-01

    Forest biomass mapping based on single tree detection from the airborne laser scanning (ALS) usually depends on an allometric equation that relates diameter at breast height (DBH) with per-tree aboveground biomass. The incapability of the ALS technology in directly measuring DBH leads to the need to predict DBH with other ALS-measured tree-level structural parameters. A copula-based method is proposed in the study to predict DBH with the ALS-measured tree height and crown diameter using a dataset measured in the Lassen National Forest in California. Instead of exploring an explicit mathematical equation that explains the underlying relationship between DBH and other structural parameters, the copula-based prediction method utilizes the dependency between cumulative distributions of these variables, and solves the DBH based on an assumption that for a single tree, the cumulative probability of each structural parameter is identical. Results show that compared with the bench-marking least-square linear regression and the k-MSN imputation, the copula-based method obtains better accuracy in the DBH for the Lassen National Forest. To assess the generalization of the proposed method, prediction uncertainty is quantified using bootstrapping techniques that examine the variability of the RMSE of the predicted DBH. We find that the copula distribution is reliable in describing the allometric relationship between tree-level structural parameters, and it contributes to the reduction of prediction uncertainty.

  14. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  15. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    Science.gov (United States)

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    Perhaps no other pair of variables in ecology has generated as much discussion as species richness and ecosystem productivity, as illustrated by the reactions by Pierce (2013) and others to Adler et al.'s (2011) report that empirical patterns are weak and inconsistent. Adler et al. (2011) argued we need to move beyond a focus on simplistic bivariate relationships and test mechanistic, multivariate causal hypotheses. We feel the continuing debate over productivity–richness relationships (PRRs) provides a focused context for illustrating the fundamental difficulties of using bivariate relationships to gain scientific understanding.

  16. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Directory of Open Access Journals (Sweden)

    Jakob Nikolas Kather

    Full Text Available Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions.In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images.To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images.Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  17. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Science.gov (United States)

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K; Schad, Lothar R; Zöllner, Frank Gerrit

    2015-01-01

    Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  18. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  19. Accuracy of body mass index in predicting pre-eclampsia: bivariate meta-analysis

    NARCIS (Netherlands)

    Cnossen, J. S.; Leeflang, M. M. G.; de Haan, E. E. M.; Mol, B. W. J.; van der Post, J. A. M.; Khan, K. S.; ter Riet, G.

    2007-01-01

    OBJECTIVE: The objective of this study was to determine the accuracy of body mass index (BMI) (pre-pregnancy or at booking) in predicting pre-eclampsia and to explore its potential for clinical application. DESIGN: Systematic review and bivariate meta-analysis. SETTING: Medline, Embase, Cochrane

  20. Parameter estimation and statistical test of geographically weighted bivariate Poisson inverse Gaussian regression models

    Science.gov (United States)

    Amalia, Junita; Purhadi, Otok, Bambang Widjanarko

    2017-11-01

    Poisson distribution is a discrete distribution with count data as the random variables and it has one parameter defines both mean and variance. Poisson regression assumes mean and variance should be same (equidispersion). Nonetheless, some case of the count data unsatisfied this assumption because variance exceeds mean (over-dispersion). The ignorance of over-dispersion causes underestimates in standard error. Furthermore, it causes incorrect decision in the statistical test. Previously, paired count data has a correlation and it has bivariate Poisson distribution. If there is over-dispersion, modeling paired count data is not sufficient with simple bivariate Poisson regression. Bivariate Poisson Inverse Gaussian Regression (BPIGR) model is mix Poisson regression for modeling paired count data within over-dispersion. BPIGR model produces a global model for all locations. In another hand, each location has different geographic conditions, social, cultural and economic so that Geographically Weighted Regression (GWR) is needed. The weighting function of each location in GWR generates a different local model. Geographically Weighted Bivariate Poisson Inverse Gaussian Regression (GWBPIGR) model is used to solve over-dispersion and to generate local models. Parameter estimation of GWBPIGR model obtained by Maximum Likelihood Estimation (MLE) method. Meanwhile, hypothesis testing of GWBPIGR model acquired by Maximum Likelihood Ratio Test (MLRT) method.

  1. The Effects of Selection Strategies for Bivariate Loglinear Smoothing Models on NEAT Equating Functions

    Science.gov (United States)

    Moses, Tim; Holland, Paul W.

    2010-01-01

    In this study, eight statistical strategies were evaluated for selecting the parameterizations of loglinear models for smoothing the bivariate test score distributions used in nonequivalent groups with anchor test (NEAT) equating. Four of the strategies were based on significance tests of chi-square statistics (Likelihood Ratio, Pearson,…

  2. First-order dominance: stronger characterization and a bivariate checking algorithm

    DEFF Research Database (Denmark)

    Range, Troels Martin; Østerdal, Lars Peter Raahave

    2018-01-01

    distributions. Utilizing that this problem can be formulated as a transportation problem with a special structure, we provide a stronger characterization of multivariate first-order dominance and develop a linear time complexity checking algorithm for the bivariate case. We illustrate the use of the checking...

  3. On the Construction of Bivariate Exponential Distributions with an Arbitrary Correlation Coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2010-01-01

    In this article we use the concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...

  4. Semi-automated detection of aberrant chromosomes in bivariate flow karyotypes

    NARCIS (Netherlands)

    Boschman, G. A.; Manders, E. M.; Rens, W.; Slater, R.; Aten, J. A.

    1992-01-01

    A method is described that is designed to compare, in a standardized procedure, bivariate flow karyotypes of Hoechst 33258 (HO)/Chromomycin A3 (CA) stained human chromosomes from cells with aberrations with a reference flow karyotype of normal chromosomes. In addition to uniform normalization of

  5. Bivariate analysis of sensitivity and specificity produces informative summary measures in diagnostic reviews

    NARCIS (Netherlands)

    Reitsma, Johannes B.; Glas, Afina S.; Rutjes, Anne W. S.; Scholten, Rob J. P. M.; Bossuyt, Patrick M.; Zwinderman, Aeilko H.

    2005-01-01

    Background and Objectives: Studies of diagnostic accuracy most often report pairs of sensitivity and specificity. We demonstrate the advantage of using bivariate meta-regression models to analyze such data. Methods: We discuss the methodology of both the summary Receiver Operating Characteristic

  6. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  7. Effect of catchment properties and flood generation regime on copula selection for bivariate flood frequency analysis

    Science.gov (United States)

    Filipova, Valeriya; Lawrence, Deborah; Klempe, Harald

    2018-02-01

    Applying copula-based bivariate flood frequency analysis is advantageous because the results provide information on both the flood peak and volume. More data are, however, required for such an analysis, and it is often the case that only data series with a limited record length are available. To overcome this issue of limited record length, data regarding climatic and geomorphological properties can be used to complement statistical methods. In this paper, we present a study of 27 catchments located throughout Norway, in which we assess whether catchment properties, flood generation processes and flood regime have an effect on the correlation between flood peak and volume and, in turn, on the selection of copulas. To achieve this, the annual maximum flood events were first classified into events generated primarily by rainfall, snowmelt or a combination of these. The catchments were then classified into flood regime, depending on the predominant flood generation process producing the annual maximum flood events. A contingency table and Fisher's exact test were used to determine the factors that affect the selection of copulas in the study area. The results show that the two-parameter copulas BB1 and BB7 are more commonly selected in catchments with high steepness, high mean annual runoff and rainfall flood regime. These findings suggest that in these types of catchments, the dependence structure between flood peak and volume is more complex and cannot be modeled effectively using a one-parameter copula. The results illustrate that by relating copula types to flood regime and catchment properties, additional information can be supplied for selecting copulas in catchments with limited data.

  8. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone...... mineral density (TBLH-BMD) regions in 10,414 children. The estimated SNP heritability is 43% (95% CI: 34-52%) for TBLH-BMD, and 39% (95% CI: 30-48%) for TB-LM, with a shared genetic component of 43% (95% CI: 29-56%). We identify variants with pleiotropic effects in eight loci, including seven established...... as in human muscle tissue. This is the first bivariate GWAS meta-analysis to demonstrate genetic factors with pleiotropic effects on bone mineral density and lean mass.Bone mineral density and lean skeletal mass are heritable traits. Here, Medina-Gomez and colleagues perform bivariate GWAS analyses of total...

  9. Bivariate functional data clustering: grouping streams based on a varying coefficient model of the stream water and air temperature relationship

    Science.gov (United States)

    H. Li; X. Deng; Andy Dolloff; E. P. Smith

    2015-01-01

    A novel clustering method for bivariate functional data is proposed to group streams based on their water–air temperature relationship. A distance measure is developed for bivariate curves by using a time-varying coefficient model and a weighting scheme. This distance is also adjusted by spatial correlation of streams via the variogram. Therefore, the proposed...

  10. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  11. Improving the modelling of redshift-space distortions - I. A bivariate Gaussian description for the galaxy pairwise velocity distributions

    Science.gov (United States)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2015-01-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation r, such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and dispersion σ. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distortions on all scales, fully capturing the overall linear and non-linear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of RSD is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. We also show how this description naturally allows for the Taylor expansion of 1 + ξS(s) around 1 + ξR(r), which leads to the Kaiser linear formula when truncated to second order, explicating its connection with the moments of the velocity distribution functions. More work is needed, but these results indicate a very promising path to make definitive progress in our programme to improve RSD estimators.

  12. Can the bivariate Hurst exponent be higher than an average of the separate Hurst exponents?

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2015-01-01

    Roč. 431, č. 1 (2015), s. 124-127 ISSN 0378-4371 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : Correlations * Power- law cross-correlations * Bivariate Hurst exponent * Spectrum coherence Subject RIV: AH - Economics Impact factor: 1.785, year: 2015 http://library.utia.cas.cz/separaty/2015/E/kristoufek-0452314.pdf

  13. The approximation of bivariate Chlodowsky-Sz?sz-Kantorovich-Charlier-type operators

    OpenAIRE

    Agrawal, Purshottam Narain; Baxhaku, Behar; Chauhan, Ruchi

    2017-01-01

    Abstract In this paper, we introduce a bivariate Kantorovich variant of combination of Szász and Chlodowsky operators based on Charlier polynomials. Then, we study local approximation properties for these operators. Also, we estimate the approximation order in terms of Peetre’s K-functional and partial moduli of continuity. Furthermore, we introduce the associated GBS-case (Generalized Boolean Sum) of these operators and study the degree of approximation by means of the Lipschitz class of Bög...

  14. On minimum divergence adaptation of discrete bivariate distributions to given marginals

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; van der Meulen, E. C.

    2005-01-01

    Roč. 51, č. 1 (2005), s. 313-320 ISSN 0018-9448 R&D Projects: GA ČR GA201/02/1391; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : approximation of contingency tables * bivariate discrete distributions * minimization of divergences Subject RIV: BD - Theory of Information Impact factor: 2.183, year: 2005

  15. Multiresolution transmission of the correlation modes between bivariate time series based on complex network theory

    Science.gov (United States)

    Huang, Xuan; An, Haizhong; Gao, Xiangyun; Hao, Xiaoqing; Liu, Pengpeng

    2015-06-01

    This study introduces an approach to study the multiscale transmission characteristics of the correlation modes between bivariate time series. The correlation between the bivariate time series fluctuates over time. The transmission among the correlation modes exhibits a multiscale phenomenon, which provides richer information. To investigate the multiscale transmission of the correlation modes, this paper describes a hybrid model integrating wavelet analysis and complex network theory to decompose and reconstruct the original bivariate time series into sequences in a joint time-frequency domain and defined the correlation modes at each time-frequency domain. We chose the crude oil spot and futures prices as the sample data. The empirical results indicate that the main duration of volatility (32-64 days) for the strongly positive correlation between the crude oil spot price and the futures price provides more useful information for investors. Moreover, the weighted degree, weighted indegree and weighted outdegree of the correlation modes follow power-law distributions. The correlation fluctuation strengthens the extent of persistence over the long term, whereas persistence weakens over the short and medium term. The primary correlation modes dominating the transmission process and the major intermediary modes in the transmission process are clustered both in the short and long term.

  16. Bivariate spline solution of time dependent nonlinear PDE for a population density over irregular domains.

    Science.gov (United States)

    Gutierrez, Juan B; Lai, Ming-Jun; Slavov, George

    2015-12-01

    We study a time dependent partial differential equation (PDE) which arises from classic models in ecology involving logistic growth with Allee effect by introducing a discrete weak solution. Existence, uniqueness and stability of the discrete weak solutions are discussed. We use bivariate splines to approximate the discrete weak solution of the nonlinear PDE. A computational algorithm is designed to solve this PDE. A convergence analysis of the algorithm is presented. We present some simulations of population development over some irregular domains. Finally, we discuss applications in epidemiology and other ecological problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. A comparison between multivariate and bivariate analysis used in marketing research

    Directory of Open Access Journals (Sweden)

    Constantin, C.

    2012-01-01

    Full Text Available This paper is about an instrumental research conducted in order to compare the information given by two multivariate data analysis in comparison with the usual bivariate analysis. The outcomes of the research reveal that sometimes the multivariate methods use more information from a certain variable, but sometimes they use only a part of the information considered the most important for certain associations. For this reason, a researcher should use both categories of data analysis in order to obtain entirely useful information.

  18. A COMPARISON OF SOME ROBUST BIVARIATE CONTROL CHARTS FOR INDIVIDUAL OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    Moustafa Omar Ahmed Abu - Shawiesh

    2014-06-01

    Full Text Available This paper proposed and considered some bivariate control charts to monitor individual observations from a statistical process control. Usual control charts which use mean and variance-covariance estimators are sensitive to outliers. We consider the following robust alternatives to the classical Hoteling's T2: T2MedMAD, T2MCD, T2MVE a simulation study has been conducted to compare the performance of these control charts. Two real life data are analyzed to illustrate the application of these robust alternatives.

  19. The approximation of bivariate Chlodowsky-Szász-Kantorovich-Charlier-type operators

    Directory of Open Access Journals (Sweden)

    Purshottam Narain Agrawal

    2017-08-01

    Full Text Available Abstract In this paper, we introduce a bivariate Kantorovich variant of combination of Szász and Chlodowsky operators based on Charlier polynomials. Then, we study local approximation properties for these operators. Also, we estimate the approximation order in terms of Peetre’s K-functional and partial moduli of continuity. Furthermore, we introduce the associated GBS-case (Generalized Boolean Sum of these operators and study the degree of approximation by means of the Lipschitz class of Bögel continuous functions. Finally, we present some graphical examples to illustrate the rate of convergence of the operators under consideration.

  20. On the construction of bivariate exponential distributions with an arbitrary correlation coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase {type distributions, The latter property is potentially important for the development hypothesis testing in linear models. Thirdly, it is very easy to simulate......In this paper we use a concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...

  1. The approximation of bivariate Chlodowsky-Szász-Kantorovich-Charlier-type operators.

    Science.gov (United States)

    Agrawal, Purshottam Narain; Baxhaku, Behar; Chauhan, Ruchi

    2017-01-01

    In this paper, we introduce a bivariate Kantorovich variant of combination of Szász and Chlodowsky operators based on Charlier polynomials. Then, we study local approximation properties for these operators. Also, we estimate the approximation order in terms of Peetre's K-functional and partial moduli of continuity. Furthermore, we introduce the associated GBS-case (Generalized Boolean Sum) of these operators and study the degree of approximation by means of the Lipschitz class of Bögel continuous functions. Finally, we present some graphical examples to illustrate the rate of convergence of the operators under consideration.

  2. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    Science.gov (United States)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  3. Review: Martin Spetsmann-Kunkel (2004. Die Moral der Daytime Talkshow. Eine soziologische Analyse eines umstrittenen Fernsehformats [The Morality of the Daytime Talk Show. A Sociological Analysis of a Controversial Television Format

    Directory of Open Access Journals (Sweden)

    Nicola Döring

    2006-05-01

    Full Text Available This book deals with the phenomenon of the daytime talk show from a sociological perspective. The author questions the common cultural pessimism of this TV format ("exhibitionist guests," "voyeuristic spectators". He first describes the characteristics of the daytime talk show and summarizes the results of previous surveys that reveal a broad variety of talk show guests' and recipients' motives—beyond pathology. Drawing on concepts like civilization and individualisation, the book outlines the societal functions of the daytime talk show. A participatory observation study in the editorial office of "Hans Meiser" and free interpretations of three series from "Vera am Mittag" are presented as "empirical evidence." Unfortunately the book lacks theoretical and methodological rigor and a sound empirical basis. The bibliography could have been more comprehensive. The work is useful, though, as an inspired, readable introduction into the topic. URN: urn:nbn:de:0114-fqs0603119

  4. Bivariate tensor product [Formula: see text]-analogue of Kantorovich-type Bernstein-Stancu-Schurer operators.

    Science.gov (United States)

    Cai, Qing-Bo; Xu, Xiao-Wei; Zhou, Guorong

    2017-01-01

    In this paper, we construct a bivariate tensor product generalization of Kantorovich-type Bernstein-Stancu-Schurer operators based on the concept of [Formula: see text]-integers. We obtain moments and central moments of these operators, give the rate of convergence by using the complete modulus of continuity for the bivariate case and estimate a convergence theorem for the Lipschitz continuous functions. We also give some graphs and numerical examples to illustrate the convergence properties of these operators to certain functions.

  5. A bivariate measurement error model for semicontinuous and continuous variables: Application to nutritional epidemiology.

    Science.gov (United States)

    Kipnis, Victor; Freedman, Laurence S; Carroll, Raymond J; Midthune, Douglas

    2016-03-01

    Semicontinuous data in the form of a mixture of a large portion of zero values and continuously distributed positive values frequently arise in many areas of biostatistics. This article is motivated by the analysis of relationships between disease outcomes and intakes of episodically consumed dietary components. An important aspect of studies in nutritional epidemiology is that true diet is unobservable and commonly evaluated by food frequency questionnaires with substantial measurement error. Following the regression calibration approach for measurement error correction, unknown individual intakes in the risk model are replaced by their conditional expectations given mismeasured intakes and other model covariates. Those regression calibration predictors are estimated using short-term unbiased reference measurements in a calibration substudy. Since dietary intakes are often "energy-adjusted," e.g., by using ratios of the intake of interest to total energy intake, the correct estimation of the regression calibration predictor for each energy-adjusted episodically consumed dietary component requires modeling short-term reference measurements of the component (a semicontinuous variable), and energy (a continuous variable) simultaneously in a bivariate model. In this article, we develop such a bivariate model, together with its application to regression calibration. We illustrate the new methodology using data from the NIH-AARP Diet and Health Study (Schatzkin et al., 2001, American Journal of Epidemiology 154, 1119-1125), and also evaluate its performance in a simulation study. © 2015, The International Biometric Society.

  6. Probabilistic modeling using bivariate normal distributions for identification of flow and displacement intervals in longwall overburden

    Energy Technology Data Exchange (ETDEWEB)

    Karacan, C.O.; Goodman, G.V.R. [NIOSH, Pittsburgh, PA (United States). Off Mine Safety & Health Research

    2011-01-15

    Gob gas ventholes (GGV) are used to control methane emissions in longwall mines by capturing it within the overlying fractured strata before it enters the work environment. In order for GGVs to effectively capture more methane and less mine air, the length of the slotted sections and their proximity to top of the coal bed should be designed based on the potential gas sources and their locations, as well as the displacements in the overburden that will create potential flow paths for the gas. In this paper, an approach to determine the conditional probabilities of depth-displacement, depth-flow percentage, depth-formation and depth-gas content of the formations was developed using bivariate normal distributions. The flow percentage, displacement and formation data as a function of distance from coal bed used in this study were obtained from a series of borehole experiments contracted by the former US Bureau of Mines as part of a research project. Each of these parameters was tested for normality and was modeled using bivariate normal distributions to determine all tail probabilities. In addition, the probability of coal bed gas content as a function of depth was determined using the same techniques. The tail probabilities at various depths were used to calculate conditional probabilities for each of the parameters. The conditional probabilities predicted for various values of the critical parameters can be used with the measurements of flow and methane percentage at gob gas ventholes to optimize their performance.

  7. Xp21 contiguous gene syndromes: Deletion quantitation with bivariate flow karyotyping allows mapping of patient breakpoints

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, E.R.B.; Towbin, J.A. (Baylor College of Medicine, Houston, TX (United States)); Engh, G. van den; Trask, B.J. (Lawrence Livermore National Lab., CA (United States))

    1992-12-01

    Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.

  8. Bivariate Random Effects Meta-analysis of Diagnostic Studies Using Generalized Linear Mixed Models

    Science.gov (United States)

    GUO, HONGFEI; ZHOU, YIJIE

    2011-01-01

    Bivariate random effect models are currently one of the main methods recommended to synthesize diagnostic test accuracy studies. However, only the logit-transformation on sensitivity and specificity has been previously considered in the literature. In this paper, we consider a bivariate generalized linear mixed model to jointly model the sensitivities and specificities, and discuss the estimation of the summary receiver operating characteristic curve (ROC) and the area under the ROC curve (AUC). As the special cases of this model, we discuss the commonly used logit, probit and complementary log-log transformations. To evaluate the impact of misspecification of the link functions on the estimation, we present two case studies and a set of simulation studies. Our study suggests that point estimation of the median sensitivity and specificity, and AUC is relatively robust to the misspecification of the link functions. However, the misspecification of link functions has a noticeable impact on the standard error estimation and the 95% confidence interval coverage, which emphasizes the importance of choosing an appropriate link function to make statistical inference. PMID:19959794

  9. A method of moments to estimate bivariate survival functions: the copula approach

    Directory of Open Access Journals (Sweden)

    Silvia Angela Osmetti

    2013-05-01

    Full Text Available In this paper we discuss the problem on parametric and non parametric estimation of the distributions generated by the Marshall-Olkin copula. This copula comes from the Marshall-Olkin bivariate exponential distribution used in reliability analysis. We generalize this model by the copula and different marginal distributions to construct several bivariate survival functions. The cumulative distribution functions are not absolutely continuous and they unknown parameters are often not be obtained in explicit form. In order to estimate the parameters we propose an easy procedure based on the moments. This method consist in two steps: in the first step we estimate only the parameters of marginal distributions and in the second step we estimate only the copula parameter. This procedure can be used to estimate the parameters of complex survival functions in which it is difficult to find an explicit expression of the mixed moments. Moreover it is preferred to the maximum likelihood one for its simplex mathematic form; in particular for distributions whose maximum likelihood parameters estimators can not be obtained in explicit form.

  10. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  11. Statistical power of intervention analyses: simulation and empirical application to treated lumber prices

    Science.gov (United States)

    Jeffrey P. Prestemon

    2009-01-01

    Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...

  12. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    , situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto...... with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration...... distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration...

  13. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  14. Showing/Sharing: Analysing Visual Communication from a Praxeological Perspective

    Directory of Open Access Journals (Sweden)

    Maria Schreiber

    2017-12-01

    Full Text Available This contribution proposes a methodological framework for empirical research into visual practices on social media. The framework identifies practices, pictures and platforms as relevant dimensions of analysis. It is mainly developed within, and is compatible with qualitative, interpretive approaches which focus on visual communication as part of everyday personal communicative practices. Two screenshots from Instagram and Facebook are introduced as empirical examples to investigate collaborative practices of meaning-making relating to pictures on social media. While social media seems to augment reflexive, processual practices of negotiating identities, visual media, in particular, amps up aesthetic, ambivalent and embodied dimensions within these practices.

  15. Bivariate tensor product ( p , q $(p, q$ -analogue of Kantorovich-type Bernstein-Stancu-Schurer operators

    Directory of Open Access Journals (Sweden)

    Qing-Bo Cai

    2017-11-01

    Full Text Available Abstract In this paper, we construct a bivariate tensor product generalization of Kantorovich-type Bernstein-Stancu-Schurer operators based on the concept of ( p , q $(p, q$ -integers. We obtain moments and central moments of these operators, give the rate of convergence by using the complete modulus of continuity for the bivariate case and estimate a convergence theorem for the Lipschitz continuous functions. We also give some graphs and numerical examples to illustrate the convergence properties of these operators to certain functions.

  16. Quasi-bivariate variational mode decomposition as a tool of scale analysis in wall-bounded turbulence

    Science.gov (United States)

    Wang, Wenkang; Pan, Chong; Wang, Jinjun

    2018-01-01

    The identification and separation of multi-scale coherent structures is a critical task for the study of scale interaction in wall-bounded turbulence. Here, we propose a quasi-bivariate variational mode decomposition (QB-VMD) method to extract structures with various scales from instantaneous two-dimensional (2D) velocity field which has only one primary dimension. This method is developed from the one-dimensional VMD algorithm proposed by Dragomiretskiy and Zosso (IEEE Trans Signal Process 62:531-544, 2014) to cope with a quasi-2D scenario. It poses the feature of length-scale bandwidth constraint along the decomposed dimension, together with the central frequency re-balancing along the non-decomposed dimension. The feasibility of this method is tested on both a synthetic flow field and a turbulent boundary layer at moderate Reynolds number (Re_{τ } = 3458) measured by 2D particle image velocimetry (PIV). Some other popular scale separation tools, including pseudo-bi-dimensional empirical mode decomposition (PB-EMD), bi-dimensional EMD (B-EMD) and proper orthogonal decomposition (POD), are also tested for comparison. Among all these methods, QB-VMD shows advantages in both scale characterization and energy recovery. More importantly, the mode mixing problem, which degrades the performance of EMD-based methods, is avoided or minimized in QB-VMD. Finally, QB-VMD analysis of the wall-parallel plane in the log layer (at y/δ = 0.12) of the studied turbulent boundary layer shows the coexistence of large- or very large-scale motions (LSMs or VLSMs) and inner-scaled structures, which can be fully decomposed in both physical and spectral domains.

  17. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    Science.gov (United States)

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  18. Bivariate empirical mode decomposition for ECG-based biometric identification with emotional data.

    Science.gov (United States)

    Ferdinando, Hany; Seppanen, Tapio; Alasaarela, Esko

    2017-07-01

    Emotions modulate ECG signals such that they might affect ECG-based biometric identification in real life application. It motivated in finding good feature extraction methods where the emotional state of the subjects has minimum impacts. This paper evaluates feature extraction based on bivariate empirical mode decomposition (BEMD) for biometric identification when emotion is considered. Using the ECG signal from the Mahnob-HCI database for affect recognition, the features were statistical distributions of dominant frequency after applying BEMD analysis to ECG signals. The achieved accuracy was 99.5% with high consistency using kNN classifier in 10-fold cross validation to identify 26 subjects when the emotional states of the subjects were ignored. When the emotional states of the subject were considered, the proposed method also delivered high accuracy, around 99.4%. We concluded that the proposed method offers emotion-independent features for ECG-based biometric identification. The proposed method needs more evaluation related to testing with other classifier and variation in ECG signals, e.g. normal ECG vs. ECG with arrhythmias, ECG from various ages, and ECG from other affective databases.

  19. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

    Science.gov (United States)

    Winahju, W. S.; Mukarromah, A.; Putri, S.

    2015-03-01

    Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

  20. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    Science.gov (United States)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  1. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.

    2015-05-22

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields\\' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  2. Improved deadzone modeling for bivariate wavelet shrinkage-based image denoising

    Science.gov (United States)

    DelMarco, Stephen

    2016-05-01

    Modern image processing performed on-board low Size, Weight, and Power (SWaP) platforms, must provide high- performance while simultaneously reducing memory footprint, power consumption, and computational complexity. Image preprocessing, along with downstream image exploitation algorithms such as object detection and recognition, and georegistration, place a heavy burden on power and processing resources. Image preprocessing often includes image denoising to improve data quality for downstream exploitation algorithms. High-performance image denoising is typically performed in the wavelet domain, where noise generally spreads and the wavelet transform compactly captures high information-bearing image characteristics. In this paper, we improve modeling fidelity of a previously-developed, computationally-efficient wavelet-based denoising algorithm. The modeling improvements enhance denoising performance without significantly increasing computational cost, thus making the approach suitable for low-SWAP platforms. Specifically, this paper presents modeling improvements to the Sendur-Selesnick model (SSM) which implements a bivariate wavelet shrinkage denoising algorithm that exploits interscale dependency between wavelet coefficients. We formulate optimization problems for parameters controlling deadzone size which leads to improved denoising performance. Two formulations are provided; one with a simple, closed form solution which we use for numerical result generation, and the second as an integral equation formulation involving elliptic integrals. We generate image denoising performance results over different image sets drawn from public domain imagery, and investigate the effect of wavelet filter tap length on denoising performance. We demonstrate denoising performance improvement when using the enhanced modeling over performance obtained with the baseline SSM model.

  3. Bivariate Gaussian bridges: directional factorization of diffusion in Brownian bridge models.

    Science.gov (United States)

    Kranstauber, Bart; Safi, Kamran; Bartumeus, Frederic

    2014-01-01

    In recent years high resolution animal tracking data has become the standard in movement ecology. The Brownian Bridge Movement Model (BBMM) is a widely adopted approach to describe animal space use from such high resolution tracks. One of the underlying assumptions of the BBMM is isotropic diffusive motion between consecutive locations, i.e. invariant with respect to the direction. Here we propose to relax this often unrealistic assumption by separating the Brownian motion variance into two directional components, one parallel and one orthogonal to the direction of the motion. Our new model, the Bivariate Gaussian bridge (BGB), tracks movement heterogeneity across time. Using the BGB and identifying directed and non-directed movement within a trajectory resulted in more accurate utilisation distributions compared to dynamic Brownian bridges, especially for trajectories with a non-isotropic diffusion, such as directed movement or Lévy like movements. We evaluated our model with simulated trajectories and observed tracks, demonstrating that the improvement of our model scales with the directional correlation of a correlated random walk. We find that many of the animal trajectories do not adhere to the assumptions of the BBMM. The proposed model improves accuracy when describing the space use both in simulated correlated random walks as well as observed animal tracks. Our novel approach is implemented and available within the "move" package for R.

  4. Bayesian bivariate meta-analysis of diagnostic test studies using integrated nested Laplace approximations.

    Science.gov (United States)

    Paul, M; Riebler, A; Bachmann, L M; Rue, H; Held, L

    2010-05-30

    For bivariate meta-analysis of diagnostic studies, likelihood approaches are very popular. However, they often run into numerical problems with possible non-convergence. In addition, the construction of confidence intervals is controversial. Bayesian methods based on Markov chain Monte Carlo (MCMC) sampling could be used, but are often difficult to implement, and require long running times and diagnostic convergence checks. Recently, a new Bayesian deterministic inference approach for latent Gaussian models using integrated nested Laplace approximations (INLA) has been proposed. With this approach MCMC sampling becomes redundant as the posterior marginal distributions are directly and accurately approximated. By means of a real data set we investigate the influence of the prior information provided and compare the results obtained by INLA, MCMC, and the maximum likelihood procedure SAS PROC NLMIXED. Using a simulation study we further extend the comparison of INLA and SAS PROC NLMIXED by assessing their performance in terms of bias, mean-squared error, coverage probability, and convergence rate. The results indicate that INLA is more stable and gives generally better coverage probabilities for the pooled estimates and less biased estimates of variance parameters. The user-friendliness of INLA is demonstrated by documented R-code. Copyright (c) 2010 John Wiley & Sons, Ltd.

  5. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  6. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    International Nuclear Information System (INIS)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez

    2011-01-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  7. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez, E-mail: valter.costa@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  8. Genetic determinant of trabecular bone score (TBS) and bone mineral density: A bivariate analysis.

    Science.gov (United States)

    Ho-Pham, Lan T; Hans, Didier; Doan, Minh C; Mai, Linh D; Nguyen, Tuan V

    2016-11-01

    This study sought to estimate the extent of genetic influence on the variation in trabecular bone score (TBS). We found that genetic factors accounted for ~45% of variance in TBS, and that the co-variation between TBS and bone density is partially determined by genetic factors. Trabecular bone score has emerged as an important predictor of fragility fracture, but factors underlying the individual differences in TBS have not been explored. In this study, we sought to determine the genetic contribution to the variation of TBS in the general population. The study included 556 women and 189 men from 265 families. The individuals aged 53years (SD 11). We measured lumbar spine bone mineral density (BMD; Hologic Horizon) and then derived the TBS from the same Hologic scan where BMD was derived. A biometric model was applied to the data to partition the variance of TBS into two components: one due to additive genetic factors, and one due to environmental factors. The index of heritability was estimated as the ratio of genetic variance to total variance of a trait. Bivariate genetic analysis was conducted to estimate the genetic correlation between TBS and BMD measurements. TBS was strongly correlated with lumbar spine BMD (r=0.73; P<0.001). On average TBS in men was higher than women, after adjusting age and height which are significantly associated with both TBS and lumbar spine BMD. The age and height adjusted index of heritability of TBS was 0.46 (95% CI, 0.39-0.54), which was not much different from that of LSBMD (0.44; 95% CI, 0.31-0.55). Moreover, the genetic correlation between TBS and LSBMD was 0.35 (95% CI, 0.21-0.46), between TBS and femoral neck BMD was 0.21 (95% CI, 0.10-0.33). Approximately 45% of the variance in TBS is under genetic influence, and this effect magnitude is similar to that of lumbar spine BMD. This finding provides a scientific justification for the search for specific genetic variants that may be associated with TBS and fracture risk

  9. A non-parametric conditional bivariate reference region with an application to height/weight measurements on normal girls

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm

    2009-01-01

    A conceptually simple two-dimensional conditional reference curve is described. The curve gives a decision basis for determining whether a bivariate response from an individual is "normal" or "abnormal" when taking into account that a third (conditioning) variable may influence the bivariate...... response. The reference curve is not only characterized analytically but also by geometric properties that are easily communicated to medical doctors - the users of such curves. The reference curve estimator is completely non-parametric, so no distributional assumptions are needed about the two......-dimensional response. An example that will serve to motivate and illustrate the reference is the study of the height/weight distribution of 7-8-year-old Danish school girls born in 1930, 1950, or 1970....

  10. Bivariate Extension of the Quadrature Method of Moments for Modeling Simultaneous Coagulation and Sintering of Particle Populations.

    Science.gov (United States)

    Wright, Douglas L.; McGraw, Robert; Rosner, Daniel E.

    2001-04-15

    We extendthe application of moment methods to multivariate suspended particle population problems-those for which size alone is insufficient to specify the state of a particle in the population. Specifically, a bivariate extension of the quadrature method of moments (QMOM) (R. McGraw, Aerosol Sci. Technol. 27, 255 (1997)) is presented for efficiently modeling the dynamics of a population of inorganic nanoparticles undergoing simultaneous coagulation and particle sintering. Continuum regime calculations are presented for the Koch-Friedlander-Tandon-Rosner model, which includes coagulation by Brownian diffusion (evaluated for particle fractal dimensions, D(f), in the range 1.8-3) and simultaneous sintering of the resulting aggregates (P. Tandon and D. E. Rosner, J. Colloid Interface Sci. 213, 273 (1999)). For evaluation purposes, and to demonstrate the computational efficiency of the bivariate QMOM, benchmark calculations are carried out using a high-resolution discrete method to evolve the particle distribution function n(nu, a) for short to intermediate times (where nu and a are particle volume and surface area, respectively). Time evolution of a selected set of 36 low-order mixed moments is obtained by integration of the full bivariate distribution and compared with the corresponding moments obtained directly using two different extensions of the QMOM. With the more extensive treatment, errors of less than 1% are obtained over substantial aerosol evolution, while requiring only a few minutes (rather than days) of CPU time. Longer time QMOM simulations lend support to the earlier finding of a self-preserving limit for the dimensionless joint (nu, a) particle distribution function under simultaneous coagulation and sintering (Tandon and Rosner, 1999; D. E. Rosner and S. Yu, AIChE J., 47 (2001)). We demonstrate that, even in the bivariate case, it is possible to use the QMOM to rapidly model the approach to asymptotic behavior, allowing an immediate assessment of

  11. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Directory of Open Access Journals (Sweden)

    Jose Javier Gorgoso-Varela

    2016-04-01

    Full Text Available Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights.Area of study: North-West of Spain.Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill. stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution.Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic.Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass.

  12. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Energy Technology Data Exchange (ETDEWEB)

    Cardil Forradellas, A.; Molina Terrén, D.M.; Oliveres, J.; Castellnou, M.

    2016-07-01

    Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights. Area of study: North-West of Spain. Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill.) stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution. Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic. Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass. (Author)

  13. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    Science.gov (United States)

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Long-lead station-scale prediction of hydrological droughts in South Korea based on bivariate pattern-based downscaling

    Science.gov (United States)

    Sohn, Soo-Jin; Tam, Chi-Yung

    2016-05-01

    Capturing climatic variations in boreal winter to spring (December-May) is essential for properly predicting droughts in South Korea. This study investigates the variability and predictability of the South Korean climate during this extended season, based on observations from 60 station locations and multi-model ensemble (MME) hindcast experiments (1983/1984-2005/2006) archived at the APEC Climate Center (APCC). Multivariate empirical orthogonal function (EOF) analysis results based on observations show that the first two leading modes of winter-to-spring precipitation and temperature variability, which together account for ~80 % of the total variance, are characterized by regional-scale anomalies covering the whole South Korean territory. These modes were also closely related to some of the recurrent large-scale circulation changes in the northern hemisphere during the same season. Consistent with the above, examination of the standardized precipitation evapotranspiration index (SPEI) indicates that drought conditions in South Korea tend to be accompanied by regional-to-continental-scale circulation anomalies over East Asia to the western north Pacific. Motivated by the aforementioned findings on the spatial-temporal coherence among station-scale precipitation and temperature anomalies, a new bivariate and pattern-based downscaling method was developed. The novelty of this method is that precipitation and temperature data were first filtered using multivariate EOFs to enhance their spatial-temporal coherence, before being linked to large-scale circulation variables using canonical correlation analysis (CCA). To test its applicability and to investigate its related potential predictability, a perfect empirical model was first constructed with observed datasets as predictors. Next, a model output statistics (MOS)-type hybrid dynamical-statistical model was developed, using products from nine one-tier climate models as inputs. It was found that, with model sea

  15. VolHOG: a volumetric object recognition approach based on bivariate histograms of oriented gradients for vertebra detection in cervical spine MRI.

    Science.gov (United States)

    Daenzer, Stefan; Freitag, Stefan; von Sachsen, Sandra; Steinke, Hanno; Groll, Mathias; Meixensberger, Jürgen; Leimert, Mario

    2014-08-01

    The automatic recognition of vertebrae in volumetric images is an important step toward automatic spinal diagnosis and therapy support systems. There are many applications such as the detection of pathologies and segmentation which would benefit from automatic initialization by the detection of vertebrae. One possible application is the initialization of local vertebral segmentation methods, eliminating the need for manual initialization by a human operator. Automating the initialization process would optimize the clinical workflow. However, automatic vertebra recognition in magnetic resonance (MR) images is a challenging task due to noise in images, pathological deformations of the spine, and image contrast variations. This work presents a fully automatic algorithm for 3D cervical vertebra detection in MR images. We propose a machine learning method for cervical vertebra detection based on new features combined with a linear support vector machine for classification. An algorithm for bivariate gradient orientation histogram generation from three-dimensional raster image data is introduced which allows us to describe three-dimensional objects using the authors' proposed bivariate histograms. A detailed performance evaluation on 21 T2-weighted MR images of the cervical vertebral region is given. A single model for cervical vertebrae C3-C7 is generated and evaluated. The results show that the generic model performs equally well for each of the cervical vertebrae C3-C7. The algorithm's performance is also evaluated on images containing various levels of artificial noise. The results indicate that the proposed algorithm achieves good results despite the presence of severe image noise. The proposed detection method delivers accurate locations of cervical vertebrae in MR images which can be used in diagnosis and therapy. In order to achieve absolute comparability with the results of future work, the authors are following an open data approach by making the image dataset

  16. Show-Bix &

    DEFF Research Database (Denmark)

    2014-01-01

    The anti-reenactment 'Show-Bix &' consists of 5 dias projectors, a dial phone, quintophonic sound, and interactive elements. A responsive interface will enable the Dias projectors to show copies of original dias slides from the Show-Bix piece ”March på Stedet”, 265 images in total. The copies are...

  17. Predicting the Size of Sunspot Cycle 24 on the Basis of Single- and Bi-Variate Geomagnetic Precursor Methods

    Science.gov (United States)

    Wilson, Robert M.; Hathaway, David H.

    2009-01-01

    Examined are single- and bi-variate geomagnetic precursors for predicting the maximum amplitude (RM) of a sunspot cycle several years in advance. The best single-variate fit is one based on the average of the ap index 36 mo prior to cycle minimum occurrence (E(Rm)), having a coefficient of correlation (r) equal to 0.97 and a standard error of estimate (se) equal to 9.3. Presuming cycle 24 not to be a statistical outlier and its minimum in March 2008, the fit suggests cycle 24 s RM to be about 69 +/- 20 (the 90% prediction interval). The weighted mean prediction of 11 statistically important single-variate fits is 116 +/- 34. The best bi-variate fit is one based on the maximum and minimum values of the 12-mma of the ap index; i.e., APM# and APm*, where # means the value post-E(RM) for the preceding cycle and * means the value in the vicinity of cycle minimum, having r = 0.98 and se = 8.2. It predicts cycle 24 s RM to be about 92 +/- 27. The weighted mean prediction of 22 statistically important bi-variate fits is 112 32. Thus, cycle 24's RM is expected to lie somewhere within the range of about 82 to 144. Also examined are the late-cycle 23 behaviors of geomagnetic indices and solar wind velocity in comparison to the mean behaviors of cycles 2023 and the geomagnetic indices of cycle 14 (RM = 64.2), the weakest sunspot cycle of the modern era.

  18. Simultaneous use of serum IgG and IgM for risk scoring of suspected early Lyme borreliosis: graphical and bivariate analyses

    DEFF Research Database (Denmark)

    Dessau, Ram B; Ejlertsen, Tove; Hilden, Jørgen

    2010-01-01

    The laboratory diagnosis of early disseminated Lyme borreliosis (LB) rests on IgM and IgG antibodies in serum. The purpose of this study was to refine the statistical interpretation of IgM and IgG by combining the diagnostic evidence provided by the two immunoglobulins and exploiting the whole...

  19. Diagnostic performance of des-γ-carboxy prothrombin (DCP) for hepatocellular carcinoma: a bivariate meta-analysis.

    Science.gov (United States)

    Gao, P; Li, M; Tian, Q B; Liu, Dian-Wu

    2012-01-01

    Serum markers are needed to be developed to specifically diagnose Hepatocellular carcinoma (HCC). Des-γ-carboxy prothrombin (DCP) is a promising tool with limited expense and widely accessibility, but the reported results have been controversial. In order to review the performance of DCP for the diagnosis of HCC, the meta-analysis was performed. After a systematic review of relevant studies, the sensitivity, specificity, positive and negative likelihood ratios (PLR and NLR, respectively) were pooled using a bivariate meta-analysis. Potential between-study heterogeneity was explored by meta-regression model. The post-test probability and the likelihood ratio scattergram to evaluate clinical usefulness were calculated. Based on literature review of 20 publications, the overall sensitivity, specificity, PLR and NLR of DCP for the detection of HCC were 67% (95%CI, 58%-74%), 92% (95%CI, 88%-94%), 7.9 (95%CI, 5.6-11.2) and 0.36 (95%CI, 0.29-0.46), respectively. The area under the bivariate summary receiving operating characteristics curve was 0.89 (95%CI, 0.85-0.92). Significant heterogeneity was present. In conclusion, the major role of DCP is the moderate confirmation of HCC. More prospective studies of DCP are needed in future.

  20. Talk Show Science.

    Science.gov (United States)

    Moore, Mitzi Ruth

    1992-01-01

    Proposes having students perform skits in which they play the roles of the science concepts they are trying to understand. Provides the dialog for a skit in which hot and cold gas molecules are interviewed on a talk show to study how these properties affect wind, rain, and other weather phenomena. (MDH)

  1. Physics Reality Show

    Science.gov (United States)

    Erukhimova, Tatiana

    The attention span of K-12 students is very short; they are used to digesting information in short snippets through social media and TV. To get the students interested in physics, we created the Physics Reality Show: a series of staged short videos with duration no longer than a few minutes. Each video explains and illustrates one physics concept or law through a fast-paced sequence of physics demonstrations and experiments. The cast consists entirely of physics undergraduate students with artistic abilities and substantial experience in showing physics demonstrations at current outreach events run by the department: Physics Shows and Physics & Engineering Festival. Undergraduate students are of almost the same age as their high-school audience. They are in the best position to connect with kids and convey their fascination with physics. The PI and other faculty members who are involved in the outreach advise and coach the cast. They help students in staging the episodes and choosing the most exciting and relevant demonstrations. Supported by the APS mini-outreach Grant.

  2. Newton Leibniz integration for ket-bra operators in quantum mechanics (V)—Deriving normally ordered bivariate-normal-distribution form of density operators and developing their phase space formalism

    Science.gov (United States)

    Fan, Hong-yi

    2008-06-01

    We show that Newton-Leibniz integration over Dirac's ket-bra projection operators with continuum variables, which can be performed by the technique of integration within ordered product (IWOP) of operators [Hong-yi Fan, Hai-liang Lu, Yue Fan, Ann. Phys. 321 (2006) 480], can directly recast density operators and generalized Wigner operators into normally ordered bivariate-normal-distribution form, which has resemblance in statistics. In this way the phase space formalism of quantum mechanics can be developed. The Husimi operator, entangled Husimi operator and entangled Wigner operator for entangled particles with different masses are naturally introduced by virtue of the IWOP technique, and their physical meanings are explained.

  3. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  4. Not a "reality" show.

    Science.gov (United States)

    Wrong, Terence; Baumgart, Erica

    2013-01-01

    The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show.

  5. Showing Value (Editorial

    Directory of Open Access Journals (Sweden)

    Denise Koufogiannakis

    2009-06-01

    Full Text Available When Su Cleyle and I first decided to start Evidence Based Library and Information Practice, one of the things we agreed upon immediately was that the journal be open access. We knew that a major obstacle to librarians using the research literature was that they did not have access to the research literature. Although Su and I are both academic librarians who can access a wide variety of library and information literature from our institutions, we belong to a profession where not everyone has equal access to the research in our field. Without such access to our own body of literature, how can we ever hope for practitioners to use research evidence in their decision making? It would have been contradictory to the principles of evidence based library and information practice to do otherwise.One of the specific groups we thought could use such an open access venue for discovering research literature was school librarians. School librarians are often isolated and lacking access to the research literature that may help them prove to stakeholders the importance of their libraries and their role within schools. Certainly, school libraries have been in decline and the use of evidence to show value is needed. As Ken Haycock noted in his 2003 report, The Crisis in Canada’s School Libraries: The Case for Reform and Reinvestment, “Across the country, teacher-librarians are losing their jobs or being reassigned. Collections are becoming depleted owing to budget cuts. Some principals believe that in the age of the Internet and the classroom workstation, the school library is an artifact” (9. Within this context, school librarians are looking to our research literature for evidence of the impact that school library programs have on learning outcomes and student success. They are integrating that evidence into their practice, and reflecting upon what can be improved locally. They are focusing on students and showing the impact of school libraries and

  6. Attenuation of vagal modulation with aging: Univariate and bivariate analysis of HRV.

    Science.gov (United States)

    Junior, E C; Oliveira, F M

    2017-07-01

    The aging process leads to diverse changes in the human organism, including in autonomic system modulation. In this study, we calculated indices of HRV in frequency (power spectral density, PSD) and time (the impulse response (IR) method) domains, using data from healthy young and elderly volunteers (Fantasia database from Physionet). The results obtained showed that aging leads to an attenuation of vagal modulation of elderly individuals when compared to young volunteers.

  7. Evaluation of Factors Affecting E-Bike Involved Crash and E-Bike License Plate Use in China Using a Bivariate Probit Model

    Directory of Open Access Journals (Sweden)

    Yanyong Guo

    2017-01-01

    Full Text Available The primary objective of this study is to evaluate factors affecting e-bike involved crash and license plate use in China. E-bike crashes data were collected from police database and completed through a telephone interview. Noncrash samples were collected by a questionnaire survey. A bivariate probit (BP model was developed to simultaneously examine the significant factors associated with e-bike involved crash and e-bike license plate and to account for the correlations between them. Marginal effects for contributory factors were calculated to quantify their impacts on the outcomes. The results show that several contributory factors, including gender, age, education level, driver license, car in household, experiences in using e-bike, law compliance, and aggressive driving behaviors, are found to have significant impacts on both e-bike involved crash and license plate use. Moreover, type of e-bike, frequency of using e-bike, impulse behavior, degree of riding experience, and risk perception scale are found to be associated with e-bike involved crash. It is also found that e-bike involved crash and e-bike license plate use are strongly correlated and are negative in direction. The result enhanced our comprehension of the factors related to e-bike involved crash and e-bike license plate use.

  8. A bivariate approach to the widening of the frontal lobes in the genus Homo.

    Science.gov (United States)

    Bruner, Emiliano; Holloway, Ralph L

    2010-02-01

    Within the genus Homo, the most encephalized taxa (Neandertals and modern humans) show relatively wider frontal lobes than either Homo erectus or australopithecines. The present analysis considers whether these changes are associated with a single size-based or allometric pattern (positive allometry of the width of the anterior endocranial fossa) or with a more specific and non-allometric pattern. The relationship between hemispheric length, maximum endocranial width, and frontal width at Broca's area was investigated in extant and extinct humans. Our results do not support positive allometry for the frontal lobe's width in relation to the main endocranial diameters within modern humans (Homo sapiens). Also, the correlation between frontal width and hemispheric length is lower than the correlation between frontal width and parieto-temporal width. When compared with the australopithecines, the genus Homo could have experienced a non-allometric widening of the brain at the temporo-parietal areas, which is most evident in Neandertals. Modern humans and Neandertals also display a non-allometric widening of the anterior endocranial fossa at the Broca's cap when compared with early hominids, again more prominent in the latter group. Taking into account the contrast between the intra-specific patterns and the between-species differences, the relative widening of the anterior fossa can be interpreted as a definite evolutionary character instead of a passive consequence of brain size increase. This expansion is most likely associated with correspondent increments of the underlying neural mass, or at least with a geometrical reallocation of the frontal cortical volumes. Although different structural changes of the cranial architecture can be related to such variations, the widening of the frontal areas is nonetheless particularly interesting when some neural functions (like language or working memory, decision processing, etc.) and related fronto-parietal cortico

  9. Aesthetic Surgery Reality Television Shows: Do they Influence Public Perception of the Scope of Plastic Surgery?

    Science.gov (United States)

    Denadai, Rafael; Araujo, Karin Milleni; Samartine Junior, Hugo; Denadai, Rodrigo; Raposo-Amaral, Cassio Eduardo

    2015-12-01

    The purpose of this survey was to assess the influence of aesthetic surgery "reality television" shows viewing on the public's perception of the scope of plastic surgery practice. Perceptions of the scope of plastic surgery (33 scenarios), aesthetic surgery "reality television" viewing patterns ("high," "moderate," or "low" familiarity, similarity, confidence, and influence viewers), sociodemographic data, and previous plastic surgery interaction were collected from 2148 members of the public. Response patterns were created and bivariate and multivariate analyses were applied to assess the possible determinants of overall public choice of plastic surgeons as experts in the plastic surgery-related scenarios. Both "plastic surgeons" and "plastic surgeons alone" were the main response patterns (all p television" viewing negatively influences the public perception of the broad scope of plastic surgery. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.

  10. Probabilistic safety analyses (PSA)

    International Nuclear Information System (INIS)

    1997-01-01

    The guide shows how the probabilistic safety analyses (PSA) are used in the design, construction and operation of light water reactor plants in order for their part to ensure that the safety of the plant is good enough in all plant operational states

  11. ABACUS: an entropy-based cumulative bivariate statistic robust to rare variants and different direction of genotype effect.

    Science.gov (United States)

    Di Camillo, Barbara; Sambo, Francesco; Toffolo, Gianna; Cobelli, Claudio

    2014-02-01

    In the past years, both sequencing and microarray have been widely used to search for relations between genetic variations and predisposition to complex pathologies such as diabetes or neurological disorders. These studies, however, have been able to explain only a small fraction of disease heritability, possibly because complex pathologies cannot be referred to few dysfunctional genes, but are rather heterogeneous and multicausal, as a result of a combination of rare and common variants possibly impairing multiple regulatory pathways. Rare variants, though, are difficult to detect, especially when the effects of causal variants are in different directions, i.e. with protective and detrimental effects. Here, we propose ABACUS, an Algorithm based on a BivAriate CUmulative Statistic to identify single nucleotide polymorphisms (SNPs) significantly associated with a disease within predefined sets of SNPs such as pathways or genomic regions. ABACUS is robust to the concurrent presence of SNPs with protective and detrimental effects and of common and rare variants; moreover, it is powerful even when few SNPs in the SNP-set are associated with the phenotype. We assessed ABACUS performance on simulated and real data and compared it with three state-of-the-art methods. When ABACUS was applied to type 1 and 2 diabetes data, besides observing a wide overlap with already known associations, we found a number of biologically sound pathways, which might shed light on diabetes mechanism and etiology. ABACUS is available at http://www.dei.unipd.it/∼dicamill/pagine/Software.html.

  12. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone...

  13. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    NARCIS (Netherlands)

    M.C. Medina-Gomez (Carolina); J.P. Kemp (John); Dimou, N.L. (Niki L.); Kreiner, E. (Eskil); A. Chesi (Alessandra); B.S. Zemel (Babette S.); K. Bønnelykke (Klaus); Boer, C.G. (Cindy G.); T.S. Ahluwalia (Tarunveer Singh); H. Bisgaard; E. Evangelou (Evangelos); D.H.M. Heppe (Denise); Bonewald, L.F. (Lynda F.); Gorski, J.P. (Jeffrey P.); M. Ghanbari (Mohsen); S. Demissie (Serkalem); Duque, G. (Gustavo); M.T. Maurano (Matthew T.); D.P. Kiel (Douglas P.); Y.-H. Hsu (Yi-Hsiang); B.C.J. van der Eerden (Bram); Ackert-Bicknell, C. (Cheryl); S. Reppe (Sjur); K.M. Gautvik (Kaare); Raastad, T. (Truls); D. Karasik (David); J. van de Peppel (Jeroen); V.W.V. Jaddoe (Vincent); A.G. Uitterlinden (André); J.H. Tobias (Jon); S.F.A. Grant (Struan); Bagos, P.G. (Pantelis G.); D.M. Evans (David); F. Rivadeneira Ramirez (Fernando)

    2017-01-01

    markdownabstractBone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body

  14. A comparison of the effect of 5-bromodeoxyuridine substitution on 33258 Hoechst- and DAPI-fluorescence of isolated chromosomes by bivariate flow karyotyping

    NARCIS (Netherlands)

    Buys, C. H.; Mesa, J.; van der Veen, A. Y.; Aten, J. A.

    1986-01-01

    Application of the fluorescent DNA-intercalator propidium iodide for stabilization of the mitotic chromosome structure during isolation of chromosomes from V79 Chinese hamster cells and subsequent staining with the fluorochromes 33258 Hoechst or DAPI allowed bivariate flow karyotyping of isolated

  15. Secondary analyses of data from four studies with fourth-grade children show that sex, race, amounts eaten of standardized portions, and energy content given in trades explain the positive relationship between BMI and energy intake at school-provided meals

    Science.gov (United States)

    Baxter, Suzanne Domel; Paxton-Aiken, Amy E.; Tebbs, Joshua M.; Royer, Julie A.; Guinn, Caroline H.; Finney, Christopher J.

    2012-01-01

    Results from a 2012 article showed a positive relationship between children’s body mass index (BMI) and energy intake at school-provided meals. To help explain that positive relationship, secondary analyses investigated 1) whether the relationship differed by sex and race, and 2) the relationship between BMI and six aspects of school-provided meals—amounts eaten of standardized portions, energy content given in trades, energy intake received in trades, energy intake from flavored milk, energy intake from a la carte ice cream, and breakfast type. Data were from four studies conducted one per school year (1999–2000 to 2002–2003). Fourth-grade children (n=328; 50% female; 54% Black) from 13 schools total were observed eating school-provided breakfast and lunch on one to three days per child for 1,178 total meals (50% breakfast). Children were weighed and measured. Marginal regression models were fit using BMI as the dependent variable. For Purpose One, independent variables were energy intake at school-provided meals, sex, race, age, and study; additional models included interaction terms involving energy intake and sex/race. For Purpose Two, independent variables were the six aspects of school-provided meals, sex, race, age, and study. The relationship between BMI and energy intake at school-provided meals differed by sex (p<0.0001; stronger for females) and race (p=0.0063; stronger for Black children). BMI was positively related to amounts eaten of standardized portions (p<0.0001) and negatively related to energy content given in trades (p=0.0052). Explaining the positive relationship between BMI and energy intake at school-provided meals may contribute to school-based obesity prevention efforts. PMID:23084638

  16. Diagnostic value of sTREM-1 in bronchoalveolar lavage fluid in ICU patients with bacterial lung infections: a bivariate meta-analysis.

    Science.gov (United States)

    Shi, Jia-Xin; Li, Jia-Shu; Hu, Rong; Li, Chun-Hua; Wen, Yan; Zheng, Hong; Zhang, Feng; Li, Qin

    2013-01-01

    The serum soluble triggering receptor expressed on myeloid cells-1 (sTREM-1) is a useful biomarker in differentiating bacterial infections from others. However, the diagnostic value of sTREM-1 in bronchoalveolar lavage fluid (BALF) in lung infections has not been well established. We performed a meta-analysis to assess the accuracy of sTREM-1 in BALF for diagnosis of bacterial lung infections in intensive care unit (ICU) patients. We searched PUBMED, EMBASE and Web of Knowledge (from January 1966 to October 2012) databases for relevant studies that reported diagnostic accuracy data of BALF sTREM-1 in the diagnosis of bacterial lung infections in ICU patients. Pooled sensitivity, specificity, and positive and negative likelihood ratios were calculated by a bivariate regression analysis. Measures of accuracy and Q point value (Q*) were calculated using summary receiver operating characteristic (SROC) curve. The potential between-studies heterogeneity was explored by subgroup analysis. Nine studies were included in the present meta-analysis. Overall, the prevalence was 50.6%; the sensitivity was 0.87 (95% confidence interval (CI), 0.72-0.95); the specificity was 0.79 (95% CI, 0.56-0.92); the positive likelihood ratio (PLR) was 4.18 (95% CI, 1.78-9.86); the negative likelihood ratio (NLR) was 0.16 (95% CI, 0.07-0.36), and the diagnostic odds ratio (DOR) was 25.60 (95% CI, 7.28-89.93). The area under the SROC curve was 0.91 (95% CI, 0.88-0.93), with a Q* of 0.83. Subgroup analysis showed that the assay method and cutoff value influenced the diagnostic accuracy of sTREM-1. BALF sTREM-1 is a useful biomarker of bacterial lung infections in ICU patients. Further studies are needed to confirm the optimized cutoff value.

  17. Flash flood susceptibility analysis and its mapping using different bivariate models in Iran: a comparison between Shannon's entropy, statistical index, and weighting factor models.

    Science.gov (United States)

    Khosravi, Khabat; Pourghasemi, Hamid Reza; Chapi, Kamran; Bahri, Masoumeh

    2016-12-01

    Flooding is a very common worldwide natural hazard causing large-scale casualties every year; Iran is not immune to this thread as well. Comprehensive flood susceptibility mapping is very important to reduce losses of lives and properties. Thus, the aim of this study is to map susceptibility to flooding by different bivariate statistical methods including Shannon's entropy (SE), statistical index (SI), and weighting factor (Wf). In this regard, model performance evaluation is also carried out in Haraz Watershed, Mazandaran Province, Iran. In the first step, 211 flood locations were identified by the documentary sources and field inventories, of which 70% (151 positions) were used for flood susceptibility modeling and 30% (60 positions) for evaluation and verification of the model. In the second step, ten influential factors in flooding were chosen, namely slope angle, plan curvature, altitude, topographic wetness index (TWI), stream power index (SPI), distance from river, rainfall, geology, land use, and normalized difference vegetation index (NDVI). In the next step, flood susceptibility maps were prepared by these four methods in ArcGIS. As the last step, receiver operating characteristic (ROC) curve was drawn and the area under the curve (AUC) was calculated for quantitative assessment of each model. The results showed that the best model to estimate the susceptibility to flooding in Haraz Watershed was SI model with the prediction and success rates of 99.71 and 98.72%, respectively, followed by Wf and SE models with the AUC values of 98.1 and 96.57% for the success rate, and 97.6 and 92.42% for the prediction rate, respectively. In the SI and Wf models, the highest and lowest important parameters were the distance from river and geology. Flood susceptibility maps are informative for managers and decision makers in Haraz Watershed in order to contemplate measures to reduce human and financial losses.

  18. Landslide susceptibility analysis in central Vietnam based on an incomplete landslide inventory: Comparison of a new method to calculate weighting factors by means of bivariate statistics

    Science.gov (United States)

    Meinhardt, Markus; Fink, Manfred; Tünschel, Hannes

    2015-04-01

    Vietnam is regarded as a country strongly impacted by climate change. Population and economic growth result in additional pressures on the ecosystems in the region. In particular, changes in landuse and precipitation extremes lead to a higher landslide susceptibility in the study area (approx. 12,400 km2), located in central Vietnam and impacted by a tropical monsoon climate. Hence, this natural hazard is a serious problem in the study area. A probability assessment of landslides is therefore undertaken through the use of bivariate statistics. However, the landslide inventory based only on field campaigns does not cover the whole area. To avoid a systematic bias due to the limited mapping area, the investigated regions are depicted as the viewshed in the calculations. On this basis, the distribution of the landslides is evaluated in relation to the maps of 13 parameters, showing the strongest correlation to distance to roads and precipitation increase. An additional weighting of the input parameters leads to better results, since some parameters contribute more to landslides than others. The method developed in this work is based on the validation of different parameter sets used within the statistical index method. It is called "omit error" because always omitting another parameter leads to the weightings, which describe how strong every single parameter improves or reduces the objective function. Furthermore, this approach is used to find a better input parameter set by excluding some parameters. After this optimization, nine input parameters are left, and they are weighted by the omit error method, providing the best susceptibility map with a success rate of 92.9% and a prediction rate of 92.3%. This is an improvement of 4.4% and 4.2%, respectively, compared to the basic statistical index method with the 13 input parameters.

  19. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...... and finally data analysis based on the ISO approach. The device was calibrated and tested on commercially available laser systems. It showed good reproducibility. It was the target to be able to measure CW lasers with a power up to 200 W, focused down to spot diameters in the range of 10µm. In order...

  20. HPLC-MS/MS analyses show that the near-Starchless aps1 and pgm leaves accumulate wild type levels of ADPglucose: further evidence for the occurrence of important ADPglucose biosynthetic pathway(s alternative to the pPGI-pPGM-AGP pathway.

    Directory of Open Access Journals (Sweden)

    Abdellatif Bahaji

    Full Text Available In leaves, it is widely assumed that starch is the end-product of a metabolic pathway exclusively taking place in the chloroplast that (a involves plastidic phosphoglucomutase (pPGM, ADPglucose (ADPG pyrophosphorylase (AGP and starch synthase (SS, and (b is linked to the Calvin-Benson cycle by means of the plastidic phosphoglucose isomerase (pPGI. This view also implies that AGP is the sole enzyme producing the starch precursor molecule, ADPG. However, mounting evidence has been compiled pointing to the occurrence of important sources, other than the pPGI-pPGM-AGP pathway, of ADPG. To further explore this possibility, in this work two independent laboratories have carried out HPLC-MS/MS analyses of ADPG content in leaves of the near-starchless pgm and aps1 mutants impaired in pPGM and AGP, respectively, and in leaves of double aps1/pgm mutants grown under two different culture conditions. We also measured the ADPG content in wild type (WT and aps1 leaves expressing in the plastid two different ADPG cleaving enzymes, and in aps1 leaves expressing in the plastid GlgC, a bacterial AGP. Furthermore, we measured the ADPG content in ss3/ss4/aps1 mutants impaired in starch granule initiation and chloroplastic ADPG synthesis. We found that, irrespective of their starch contents, pgm and aps1 leaves, WT and aps1 leaves expressing in the plastid ADPG cleaving enzymes, and aps1 leaves expressing in the plastid GlgC accumulate WT ADPG content. In clear contrast, ss3/ss4/aps1 leaves accumulated ca. 300 fold-more ADPG than WT leaves. The overall data showed that, in Arabidopsis leaves, (a there are important ADPG biosynthetic pathways, other than the pPGI-pPGM-AGP pathway, (b pPGM and AGP are not major determinants of intracellular ADPG content, and (c the contribution of the chloroplastic ADPG pool to the total ADPG pool is low.

  1. SU-C-BRC-04: Efficient Dose Calculation Algorithm for FFF IMRT with a Simplified Bivariate Gaussian Source Model

    Energy Technology Data Exchange (ETDEWEB)

    Li, F; Park, J; Barraclough, B; Lu, B; Li, J; Liu, C; Yan, G [University Florida, Gainesville, FL (United States)

    2016-06-15

    Purpose: To develop an efficient and accurate independent dose calculation algorithm with a simplified analytical source model for the quality assurance and safe delivery of Flattening Filter Free (FFF)-IMRT on an Elekta Versa HD. Methods: The source model consisted of a point source and a 2D bivariate Gaussian source, respectively modeling the primary photons and the combined effect of head scatter, monitor chamber backscatter and collimator exchange effect. The in-air fluence was firstly calculated by back-projecting the edges of beam defining devices onto the source plane and integrating the visible source distribution. The effect of the rounded MLC leaf end, tongue-and-groove and interleaf transmission was taken into account in the back-projection. The in-air fluence was then modified with a fourth degree polynomial modeling the cone-shaped dose distribution of FFF beams. Planar dose distribution was obtained by convolving the in-air fluence with a dose deposition kernel (DDK) consisting of the sum of three 2D Gaussian functions. The parameters of the source model and the DDK were commissioned using measured in-air output factors (Sc) and cross beam profiles, respectively. A novel method was used to eliminate the volume averaging effect of ion chambers in determining the DDK. Planar dose distributions of five head-and-neck FFF-IMRT plans were calculated and compared against measurements performed with a 2D diode array (MapCHECK™) to validate the accuracy of the algorithm. Results: The proposed source model predicted Sc for both 6MV and 10MV with an accuracy better than 0.1%. With a stringent gamma criterion (2%/2mm/local difference), the passing rate of the FFF-IMRT dose calculation was 97.2±2.6%. Conclusion: The removal of the flattening filter represents a simplification of the head structure which allows the use of a simpler source model for very accurate dose calculation. The proposed algorithm offers an effective way to ensure the safe delivery of FFF-IMRT.

  2. Evaluation of Factors Affecting E-Bike Involved Crash and E-Bike License Plate Use in China Using a Bivariate Probit Model

    OpenAIRE

    Guo, Yanyong; Zhou, Jibiao; Wu, Yao; Chen, Jingxu

    2017-01-01

    The primary objective of this study is to evaluate factors affecting e-bike involved crash and license plate use in China. E-bike crashes data were collected from police database and completed through a telephone interview. Noncrash samples were collected by a questionnaire survey. A bivariate probit (BP) model was developed to simultaneously examine the significant factors associated with e-bike involved crash and e-bike license plate and to account for the correlations between them. Margina...

  3. A Hybrid ANN-GA Model to Prediction of Bivariate Binary Responses: Application to Joint Prediction of Occurrence of Heart Block and Death in Patients with Myocardial Infarction.

    Science.gov (United States)

    Mirian, Negin-Sadat; Sedehi, Morteza; Kheiri, Soleiman; Ahmadi, Ali

    2016-01-01

    In medical studies, when the joint prediction about occurrence of two events should be anticipated, a statistical bivariate model is used. Due to the limitations of usual statistical models, other methods such as Artificial Neural Network (ANN) and hybrid models could be used. In this paper, we propose a hybrid Artificial Neural Network-Genetic Algorithm (ANN-GA) model to prediction the occurrence of heart block and death in myocardial infarction (MI) patients simultaneously. For fitting and comparing the models, 263 new patients with definite diagnosis of MI hospitalized in Cardiology Ward of Hajar Hospital, Shahrekord, Iran, from March, 2014 to March, 2016 were enrolled. Occurrence of heart block and death were employed as bivariate binary outcomes. Bivariate Logistic Regression (BLR), ANN and hybrid ANN-GA models were fitted to data. Prediction accuracy was used to compare the models. The codes were written in Matlab 2013a and Zelig package in R3.2.2. The prediction accuracy of BLR, ANN and hybrid ANN-GA models was obtained 77.7%, 83.69% and 93.85% for the training and 78.48%, 84.81% and 96.2% for the test data, respectively. In both training and test data set, hybrid ANN-GA model had better accuracy. ANN model could be a suitable alternative for modeling and predicting bivariate binary responses when the presuppositions of statistical models are not met in actual data. In addition, using optimization methods, such as hybrid ANN-GA model, could improve precision of ANN model.

  4. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis

    OpenAIRE

    Shields, Katherine F.; Bain, Robert E.S.; Cronk, Ryan; Wright, Jim A.; Bartram, Jamie

    2015-01-01

    Background Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. Objectives We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. Methods We performed a bivari...

  5. Rotational distortion in conventional allometric analyses.

    Science.gov (United States)

    Packard, Gary C

    2011-08-01

    Three data sets from the recent literature were submitted to new analyses to illustrate the rotational distortion that commonly accompanies traditional allometric analyses and that often causes allometric equations to be inaccurate and misleading. The first investigation focused on the scaling of evaporative water loss to body mass in passerine birds; the second was concerned with the influence of body size on field metabolic rates of rodents; and the third addressed interspecific variation in kidney mass among primates. Straight lines were fitted to logarithmic transformations by Ordinary Least Squares and Generalized Linear Models, and the resulting equations then were re-expressed as two-parameter power functions in the original arithmetic scales. The re-expressed models were displayed on bivariate graphs together with tracings for equations fitted directly to untransformed data by nonlinear regression. In all instances, models estimated by back-transformation failed to describe major features of the arithmetic distribution whereas equations fitted by nonlinear regression performed quite well. The poor performance of equations based on models fitted to logarithms can be traced to the increased weight and leverage exerted in those analyses by observations for small species and to the decreased weight and leverage exerted by large ones. The problem of rotational distortion can be avoided by performing exploratory analysis on untransformed values and by validating fitted models in the scale of measurement. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Comparison of connectivity analyses for resting state EEG data

    Science.gov (United States)

    Olejarczyk, Elzbieta; Marzetti, Laura; Pizzella, Vittorio; Zappasodi, Filippo

    2017-06-01

    Objective. In the present work, a nonlinear measure (transfer entropy, TE) was used in a multivariate approach for the analysis of effective connectivity in high density resting state EEG data in eyes open and eyes closed. Advantages of the multivariate approach in comparison to the bivariate one were tested. Moreover, the multivariate TE was compared to an effective linear measure, i.e. directed transfer function (DTF). Finally, the existence of a relationship between the information transfer and the level of brain synchronization as measured by phase synchronization value (PLV) was investigated. Approach. The comparison between the connectivity measures, i.e. bivariate versus multivariate TE, TE versus DTF, TE versus PLV, was performed by means of statistical analysis of indexes based on graph theory. Main results. The multivariate approach is less sensitive to false indirect connections with respect to the bivariate estimates. The multivariate TE differentiated better between eyes closed and eyes open conditions compared to DTF. Moreover, the multivariate TE evidenced non-linear phenomena in information transfer, which are not evidenced by the use of DTF. We also showed that the target of information flow, in particular the frontal region, is an area of greater brain synchronization. Significance. Comparison of different connectivity analysis methods pointed to the advantages of nonlinear methods, and indicated a relationship existing between the flow of information and the level of synchronization of the brain.

  7. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  8. The use of bivariate spatial modeling of questionnaire and parasitology data to predict the distribution of Schistosoma haematobium in Coastal Kenya.

    Directory of Open Access Journals (Sweden)

    Hugh J W Sturrock

    Full Text Available Questionnaires of reported blood in urine (BIU distributed through the existing school system provide a rapid and reliable method to classify schools according to the prevalence of Schistosoma haematobium, thereby helping in the targeting of schistosomiasis control. However, not all schools return questionnaires and it is unclear whether treatment is warranted in such schools. This study investigates the use of bivariate spatial modelling of available and multiple data sources to predict the prevalence of S. haematobium at every school along the Kenyan coast.Data from a questionnaire survey conducted by the Kenya Ministry of Education in Coast Province in 2009 were combined with available parasitological and environmental data in a Bayesian bivariate spatial model. This modeled the relationship between BIU data and environmental covariates, as well as the relationship between BIU and S. haematobium infection prevalence, to predict S. haematobium infection prevalence at all schools in the study region. Validation procedures were implemented to assess the predictive accuracy of endemicity classification.The prevalence of BIU was negatively correlated with distance to nearest river and there was considerable residual spatial correlation at small (~15 km spatial scales. There was a predictable relationship between the prevalence of reported BIU and S. haematobium infection. The final model exhibited excellent sensitivity (0.94 but moderate specificity (0.69 in identifying low (<10% prevalence schools, and had poor performance in differentiating between moderate and high prevalence schools (sensitivity 0.5, specificity 1.Schistosomiasis is highly focal and there is a need to target treatment on a school-by-school basis. The use of bivariate spatial modelling can supplement questionnaire data to identify schools requiring mass treatment, but is unable to distinguish between moderate and high prevalence schools.

  9. Risk Aversion in Game Shows

    DEFF Research Database (Denmark)

    Andersen, Steffen; Harrison, Glenn W.; Lau, Morten I.

    2008-01-01

    We review the use of behavior from television game shows to infer risk attitudes. These shows provide evidence when contestants are making decisions over very large stakes, and in a replicated, structured way. Inferences are generally confounded by the subjective assessment of skill in some games...

  10. Measuring performance at trade shows

    DEFF Research Database (Denmark)

    Hansen, Kåre

    2004-01-01

    Trade shows is an increasingly important marketing activity to many companies, but current measures of trade show performance do not adequately capture dimensions important to exhibitors. Based on the marketing literature's outcome and behavior-based control system taxonomy, a model is built...... that captures a outcome-based sales dimension and four behavior-based dimensions (i.e. information-gathering, relationship building, image building, and motivation activities). A 16-item instrument is developed for assessing exhibitors perceptions of their trade show performance. The paper presents evidence...... of the scale's reliability, factor structure, and validity on the basis of analyzing data from independent samples of exhibitors at the international trade shows SIAL (Paris) and ANUGA (Cologne); and it concludes with a discussion of potential managerial applications and implications for future research. New...

  11. A Hybrid Forecasting Model Based on Bivariate Division and a Backpropagation Artificial Neural Network Optimized by Chaos Particle Swarm Optimization for Day-Ahead Electricity Price

    Directory of Open Access Journals (Sweden)

    Zhilong Wang

    2014-01-01

    Full Text Available In the electricity market, the electricity price plays an inevitable role. Nevertheless, accurate price forecasting, a vital factor affecting both government regulatory agencies and public power companies, remains a huge challenge and a critical problem. Determining how to address the accurate forecasting problem becomes an even more significant task in an era in which electricity is increasingly important. Based on the chaos particle swarm optimization (CPSO, the backpropagation artificial neural network (BPANN, and the idea of bivariate division, this paper proposes a bivariate division BPANN (BD-BPANN method and the CPSO-BD-BPANN method for forecasting electricity price. The former method creatively transforms the electricity demand and price to be a new variable, named DV, which is calculated using the division principle, to forecast the day-ahead electricity by multiplying the forecasted values of the DVs and forecasted values of the demand. Next, to improve the accuracy of BD-BPANN, chaos particle swarm optimization and BD-BPANN are synthesized to form a novel model, CPSO-BD-BPANN. In this study, CPSO is utilized to optimize the initial parameters of BD-BPANN to make its output more stable than the original model. Finally, two forecasting strategies are proposed regarding different situations.

  12. Effectiveness of enforcement levels of speed limit and drink driving laws and associated factors – Exploratory empirical analysis using a bivariate ordered probit model

    Directory of Open Access Journals (Sweden)

    Behram Wali

    2017-06-01

    Full Text Available The contemporary traffic safety research comprises little information on quantifying the simultaneous association between drink driving and speeding among fatally injured drivers. Potential correlation between driver's drink driving and speeding behavior poses a substantial methodological concern which needs investigation. This study therefore focused on investigating the simultaneous impact of socioeconomic factors, fatalities, vehicle ownership, health services and highway agency road safety policies on enforcement levels of speed limit and drink driving laws. The effectiveness of enforcement levels of speed limit and drink driving laws has been investigated through development of bivariate ordered probit model using data extricated from WHO's global status report on road safety in 2013. The consistent and intuitive parameter estimates along with statistically significant correlation between response outcomes validates the statistical supremacy of bivariate ordered probit model. The results revealed that fatalities per thousand registered vehicles, hospital beds per hundred thousand population and road safety policies are associated with a likely medium or high effectiveness of enforcement levels of speed limit and drink driving laws, respectively. Also, the model encapsulates the effect of several other agency related variables and socio-economic status on the response outcomes. Marginal effects are reported for analyzing the impact of such factors on intermediate categories of response outcomes. The results of this study are expected to provide necessary insights to elemental enforcement programs. Also, marginal effects of explanatory variables may provide useful directions for formulating effective policy countermeasures for overcoming driver's speeding and drink driving behavior.

  13. Tokyo Motor Show 2003; Tokyo Motor Show 2003

    Energy Technology Data Exchange (ETDEWEB)

    Joly, E.

    2004-01-01

    The text which follows present the different techniques exposed during the 37. Tokyo Motor Show. The report points out the great tendencies of developments of the Japanese automobile industry. The hybrid electric-powered vehicles or those equipped with fuel cells have been highlighted by the Japanese manufacturers which allow considerable budgets in the research of less polluting vehicles. The exposed models, although being all different according to the manufacturer, use always a hybrid system: fuel cell/battery. The manufacturers have stressed too on the intelligent systems for navigation and safety as well as on the design and comfort. (O.M.)

  14. Create a Polarized Light Show.

    Science.gov (United States)

    Conrad, William H.

    1992-01-01

    Presents a lesson that introduces students to polarized light using a problem-solving approach. After illustrating the concept using a slinky and poster board with a vertical slot, students solve the problem of creating a polarized light show using Polya's problem-solving methods. (MDH)

  15. Producing Talent and Variety Shows.

    Science.gov (United States)

    Szabo, Chuck

    1995-01-01

    Identifies key aspects of producing talent shows and outlines helpful hints for avoiding pitfalls and ensuring a smooth production. Presents suggestions concerning publicity, scheduling, and support personnel. Describes types of acts along with special needs and problems specific to each act. Includes a list of resources. (MJP)

  16. Complexity analyses show two distinct types of nonlinear dynamics in short heart period variability recordings

    Science.gov (United States)

    Porta, Alberto; Bari, Vlasta; Marchi, Andrea; De Maria, Beatrice; Cysarz, Dirk; Van Leeuwen, Peter; Takahashi, Anielle C. M.; Catai, Aparecida M.; Gnecchi-Ruscone, Tomaso

    2015-01-01

    Two diverse complexity metrics quantifying time irreversibility and local prediction, in connection with a surrogate data approach, were utilized to detect nonlinear dynamics in short heart period (HP) variability series recorded in fetuses, as a function of the gestational period, and in healthy humans, as a function of the magnitude of the orthostatic challenge. The metrics indicated the presence of two distinct types of nonlinear HP dynamics characterized by diverse ranges of time scales. These findings stress the need to render more specific the analysis of nonlinear components of HP dynamics by accounting for different temporal scales. PMID:25806002

  17. Contributory fault and level of personal injury to drivers involved in head-on collisions: Application of copula-based bivariate ordinal models.

    Science.gov (United States)

    Wali, Behram; Khattak, Asad J; Xu, Jingjing

    2018-01-01

    The main objective of this study is to simultaneously investigate the degree of injury severity sustained by drivers involved in head-on collisions with respect to fault status designation. This is complicated to answer due to many issues, one of which is the potential presence of correlation between injury outcomes of drivers involved in the same head-on collision. To address this concern, we present seemingly unrelated bivariate ordered response models by analyzing the joint injury severity probability distribution of at-fault and not-at-fault drivers. Moreover, the assumption of bivariate normality of residuals and the linear form of stochastic dependence implied by such models may be unduly restrictive. To test this, Archimedean copula structures and normal mixture marginals are integrated into the joint estimation framework, which can characterize complex forms of stochastic dependencies and non-normality in residual terms. The models are estimated using 2013 Virginia police reported two-vehicle head-on collision data, where exactly one driver is at-fault. The results suggest that both at-fault and not-at-fault drivers sustained serious/fatal injuries in 8% of crashes, whereas, in 4% of the cases, the not-at-fault driver sustained a serious/fatal injury with no injury to the at-fault driver at all. Furthermore, if the at-fault driver is fatigued, apparently asleep, or has been drinking the not-at-fault driver is more likely to sustain a severe/fatal injury, controlling for other factors and potential correlations between the injury outcomes. While not-at-fault vehicle speed affects injury severity of at-fault driver, the effect is smaller than the effect of at-fault vehicle speed on at-fault injury outcome. Contrarily, and importantly, the effect of at-fault vehicle speed on injury severity of not-at-fault driver is almost equal to the effect of not-at-fault vehicle speed on injury outcome of not-at-fault driver. Compared to traditional ordered probability

  18. "Medicine show." Alice in Doctorland.

    Science.gov (United States)

    1987-01-01

    This is an excerpt from the script of a 1939 play provided to the Institute of Social Medicine and Community Health by the Library of Congress Federal Theater Project Collection at George Mason University Library, Fairfax, Virginia, pages 2-1-8 thru 2-1-14. The Federal Theatre Project (FTP) was part of the New Deal program for the arts 1935-1939. Funded by the Works Progress Administration (WPA) its goal was to employ theater professionals from the relief rolls. A number of FTP plays deal with aspects of medicine and public health. Pageants, puppet shows and documentary plays celebrated progress in medical science while examining social controversies in medical services and the public health movement. "Medicine Show" sharply contrasts technological wonders with social backwardness. The play was rehearsed by the FTP but never opened because funding ended. A revised version ran on Broadway in 1940. The preceding comments are adapted from an excellent, well-illustrated review of five of these plays by Barabara Melosh: "The New Deal's Federal Theatre Project," Medical Heritage, Vol. 2, No. 1 (Jan/Feb 1986), pp. 36-47.

  19. Periodic safety analyses

    International Nuclear Information System (INIS)

    Gouffon, A.; Zermizoglou, R.

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989

  20. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  1. Point processes statistics of stable isotopes: analysing water uptake patterns in a mixed stand of Aleppo pine and Holm oak

    Directory of Open Access Journals (Sweden)

    Carles Comas

    2015-04-01

    Full Text Available Aim of study: Understanding inter- and intra-specific competition for water is crucial in drought-prone environments. However, little is known about the spatial interdependencies for water uptake among individuals in mixed stands. The aim of this work was to compare water uptake patterns during a drought episode in two common Mediterranean tree species, Quercus ilex L. and Pinus halepensis Mill., using the isotope composition of xylem water (δ18O, δ2H as hydrological marker. Area of study: The study was performed in a mixed stand, sampling a total of 33 oaks and 78 pines (plot area= 888 m2. We tested the hypothesis that both species uptake water differentially along the soil profile, thus showing different levels of tree-to-tree interdependency, depending on whether neighbouring trees belong to one species or the other. Material and Methods: We used pair-correlation functions to study intra-specific point-tree configurations and the bivariate pair correlation function to analyse the inter-specific spatial configuration. Moreover, the isotopic composition of xylem water was analysed as a mark point pattern. Main results: Values for Q. ilex (δ18O = –5.3 ± 0.2‰, δ2H = –54.3 ± 0.7‰ were significantly lower than for P. halepensis (δ18O = –1.2 ± 0.2‰, δ2H = –25.1 ± 0.8‰, pointing to a greater contribution of deeper soil layers for water uptake by Q. ilex. Research highlights: Point-process analyses revealed spatial intra-specific dependencies among neighbouring pines, showing neither oak-oak nor oak-pine interactions. This supports niche segregation for water uptake between the two species.

  2. Testing for Bivariate Spherical Symmetry

    NARCIS (Netherlands)

    Einmahl, J.H.J.; Gantner, M.

    2010-01-01

    An omnibus test for spherical symmetry in R2 is proposed, employing localized empirical likelihood. The thus obtained test statistic is distri- bution-free under the null hypothesis. The asymptotic null distribution is established and critical values for typical sample sizes, as well as the

  3. Testing for bivariate spherical symmetry

    OpenAIRE

    Einmahl, J.H.J.; Gantner, M.

    2012-01-01

    An omnibus test for spherical symmetry in R2 is proposed, employing localized empirical likelihood. The thus obtained test statistic is distri- bution-free under the null hypothesis. The asymptotic null distribution is established and critical values for typical sample sizes, as well as the asymptotic ones, are presented. In a simulation study, the good perfor- mance of the test is demonstrated. Furthermore, a real data example is presented.

  4. BIMOND3, Monotone Bivariate Interpolation

    International Nuclear Information System (INIS)

    Fritsch, F.N.; Carlson, R.E.

    2001-01-01

    1 - Description of program or function: BIMOND is a FORTRAN-77 subroutine for piecewise bi-cubic interpolation to data on a rectangular mesh, which reproduces the monotonousness of the data. A driver program, BIMOND1, is provided which reads data, computes the interpolating surface parameters, and evaluates the function on a mesh suitable for plotting. 2 - Method of solution: Monotonic piecewise bi-cubic Hermite interpolation is used. 3 - Restrictions on the complexity of the problem: The current version of the program can treat data which are monotone in only one of the independent variables, but cannot handle piecewise monotone data

  5. Statistical Modeling of Bivariate Data.

    Science.gov (United States)

    1982-08-01

    to one. Following Crain (1974), one may consider order m approximators m log f111(X) - k k (x) - c(e), asx ;b. (4.4.5) k,-r A m and attempt to find...literature. Consider the approximate model m log fn (x) = 7 ekk(x) + a G(x), aSx ;b, (44.8) " k=-Mn ’ where G(x) is a Gaussian process and n is a

  6. Testing for bivariate spherical symmetry

    NARCIS (Netherlands)

    Einmahl, J.H.J.; Gantner, M.

    2012-01-01

    An omnibus test for spherical symmetry in R2 is proposed, employing localized empirical likelihood. The thus obtained test statistic is distribution free under the null hypothesis. The asymptotic null distribution is established and critical values for typical sample sizes, as well as the asymptotic

  7. Dialogisk kommunikationsteoretisk analyse

    DEFF Research Database (Denmark)

    Phillips, Louise Jane

    2018-01-01

    analysemetode, der er blevet udviklet inden for dialogisk kommunikationsforskning - The Integrated Framework for Analysing Dialogic Knowledge Production and Communication (IFADIA). IFADIA-metoden bygger på en kombination af Bakhtins dialogteori og Foucaults teori om magt/viden og diskurs. Metoden er beregnet...

  8. Meta-analyses

    NARCIS (Netherlands)

    Hendriks, Maria A.; Luyten, Johannes W.; Scheerens, Jaap; Sleegers, P.J.C.; Scheerens, J

    2014-01-01

    In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used

  9. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    . Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  10. Wavelet Analyses and Applications

    Science.gov (United States)

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  11. Filmstil - teori og analyse

    DEFF Research Database (Denmark)

    Hansen, Lennard Højbjerg

    Filmstil påvirker på afgørende vis vores oplevelse af film. Men filmstil, måden, de levende billeder organiserer fortællingen på fylder noget mindre end filmens handling, når vi taler om film. Filmstil - teori og analyse er en rigt eksemplificeret præsentation, kritik og videreudvikling af...

  12. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  13. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis.

    Science.gov (United States)

    Shields, Katherine F; Bain, Robert E S; Cronk, Ryan; Wright, Jim A; Bartram, Jamie

    2015-12-01

    Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. We performed a bivariate random-effects meta-analysis of 45 studies, identified through a systematic review, that reported either the proportion of samples free of fecal indicator bacteria and/or individual sample bacteria counts for source and HSW, disaggregated by supply type. Water quality deteriorated substantially between source and stored water. The mean percentage of contaminated samples (noncompliance) at the source was 46% (95% CI: 33, 60%), whereas mean noncompliance in HSW was 75% (95% CI: 64, 84%). Water supply type was significantly associated with noncompliance at the source (p water (OR = 0.2; 95% CI: 0.1, 0.5) and HSW (OR = 0.3; 95% CI: 0.2, 0.8) from piped supplies had significantly lower odds of contamination compared with non-piped water, potentially due to residual chlorine. Piped water is less likely to be contaminated compared with other water supply types at both the source and in HSW. A focus on upgrading water services to piped supplies may help improve safety, including for those drinking stored water.

  14. Modelling the vicious circle between obesity and physical activity in children and adolescents using a bivariate probit model with endogenous regressors.

    Science.gov (United States)

    Yeh, C-Y; Chen, L-J; Ku, P-W; Chen, C-M

    2015-01-01

    The increasing prevalence of obesity in children and adolescents has become one of the most important public health issues around the world. Lack of physical activity is a risk factor for obesity, while being obese could reduce the likelihood of participating in physical activity. Failing to account for the endogeneity between obesity and physical activity would result in biased estimation. This study investigates the relationship between overweight and physical activity by taking endogeneity into consideration. It develops an endogenous bivariate probit model estimated by the maximum likelihood method. The data included 4008 boys and 4197 girls in the 5th-9th grades in Taiwan in 2007-2008. The relationship between overweight and physical activity is significantly negative in the endogenous model, but insignificant in the comparative exogenous model. This endogenous relationship presents a vicious circle in which lower levels of physical activity lead to overweight, while those who are already overweight engage in less physical activity. The results not only reveal the importance of endogenous treatment, but also demonstrate the robust negative relationship between these two factors. An emphasis should be put on overweight and obese children and adolescents in order to break the vicious circle. Promotion of physical activity by appropriate counselling programmes and peer support could be effective in reducing the prevalence of obesity in children and adolescents.

  15. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  16. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  17. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  18. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  19. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  20. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  1. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  2. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  3. Analyse af elbilers forbrug

    DEFF Research Database (Denmark)

    Andersen, Ove; Krogh, Benjamin Bjerre; Torp, Kristian

    2014-01-01

    Denne rapport undersøger GPS og CAN bus datagrundlaget opsamlet ved kørsel med elbiler og analysere på elbilers forbrug. Analyserne er baseret på godt 133 millioner GPS og CAN bus målinger opsamlet fra 164 elbiler (Citroen C-Zero, Mitsubishi iMiev og Peugeot Ion) i kalenderåret 2012....... For datagrundlaget kan det konstateres, at der er behov for væsentlige, men simple opstramninger for fremadrettet at gøre det nemmere at anvende GPS/CAN bus data fra elbiler i andre analyser. Brugen af elbiler er sammenlignet med brændstofbiler og konklusionen er, at elbiler generelt kører 10-15 km/t langsommere på...

  4. Analyse de "La banlieue"

    Directory of Open Access Journals (Sweden)

    Nelly Morais

    2006-11-01

    Full Text Available 1. Préambule - Conditions de réalisation de la présente analyse Un groupe d'étudiants de master 1 de FLE de l'université Paris 3 (donc des étudiants en didactique des langues se destinant à l'enseignement du FLE a observé le produit au cours d'un module sur les TIC (Technologies de l'Information et de la Communication et la didactique des langues. Une discussion s'est ensuite engagée sur le forum d'une plate-forme de formation à distance à partir de quelques questions posées par l'enseigna...

  5. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  6. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  7. Analysing international relations

    DEFF Research Database (Denmark)

    Corry, Olaf

    2014-01-01

    theories to ‘explain’ international relations and distinguishes between different kinds of explanation. In Section 4 I look at how different theories have been grouped – first according to their underlying views of what is valid knowledge, and second in terms of different accounts of how history works....... matters by depicting reality in new ways. I then show how different theories rely on different ‘pictures’ of what makes up the international system. Section 2 shows how theories differ in terms of their scope, their aims and their purposes. Section 3 explores some of the choices to be made when using...

  8. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  9. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  10. Analyse van kwalitatief onderzoeksmateriaal

    NARCIS (Netherlands)

    Wester, F.P.J.

    2004-01-01

    Qualitative research is characterised by its analytical goals: the development of categories, the elaboration of concepts or the formulation of a theory. Because of this analytical openness, the research design shows successive phases, each with its own objective and specific demands for data

  11. Systematic review with meta-analysis: faecal occult blood tests show lower colorectal cancer detection rates in the proximal colon in colonoscopy-verified diagnostic studies.

    Science.gov (United States)

    Hirai, H W; Tsoi, K K F; Chan, J Y C; Wong, S H; Ching, J Y L; Wong, M C S; Wu, J C Y; Chan, F K L; Sung, J J Y; Ng, S C

    2016-04-01

    The performance of faecal occult blood tests (FOBTs) to screen proximally located colorectal cancer (CRC) has produced inconsistent results. To assess in a meta-analysis, the diagnostic accuracy of FOBTs for relative detection of CRC according to anatomical location of CRC. Diagnostic studies including both symptomatic and asymptomatic cohorts assessing performance of FOBTs for CRC were searched from MEDINE and EMBASE. Primary outcome was accuracy of FOBTs according to the anatomical location of CRC. Bivariate random-effects model was used. Subgroup analyses were performed to evaluate test performance of guaiac-based FOBT (gFOBT) and immunochemical-based FOBT (iFOBT). Thirteen studies, with 17 cohorts, reporting performance of FOBT were included; a total of 26 342 patients (mean age 58.9 years; 58.1% male) underwent both colonoscopy and FOBT. Pooled sensitivity, specificity, positive likelihood ratio and negative likelihood ratio of FOBTs for CRC detection in the proximal colon were 71.2% (95% CI 61.3-79.4%), 93.6% (95% CI 90.7-95.7%), 11.1 (95% CI 7.8-15.8) and 0.3 (95% CI 0.2-0.4) respectively. Corresponding findings for CRC detection in distal colon were 80.1% (95% CI 70.9-87.0%), 93.6% (95% CI 90.7-95.7%), 12.6 (95% CI 8.8-18.1) and 0.2 (95% CI 0.1-0.3). The area-under-curve for FOBT detection for proximal and distal CRC were 90% vs. 94% (P = 0.0143). Both gFOBT and iFOBT showed significantly lower sensitivity but comparable specificity for the detection of proximally located CRC compared with distal CRC. Faecal occult blood tests, both guaiac- and immunochemical-based, show better diagnostic performance for the relative detection of colorectal cancer in the distal colon than in the proximal bowel. © 2016 John Wiley & Sons Ltd.

  12. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    Science.gov (United States)

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  13. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  14. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  15. Molecular ecological network analyses

    Directory of Open Access Journals (Sweden)

    Deng Ye

    2012-05-01

    Full Text Available Abstract Background Understanding the interaction among different species within a community and their responses to environmental changes is a central goal in ecology. However, defining the network structure in a microbial community is very challenging due to their extremely high diversity and as-yet uncultivated status. Although recent advance of metagenomic technologies, such as high throughout sequencing and functional gene arrays, provide revolutionary tools for analyzing microbial community structure, it is still difficult to examine network interactions in a microbial community based on high-throughput metagenomics data. Results Here, we describe a novel mathematical and bioinformatics framework to construct ecological association networks named molecular ecological networks (MENs through Random Matrix Theory (RMT-based methods. Compared to other network construction methods, this approach is remarkable in that the network is automatically defined and robust to noise, thus providing excellent solutions to several common issues associated with high-throughput metagenomics data. We applied it to determine the network structure of microbial communities subjected to long-term experimental warming based on pyrosequencing data of 16 S rRNA genes. We showed that the constructed MENs under both warming and unwarming conditions exhibited topological features of scale free, small world and modularity, which were consistent with previously described molecular ecological networks. Eigengene analysis indicated that the eigengenes represented the module profiles relatively well. In consistency with many other studies, several major environmental traits including temperature and soil pH were found to be important in determining network interactions in the microbial communities examined. To facilitate its application by the scientific community, all these methods and statistical tools have been integrated into a comprehensive Molecular Ecological

  16. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  17. Late-night talk show v USA

    OpenAIRE

    Halamásek, Šimon

    2013-01-01

    This thesis focuses on the history of talk show in USA with emphasis on its specific form, which is late-night talk show. The first chapter focuses on the creation of new television networks and the overall state of american broadcasting during the first era of the television talk show format. The thesis briefly describes radio broadcasting which served not only as an important source of inspiration for television but also as a starting platform for most talk show hosts. Next chapter theoreti...

  18. Multi-level Bayesian analyses for single- and multi-vehicle freeway crashes.

    Science.gov (United States)

    Yu, Rongjie; Abdel-Aty, Mohamed

    2013-09-01

    This study presents multi-level analyses for single- and multi-vehicle crashes on a mountainous freeway. Data from a 15-mile mountainous freeway section on I-70 were investigated. Both aggregate and disaggregate models for the two crash conditions were developed. Five years of crash data were used in the aggregate investigation, while the disaggregate models utilized one year of crash data along with real-time traffic and weather data. For the aggregate analyses, safety performance functions were developed for the purpose of revealing the contributing factors for each crash type. Two methodologies, a Bayesian bivariate Poisson-lognormal model and a Bayesian hierarchical Poisson model with correlated random effects, were estimated to simultaneously analyze the two crash conditions with consideration of possible correlations. Except for the factors related to geometric characteristics, two exposure parameters (annual average daily traffic and segment length) were included. Two different sets of significant explanatory and exposure variables were identified for the single-vehicle (SV) and multi-vehicle (MV) crashes. It was found that the Bayesian bivariate Poisson-lognormal model is superior to the Bayesian hierarchical Poisson model, the former with a substantially lower DIC and more significant variables. In addition to the aggregate analyses, microscopic real-time crash risk evaluation models were developed for the two crash conditions. Multi-level Bayesian logistic regression models were estimated with the random parameters accounting for seasonal variations, crash-unit-level diversity and segment-level random effects capturing unobserved heterogeneity caused by the geometric characteristics. The model results indicate that the effects of the selected variables on crash occurrence vary across seasons and crash units; and that geometric characteristic variables contribute to the segment variations: the more unobserved heterogeneity have been accounted, the better

  19. Acculturation, Cultivation, and Daytime TV Talk Shows.

    Science.gov (United States)

    Woo, Hyung-Jin; Dominick, Joseph R.

    2003-01-01

    Explores the cultivation phenomenon among international college students in the United States by examining the connection between levels of acculturation, daytime TV talk show viewing, and beliefs about social reality. Finds that students who scored low on acculturation and watched a great deal of daytime talk shows had a more negative perception…

  20. Effects of Talk Show Viewing on Adolescents.

    Science.gov (United States)

    Davis, Stacy; Mares, Marie-Louise

    1998-01-01

    Investigates the effects of talk-show viewing on high-school students' social-reality beliefs. Supports the hypothesis that viewers overestimate the frequency of deviant behaviors; does not find support for the hypothesis that viewers become desensitized to the suffering of others; and finds that talk-show viewing was positively related, among…

  1. Multiple Imputation for Network Analyses

    NARCIS (Netherlands)

    Krause, Robert; Huisman, Mark; Steglich, Christian; Snijders, Thomas

    2016-01-01

    Missing data on network ties is a fundamental problem for network analyses. The biases induced by missing edge data, even when missing completely at random (MCAR), are widely acknowledged and problematic for network analyses (Kossinets, 2006; Huisman & Steglich, 2008; Huisman, 2009). Although

  2. Genomic analyses of the Chlamydia trachomatis core genome show an association between chromosomal genome, plasmid type and disease

    NARCIS (Netherlands)

    Versteeg, Bart; Bruisten, Sylvia M.; Pannekoek, Yvonne; Jolley, Keith A.; Maiden, Martin C. J.; van der Ende, Arie; Harrison, Odile B.

    2018-01-01

    Background: Chlamydia trachomatis (Ct) plasmid has been shown to encode genes essential for infection. We evaluated the population structure of Ct using whole-genome sequence data (WGS). In particular, the relationship between the Ct genome, plasmid and disease was investigated. Results: WGS data

  3. Community-level physiological profiling analyses show potential to identify the copiotrophic bacteria present in soil environments

    Czech Academy of Sciences Publication Activity Database

    Lladó, Salvador; Baldrian, Petr

    2017-01-01

    Roč. 12, č. 2 (2017), s. 1-9, č. článku e0171638. E-ISSN 1932-6203 R&D Projects: GA ČR(CZ) GP14-09040P; GA MŠk(CZ) LD15086 Institutional support: RVO:61388971 Keywords : SUBSTRATE UTILIZATION PATTERNS * CARBON-SOURCE UTILIZATION * MICROBIAL COMMUNITIES Subject RIV: EE - Microbiology, Virology OBOR OECD: Microbiology Impact factor: 2.806, year: 2016

  4. Tract-Specific Analyses of Diffusion Tensor Imaging Show Widespread White Matter Compromise in Autism Spectrum Disorder

    Science.gov (United States)

    Shukla, Dinesh K.; Keehn, Brandon; Muller, Ralph-Axel

    2011-01-01

    Background: Previous diffusion tensor imaging (DTI) studies have shown white matter compromise in children and adults with autism spectrum disorder (ASD), which may relate to reduced connectivity and impaired function of distributed networks. However, tract-specific evidence remains limited in ASD. We applied tract-based spatial statistics (TBSS)…

  5. Foodstuff analyses show that seafood and water are major perfluoroalkyl acids (PFAAs) sources to humans in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jin-Ju; Lee, Ji-Woo [Department of Civil and Environmental Engineering, Pusan National University, Busan, 609-735 (Korea, Republic of); Kim, Seung-Kyu [Department of Marine Science, College of Natural Sciences, Incheon National University, Incheon, 406-772 (Korea, Republic of); Oh, Jeong-Eun, E-mail: jeoh@pusan.ac.kr [Department of Civil and Environmental Engineering, Pusan National University, Busan, 609-735 (Korea, Republic of)

    2014-08-30

    Graphical abstract: - Highlights: • 16 PFAAs in 397 samples of 66 food types and 34 tap water samples were analyzed. • Dietary exposure to PFAAs was estimated by using the PFAAs measured concentrations. • The major contributors of PFAAs dietary exposure were confirmed. - Abstract: We measured concentrations of PFAAs in 397 foods, of 66 types, in Korea, and determined the daily human dietary PFAAs intake and the contribution of each foodstuff to that intake. The PFAAs concentration in the 66 different food types ranged from below the detection limit to 48.3 ng/g. Perfluorooctane sulfonate (PFOS) and long-chain perfluorocarboxylic acids (PFCAs) were the dominant PFAAs in fish, shellfish, and processed foods, while perfluorooctanoic acid (PFOA) and short-chain PFCAs dominated dairy foodstuffs and beverages. The Korean adult dietary intake ranges, estimated for a range of scenarios, were 0.60–3.03 and 0.17–1.68 ng kg{sup −1} bw d{sup −1} for PFOS and PFOA, respectively, which were lower than the total daily intake limits suggested by European Food Safety Authority (PFOS: 150 ng kg{sup −1} bw d{sup −1}; PFOA: 1500 ng kg{sup −1} bw d{sup −1}). The major contributors to PFAAs dietary exposure varied with subject age and PFAAs. For example, fish was a major contributor of PFOS but dairy foods were major contributors of PFOA. However, tap water was a major contributor to PFOA intake when it was the main source of drinking water (rather than bottled water)

  6. Career development at London Vet Show.

    Science.gov (United States)

    2016-09-03

    Are you considering a career change? Perhaps you want help to develop within your current role? Either way, you will find a relevant session in the BVA Career Development stream at the London Vet Show in November. British Veterinary Association.

  7. Comparison and validation of shallow landslides susceptibility maps generated by bi-variate and multi-variate linear probabilistic GIS-based techniques. A case study from Ribeira Quente Valley (S. Miguel Island, Azores)

    Science.gov (United States)

    Marques, R.; Amaral, P.; Zêzere, J. L.; Queiroz, G.; Goulart, C.

    2009-04-01

    Slope instability research and susceptibility mapping is a fundamental component of hazard assessment and is of extreme importance for risk mitigation, land-use management and emergency planning. Landslide susceptibility zonation has been actively pursued during the last two decades and several methodologies are still being improved. Among all the methods presented in the literature, indirect quantitative probabilistic methods have been extensively used. In this work different linear probabilistic methods, both bi-variate and multi-variate (Informative Value, Fuzzy Logic, Weights of Evidence and Logistic Regression), were used for the computation of the spatial probability of landslide occurrence, using the pixel as mapping unit. The methods used are based on linear relationships between landslides and 9 considered conditioning factors (altimetry, slope angle, exposition, curvature, distance to streams, wetness index, contribution area, lithology and land-use). It was assumed that future landslides will be conditioned by the same factors as past landslides in the study area. The presented work was developed for Ribeira Quente Valley (S. Miguel Island, Azores), a study area of 9,5 km2, mainly composed of volcanic deposits (ash and pumice lapilli) produced by explosive eruptions in Furnas Volcano. This materials associated to the steepness of the slopes (38,9% of the area has slope angles higher than 35°, reaching a maximum of 87,5°), make the area very prone to landslide activity. A total of 1.495 shallow landslides were mapped (at 1:5.000 scale) and included in a GIS database. The total affected area is 401.744 m2 (4,5% of the study area). Most slope movements are translational slides frequently evolving into debris-flows. The landslides are elongated, with maximum length generally equivalent to the slope extent, and their width normally does not exceed 25 m. The failure depth rarely exceeds 1,5 m and the volume is usually smaller than 700 m3. For modelling

  8. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show...

  9. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  10. Microbiological And Physicochemical Analyses Of Oil Contaminated ...

    African Journals Online (AJOL)

    Michael Horsfall

    The physicochemical properties of the soil samples analysed shows the pH ... Keywords: Oil contaminated soil, microbial isolates, mechanical workshops and physicochemical parameters. Pollution of the environment by petroleum ... strains capable of degrading Poly aromatic hydrocarbons have been isolated from soil and.

  11. Ergonomic analyses of downhill skiing.

    Science.gov (United States)

    Clarys, J P; Publie, J; Zinzen, E

    1994-06-01

    The purpose of this study was to provide electromyographic feedback for (1) pedagogical advice in motor learning, (2) the ergonomics of materials choice and (3) competition. For these purposes: (1) EMG data were collected for the Stem Christie, the Stem Turn and the Parallel Christie (three basic ski initiation drills) and verified for the complexity of patterns; (2) integrated EMG (iEMG) and linear envelopes (LEs) were analysed from standardized positions, motions and slopes using compact, soft and competition skis; (3) in a simulated 'parallel special slalom', the muscular activity pattern and intensity of excavated and flat snow conditions were compared. The EMG data from the three studies were collected on location in the French Alps (Tignes). The analog raw EMG was recorded on the slopes with a portable seven-channel FM recorder (TEAC MR30) and with pre-amplified bipolar surface electrodes supplied with a precision instrumentation amplifier (AD 524, Analog Devices, Norwood, USA). The raw signal was full-wave rectified and enveloped using a moving average principle. This linear envelope was normalized according to the highest peak amplitude procedure per subject and was integrated in order to obtain a reference of muscular intensity. In the three studies and for all subjects (elite skiers: n = 25 in studies 1 and 2, n = 6 in study 3), we found a high level of co-contractions in the lower limb extensors and flexors, especially during the extension phase of the ski movement. The Stem Christie and the Parallel Christie showed higher levels of rhythmic movement (92 and 84%, respectively).(ABSTRACT TRUNCATED AT 250 WORDS)

  12. Online Italian fandoms of American TV shows

    Directory of Open Access Journals (Sweden)

    Eleonora Benecchi

    2015-06-01

    Full Text Available The Internet has changed media fandom in two main ways: it helps fans connect with each other despite physical distance, leading to the formation of international fan communities; and it helps fans connect with the creators of the TV show, deepening the relationship between TV producers and international fandoms. To assess whether Italian fan communities active online are indeed part of transnational online communities and whether the Internet has actually altered their relationship with the creators of the original text they are devoted to, qualitative analysis and narrative interviews of 26 Italian fans of American TV shows were conducted to explore the fan-producer relationship. Results indicated that the online Italian fans surveyed preferred to stay local, rather than using geography-leveling online tools. Further, the sampled Italian fans' relationships with the show runners were mediated or even absent.

  13. 2008 LHC Open Days Physics: the show

    CERN Multimedia

    2008-01-01

    A host of events and activities await visitors to the LHC Open Days on 5 and 6 April. A highlight will be the physics shows funded by the European Physical Society (EPS), which are set to surprise and challenge children and adults alike! School children use their experience of riding a bicycle to understand how planets move around the sun (Copyright : Circus Naturally) Participating in the Circus Naturally show could leave a strange taste in your mouth! (Copyright : Circus Naturally) The Rino Foundation’s experiments with liquid nitrogen can be pretty exciting! (Copyright: The Rino Foundation)What does a bicycle have in common with the solar system? Have you ever tried to weigh air or visualise sound? Ever heard of a vacuum bazooka? If you want to discover the answers to these questions and more then come to the Physics Shows taking place at the CERN O...

  14. Archaea Signal Recognition Particle Shows the Way

    Directory of Open Access Journals (Sweden)

    Christian Zwieb

    2010-01-01

    Full Text Available Archaea SRP is composed of an SRP RNA molecule and two bound proteins named SRP19 and SRP54. Regulated by the binding and hydrolysis of guanosine triphosphates, the RNA-bound SRP54 protein transiently associates not only with the hydrophobic signal sequence as it emerges from the ribosomal exit tunnel, but also interacts with the membrane-associated SRP receptor (FtsY. Comparative analyses of the archaea genomes and their SRP component sequences, combined with structural and biochemical data, support a prominent role of the SRP RNA in the assembly and function of the archaea SRP. The 5e motif, which in eukaryotes binds a 72 kilodalton protein, is preserved in most archaea SRP RNAs despite the lack of an archaea SRP72 homolog. The primary function of the 5e region may be to serve as a hinge, strategically positioned between the small and large SRP domain, allowing the elongated SRP to bind simultaneously to distant ribosomal sites. SRP19, required in eukaryotes for initiating SRP assembly, appears to play a subordinate role in the archaea SRP or may be defunct. The N-terminal A region and a novel C-terminal R region of the archaea SRP receptor (FtsY are strikingly diverse or absent even among the members of a taxonomic subgroup.

  15. A Talk Show from the Past.

    Science.gov (United States)

    Gallagher, Arlene F.

    1991-01-01

    Describes a two-day activity in which elementary students examine voting rights, the right to assemble, and women's suffrage. Explains the game, "Assemble, Reassemble," and a student-produced talk show with five students playing the roles of leaders of the women's suffrage movement. Profiles Elizabeth Cady Stanton, Lucretia Mott, Susan…

  16. See you at London Vet Show.

    Science.gov (United States)

    2016-11-05

    London Vet Show is fast approaching: it takes place from November 17 to 18 and is being held at ExCeL London for the first time. Zoe Davies, marketing manager, highlights some of what BVA is offering at the event. British Veterinary Association.

  17. Show Them You Really Want the Job

    Science.gov (United States)

    Perlmutter, David D.

    2012-01-01

    Showing that one really "wants" the job entails more than just really wanting the job. An interview is part Broadway casting call, part intellectual dating game, part personality test, and part, well, job interview. When there are 300 applicants for a position, many of them will "fit" the required (and even the preferred) skills listed in the job…

  18. Identification of genes showing differential expression profile

    Indian Academy of Sciences (India)

    Suppression subtractive hybridization was used to identify genes showing differential expression profile associated withgrowth rate in skeletal muscle tissue of Landrace weanling pig. Two subtracted cDNA populations were generated from mus-culus longissimus muscle tissues of selected pigs with extreme expected ...

  19. Mike Pentz showing visitors around CESAR

    CERN Multimedia

    CERN PhotoLab

    1964-01-01

    Mike Pentz, leader of the CESAR Group, shows visitors around the 2 MeV electron storage ring. Here they are in the vault of the injector (a 2 MV van de Graaff generator), next to the 2 beam lines, one leading to the ring, the other to the spectrometer.

  20. Identification of genes showing differential expression profile ...

    Indian Academy of Sciences (India)

    Abstract. Suppression subtractive hybridization was used to identify genes showing differential expression profile associated with growth rate in skeletal muscle tissue of Landrace weanling pig. Two subtracted cDNA populations were generated from mus- culus longissimus muscle tissues of selected pigs with extreme ...

  1. Laser entertainment and light shows in education

    Science.gov (United States)

    Sabaratnam, Andrew T.; Symons, Charles

    2002-05-01

    Laser shows and beam effects have been a source of entertainment since its first public performance May 9, 1969, at Mills College in Oakland, California. Since 1997, the Photonics Center, NgeeAnn Polytechnic, Singapore, has been using laser shows as a teaching tool. Students are able to exhibit their creative skills and learn at the same time how lasers are used in the entertainment industry. Students will acquire a number of skills including handling three- phase power supply, operation of cooling system, and laser alignment. Students also acquire an appreciation of the arts, learning about shapes and contours as they develop graphics for the shows. After holography, laser show animation provides a combination of the arts and technology. This paper aims to briefly describe how a krypton-argon laser, galvanometer scanners, a polychromatic acousto-optic modulator and related electronics are put together to develop a laser projector. The paper also describes how students are trained to make their own laser animation and beam effects with music, and at the same time have an appreciation of the operation of a Class IV laser and the handling of optical components.

  2. Tilapia show immunization response against Ich

    Science.gov (United States)

    This study compares the immune response of Nile tilapia and red tilapia against parasite Ichthyophthirius multifiliis (Ich) using a cohabitation challenge model. Both Nile and red tilapia showed strong immune response post immunization with live Ich theronts by IP injection or immersion. Blood serum...

  3. The Last Great American Picture Show

    NARCIS (Netherlands)

    Elsaesser, Thomas; King, Noel; Horwath, Alexander

    2004-01-01

    The Last Great American Picture Show brings together essays by scholars and writers who chart the changing evaluations of the American cinema of the 1970s, sometimes referred to as the decade of the lost generation, but now more and more recognized as the first New Hollywood, without which the

  4. The British Show in Australia, 1985

    Directory of Open Access Journals (Sweden)

    Anthony Bond

    2016-07-01

    Full Text Available In 1984–85, The British Show, an exhibition largely made up of New British Sculpture, was curated for Australia and New Zealand. This essay discusses the context and effects of the exhibition on art in Australia. It also seeks to define the sources of originality and innovation of the artists included.

  5. Identification of genes showing differential expression profile ...

    Indian Academy of Sciences (India)

    Suppression subtractive hybridization was used to identify genes showing differential expression profile associated withgrowth rate in skeletal muscle tissue of Landrace weanling pig. Two subtracted cDNA populations were generated from mus-culus longissimus muscle tissues of selected pigs with extreme expected ...

  6. Do men and women show love differently in marriage?

    Science.gov (United States)

    Schoenfeld, Elizabeth A; Bredow, Carrie A; Huston, Ted L

    2012-11-01

    In Western societies, women are considered more adept than men at expressing love in romantic relationships. Although scholars have argued that this view of love gives short shrift to men's ways of showing love (e.g., Cancian, 1986; Noller, 1996), the widely embraced premise that men and women "love differently" has rarely been examined empirically. Using data collected at four time points over 13 years of marriage, the authors examined whether love is associated with different behaviors for husbands and wives. Multilevel analyses revealed that, counter to theoretical expectations, both genders were equally likely to show love through affection. But whereas wives expressed love by enacting fewer negative or antagonistic behaviors, husbands showed love by initiating sex, sharing leisure activities, and doing household work together with their wives. Overall, the findings indicate that men and women show their love in more nuanced ways than cultural stereotypes suggest.

  7. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  8. La mujer, en los talk shows

    Directory of Open Access Journals (Sweden)

    Antrop. José Gamboa Cetina

    2001-01-01

    Full Text Available Dentro de los medios de comunicación, uno de los que más ha impactado a la población latinoamericana ha sido la televisión, y en la barra programática de las televisoras ha surgido un tipo de programas denominados talk shows. En los últimos meses, la sociedad mexicana ha vivido el "boom" de los talk shows, que en poco tiempo han saturado la barra de la programación vespertina de las dos principales empresas televisivas de la república mexicana. Este fenómeno puede estudiarse desde diversas perspectivas. Sin embargo, por motivos de espacio, en esta ocasión analizaremos su impacto en las mujeres, desde diferentes dimensiones.

  9. Reality, ficción o show

    Directory of Open Access Journals (Sweden)

    Sandra Ruíz Moreno

    2002-01-01

    Full Text Available Para tener un punto de vista claro y objetivo frente a la polémica establecida en torno al programa “Protagonistas de novela” y la tendiente proliferación de los reality show en las parrillas de programación de la televisión colombiana, se realizó un análisis de texto y contenido de dicho programa, intentando definirlo desde sus posibilidades de realidad, ficción y show. Las unidades de análisis y el estudio de su tratamiento arrojaron un alto contenido que gira en torno a las emociones del ser humano relacionadas con la convivencia, tratadas a manera de show y con algunos aportes textuales de ficción, pero sin su elemento mediador básico, el actor, quitándole toda la posibilidad de tener un tratamiento con la profundidad, distancia y ética que requieren los temas de esta índole. El resultado es un formato que sólo busca altos índices de sintonía y que pertenece más a la denominada televisión “trash”, que a una búsqueda de realidad del hombre y mucho menos de sociedad.

  10. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  11. Bivariate Correlation Analysis of the Chemometric Profiles of Chinese Wild Salvia miltiorrhiza Based on UPLC-Qqq-MS and Antioxidant Activities

    Directory of Open Access Journals (Sweden)

    Xiaodan Zhang

    2018-02-01

    Full Text Available To better understand the mechanisms underlying the pharmacological actions of Salvia miltiorrhiza, correlation between the chemical profiles and in vitro antioxidant activities in 50 batches of wild S. miltiorrhiza samples was analyzed. Our ultra-performance liquid chromatography–tandem mass spectrometry analysis detected twelve phenolic acids and five tanshinones and obtained various chemical profiles from different origins. In a principal component analysis (PCA and cluster analysis, the tanshinones cryptotanshinone, tanshinone IIA and dihydrotanshinone I exhibited higher weights in PC1, whereas the phenolic acids danshensu, salvianolic acids A and B and lithospermic acid were highly loaded in PC2. All components could be optimized as markers of different locations and might be suitable for S. miltiorrhiza quality analyses. Additionally, the DPPH and ABTS assays used to comprehensively evaluate antioxidant activities indicated large variations, with mean DPPH and ABTS scavenging potencies of 32.24 and 23.39 μg/mL, respectively, among S. miltiorrhiza extract solutions. Notably, samples that exceeded the mean IC50 values had higher phenolic acid contents. A correlation analysis indicated a strong correlation between the antioxidant activities and phenolic acid contents. Caffeic acid, danshensu, rosmarinic acid, lithospermic acid and salvianolic acid B were major contributors to antioxidant activity. In conclusion, phenolic compounds were the predominant antioxidant components in the investigated plant species. These plants may be sources of potent natural antioxidants and beneficial chemopreventive agents.

  12. Bivariate analysis of the genetic variability among some accessions of African Yam Bean (Sphenostylis stenocarpa (Hochst ex A. RichHarms

    Directory of Open Access Journals (Sweden)

    Solomon Tayo AKINYOSOYE

    2017-12-01

    Full Text Available Variability is an important factor to consider in crop improvement programmes. This study was conducted in two years to assess genetic variability and determine relationship between seed yield, its components and tuber production characters among twelve accessions of African yam bean. Data collected were subjected to combined analysis of variance (ANOVA, Principal Component Analysis (PCA, hierarchical and K-means clustering analyses. Results obtained revealed that genotype by year (G × Y interaction had significant effects on some of variables measured (days to first flowering, days to 50 % flowering, number of pod per plant, pod length, seed yield and tuber yield per plant in this study.The first five principal components (PC with Eigen values greater than 1.0 accounted for about 66.70 % of the total variation, where PC1 and PC 2 accounted for 39.48 % of variation and were associated with seed and tuber yield variables. Three heterotic groups were clearly delineated among genotypes with accessions AY03 and AY10 identified for high seed yield and tuber yield respectively. Non-significant relationship that existed between tuber and seed yield per plant of these accessions was recommended for further test in various agro-ecologies for their suitability, adaptability and possible exploitation of heterosis to further improve the accessions.

  13. Varicose veins show enhanced chemokine expression.

    Science.gov (United States)

    Solá, L del Rio; Aceves, M; Dueñas, A I; González-Fajardo, J A; Vaquero, C; Crespo, M Sanchez; García-Rodríguez, C

    2009-11-01

    Leucocyte infiltration in the wall of varicose veins has been reported previously. This study was designed to investigate the expression of pro-inflammatory cytokines and chemokines in control and in patients with varicose veins and to test the effect of treating varicose vein patients with acetylsalicylic acid (ASA) on cytokine expression prior to removal of varices. Sections of vein were removed during operation from both patient groups, and ribonuclease protection assays (RPAs) were performed to assess the expression of chemokines. Group I included non-varicose saphenous veins from healthy patients undergoing amputation for trauma. Varicose veins were obtained from patients with primary varicose undergoing surgical treatment who received no drug (group II) or treatment with 300 mg day(-1) of ASA for 15 days before surgery (group III). Non-varicose veins constitutively expressed low levels of monocyte-chemoattractant protein (MCP-1) and interleukin (IL)-8 mRNA. Varicose veins had a distinct chemokine expression pattern, since significant up-regulation of MCP-1 and IL-8 and a marked expression of IP-10, RANTES, MIP-1alpha and MIP-1beta mRNA were detected. Removal of the endothelium did not alter this pattern. Varicose veins obtained from patients treated with ASA showed a consistent decrease in chemokine expression, although it did not reach statistical significance. Varicose veins showed increased expression of several chemokines compared to control veins. A non-significant reduction of activation was observed following treatment with ASA for 15 days.

  14. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    Analysing real-world systems for vulnerabilities with respect to security and safety threats is a difficult undertaking, not least due to a lack of availability of formalisations for those systems. While both formalisations and analyses can be found for artificial systems such as software......, this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security...... are based on (quite successful) ad-hoc techniques. We believe they can be significantly improved beyond the state-of-the-art by pairing them with static analyses techniques. In this paper we present an approach to both formalising those real-world systems, as well as providing an underlying semantics, which...

  15. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  16. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... (mitogenomes). Such studies were initially limited to analyses of extant organisms, but developments in both DNA sequencing technologies and general methodological aspects related to working with degraded DNA have resulted in complete mitogenomes becoming increasingly popular for ancient DNA studies as well...

  17. Microbiological and environmental issues in show caves.

    Science.gov (United States)

    Saiz-Jimenez, Cesareo

    2012-07-01

    Cultural tourism expanded in the last half of the twentieth century, and the interest of visitors has come to include caves containing archaeological remains. Some show caves attracted mass tourism, and economical interests prevailed over conservation, which led to a deterioration of the subterranean environment and the rock art. The presence and the role of microorganisms in caves is a topic that is often ignored in cave management. Knowledge of the colonisation patterns, the dispersion mechanisms, and the effect on human health and, when present, over rock art paintings of these microorganisms is of the utmost importance. In this review the most recent advances in the study of microorganisms in caves are presented, together with the environmental implications of the findings.

  18. Ancient bacteria show evidence of DNA repair

    DEFF Research Database (Denmark)

    Johnson, Sarah Stewart; Hebsgaard, Martin B; Christensen, Torben R

    2007-01-01

    Recent claims of cultivable ancient bacteria within sealed environments highlight our limited understanding of the mechanisms behind long-term cell survival. It remains unclear how dormancy, a favored explanation for extended cellular persistence, can cope with spontaneous genomic decay over......-term survival of bacteria sealed in frozen conditions for up to one million years. Our results show evidence of bacterial survival in samples up to half a million years in age, making this the oldest independently authenticated DNA to date obtained from viable cells. Additionally, we find strong evidence...... that this long-term survival is closely tied to cellular metabolic activity and DNA repair that over time proves to be superior to dormancy as a mechanism in sustaining bacteria viability....

  19. Plant species descriptions show signs of disease.

    Science.gov (United States)

    Hood, Michael E; Antonovics, Janis

    2003-11-07

    It is well known that diseases can greatly influence the morphology of plants, but often the incidence of disease is either too rare or the symptoms too obvious for the 'abnormalities' to cause confusion in systematics. However, we have recently come across several misinterpretations of disease-induced traits that may have been perpetuated into modern species inventories. Anther-smut disease (caused by the fungus Microbotryum violaceum) is common in many members of the Caryophyllaceae and related plant families. This disease causes anthers of infected plants to be filled with dark-violet fungal spores rather than pollen. Otherwise, their vegetative morphology is within the normal range of healthy plants. Here, we present the results of a herbarium survey showing that a number of type specimens (on which the species name and original description are based) in the genus Silene from Asia are diseased with anther smut. The primary visible disease symptom, namely the dark-violet anthers, is incorporated into the original species descriptions and some of these descriptions have persisted unchanged into modern floras. This raises the question of whether diseased type specimens have erroneously been given unique species names.

  20. NASA GIBS Use in Live Planetarium Shows

    Science.gov (United States)

    Emmart, C. B.

    2015-12-01

    The American Museum of Natural History's Hayden Planetarium was rebuilt in year 2000 as an immersive theater for scientific data visualization to show the universe in context to our planet. Specific astrophysical movie productions provide the main daily programming, but interactive control software, developed at AMNH allows immersive presentation within a data aggregation of astronomical catalogs called the Digital Universe 3D Atlas. Since 2006, WMS globe browsing capabilities have been built into a software development collaboration with Sweden's Linkoping University (LiU). The resulting Uniview software, now a product of the company SCISS, is operated by about fifty planetariums around that world with ability to network amongst the sites for global presentations. Public presentation of NASA GIBS has allowed authoritative narratives to be presented within the range of data available in context to other sources such as Science on a Sphere, NASA Earth Observatory and Google Earth KML resources. Specifically, the NOAA supported World Views Network conducted a series of presentations across the US that focused on local ecological issues that could then be expanded in the course of presentation to national and global scales of examination. NASA support of for GIBS resources in an easy access multi scale streaming format like WMS has tremendously enabled particularly facile presentations of global monitoring like never before. Global networking of theaters for distributed presentations broadens out the potential for impact of this medium. Archiving and refinement of these presentations has already begun to inform new types of documentary productions that examine pertinent, global interdependency topics.

  1. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  2. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  3. Geoscience is Important? Show Me Why

    Science.gov (United States)

    Boland, M. A.

    2017-12-01

    "The public" is not homogenous and no single message or form of messaging will connect the entire public with the geosciences. One approach to promoting trust in, and engagement with, the geosciences is to identify specific sectors of the public and then develop interactions and communication products that are immediately relevant to that sector's interests. If the content and delivery are appropriate, this approach empowers people to connect with the geosciences on their own terms and to understand the relevance of the geosciences to their own situation. Federal policy makers are a distinct and influential subgroup of the general public. In preparation for the 2016 presidential election, the American Geosciences Institute (AGI) in collaboration with its 51 member societies prepared Geoscience for America's Critical Needs: Invitation to a National Dialogue, a document that identified major geoscience policy issues that should be addressed in a national policy platform. Following the election, AGI worked with eight other geoscience societies to develop Geoscience Policy Recommendations for the New Administration and the 115th Congress, which outlines specific policy actions to address national issues. State and local decision makers are another important subgroup of the public. AGI has developed online content, factsheets, and case studies with different levels of technical complexity so people can explore societally-relevant geoscience topics at their level of technical proficiency. A related webinar series is attracting a growing worldwide audience from many employment sectors. Partnering with government agencies and other scientific and professional societies has increased the visibility and credibility of these information products with our target audience. Surveys and other feedback show that these products are raising awareness of the geosciences and helping to build reciprocal relationships between geoscientists and decision makers. The core message of all

  4. Safety analyses of surface facilities

    International Nuclear Information System (INIS)

    Anspach, W.; Baran, A.; Dorst, H.J.; Eifert, B.; Gruen, M.; Behrendt, V.; Berkhan, W.; Dincklage, R.D. v.; Doehler, J.; Bruecher, H.

    1981-01-01

    The investigations were carried out using the example of the Gorleben waste disposal center and the planning documents established for this center. The safety analyses refer to the transport of spent fuel elements, the water-cooled interin storage and the reprocessing stage. Regarding the risk analysis of the technical systems the dynamics of the courses of incidents can be better taken into account by doing a methodical development. (DG) [de

  5. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  6. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  7. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    Science.gov (United States)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  8. Providing traceability for neuroimaging analyses.

    Science.gov (United States)

    McClatchey, Richard; Branson, Andrew; Anjum, Ashiq; Bloodsworth, Peter; Habib, Irfan; Munir, Kamran; Shamdasani, Jetendr; Soomro, Kamran

    2013-09-01

    With the increasingly digital nature of biomedical data and as the complexity of analyses in medical research increases, the need for accurate information capture, traceability and accessibility has become crucial to medical researchers in the pursuance of their research goals. Grid- or Cloud-based technologies, often based on so-called Service Oriented Architectures (SOA), are increasingly being seen as viable solutions for managing distributed data and algorithms in the bio-medical domain. For neuroscientific analyses, especially those centred on complex image analysis, traceability of processes and datasets is essential but up to now this has not been captured in a manner that facilitates collaborative study. Few examples exist, of deployed medical systems based on Grids that provide the traceability of research data needed to facilitate complex analyses and none have been evaluated in practice. Over the past decade, we have been working with mammographers, paediatricians and neuroscientists in three generations of projects to provide the data management and provenance services now required for 21st century medical research. This paper outlines the finding of a requirements study and a resulting system architecture for the production of services to support neuroscientific studies of biomarkers for Alzheimer's disease. The paper proposes a software infrastructure and services that provide the foundation for such support. It introduces the use of the CRISTAL software to provide provenance management as one of a number of services delivered on a SOA, deployed to manage neuroimaging projects that have been studying biomarkers for Alzheimer's disease. In the neuGRID and N4U projects a Provenance Service has been delivered that captures and reconstructs the workflow information needed to facilitate researchers in conducting neuroimaging analyses. The software enables neuroscientists to track the evolution of workflows and datasets. It also tracks the outcomes of

  9. Is bilingualism associated with a lower risk of dementia in community-living older adults? Cross-sectional and prospective analyses.

    Science.gov (United States)

    Yeung, Caleb M; St John, Philip D; Menec, Verena; Tyas, Suzanne L

    2014-01-01

    The aim of this study was to determine whether bilingualism is associated with dementia in cross-sectional or prospective analyses of older adults. In 1991, 1616 community-living older adults were assessed and were followed 5 years later. Measures included age, sex, education, subjective memory loss (SML), and the modified Mini-mental State Examination (3MS). Dementia was determined by clinical examination in those who scored below the cut point on the 3MS. Language status was categorized based upon self-report into 3 groups: English as a first language (monolingual English, bilingual English) and English as a Second Language (ESL). The ESL category had lower education, lower 3MS scores, more SML, and were more likely to be diagnosed with cognitive impairment, no dementia at both time 1 and time 2 compared with those speaking English as a first language. There was no association between being bilingual (ESL and bilingual English vs. monolingual) and having dementia at time 1 in bivariate or multivariate analyses. In those who were cognitively intact at time 1, there was no association between being bilingual and having dementia at time 2 in bivariate or multivariate analyses. We did not find any association between speaking >1 language and dementia.

  10. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  11. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  12. Incorporating nonlinearity into mediation analyses.

    Science.gov (United States)

    Knafl, George J; Knafl, Kathleen A; Grey, Margaret; Dixon, Jane; Deatrick, Janet A; Gallo, Agatha M

    2017-03-21

    Mediation is an important issue considered in the behavioral, medical, and social sciences. It addresses situations where the effect of a predictor variable X on an outcome variable Y is explained to some extent by an intervening, mediator variable M. Methods for addressing mediation have been available for some time. While these methods continue to undergo refinement, the relationships underlying mediation are commonly treated as linear in the outcome Y, the predictor X, and the mediator M. These relationships, however, can be nonlinear. Methods are needed for assessing when mediation relationships can be treated as linear and for estimating them when they are nonlinear. Existing adaptive regression methods based on fractional polynomials are extended here to address nonlinearity in mediation relationships, but assuming those relationships are monotonic as would be consistent with theories about directionality of such relationships. Example monotonic mediation analyses are provided assessing linear and monotonic mediation of the effect of family functioning (X) on a child's adaptation (Y) to a chronic condition by the difficulty (M) for the family in managing the child's condition. Example moderated monotonic mediation and simulation analyses are also presented. Adaptive methods provide an effective way to incorporate possibly nonlinear monotonicity into mediation relationships.

  13. Incorporating nonlinearity into mediation analyses

    Directory of Open Access Journals (Sweden)

    George J. Knafl

    2017-03-01

    Full Text Available Abstract Background Mediation is an important issue considered in the behavioral, medical, and social sciences. It addresses situations where the effect of a predictor variable X on an outcome variable Y is explained to some extent by an intervening, mediator variable M. Methods for addressing mediation have been available for some time. While these methods continue to undergo refinement, the relationships underlying mediation are commonly treated as linear in the outcome Y, the predictor X, and the mediator M. These relationships, however, can be nonlinear. Methods are needed for assessing when mediation relationships can be treated as linear and for estimating them when they are nonlinear. Methods Existing adaptive regression methods based on fractional polynomials are extended here to address nonlinearity in mediation relationships, but assuming those relationships are monotonic as would be consistent with theories about directionality of such relationships. Results Example monotonic mediation analyses are provided assessing linear and monotonic mediation of the effect of family functioning (X on a child’s adaptation (Y to a chronic condition by the difficulty (M for the family in managing the child's condition. Example moderated monotonic mediation and simulation analyses are also presented. Conclusions Adaptive methods provide an effective way to incorporate possibly nonlinear monotonicity into mediation relationships.

  14. DMPD: Structural and functional analyses of bacterial lipopolysaccharides. [Dynamic Macrophage Pathway CSML Database

    Lifescience Database Archive (English)

    Full Text Available 12106784 Structural and functional analyses of bacterial lipopolysaccharides. Carof...html) (.csml) Show Structural and functional analyses of bacterial lipopolysaccharides. PubmedID 12106784 Ti...tle Structural and functional analyses of bacterial lipopolysaccharides. Authors

  15. Stereology of extremes; bivariate models and computation

    Czech Academy of Sciences Publication Activity Database

    Beneš, Viktor; Bodlák, M.; Hlubinka, D.

    2003-01-01

    Roč. 5, č. 3 (2003), s. 289-308 ISSN 1387-5841 R&D Projects: GA AV ČR IAA1075201; GA ČR GA201/03/0946 Institutional research plan: CEZ:AV0Z1075907 Keywords : sample extreme s * domain of attraction * normalizing constants Subject RIV: BA - General Mathematics

  16. A Vehicle for Bivariate Data Analysis

    Science.gov (United States)

    Roscoe, Matt B.

    2016-01-01

    Instead of reserving the study of probability and statistics for special fourth-year high school courses, the Common Core State Standards for Mathematics (CCSSM) takes a "statistics for all" approach. The standards recommend that students in grades 6-8 learn to summarize and describe data distributions, understand probability, draw…

  17. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  18. HGCal Simulation Analyses for CMS

    CERN Document Server

    Bruno, Sarah Marie

    2015-01-01

    This summer, I approached the topic of fast-timing detection of photons from Higgs decays via simulation analyses, working under the supervision of Dr. Adolf Bornheim of the California Institute of Technology. My specific project focused on simulating the high granularity calorimeter for the Compact Muon Solenoid (CMS) experiment. CMS detects particles using calorimeters. The Electromagnetic Calorimeter (ECal) is arranged cylindrically to form a barrel section and two “endcaps.” Previously, both the barrel and endcap have employed lead tungstate crystal detectors, known as the “shashlik” design. The crystal detectors, however, rapidly degrade from exposure to radiation. This effect is most pronounced in the endcaps. To avoid the high expense of frequently replacing degraded detectors, it was recently decided to eliminate the endcap crystals in favor of an arrangement of silicon detectors known as the “High Granularity Calorimeter” (HGCal), while leaving the barrel detector technology unchanged. T...

  19. Isotopic signatures by bulk analyses

    International Nuclear Information System (INIS)

    Efurd, D.W.; Rokop, D.J.

    1997-01-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally

  20. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  1. Five Kepler target stars that show multiple transiting exoplanet candidates

    Energy Technology Data Exchange (ETDEWEB)

    Steffen, Jason H.; /Fermilab; Batalha, Natalie M.; /San Jose State U.; Borucki, William J.; /NASA, Ames; Buchhave, Lars A.; /Harvard-Smithsonian Ctr. Astrophys. /Bohr Inst.; Caldwell, Douglas A.; /NASA, Ames /SETI Inst., Mtn. View; Cochran, William D.; /Texas U.; Endl, Michael; /Texas U.; Fabrycky, Daniel C.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Ford, Eric B.; /Florida U.; Fortney, Jonathan J.; /UC, Santa Cruz, Phys. Dept. /NASA, Ames

    2010-06-01

    We present and discuss five candidate exoplanetary systems identified with the Kepler spacecraft. These five systems show transits from multiple exoplanet candidates. Should these objects prove to be planetary in nature, then these five systems open new opportunities for the field of exoplanets and provide new insights into the formation and dynamical evolution of planetary systems. We discuss the methods used to identify multiple transiting objects from the Kepler photometry as well as the false-positive rejection methods that have been applied to these data. One system shows transits from three distinct objects while the remaining four systems show transits from two objects. Three systems have planet candidates that are near mean motion commensurabilities - two near 2:1 and one just outside 5:2. We discuss the implications that multitransiting systems have on the distribution of orbital inclinations in planetary systems, and hence their dynamical histories; as well as their likely masses and chemical compositions. A Monte Carlo study indicates that, with additional data, most of these systems should exhibit detectable transit timing variations (TTV) due to gravitational interactions - though none are apparent in these data. We also discuss new challenges that arise in TTV analyses due to the presence of more than two planets in a system.

  2. Recurrent and multiple bladder tumors show conserved expression profiles

    International Nuclear Information System (INIS)

    Lindgren, David; Fioretos, Thoas; Månsson, Wiking; Höglund, Mattias; Gudjonsson, Sigurdur; Jee, Kowan Ja; Liedberg, Fredrik; Aits, Sonja; Andersson, Anna; Chebil, Gunilla; Borg, Åke; Knuutila, Sakari

    2008-01-01

    Urothelial carcinomas originate from the epithelial cells of the inner lining of the bladder and may appear as single or as multiple synchronous tumors. Patients with urothelial carcinomas frequently show recurrences after treatment making follow-up necessary. The leading hypothesis explaining the origin of meta- and synchronous tumors assumes a monoclonal origin. However, the genetic relationship among consecutive tumors has been shown to be complex in as much as the genetic evolution does not adhere to the chronological appearance of the metachronous tumors. Consequently, genetically less evolved tumors may appear chronologically later than genetically related but more evolved tumors. Forty-nine meta- or synchronous urothelial tumors from 22 patients were analyzed using expression profiling, conventional CGH, LOH, and mutation analyses. We show by CGH that partial chromosomal losses in the initial tumors may not be present in the recurring tumors, by LOH that different haplotypes may be lost and that detected regions of LOH may be smaller in recurring tumors, and that mutations present in the initial tumor may not be present in the recurring ones. In contrast we show that despite apparent genomic differences, the recurrent and multiple bladder tumors from the same patients display remarkably similar expression profiles. Our findings show that even though the vast majority of the analyzed meta- and synchronous tumors from the same patients are not likely to have originated directly from the preceding tumor they still show remarkably similar expressions profiles. The presented data suggests that an expression profile is established early in tumor development and that this profile is stable and maintained in recurring tumors

  3. Analyses Of High Voltage Transmission Cables In South Western ...

    African Journals Online (AJOL)

    ... unit diameter) of the analysed underground cable is good compared with the standard conductor, the overhead cable has a poor load bearing capability. SEM and EDX analyses show that both the underground and the overhead cables contain impurities that are deleterious to the structure and their functional properties.

  4. Signature of Nonstationarity in Precipitation Extremes over Urbanizing Regions in India Identified through a Multivariate Frequency Analyses

    Science.gov (United States)

    Singh, Jitendra; Hari, Vittal; Sharma, Tarul; Karmakar, Subhankar; Ghosh, Subimal

    2016-04-01

    The statistical assumption of stationarity in hydrologic extreme time/event series has been relied heavily in frequency analysis. However, due to the analytically perceivable impacts of climate change, urbanization and concomitant land use pattern, assumption of stationarity in hydrologic time series will draw erroneous results, which in turn may affect the policy and decision-making. Past studies provided sufficient evidences on changes in the characteristics of Indian monsoon precipitation extremes and further it has been attributed to climate change and urbanization, which shows need of nonstationary analysis on the Indian monsoon extremes. Therefore, a comprehensive multivariate nonstationary frequency analysis has been conducted for the entire India to identify the precipitation characteristics (intensity, duration and depth) responsible for significant nonstationarity in the Indian monsoon. We use 1o resolution of precipitation data for a period of 1901-2004, in a Generalized Additive Model for Location, Scale and Shape (GAMLSS) framework. A cluster of GAMLSS models has been developed by considering nonstationarity in different combinations of distribution parameters through different regression techniques, and the best-fit model is further applied for bivariate analysis. A population density data has been utilized to identify the urban, urbanizing and rural regions. The results showed significant differences in the stationary and nonstationary bivariate return periods for the urbanizing grids, when compared to urbanized and rural grids. A comprehensive multivariate analysis has also been conducted to identify the precipitation characteristics particularly responsible for imprinting signature of nonstationarity.

  5. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  6. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  7. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  8. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  9. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  10. Herbarium specimens show contrasting phenological responses to Himalayan climate.

    Science.gov (United States)

    Hart, Robbie; Salick, Jan; Ranjitkar, Sailesh; Xu, Jianchu

    2014-07-22

    Responses by flowering plants to climate change are complex and only beginning to be understood. Through analyses of 10,295 herbarium specimens of Himalayan Rhododendron collected by plant hunters and botanists since 1884, we were able to separate these responses into significant components. We found a lack of directional change in mean flowering time over the past 45 y of rapid warming. However, over the full 125 y of collections, mean flowering time shows a significant response to year-to-year changes in temperature, and this response varies with season of warming. Mean flowering advances with annual warming (2.27 d earlier per 1 °C warming), and also is delayed with fall warming (2.54 d later per 1 °C warming). Annual warming may advance flowering through positive effects on overwintering bud formation, whereas fall warming may delay flowering through an impact on chilling requirements. The lack of a directional response suggests that contrasting phenological responses to temperature changes may obscure temperature sensitivity in plants. By drawing on large collections from multiple herbaria, made over more than a century, we show how these data may inform studies even of remote localities, and we highlight the increasing value of these and other natural history collections in understanding long-term change.

  11. We're Playing "Jeremy Kyle"! Television Talk Shows in the Playground

    Science.gov (United States)

    Marsh, Jackie; Bishop, Julia

    2014-01-01

    This paper focuses on an episode of play in a primary school playground in England, which featured a group of children re-enacting elements of the television talk show "The Jeremy Kyle Show". The episode is analysed in the light of work that has identified the key elements of the talk show genre and the children's play is examined in…

  12. Tomato Fruits Show Wide Phenomic Diversity but Fruit Developmental Genes Show Low Genomic Diversity.

    Directory of Open Access Journals (Sweden)

    Vijee Mohan

    Full Text Available Domestication of tomato has resulted in large diversity in fruit phenotypes. An intensive phenotyping of 127 tomato accessions from 20 countries revealed extensive morphological diversity in fruit traits. The diversity in fruit traits clustered the accessions into nine classes and identified certain promising lines having desirable traits pertaining to total soluble salts (TSS, carotenoids, ripening index, weight and shape. Factor analysis of the morphometric data from Tomato Analyzer showed that the fruit shape is a complex trait shared by several factors. The 100% variance between round and flat fruit shapes was explained by one discriminant function having a canonical correlation of 0.874 by stepwise discriminant analysis. A set of 10 genes (ACS2, COP1, CYC-B, RIN, MSH2, NAC-NOR, PHOT1, PHYA, PHYB and PSY1 involved in various plant developmental processes were screened for SNP polymorphism by EcoTILLING. The genetic diversity in these genes revealed a total of 36 non-synonymous and 18 synonymous changes leading to the identification of 28 haplotypes. The average frequency of polymorphism across the genes was 0.038/Kb. Significant negative Tajima'D statistic in two of the genes, ACS2 and PHOT1 indicated the presence of rare alleles in low frequency. Our study indicates that while there is low polymorphic diversity in the genes regulating plant development, the population shows wider phenotype diversity. Nonetheless, morphological and genetic diversity of the present collection can be further exploited as potential resources in future.

  13. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  14. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...

  15. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  16. Bayesian analyses of cognitive architecture.

    Science.gov (United States)

    Houpt, Joseph W; Heathcote, Andrew; Eidels, Ami

    2017-06-01

    The question of cognitive architecture-how cognitive processes are temporally organized-has arisen in many areas of psychology. This question has proved difficult to answer, with many proposed solutions turning out to be spurious. Systems factorial technology (Townsend & Nozawa, 1995) provided the first rigorous empirical and analytical method of identifying cognitive architecture, using the survivor interaction contrast (SIC) to determine when people are using multiple sources of information in parallel or in series. Although the SIC is based on rigorous nonparametric mathematical modeling of response time distributions, for many years inference about cognitive architecture has relied solely on visual assessment. Houpt and Townsend (2012) recently introduced null hypothesis significance tests, and here we develop both parametric and nonparametric (encompassing prior) Bayesian inference. We show that the Bayesian approaches can have considerable advantages. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  18. Feasibility Analyses of Integrated Broiler Production

    Directory of Open Access Journals (Sweden)

    L. Komalasari

    2010-12-01

    Full Text Available The major obstacles in the development of broiler raising is the expensive price of feed and the fluctuative price of DOCs. The cheap price of imported leg quarters reduces the competitiveness of the local broilers. Therefore, an effort to increase production efficiency is needed through integration between broiler raising and corn farmers and feed producers (integrated farming. The purpose of this study is to analyze the feasibility of integrating broiler raising with corn cultivation and feed production. Besides that, a simulation was conducted to analyze the effects of DOC price changes, broiler price and production capacity. The analyses showed that integrated farming and a mere combination between broiler raising and feed factory of a 10,000 bird capacity is not financially feasible. Increasing the production to 25,000 broiler chickens will make the integrated farming financially feasible. Unintegrated broiler raising is relatively sensitive to broiler price decreases and DOC price increases compared to integrated farming.

  19. Preserving the nuclear option: analyses and recommendations

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    It is certain that a future role for nuclear power will depend on substantial changes in the management and regulation of the enterprise. It is widely believed that institutional, rather than technological, change is, at least in the short term, the key to resuscitating the nuclear option. Several recent analyses of the problems facing nuclear power, together with the current congressional hearing on the Nuclear Regulatory Commission's fiscal year 1986 budget request, have examined both the future of nuclear power and what can be done to address present institutional shortcomings. The congressional sessions have provided an indication of the views of both legislators and regulators, and this record, although mixed, generally shows continued optimism about the prospects of the nuclear option if needed reforms are accomplished

  20. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  1. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  2. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  3. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    Institute of Information Technology (IIIT), Hyderabad. The study was made in various direc- tions to compare the results our system developed with the out put of Hindi Analyser developed by IIIT, Hyderabad. 3.1 Comparison of rule-based morphological analyser with unsupervised morphological analyser. 3.1a Calculation of ...

  4. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  5. A novel nucleic acid analogue shows strong angiogenic activity

    Energy Technology Data Exchange (ETDEWEB)

    Tsukamoto, Ikuko, E-mail: tukamoto@med.kagawa-u.ac.jp [Department of Pharmaco-Bio-Informatics, Faculty of Medicine, Kagawa University, 1750-1 Ikenobe, Miki, Kita, Kagawa 761-0793 (Japan); Sakakibara, Norikazu; Maruyama, Tokumi [Kagawa School of Pharmaceutical Sciences, Tokushima Bunri University, 1314-1 Shido, Sanuki, Kagawa 769-2193 (Japan); Igarashi, Junsuke; Kosaka, Hiroaki [Department of Cardiovascular Physiology, Faculty of Medicine, Kagawa University, 1750-1 Ikenobe, Miki, Kita, Kagawa 761-0793 (Japan); Kubota, Yasuo [Department of Dermatology, Faculty of Medicine, Kagawa University, 1750-1 Ikenobe, Miki, Kita, Kagawa 761-0793 (Japan); Tokuda, Masaaki [Department of Cell Physiology, Faculty of Medicine, Kagawa University, 1750-1 Ikenobe, Miki, Kita, Kagawa 761-0793 (Japan); Ashino, Hiromi [The Tokyo Metropolitan Institute of Medical Science, 1-6 Kamikitazawa2-chome, Setagaya-ku, Tokyo 156-8506 (Japan); Hattori, Kenichi; Tanaka, Shinji; Kawata, Mitsuhiro [Teikoku Seiyaku Co., Ltd., Sanbonmatsu, Higashikagawa, Kagawa 769-2695 (Japan); Konishi, Ryoji [Department of Pharmaco-Bio-Informatics, Faculty of Medicine, Kagawa University, 1750-1 Ikenobe, Miki, Kita, Kagawa 761-0793 (Japan)

    2010-09-03

    Research highlights: {yields} A novel nucleic acid analogue (2Cl-C.OXT-A, m.w. 284) showed angiogenic potency. {yields} It stimulated the tube formation, proliferation and migration of HUVEC in vitro. {yields} 2Cl-C.OXT-A induced the activation of ERK1/2 and MEK in HUVEC. {yields} Angiogenic potency in vivo was confirmed in CAM assay and rabbit cornea assay. {yields} A synthesized small angiogenic agent would have great clinical therapeutic value. -- Abstract: A novel nucleic acid analogue (2Cl-C.OXT-A) significantly stimulated tube formation of human umbilical endothelial cells (HUVEC). Its maximum potency at 100 {mu}M was stronger than that of vascular endothelial growth factor (VEGF), a positive control. At this concentration, 2Cl-C.OXT-A moderately stimulated proliferation as well as migration of HUVEC. To gain mechanistic insights how 2Cl-C.OXT-A promotes angiogenic responses in HUVEC, we performed immunoblot analyses using phospho-specific antibodies as probes. 2Cl-C.OXT-A induced robust phosphorylation/activation of MAP kinase ERK1/2 and an upstream MAP kinase kinase MEK. Conversely, a MEK inhibitor PD98059 abolished ERK1/2 activation and tube formation both enhanced by 2Cl-C.OXT-A. In contrast, MAP kinase responses elicited by 2Cl-C.OXT-A were not inhibited by SU5416, a specific inhibitor of VEGF receptor tyrosine kinase. Collectively these results suggest that 2Cl-C.OXT-A-induces angiogenic responses in HUVEC mediated by a MAP kinase cascade comprising MEK and ERK1/2, but independently of VEGF receptor tyrosine kinase. In vivo assay using chicken chorioallantoic membrane (CAM) and rabbit cornea also suggested the angiogenic potency of 2Cl-C.OXT-A.

  6. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  7. Analyses of tropistic responses using metabolomics.

    Science.gov (United States)

    Millar, Katherine D L; Kiss, John Z

    2013-01-01

    Characterization of phototropism and gravitropism has been through gene expression studies, assessment of curvature response, and protein expression experiments. To our knowledge, the current study is the first to determine how the metabolome, the complete set of small-molecule metabolites within a plant, is impacted during these tropisms. We have determined the metabolic profile of plants during gravitropism and phototropism. Seedlings of Arabidopsis thaliana wild type (WT) and phyB mutant were exposed to unidirectional light (red or blue) or reoriented to induce a tropistic response, and small-molecule metabolites were assayed and quantified. A subset of the WT was analyzed using microarray experiments to obtain gene profiling data. Analyses of the metabolomic data using principal component analysis showed a common profile in the WT during the different tropistic curvatures, but phyB mutants produced a distinctive profile for each tropism. Interestingly, the gravity treatment elicited the greatest changes in gene expression of the WT, followed by blue light, then by red light treatments. For all tropisms, we identified genes that were downregulated by a large magnitude in carbohydrate metabolism and secondary metabolism. These included ATCSLA15, CELLULOSE SYNTHASE-LIKE, and ATCHS/SHS/TT4, CHALCONE SYNTHASE. In addition, genes involved in amino acid biosynthesis were strongly upregulated, and these included THA1 (THREONINE ALDOLASE 1) and ASN1 (DARK INDUCIBLE asparagine synthase). We have established the first metabolic profile of tropisms in conjunction with transcriptomic analyses. This approach has been useful in characterizing the similarities and differences in the molecular mechanisms involved with phototropism and gravitropism.

  8. Accounting analyses of momentum and contrarian strategies in emerging markets

    OpenAIRE

    Nnadi, Matthias Akandu; Tanna, S.

    2017-01-01

    We analyse the momentum and contrarian effects of stock markets in Brazil, Russia, India, China and South Africa (BRICS) using accounting data. The five markets show different characteristics with the Indian market having the strongest momentum effect. Stock markets in China and Brazil show significant short-term contrarian profit and intermediate to long-term momentum profit while South Africa shows short-term momentum effect and intermediate to long-term contrarian effect. The Russian stock...

  9. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  10. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  11. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  12. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  13. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  14. Map and table showing isotopic age data in Alaska

    Science.gov (United States)

    Wilson, Frederic H.; Shew, Nora B.; DuBois, G.D.

    1994-01-01

    The source of the data reported here is a compilation of radiometric ages maintained in conjunction with the Alaska Mineral Resource Assessment Program (AMRAP) studies for Alaska. The symbol shape plotted at each location is coded for rock type, whether igneous, metamorphic, or other; the color of the symbol shows the geologic era or period for the Sample(s) at each locale. A list of references for each quadrangle is given to enable the user to find specific information including analytical data for each sample dated within a particular quadrangle. At the scale of this map, the very large number of Samples and the clustering of the samples in limited areas prevented the showing of individual sample numbers on the map.Synthesis and interpretation of any data set requires the user to evaluate the reliability or value of each component of the data set with respect to his or her intended use of the data. For geochronological data, this evaluation must be based on both analytical and geological criteria. Most age determinations are published with calculated estimates of analytical precision, Replicate analyses are infrequently performed; therefore, reported analytical precision is based on estimates of the precision of various components of the analysis and often on an intuitive factor to cover components that may have not been considered. Analytical accuracy is somewhat more difficult to determine; it is not only dependent on the actual measurement, it is also concerned with uncertainties in decay and abundance constants, uncertainties in the isotopic composition and size of the tracer for conventional K-Ar ages, and uncertainties in the Original isotopic composition of the sample, Geologic accuracy of a date is Variable; the interpretation of the meaning of an age determination, is important in the evaluation of its geologic accuracy. Potassium-argon, rubidium-strontium, and uranium-lead age determinations on a single sample can differ widely yet none or all may be

  15. Analysing Scenarios of Cell Population System Development

    Directory of Open Access Journals (Sweden)

    M. S. Vinogradova

    2014-01-01

    Full Text Available The article considers an isolated population system consisting of two types of human stem cells, namely normal cells and cells with chromosomal abnormalities (abnormal ones. The system develops in the laboratory (in vitro. The article analyses possible scenarios of the population system development, which are implemented for different values of its parameters. An investigated model of the cell population system takes into account the limited resources. It is represented as a system of two nonlinear differential equations with continuous right-hand part. The model is considered with non-negative values of the variables; the domain is divided into four sets. The model feature is that in each set the right part of the system of differential equations has a different form.The article analyses a quality of the rest points of the system in each of four sets. The analytical conditions for determination of the number of rest points and the quality of rest points, with, at least, one zero coordinate, are obtained.It is shown that the population system under study cannot have more than two points of rest, both coordinates of which are positive (non-zero. It is difficult to determine quality of such rest points depending on the model parameters due to the complexity of the expressions, which define the systems of the first approximation, recorded in a neighborhood of these points of rest. Numerical research results of the stability of these points of rest are obtained, and phase portraits with the specified specific values of the system parameters are demonstrated. The main scenarios for the cell population development are adduced. Analysis of mathematical model shows that a cell population system may remain the system consisting of populations of normal and abnormal cells; it can degenerate into a population of abnormal cells or perish. The scenario, in which there is only the population of normal cells, is not implemented. The numerical simulation

  16. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  17. Roux 105 PHONETIC DATA AND PHONOLOGICAL ANALYSES ...

    African Journals Online (AJOL)

    This paper is basically concerned with the relationship between phonetic data and phonological analyses. I) It will be shown that phonological analyses based on unverified phonetic data tend to accommodate ad hoc, unmotivated, and even phonetically implausible phonological rules. On the other hand, it will be ...

  18. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  19. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  20. Novel Algorithms for Astronomical Plate Analyses

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... Powerful computers and dedicated software allow effective data mining and scientific analyses in astronomical plate archives. We give and discuss examples of newly developed algorithms for astronomical plate analyses, e.g., searches for optical transients, as well as for major spectral and brightness ...

  1. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    -to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...... and needs among population groups with a low ability to pay. Instead of cost-benefit analyses, impact analyses evaluating the likely effects of project alternatives against a wide range of societal goals is recommended, with quantification and economic valorisation only for impact categories where this can......This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural...

  2. Radiocarbon analyses along the EDML ice core in Antarctica

    NARCIS (Netherlands)

    van de Wal, R.S.W.; Meijer, H.A.J.; van Rooij, M.; van der Veen, C.

    2007-01-01

    Samples, 17 in total, from the EDML core drilled at Kohnen station Antarctica are analysed for 14CO and 14CO2 with a dry-extraction technique in combination with accelerator mass spectrometry. Results of the in situ produced 14CO fraction show a very low concentration of in situ produced 14CO.

  3. Radiocarbon analyses along the EDML ice core in Antarctica

    NARCIS (Netherlands)

    Van de Wal, R. S. W.; Meijer, H. A. J.; De Rooij, M.; Van der Veen, C.

    Samples, 17 in total, from the EDML core drilled at Kohnen station Antarctica are analysed for (CO)-C-14 and (CO2)-C-14 with a dry-extraction technique in combination with accelerator mass spectrometry. Results of the in situ produced (CO)-C-14 fraction show a very low concentration of in situ

  4. Race, Gender, and Reseacher Positionality Analysed Through Memory Work

    DEFF Research Database (Denmark)

    Andreassen, Rikke; Myong, Lene

    2017-01-01

    Drawing upon feminist standpoint theory and memory work, the authors analyse racial privilege by investigating their own racialized and gendered subjectifications as academic researchers. By looking at their own experiences within academia, they show how authority and agency are contingent upon...

  5. Towards a morphological analyser for past tense forms in Northern ...

    African Journals Online (AJOL)

    A sample from the Comprehensive Northern Sotho Dictionary (Ziervogel & Mokgokong, 1985) was used to analyse the formation of tense forms in verb stems with final 'm' and 'n'. The research findings show that stems with final 'm' and 'n' are not governed by the same rules, as purported by some grammars, and that the ...

  6. "Youth Amplified": Using Critical Pedagogy to Stimulate Learning through Dialogue at a Youth Radio Show

    Science.gov (United States)

    Cooper, Adam

    2016-01-01

    In this paper I describe and analyse how critical pedagogy, an approach to teaching and learning that encourages students to reflect on their socio-political contexts, may stimulate critical consciousness and dialogue at a youth radio show. The participants, who attended four diverse Cape Town high schools and predominantly lived in poor…

  7. Fracturing and brittleness index analyses of shales

    Science.gov (United States)

    Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje

    2016-04-01

    The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable

  8. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    compounds often present in concentrations much greater than those of analyte. Analiyte concentrations are often low, and in the case of drugs, the endogenous compounds are sometimes structurally very similar to the drug to be measured. The binding of drugs to the plasma protein also may occur which decreases the amount of free compound that is measured. To undertake the analyses of drugs and metabolites in body fluids the analyst is facet with several problems. The first problem is due to the complex nature of the body fluid, the drugs must be isolated by an extraction technique, which ideally should provide a relatively clean extract, and the separation system must be capable of resolving the drugs of interest from co extractives. All mentioned when we are using high performance liquid chromatography require good selections of detectors, good stationary phase, eluents and adequate program during separation. UV/VIS detector is the most versatile detector used in high performance liquid chromatography it is not always ideal since it is lack of specificity means high resolution of the analyte that may be required. UV detection is preferred since it offers excellent linearity and rapid quantitative analyses can be performed against a single standard of the drug being determined. Diode array and rapid scanning detector are useful for peak identification and monitoring peak purity but they are somewhat less sensitive then single wavelength detectors. In liquid chromatography some components may have a poor UV chromophores if UV detection is being used or be completely retained on the liquid chromatography column. Fluorescence and electrochemical detector are not only considerably more sensitive towed appropriate analytes but also more selective than UV detectors for many compounds. If at all possible fluorescence detectors are sensitive, stable, selective and easy to operate. It is selectivity shows itself in the lack of frontal components observed in plasma extract whereas

  9. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  10. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  11. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  12. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  13. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  14. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  15. Thermal Analyse sof Cross-Linked Polyethylene

    Directory of Open Access Journals (Sweden)

    Radek Polansky

    2007-01-01

    Full Text Available The paper summarizes results obtained during the structural analyses measurements (Differential Scanning Calorimetry DSC, Thermogravimetry TG, Thermomechanical analysis TMA and Fourier transform infrared spectroscopy FT-IR. The samples of cross-linked polyethylene cable insulation were tested via these analyses. The DSC and TG were carried out using simultaneous thermal analyzer TA Instruments SDT Q600 with connection of Fourier transform infrared spectrometer Nicolet 380. Thermomechanical analysis was carried out by TMA Q400EM TA Instruments apparatus.

  16. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  17. Systematic Derivation of Static Analyses for Software Product Lines

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Brabrand, Claus; Wasowski, Andrzej

    2014-01-01

    A recent line of work lifts particular verification and analysis methods to Software Product Lines (SPL). In an effort to generalize such case-by-case approaches, we develop a systematic methodology for lifting program analyses to SPLs using abstract interpretation. Abstract interpretation...... is a classical framework for deriving static analyses in a compositional, step-by-step manner. We show how to take an analysis expressed as an abstract interpretation and lift each of the abstract interpretation steps to a family of programs. This includes schemes for how to lift domain types, and combinators...... analysis for a simple imperative language....

  18. Payload specialist Reinhard Furrer show evidence of previous blood sampling

    Science.gov (United States)

    1985-01-01

    Payload specialist Reinhard Furrer shows evidence of previous blood sampling while Wubbo J. Ockels, Dutch payload specialist (only partially visible), extends his right arm after a sample has been taken. Both men show bruises on their arms.

  19. Neoliberalism in education: Five images of critical analyses

    Directory of Open Access Journals (Sweden)

    Branislav Pupala

    2011-03-01

    Full Text Available The survey study brings information about the way that educational researchcopes with neoliberalism as a generalized form of social government in the currentwestern culture. It shows that neoliberalism is considered as a universal scope of otherchanges in the basic segments of education and those theoretical and critical analyses ofthis phenomenon represent an important part of production in the area of educationalresearch. It emphasizes the contribution of formation and development of the socalledgovernmental studies for comprehension of mechanisms and consequences ofneoliberal government of the society and shows how way the methodology of thesestudies helps to identify neoliberal strategies used in the regulation of social subjectsby education. There are five selected segments of critical analyses elaborated (fromthe concept of a lifelong learning, through preschool and university education to theeducation of teachers and PISA project that obviously show ideological and theoreticalcohesiveness of the education analysis through the scope of neoliberal governmentality.

  20. Analyses adjusting for selective crossover show improved overall survival with adjuvant letrozole compared with tamoxifen in the BIG 1-98 study

    DEFF Research Database (Denmark)

    Colleoni, Marco; Giobbie-Hurder, Anita; Regan, Meredith M

    2011-01-01

    Among postmenopausal women with endocrine-responsive breast cancer, the aromatase inhibitor letrozole, when compared with tamoxifen, has been shown to significantly improve disease-free survival (DFS) and time to distant recurrence (TDR). We investigated whether letrozole monotherapy prolonged ov...

  1. How to Show the Real Microbial Biodiversity? A Comparison of Seven DNA Extraction Methods for Bacterial Population Analyses in Matrices Containing Highly Charged Natural Nanoparticles.

    Science.gov (United States)

    Kaden, Rene; Krolla-Sidenstein, Peter

    2015-10-20

    A DNA extraction that comprises the DNA of all available taxa in an ecosystem is an essential step in population analysis, especially for next generation sequencing applications. Many nanoparticles as well as naturally occurring clay minerals contain charged surfaces or edges that capture negatively charged DNA molecules after cell lysis within DNA extraction. Depending on the methodology of DNA extraction, this phenomenon causes a shift in detection of microbial taxa in ecosystems and a possible misinterpretation of microbial interactions. With the aim to describe microbial interactions and the bio-geo-chemical reactions during a clay alteration experiment, several methods for the detection of a high number of microbial taxa were examined in this study. Altogether, 13 different methods of commercially available DNA extraction kits provided by seven companies as well as the classical phenol-chloroform DNA extraction were compared. The amount and the quality of nucleic acid extracts were determined and compared to the amplifiable amount of DNA. The 16S rRNA gene fragments of several taxa were separated using denaturing gradient gel electrophoresis (DGGE) to determine the number of different species and sequenced to get the information about what kind of species the microbial population consists of. A total number of 13 species was detected in the system. Up to nine taxa could be detected with commercially available DNA extraction kits while phenol-chloroform extraction lead to three detected species. In this paper, we describe how to combine several DNA extraction methods for the investigation of microbial community structures in clay.

  2. Whole proteome analyses on Ruminiclostridium cellulolyticum show a modulation of the cellulolysis machinery in response to cellulosic materials with subtle differences in chemical and structural properties

    NARCIS (Netherlands)

    Badalato, Nelly; Guillot, Alain; Sabarly, Victor; Dubois, Marc; Pourette, Nina; Pontoire, Bruno; Robert, Paul; Bridier, Arnaud; Monnet, Véronique; Machado de Sousa, Diana; Durand, Sylvie; Mazéas, Laurent; Buléon, Alain; Bouchez, Théodore; Mortha, Gerard; Bize, Ariane

    2017-01-01

    Lignocellulosic materials from municipal solid waste emerge as attractive resources for anaerobic digestion biorefinery. To increase the knowledge required for establishing efficient bioprocesses, dynamics of batch fermentation by the cellulolytic bacterium Ruminiclostridium cellulolyticum were

  3. X-chromosome SNP analyses in 11 human Mediterranean populations show a high overall genetic homogeneity except in North-west Africans (Moroccans)

    DEFF Research Database (Denmark)

    Tomas Mas, Carmen; Sanchez Sanchez, Juan Jose; Barbaro, Anna

    2008-01-01

    homogeneity was found among the Mediterranean populations except for the population from Morocco, which seemed to differ genetically from the rest of the populations in the Mediterranean area. A very low genetic distance was found between populations in the Middle East and most of the western part...

  4. Robust correlation analyses: false positive and power validation using a new open source matlab toolbox.

    Science.gov (United States)

    Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A

    2012-01-01

    Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.

  5. Genome-Facilitated Analyses of Geomicrobial Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kenneth H. Nealson

    2012-05-02

    This project had the goal(s) of understanding the mechanism(s) of extracellular electron transport (EET) in the microbe Shewanella oneidensis MR-1, and a number of other strains and species in the genus Shewanella. The major accomplishments included sequencing, annotation, and analysis of more than 20 Shewanella genomes. The comparative genomics enabled the beginning of a systems biology approach to this genus. Another major contribution involved the study of gene regulation, primarily in the model organism, MR-1. As part of this work, we took advantage of special facilities at the DOE: e.g., the synchrotron radiation facility at ANL, where we successfully used this system for elemental characterization of single cells in different metabolic states (1). We began work with purified enzymes, and identification of partially purified enzymes, leading to initial characterization of several of the 42 c-type cytochromes from MR-1 (2). As the genome became annotated, we began experiments on transcriptome analysis under different conditions of growth, the first step towards systems biology (3,4). Conductive appendages of Shewanella, called bacterial nanowires were identified and characterized during this work (5, 11, 20,21). For the first time, it was possible to measure the electron transfer rate between single cells and a solid substrate (20), a rate that has been confirmed by several other laboratories. We also showed that MR-1 cells preferentially attach to cells at a given charge, and are not attracted, or even repelled by other charges. The interaction with the charged surfaces begins with a stimulation of motility (called electrokinesis), and eventually leads to attachment and growth. One of the things that genomics allows is the comparative analysis of the various Shewanella strains, which led to several important insights. First, while the genomes predicted that none of the strains looked like they should be able to degrade N-acetyl glucosamine (NAG), the monomer

  6. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  7. Snacking on Television: A Content Analysis of Adolescents' Favorite Shows.

    Science.gov (United States)

    Eisenberg, Marla E; Larson, Nicole I; Gollust, Sarah E; Neumark-Sztainer, Dianne

    2016-05-19

    Snacking is a complex behavior that may be influenced by entertainment media. Research suggests that snacking and unhealthy foods are commonly shown in programming that targets young audiences, but shows selected for study have been limited. We conducted a content analysis on shows that were named as favorites by adolescents to characterize portrayals of snacking on popular television. A diverse sample of 2,130 adolescents (mean age, 14.3 y) listed 3 favorite television shows in a 2010 school-based survey. Three episodes each of the 25 most popular shows were coded for food-related content, including healthfulness, portion size, screen time use, setting, and social context. We also analyzed the characteristics of characters involved in eating incidents, the show type, and the show rating. We used χ(2) tests, binomial tests, and multilevel regression models to compare incidence of snacks versus meals, the characteristics of those involved, and snacking across show characteristics. Almost half of food incidents on television shows were snacks. Snacks were significantly more likely than meals to be "mostly unhealthy" (69.3% vs 22.6%, P Media awareness and literacy programs should include foods and snacking behaviors among the issues they address. More healthful portrayals of food and dietary intake in entertainment shows' content would create a healthier media environment for youth.

  8. Uudised : Otsman taas Riias show'l. Rokkstaarist ministriks

    Index Scriptorium Estoniae

    2007-01-01

    Drag-kabareeartist Erkki Otsman esineb detsembris Riias "Sapnu Fabrikas" toimuval jõulu-show'l. Austraalia rokkansambli Midnight Oil endine laulja Peter Garrett nimetati valitsuse keskkonnaministriks

  9. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...... through the analysis of one of the earliest recorded examples of preschool education (initiated by J. F. Oberlin in northeastern France in 1767). The general idea of societal need is elaborated as a way of analysing practices, and a general analytic schema is presented for characterising preschool...

  10. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  11. Advanced Toroidal Facility vacuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advanced Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described

  12. Proteomic Analyses of the Vitreous Humour

    Directory of Open Access Journals (Sweden)

    Martina Angi

    2012-01-01

    Full Text Available The human vitreous humour (VH is a transparent, highly hydrated gel, which occupies the posterior segment of the eye between the lens and the retina. Physiological and pathological conditions of the retina are reflected in the protein composition of the VH, which can be sampled as part of routine surgical procedures. Historically, many studies have investigated levels of individual proteins in VH from healthy and diseased eyes. In the last decade, proteomics analyses have been performed to characterise the proteome of the human VH and explore networks of functionally related proteins, providing insight into the aetiology of diabetic retinopathy and proliferative vitreoretinopathy. Recent proteomic studies on the VH from animal models of autoimmune uveitis have identified new signalling pathways associated to autoimmune triggers and intravitreal inflammation. This paper aims to guide biological scientists through the different proteomic techniques that have been used to analyse the VH and present future perspectives for the study of intravitreal inflammation using proteomic analyses.

  13. A 256-channel pulse-height analyser

    International Nuclear Information System (INIS)

    Berset, J.C.; Delavallade, G.; Lindsay, J.

    1975-01-01

    The design, construction, and testing of a small, low-cost 256-channel pulse-height analyser is briefly discussed. The analyser, intended for use in the setting up of experiments in high-energy physics, is fully compatible with the CERN/NIM nucleonic instrumentation. It has a digital display of channel and content as well as outputs for printing, plotting, and binary transfer. The logic circuitry is made with TTL integrated circuits and has a static random-access MOS memory. Logic and timing diagrams are given. Detailed specifications are also included. (Author)

  14. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  15. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  16. The Daily Show with Jon Stewart: Part 2

    Science.gov (United States)

    Trier, James

    2008-01-01

    "The Daily Show With Jon Stewart" is one of the best critical literacy programs on television, and in this Media Literacy column the author suggests ways that teachers can use video clips from the show in their classrooms. (For Part 1, see EJ784683.)

  17. "The Daily Show with Jon Stewart": Part 1

    Science.gov (United States)

    Trier, James

    2008-01-01

    Comedy Central's popular program "The Daily Show With Jon Stewart" is the best critical media literacy program on television, and it can be used in valuable ways in the classroom as part of a media literacy pedagogy. This Media Literacy column provides an overview of the show and its accompanying website and considers ways it might be used in the…

  18. Pedagogical Techniques Employed by the Television Show "MythBusters"

    Science.gov (United States)

    Zavrel, Erik

    2016-01-01

    "MythBusters," the long-running though recently discontinued Discovery Channel science entertainment television program, has proven itself to be far more than just a highly rated show. While its focus is on entertainment, the show employs an array of pedagogical techniques to communicate scientific concepts to its audience. These…

  19. Entertaining politics, seriously?! : How talk show formats blur conceptual boundaries

    NARCIS (Netherlands)

    Schohaus, Birte

    2017-01-01

    What happens behind the scenes of a talk show? Why do some politicians seem to appear on every show while others are hardly ever seen? Birte Schohaus conducted a multi-layered research in which she conducted interviews with journalists, producers, PR advisors and (former) politicians and combined

  20. The Easy Way to Create Computer Slide Shows.

    Science.gov (United States)

    Anderson, Mary Alice

    1995-01-01

    Discusses techniques for creating computer slide shows. Topics include memory; format; color use; HyperCard and CD-ROM; font styles and sizes; graphs and graphics; the slide show option; special effects; and tips for effective presentation. (Author/AEF)

  1. Computer Slide Shows: A Trap for Bad Teaching

    Science.gov (United States)

    Klemm, W. R.

    2007-01-01

    Slide shows presented with software such as PowerPoint or WordPerfect Presentations can trap instructors into bad teaching practices. Research on memory suggests that slide-show instruction can actually be less effective than traditional lecturing when the teacher uses a blackboard or overhead projector. The author proposes a model of classroom…

  2. Showing and Telling: The Difference That Makes a Difference.

    Science.gov (United States)

    Lewis, David

    2001-01-01

    Attempts to clarify an essential difference between the ways in which pictures and words convey meaning. Examines one attempt to differentiate and characterize various types of picture books and concludes by showing how Anthony Browne exploits the distinction between showing and telling to create the atmosphere of uncertainty and mystery in his…

  3. 47 CFR 73.33 - Antenna systems; showing required.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Antenna systems; showing required. 73.33... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.33 Antenna systems; showing required. (a) An application for authority to install a broadcast antenna shall specify a definite site and include full...

  4. International Team Shows that Primes Can Be Found in Surprising ...

    Indian Academy of Sciences (India)

    1997-12-05

    Dec 5, 1997 ... The significance of the work of Friedlander and I waniec can be seen in its historical context. It was Euclid, in ancient Greece, who first showed that there are an infinity of primes among the integers. Much later, in. 1837, Gustave Lejeune Dirichlet showed that there are infinitely many primes among the.

  5. Runtime and Pressurization Analyses of Propellant Tanks

    Science.gov (United States)

    Field, Robert E.; Ryan, Harry M.; Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chung P.

    2007-01-01

    Multi-element unstructured CFD has been utilized at NASA SSC to carry out analyses of propellant tank systems in different modes of operation. The three regimes of interest at SSC include (a) tank chill down (b) tank pressurization and (c) runtime propellant draw-down and purge. While tank chill down is an important event that is best addressed with long time-scale heat transfer calculations, CFD can play a critical role in the tank pressurization and runtime modes of operation. In these situations, problems with contamination of the propellant by inclusion of the pressurant gas from the ullage causes a deterioration of the quality of the propellant delivered to the test article. CFD can be used to help quantify the mixing and propellant degradation. During tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. It should be noted that traditional CFD modeling is inadequate for such simulations because the fluids in the tank are in a range of different sub-critical and supercritical states and elaborate phase change and mixing rules have to be developed to accurately model the interaction between the ullage gas and the propellant. We show a typical run-time simulation of a spherical propellant tank, containing RP-1 in this case, being pressurized with room-temperature nitrogen at 540 R. Nitrogen

  6. QLab 3 show control projects for live performances & installations

    CERN Document Server

    Hopgood, Jeromy

    2013-01-01

    Used from Broadway to Britain's West End, QLab software is the tool of choice for many of the world's most prominent sound, projection, and integrated media designers. QLab 3 Show Control: Projects for Live Performances & Installations is a project-based book on QLab software covering sound, video, and show control. With information on both sound and video system basics and the more advanced functions of QLab such as MIDI show control, new OSC capabilities, networking, video effects, and microphone integration, each chapter's specific projects will allow you to learn the software's capabilitie

  7. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  8. Good Governance Analysing Performance of Economic Community ...

    African Journals Online (AJOL)

    Good Governance Analysing Performance of Economic Community of West African States and Southern African Development Community Members on Mo Ibrahim Index of ... The Index is important, significant and appropriate because it outlines criteria and conditions deemed essential for Africans to live meaningful lives.

  9. Regression og geometrisk data analyse (2. del)

    DEFF Research Database (Denmark)

    Brinkkjær, Ulf

    2010-01-01

    Artiklen søger at vise, hvordan regressionsanalyse og geometrisk data analyse kan integreres. Det er interessant, fordi disse metoder ofte opstilles som modsætninger f.eks. som en modsætning mellem beskrivende og forklarende metoder. Artiklens første del bragtes i Praktiske Grunde 3-4 / 2007....

  10. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Unknown

    Abstract. The object of this study was to estimate heritabilities and sire breeding values for stayability and reproductive traits in a composite multibreed beef cattle herd using a threshold model. A GFCAT set of programmes was used to analyse reproductive data. Heritabilities and product-moment correlations between.

  11. Phonetic data and phonological analyses | Roux | Stellenbosch ...

    African Journals Online (AJOL)

    Stellenbosch Papers in Linguistics. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 1 (1978) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register. Phonetic data and phonological analyses. JC Roux. Abstract.

  12. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  13. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  14. Some Tools for Robustifying Econometric Analyses

    NARCIS (Netherlands)

    V. Hoornweg (Victor)

    2013-01-01

    markdownabstract__Abstract__ We use automated algorithms to update and evaluate ad hoc judgments that are made in applied econometrics. Such an application of automated algorithms robustifies empirical econometric analyses, it achieves lower and more consistent prediction errors, and it helps to

  15. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    The phylogenetic relationships among the tayassuids are unclear and have insti- gated debate over the ... [Adega F., Chaves R. and Guedes-Pinto H. 2007 Chromosomal evolution and phylogenetic analyses in Tayassu pecari and Pecari tajacu. (Tayassuidae): tales ..... Chromosome banding in Amphibia. XXV. Karyotype ...

  16. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  17. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  18. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... must clearly identify and differentiate between the roles performed by the natural disposal site... and segregation requirements will be met and that adequate barriers to inadvertent intrusion will be... need for ongoing active maintenance after closure must be based upon analyses of active natural...

  19. Chemical Analyses of Silicon Aerogel Samples

    Energy Technology Data Exchange (ETDEWEB)

    van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano

    2008-04-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  20. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  1. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  2. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1993-01-01

    In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques. (J.P.N.)

  3. Pecheries maritimes artisanales Togolaises : analyse des ...

    African Journals Online (AJOL)

    Pecheries maritimes artisanales Togolaises : analyse des debarquements et de la valeur commerciale des captures. K.M. Sedzro, E.D. Fiogbe, E.B. Guerra. Abstract. Description du sujet : La connaissance scientifique de la pression des pêcheries artisanales sur les ressources marines togolaises s'avère nécessaire pour ...

  4. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  5. The Economic Cost of Homosexuality: Multilevel Analyses

    Science.gov (United States)

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  6. What do international comparisons of health care expenditures really show?

    Science.gov (United States)

    Parkin, D W; McGuire, A J; Yule, B F

    1989-05-01

    There is much interest in international comparisons of health care expenditures, in particular their relation to national income. They have been widely used to judge countries' performance in cost-containment, and in the United Kingdom have been widely quoted in debates about the funding of the National Health Service. This paper challenges conclusions drawn from simple analyses of this topic, which have used dubious and inappropriate data, questionable methods and assumptions, and simplistic ad-hoc reasoning. It looks particularly at price differences between countries, which have usually been hidden by using exchange rates to standardize national figures. When more appropriate conversion factors called purchasing power parities are used, many of the simple and conventionally-accepted conclusions no longer appear so obvious. The attempt to create apparent scientific facts for policy debates has been based on a misuse of international comparisons.

  7. Mexican obsidian samples analysed by PIXE and AAS

    Energy Technology Data Exchange (ETDEWEB)

    Tenorio, D.; Jimenez-Reyes, M. [Inst. Nacional de Investigaciones Nucleares, Mexico (Mexico); Lagarde, G.

    1997-12-31

    Proton induced X-ray emission analysis results are reported for obsidian artifacts from different sites of the State of Mexico: Teotenango, Calixtlahuaca, La Marqueza, Malinalco and Tonatico. Twenty elements were analysed by PIXE and some of them were verified by AAS. The results show that the samples came from three different sources: Teotenango and Calixtlahuaca samples from the first, La Marqueza and Malinalco samples from the second and Tonatico samples from the third. (author)

  8. Analyse proxémique d'interactions didactiques

    OpenAIRE

    Forest, Dominique

    2006-01-01

    Our study explores the use of non-verbal practices by primary school teachers. In our research, we have used concepts from the didactics of mathemathics (Brousseau, 1998) and from anthropology. These tools have been used to analyse classroom situations and teacher's actions, and they form the background of a specific methodology for video-data analysis. In this thesis, we experiment this methodology that aims at showing how teachers use proxemic techniques (Hall, 1971) as an important part of...

  9. Pedagogy of the Talk Show Hosts and Hostesses.

    Science.gov (United States)

    Heinze, Denise

    2001-01-01

    Explores the apparently disparate link between successfully hosting a talk show and successfully teaching a college class and describes methods this professor has used to foster class discussion and student interest. (EV)

  10. Protein Replacement Therapy Shows Promise in Treating Rare Skin Disorder

    Science.gov (United States)

    ... Room Spotlight on Research Spotlight on Research Protein Replacement Therapy Shows Promise in Treating Rare Skin Disorder ... are under development. One promising approach involves protein replacement. Previous research has suggested that replacing the missing ...

  11. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  12. Phytoceramide Shows Neuroprotection and Ameliorates Scopolamine-Induced Memory Impairment

    OpenAIRE

    Jung, Jae-Chul; Lee, Yeonju; Moon, Sohyeon; Ryu, Jong Hoon; Oh, Seikwan

    2011-01-01

    The function and the role phytoceramide (PCER) and phytosphingosine (PSO) in the central nervous system has not been well studied. This study was aimed at investigating the possible roles of PCER and PSO in glutamate-induced neurotoxicity in cultured neuronal cells and memory function in mice. Phytoceramide showed neuro-protective activity in the glutamate-induced toxicity in cultured cortical neuronal cells. Neither phytosphingosine nor tetraacetylphytosphingosine (TAPS) showed neuroproectiv...

  13. Caulis Lonicerae Japonicae extract shows protective effect on ...

    African Journals Online (AJOL)

    Caulis Lonicerae Japonicae extract shows protective effect on osteoporosis in rats. ... Result: The results show that a high-dose of CLJE (600 mg/kg) significantly inhibited bone mineral density (BMD) reduction of L4 vertebrae (0.24 ± 0.02, p < 0.05) and femur (0.24 ± 0.03, p < 0.05) caused by OVX, and prevented the ...

  14. REALITY SHOW AS A TYPE OF MEDIA DISCOURSE (A STUDY OF THE REALITY SHOW KEEPING UP WITH THE KARDASHIANS

    Directory of Open Access Journals (Sweden)

    L.M. Ikalyuk

    2015-09-01

    Full Text Available The article focuses on defining peculiarities of the US reality show as a type of media discourse. Based on a study of the reality show Keeping up with the Kardashians, an attempt has been made to determine intralinguistic and extralinguistic factors of creating an image of an ordinary American family in order to attract the public attention.

  15. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  16. Dolphin shows and interaction programs: benefits for conservation education?

    Science.gov (United States)

    Miller, L J; Zeigler-Hill, V; Mellen, J; Koeppel, J; Greer, T; Kuczaj, S

    2013-01-01

    Dolphin shows and dolphin interaction programs are two types of education programs within zoological institutions used to educate visitors about dolphins and the marine environment. The current study examined the short- and long-term effects of these programs on visitors' conservation-related knowledge, attitude, and behavior. Participants of both dolphin shows and interaction programs demonstrated a significant short-term increase in knowledge, attitudes, and behavioral intentions. Three months following the experience, participants of both dolphin shows and interaction programs retained the knowledge learned during their experience and reported engaging in more conservation-related behaviors. Additionally, the number of dolphin shows attended in the past was a significant predictor of recent conservation-related behavior suggesting that repetition of these types of experiences may be important in inspiring people to conservation action. These results suggest that both dolphin shows and dolphin interaction programs can be an important part of a conservation education program for visitors of zoological facilities. © 2012 Wiley Periodicals, Inc.

  17. New Planetarium Show: "Max Goes To The Moon"

    Science.gov (United States)

    Benjamin, Matthew

    2012-05-01

    As part of our NASA Lunar Science Institute funding we have focused on making a children’s planetarium show about space science and exploration. We decided to adapt an award winning children’s book, “Max Goes to the Moon” by Dr. Jeffrey Bennett into a planetarium show. This story follows the adventure of a dog names Max and his friend/owner Tori. The two of them go on an amazing journey to the Moon and back. Not only is the show a great adventure but it also teaches many concepts pertaining to our current understanding of the Earth-Moon system. We based many of these concepts to fit the new State and Federal education standards.

  18. Uni Dufour | Ig Nobel Show with Marc Abrahams | 7 May

    CERN Multimedia

    2013-01-01

    On 7 May, Marc Abrahams, founder of the Ig Nobel Prize, will give an "Ig Nobel show", in English at Uni Dufour. The Ig Nobel Prizes are an American parody of the Nobel Prizes. In early October of each year, they are awarded to ten unusual or trivial achievements in scientific research. The stated aim of the prizes is to "first make people laugh, and then make them think". Marc Abrahams will introduce this funny and dynamic evening with a short presentation before handing over to a selection of recipients. The show is free and open to all. Tuesday 7 May Ig Nobel Show 6:30 p.m. - Room U600 Uni Dufour

  19. The Biochemistry Show: a new and fun tool for learning

    Directory of Open Access Journals (Sweden)

    A.H Ono

    2006-07-01

    Full Text Available The traditional methods to teach biochemistry in most universities are based on the memorization of chemical structures,  biochemical  pathways  and  reagent  names,  which  is  many  times  dismotivating  for  the  students.  We presently describe an innovative, interactive and alternative method for teaching biochemistry to medical and nutrition undergraduate students, called the Biochemistry Show (BioBio Show.The Biobio show is based on active participation of the students. They are divided in groups and the groups face each other. One group faces another one group at a time, in a game based on true or false questions that involve subjects of applied biochemistry (exercise, obesity, diabetes, cholesterol, free radicals, among others. The questions of the Show are previously elaborated by senior students. The Biobio Show has four phases, the first one is a selection exam, and from the second to the fourth phase, eliminatory confrontations happen. On a confrontation, the first group must select a certain quantity of questions for the opponent to answer.  The group who choses the questions must know how to answer and justify the selected questions. This procedure is repeated on all phases of the show. On the last phase, the questions used are taken from an exam previously performed by the students: either the 9-hour biochemistry exam (Sé et al. A 9-hour biochemistry exam. An iron man competition or a good way of evaluating undergraduate students? SBBq 2005, abstract K-6 or the True-or-False exam (TFE (Sé et al. Are tutor-students capable of writing good biochemistry exams? SBBq 2004, abstract K-18. The winner group receives an extra 0,5 point on the final grade. Over 70% of the students informed on a questionnaire that the Biobio Show is a valuable tool for learning biochemistry.    That is a new way to enrich the discussion of biochemistry in the classroom without the students getting bored. Moreover, learning

  20. CERN cars drive by the Geneva Motor Show

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    One of CERN's new gas-fuelled cars was a special guest at the press days of the Geneva motor show this year. The car enjoyed a prominent position on the Gazmobil stand, right next to the latest Mazeratis and Ferraris. Journalists previewing the motor show could discover CERN's support for green technologies and also find out more about the lab - home to the fastest racetrack on the planet, with protons in the LHC running at 99.9999991% of the speed of light.    

  1. TODDLERS WITH ELEVATED AUTISM SYMPTOMS SHOW SLOWED HABITUATION TO FACES

    Science.gov (United States)

    Webb, Sara Jane; Jones, Emily J. H.; Merkle, Kristen; Namkung, Jessica; Toth, Karen; Greenson, Jessica; Murias, Michael; Dawson, Geraldine

    2010-01-01

    We explored social information processing and its relation to social and communicative symptoms in toddlers with Autism Spectrum Disorder (ASD) and their siblings. Toddlers with more severe symptoms of autism showed slower habituation to faces than comparison groups; slower face learning correlated with poorer social skills and lower verbal ability. Unaffected toddlers who were siblings of children with ASD also showed slower habituation to faces compared with toddlers without siblings with ASD. We conclude that slower rates of face learning may be an endophenotype of ASD and is associated with more severe symptoms among affected individuals. PMID:20301009

  2. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    , and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack...... for the influence of such size-effects on cavitation instabilities are presented. When a metal contains a distribution of micro voids, and the void spacing compared to void size is not extremely large, the surrounding voids may affect the occurrence of a cavitation instability at one of the voids. This has been...

  3. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  4. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  5. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  6. Introduction à l'analyse fonctionnelle

    CERN Document Server

    Reischer, Corina; Hengartner, Walter

    1981-01-01

    Fruit de la collaboration des professeur Walter Hengarther de l'Université Laval, Marcel Lambert et Corina Reischer de l'Université du Québec à Trois-Rivières, Introduction à l'analyse fonctionnelle se distingue tant par l'étendue de son contenu que par l'accessibilité de sa présentation. Sans céder quoi que ce soit sur la rigueur, il est parfaitement adapté à un premier cours d'analyse fonctionnelle. Tout en étant d'abord destiné aux étudiants en mathématiques, il pourra certes être utile aux étudiants de second cycle en sciences et en génie.

  7. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  8. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  9. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  10. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  11. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  12. Analysing qualitative research data using computer software.

    Science.gov (United States)

    McLafferty, Ella; Farley, Alistair H

    An increasing number of clinical nurses are choosing to undertake qualitative research. A number of computer software packages are available designed for the management and analysis of qualitative data. However, while it is claimed that the use of these programs is also increasing, this claim is not supported by a search of recent publications. This paper discusses the advantages and disadvantages of using computer software packages to manage and analyse qualitative data.

  13. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  14. Analysing customer behaviour in mobile app usage

    OpenAIRE

    Chen, Qianling; Zhang, Min; Zhao, Xiande

    2017-01-01

    Purpose – Big data produced by mobile apps contains valuable knowledge about customers and markets and has been viewed as productive resources. This study proposes a multiple methods approach to elicit intelligence and value from big data by analysing customer behaviour in mobile app usage. Design/methodology/approach – The big data analytical approach is developed using three data mining techniques: RFM (Recency, Frequency, Monetary) analysis, link analysis, and association rule learning. We...

  15. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  16. Show me the money: state contributions toward STD prevention, 2007.

    Science.gov (United States)

    Meyerson, Beth E; Gilbert, Lisa K

    2010-01-01

    The importance of state investment in sexually transmitted disease (STD) prevention has been discussed since the mid-1990s; however, little has become known about state public health funding for STD prevention. To establish a baseline understanding of state STD prevention funding, financial data for fiscal year 2007 were gathered by survey of state STD, immunization, laboratory, and hepatitis program directors. Results revealed that on average states funded 25.8 percent of their total STD prevention budgets and invested $0.23 per capita in STD prevention. The percentage of state funding in the total state STD prevention budget ranged from 0 percent to 70.2 percent, and state investment in STD prevention ranged from $0.00 to $1.55 per capita. The direction and expenditure of state STD prevention resources was also examined. This study strengthens the national understanding of what states are doing to fund STD prevention, and it broadens state public health awareness of the overall STD prevention investment at the state level. The inclusion of Medicaid data and expenditure of federal resources by states would strengthen the study and assist longitudinal analyses focused on the impact of investment on epidemiologic indicators.

  17. A conceptual framework to analyse supply chain designs

    Directory of Open Access Journals (Sweden)

    J. A. Badenhorst-Weiss

    2011-12-01

    Full Text Available Objectives: Supply chain design (SCD is a concept that forms an integral part of supply chain management (SCM. Effective SCD enhances supply chain integration (SCI which in turn contributes towards improved supply chain performance. Therefore, organisations' supply chain designs need to be analysed. This article proposes a conceptual framework to analyse organisations' supply chain designs. The objective of this article is to determine whether the proposed conceptual framework is a workable instrument with which organisations can analyse their supply chain designs. Problem investigated: Effective SCD is a complex and demanding undertaking and has become a major challenge for organisations. Moreover, the literature suggests that organisations allow their supply chains to evolve rather than consciously designing them. Although the importance of SCD is emphasised, very little attention is given to what it entails exactly. The problem statement of this article is thus: What are the elements of SCD and how can these elements be included in a conceptual framework to analyse organisations' supply chain designs? Methodology: The methodology used in this article comprised two phases. Firstly, a literature review was conducted to identify SCD elements. The elements were used to develop a conceptual framework with which organisations can analyse their supply chain designs. Secondly, the conceptual framework was tested in 13 organisations to determine whether it is a workable instrument to analyse supply chain designs. The respondents were selected by means of non-probability sampling. Purposive, judgmental and convenience sampling methods were used to select the sample. Findings and implications: As mentioned, the conceptual framework was tested empirically within 13 organisations. The findings show that the conceptual framework is in fact a workable instrument to analyse supply chain designs. Value of the research: The research will make a contribution in

  18. Publication bias in dermatology systematic reviews and meta-analyses.

    Science.gov (United States)

    Atakpo, Paul; Vassar, Matt

    2016-05-01

    Systematic reviews and meta-analyses in dermatology provide high-level evidence for clinicians and policy makers that influence clinical decision making and treatment guidelines. One methodological problem with systematic reviews is the under representation of unpublished studies. This problem is due in part to publication bias. Omission of statistically non-significant data from meta-analyses may result in overestimation of treatment effect sizes which may lead to clinical consequences. Our goal was to assess whether systematic reviewers in dermatology evaluate and report publication bias. Further, we wanted to conduct our own evaluation of publication bias on meta-analyses that failed to do so. Our study considered systematic reviews and meta-analyses from ten dermatology journals from 2006 to 2016. A PubMed search was conducted, and all full-text articles that met our inclusion criteria were retrieved and coded by the primary author. 293 articles were included in our analysis. Additionally, we formally evaluated publication bias in meta-analyses that failed to do so using trim and fill and cumulative meta-analysis by precision methods. Publication bias was mentioned in 107 articles (36.5%) and was formally evaluated in 64 articles (21.8%). Visual inspection of a funnel plot was the most common method of evaluating publication bias. Publication bias was present in 45 articles (15.3%), not present in 57 articles (19.5%) and not determined in 191 articles (65.2%). Using the trim and fill method, 7 meta-analyses (33.33%) showed evidence of publication bias. Although the trim and fill method only found evidence of publication bias in 7 meta-analyses, the cumulative meta-analysis by precision method found evidence of publication bias in 15 meta-analyses (71.4%). Many of the reviews in our study did not mention or evaluate publication bias. Further, of the 42 articles that stated following PRISMA reporting guidelines, 19 (45.2%) evaluated for publication bias. In

  19. Comparative genomic analyses of the Taylorellae.

    Science.gov (United States)

    Hauser, Heidi; Richter, Daniel C; van Tonder, Andries; Clark, Louise; Preston, Andrew

    2012-09-14

    Contagious equine metritis (CEM) is an important venereal disease of horses that is of concern to the thoroughbred industry. Taylorella equigenitalis is a causative agent of CEM but very little is known about it or its close relative Taylorella asinigenitalis. To reveal novel information about Taylorella biology, comparative genomic analyses were undertaken. Whole genome sequencing was performed for the T. equigenitalis type strain, NCTC11184. Draft genome sequences were produced for a second T. equigenitalis strain and for a strain of T. asinigenitalis. These genome sequences were analysed and compared to each other and the recently released genome sequence of T. equigenitalis MCE9. These analyses revealed that T. equigenitalis strains appear to be very similar to each other with relatively little strain-specific DNA content. A number of genes were identified that encode putative toxins and adhesins that are possibly involved in infection. Analysis of T. asinigenitalis revealed that it has a very similar gene repertoire to that of T. equigenitalis but shares surprisingly little DNA sequence identity with it. The generation of genome sequence information greatly increases knowledge of these poorly characterised bacteria and greatly facilitates study of them. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  1. Freshwater pearl mussels show plasticity of responses to different predation risks but also show consistent individual differences in responsiveness.

    Science.gov (United States)

    Wilson, Conor D; Arnott, Gareth; Elwood, Robert W

    2012-03-01

    Animals often show behavioural plasticity with respect to predation risk but also show behavioural syndromes in terms of consistency of responses to different stimuli. We examine these features in the freshwater pearl mussel. These bivalves often aggregate presumably to reduce predation risk to each individual. Predation risk, however, will be higher in the presence of predator cues. Here we use dimming light, vibration and touch as novel stimuli to examine the trade-off between motivation to feed and motivation to avoid predation. We present two experiments that each use three sequential novel stimuli to cause the mussels to close their valves and hence cease feeding. We find that mussels within a group showed shorter closure times than solitary mussels, consistent with decreased vulnerability to predation in group-living individuals. Mussels exposed to the odour of a predatory crayfish showed longer closures than control mussels, highlighting the predator assessment abilities of this species. However, individuals showed significant consistency in their closure responses across the trial series, in line with behavioural syndrome theory. Our results show that bivalves trade-off feeding and predator avoidance according to predation risk but the degree to which this is achieved is constrained by behavioural consistency. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. The neonicotinoid imidachloprid shows high chronic toxicity to mayfly nymphs

    NARCIS (Netherlands)

    Roessink, I.; Merga, L.B.; Zweers, A.J.; Brink, van den P.J.

    2013-01-01

    The present study evaluated the acute and chronic toxicity of imidacloprid to a range of freshwater arthropods. Mayfly and caddisfly species were most sensitive to short-term imidacloprid exposures (10 tests), whereas the mayflies showed by far the most sensitive response to long-term exposure of

  3. Polypyridyl iron(II) complexes showing remarkable photocytotoxicity ...

    Indian Academy of Sciences (India)

    Polypyridyl iron(II) complexes showing remarkable photocytotoxicity in visible light. ADITYA GARAIa, UTTARA BASUa, ILA PANTb, PATURU KONDAIAHb,∗ and AKHIL R CHAKRAVARTYa,∗. aDepartment of Inorganic and Physical Chemistry, Indian Institute of Science, Bangalore 560012, India. bDepartment of Molecular ...

  4. Polypyridyl iron(II) complexes showing remarkable photocytotoxicity ...

    Indian Academy of Sciences (India)

    aditya

    Polypyridyl iron(II) complexes showing remarkable photocytotoxicity in visible light. ADITYA GARAI a. , UTTARA BASU a. , ILA PANT b. , PATURU KONDAIAH*. ,b. AND. AKHIL R. CHAKRAVARTY*. ,a a. Department of Inorganic and Physical Chemistry, Indian Institute of Science, Bangalore. 560012, India. E-mail: ...

  5. Manumycin from a new Streptomyces strain shows antagonistic ...

    African Journals Online (AJOL)

    Manumycin from a new Streptomyces strain shows antagonistic effect against methicillin-resistant Staphylococcus aureus (MRSA)/vancomycin-resistant enterococci (VRE) strains from Korean Hospitals. Yun Hee Choi, Seung Sik Cho, Jaya Ram Simkhada, Chi Nam Seong, Hyo Jeong Lee, Hong Seop Moon, Jin Cheol Yoo ...

  6. Auditory temporal-order thresholds show no gender differences

    NARCIS (Netherlands)

    van Kesteren, Marlieke T. R.; Wierslnca-Post, J. Esther C.

    2007-01-01

    Purpose: Several studies on auditory temporal-order processing showed gender differences. Women needed longer inter-stimulus intervals than men when indicating the temporal order of two clicks presented to the left and right ear. In this study, we examined whether we could reproduce these results in

  7. Mice lacking neuropeptide Y show increased sensitivity to cocaine

    DEFF Research Database (Denmark)

    Sørensen, Gunnar; Woldbye, David Paul Drucker

    2012-01-01

    There is increasing data implicating neuropeptide Y (NPY) in the neurobiology of addiction. This study explored the possible role of NPY in cocaine-induced behavior using NPY knockout mice. The transgenic mice showed a hypersensitive response to cocaine in three animal models of cocaine addiction...

  8. Television Judge Shows: Nordic and U.S. Perspectives

    DEFF Research Database (Denmark)

    Porsdam, Helle

    2017-01-01

    Legal discourse is language that people use in a globalizing and multicultural society to negotiate acceptable behaviors and values. We see this played out in popular cultural forums such as judicial television dramas. In the American context, television judge shows are virtually synonymous...

  9. Strobes: Pyrotechnic Compositions That Show a Curious Oscillatory Combustion

    NARCIS (Netherlands)

    Corbel, J.M.L.|info:eu-repo/dai/nl/341356034; van Lingen, J.N.J.|info:eu-repo/dai/nl/311441769; Zevenbergen, J.F.; Gijzeman, O.L.J.|info:eu-repo/dai/nl/073464708; Meijerink, A.|info:eu-repo/dai/nl/075044986

    2013-01-01

    Strobes are pyrotechnic compositions which show an oscillatory combustion; a dark phase and a flash phase alternate periodically. The strobe effect has applications in various fields, most notably in the fireworks industry and in the military area. All strobe compositions mentioned in the literature

  10. An Easy Way to Show Memory Color Effects.

    Science.gov (United States)

    Witzel, Christoph

    2016-01-01

    This study proposes and evaluates a simple stimulus display that allows one to measure memory color effects (the effect of object knowledge and memory on color perception). The proposed approach is fast and easy and does not require running an extensive experiment. It shows that memory color effects are robust to minor variations due to a lack of color calibration.

  11. An Easy Way to Show Memory Color Effects

    OpenAIRE

    Witzel, Christoph

    2016-01-01

    This study proposes and evaluates a simple stimulus display that allows one to measure memory color effects (the effect of object knowledge and memory on color perception). The proposed approach is fast and easy and does not require running an extensive experiment. It shows that memory color effects are robust to minor variations due to a lack of color calibration.

  12. A Progress Evaluation of Four Bilingual Children's Television Shows.

    Science.gov (United States)

    Klein, Stephen P.; And Others

    An evaluation of a bilingual education TV series was conducted involving 6-year-old English speaking, Spanish speaking, and bilingual children at four sites. Children were assigned to control and experimental groups with the latter group seeing four 30 minute shows. A pretest-posttest design was employed with the pretest serving as the covariate…

  13. 10 CFR 110.62 - Order to show cause.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Order to show cause. 110.62 Section 110.62 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL Violations and... of his right, within 20 days or other specified time, to file a written answer and demand a hearing...

  14. THE MUSLIM SHOW: SOFT CONTRA "LABELING" MELALUI MEDIA SOSIAL

    Directory of Open Access Journals (Sweden)

    Yuliana Rakhmawati

    2016-03-01

    Full Text Available Post-event 911 Muslims get “labeling” as an entity that refers  to  the  fatalist devian, fundamentalists, terrorists and a number of other negative Labeling. Labeling (labeling can not be separated from the role  of the  mass media to distribute messages containing the theme of  labeling. An  idea promoted by some Western media tend to provide reinforcement on the labeling. As one of the sensitive issues that tendentious depiction of Islam and labeling will be a provocative theme.Most Muslims respond to such labeling in ways counterproductive. However, most respond with high context communication through the dissemination of messages clarification by using social media. One such action is through the production and distribution of the comic «The Muslim Show» (TMS. Visualization packaging and design of messages in the TMS showed the creativity of the author to show the entity- whether Muslims specifically or so on about the condition of multidimensional community of Muslims. The themes in the comic duo made by the French Muslims are trying to show the positive side, peaceful Islam in the fight against Islamophobia.

  15. 47 CFR 73.24 - Broadcast facilities; showing required.

    Science.gov (United States)

    2010-10-01

    ... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.24 Broadcast facilities; showing required. An authorization for a new AM broadcast station or increase in facilities of an existing station will be issued... transmitter, and other technical phases of operation comply with the regulations governing the same, and the...

  16. International Team Shows that Primes Can Be Found in Surprising ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 3. International Team Shows that Primes Can Be Found in Surprising Places. Andrew Granville. Research News Volume 3 Issue 3 March 1998 pp 71-72. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Triphala, a formulation of traditional Ayurvedic medicine, shows

    Indian Academy of Sciences (India)

    Triphala, a formulation of traditional Ayurvedic medicine, shows protective effect against X-radiation in HeLa cells. YUKI TAKAUJI KENSUKE ... with the cellscultured in vitro. The simple bioassay system with human cultured cells would facilitate the understanding of themolecular basis for the beneficial effects of Triphala.

  18. Five kepler target stars that show multiple transiting exoplanet candidates

    DEFF Research Database (Denmark)

    Steffen..[], Jason H.; Batalha, N. M.; Broucki, W J.

    2010-01-01

    We present and discuss five candidate exoplanetary systems identified with the Kepler spacecraft. These five systems show transits from multiple exoplanet candidates. Should these objects prove to be planetary in nature, then these five systems open new opportunities for the field of exoplanets...

  19. On with the Show! A Guide for Directors and Actors

    Science.gov (United States)

    Bestor, Sheri

    2005-01-01

    Divided into two parts, the Director's Handbook and the Actor's Handbook, On With the Show! enables directors and actors to get the most out of rehearsal time at home and on the stage. Providing essential time-saving techniques, worksheets, and samples, this guide allows performers and directors to work more effectively and efficiently toward the…

  20. 47 CFR 74.131 - Licensing requirements, necessary showing.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Licensing requirements, necessary showing. 74.131 Section 74.131 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES EXPERIMENTAL RADIO, AUXILIARY, SPECIAL BROADCAST AND OTHER PROGRAM DISTRIBUTIONAL SERVICES Experimental Broadcast Stations § 74.131 Licensing...

  1. Teaching Job Interviewing Skills with the Help of Television Shows

    Science.gov (United States)

    Bloch, Janel

    2011-01-01

    Because of its potential for humor and drama, job interviewing is frequently portrayed on television. This article discusses how scenes from popular television series such as "Everybody Loves Raymond," "Friends," and "The Mary Tyler Moore Show" can be used to teach effective job interview skills in business communication courses. Television…

  2. Using "The Daily Show" to Promote Media Literacy

    Science.gov (United States)

    Garrett, H. James; Schmeichel, Mardi

    2012-01-01

    Social studies teachers are tasked with aiding their students' abilities to engage in public debate and make politically sound decisions. One way the authors have found to help facilitate this is to draw connections between content knowledge and current political conversations through the use of clips from "The Daily Show with Jon Stewart." While…

  3. Airline Overbooking Problem with Uncertain No-Shows

    Directory of Open Access Journals (Sweden)

    Chunxiao Zhang

    2014-01-01

    Full Text Available This paper considers an airline overbooking problem of a new single-leg flight with discount fare. Due to the absence of historical data of no-shows for a new flight, and various uncertain human behaviors or unexpected events which causes that a few passengers cannot board their aircraft on time, we fail to obtain the probability distribution of no-shows. In this case, the airlines have to invite some domain experts to provide belief degree of no-shows to estimate its distribution. However, human beings often overestimate unlikely events, which makes the variance of belief degree much greater than that of the frequency. If we still regard the belief degree as a subjective probability, the derived results will exceed our expectations. In order to deal with this uncertainty, the number of no-shows of new flight is assumed to be an uncertain variable in this paper. Given the chance constraint of social reputation, an overbooking model with discount fares is developed to maximize the profit rate based on uncertain programming theory. Finally, the analytic expression of the optimal booking limit is obtained through a numerical example, and the results of sensitivity analysis indicate that the optimal booking limit is affected by flight capacity, discount, confidence level, and parameters of the uncertainty distribution significantly.

  4. Coat protein sequence shows that Cucumber mosaic virus isolate ...

    Indian Academy of Sciences (India)

    Madhu

    CMV subgroup I has recently been subdivided into IA and. IB on the basis of gene sequences available for CMV strains. Coat protein sequence shows that Cucumber mosaic virus isolate from geraniums (Pelargonium spp.) belongs to subgroup II†. NEERAJ VERMA*, B K MAHINGHARA, RAJA RAM and A A ZAIDI.

  5. Synthetic analogs of anoplin show improved antimicrobial activities

    DEFF Research Database (Denmark)

    Munk, Jens; Uggerhøj, Lars Erik; Poulsen, Tanja Juul

    2013-01-01

    We present the antimicrobial and hemolytic activities of the decapeptide anoplin and 19 analogs thereof tested against methicillin-resistant Staphylococcus aureus ATCC 33591 (MRSA), Escherichia coli (ATCC 25922), Pseudomonas aeruginosa (ATCC 27853), vancomycin-resistant Enterococcus faecium (ATCC...... compounds are more specific than anoplin. Both 2Nal(6) and Cha(6) show improved therapeutic index against all strains tested....

  6. An autopsied case of tuberculous meningitis showing interesting CT findings

    International Nuclear Information System (INIS)

    Abiko, Takashi; Higuchi, Hiroshi; Imada, Ryuichi; Nagai, Kenichi

    1983-01-01

    A 61-year-old female patient died of a neurological disorder of unknown origin one month after the first visit and was found to have had tuberculous meningitis at autopsy. CT revealed a low density area showing an enlargement of the cerebral ventricle but did not reveal contrast enhancement in the basal cistern peculiar to tuberculous meningitis. (Namekawa, K.)

  7. Autopsied case of tuberculous meningitis showing interesting CT findings

    Energy Technology Data Exchange (ETDEWEB)

    Abiko, Takashi; Higuchi, Hiroshi; Imada, Ryuichi; Nagai, Kenichi (Iwate Prefectural Central Hospital (Japan))

    1983-11-01

    A 61-year-old female patient died of a neurological disorder of unknown origin one month after the first visit and was found to have had tuberculous meningitis at autopsy. CT revealed a low density area showing an enlargement of the cerebral ventricle but did not reveal contrast enhancement in the basal cistern peculiar to tuberculous meningitis.

  8. 34 CFR 300.193 - Request to show cause.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Request to show cause. 300.193 Section 300.193 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION ASSISTANCE TO STATES FOR THE EDUCATION OF...

  9. 34 CFR 300.194 - Show cause hearing.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Show cause hearing. 300.194 Section 300.194 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION ASSISTANCE TO STATES FOR THE EDUCATION OF CHILDREN WITH...

  10. Caulis Lonicerae Japonicae extract shows protective effect on ...

    African Journals Online (AJOL)

    Result: The results show that a high-dose of CLJE (600 mg/kg) significantly inhibited bone mineral density (BMD) reduction of L4 vertebrae (0.24 ... Keywords: Caulis Lonicerae Japonicae, Post-menopausal osteoporosis, Ovariectomy, Bone mineral density, Trabecular ..... biology of osteoclast function. J Cell Sci. 2000; 113:.

  11. Genoa Boat Show – Good Example of Event Management

    Directory of Open Access Journals (Sweden)

    Dunja Demirović

    2012-07-01

    Full Text Available International Boat Show, a business and tourist event, has been held annually in Italian city of Genoa since 1962. The fair is one of the oldest, largest and best known in the field of boating industry worldwide, primarily due to good management of the event and it can serve as case study for domestic fair organizers to improve the quality of their business and services. Since Belgrade is the city of fairs, but compared to Genoa still underdeveloped in terms of trade shows, the following tasks imposed naturally in this study: to determine the relationship of the organizers of Genoa Boat Show in the sector of preparation and fair offer, in the sector of selection and communication with specific target groups (especially visitors, services during the fair and functioning of the city during the fair. During the research the authors have mostly used historical method, comparison, synthesis and the interview method. The results of theoretical research, in addition, may help not only managers of fair shows and of exhibitions, but also to organizers of other events in our country

  12. Coupled biochemical genetic and karyomorphological analyses for ...

    African Journals Online (AJOL)

    S. esocinus showed five bands, S. curvifrons five, S. niger seven, S. labiatus and S. plagiostomus each showed six bands; they also showed species characteristic bands. Karyotypic study of these was carried out. The diploid chromosome numbers recorded were 98 in S. niger (24 m + 32 sm + 22 st + 20 t), 98 in S. esocinus ...

  13. New Inspiring Planetarium Show Introduces ALMA to the Public

    Science.gov (United States)

    2009-03-01

    As part of a wide range of education and public outreach activities for the International Year of Astronomy 2009 (IYA2009), ESO, together with the Association of French Language Planetariums (APLF), has produced a 30-minute planetarium show, In Search of our Cosmic Origins. It is centred on the global ground-based astronomical Atacama Large Millimeter/submillimeter Array (ALMA) project and represents a unique chance for planetariums to be associated with the IYA2009. ESO PR Photo 09a/09 Logo of the ALMA Planetarium Show ESO PR Photo 09b/09 Galileo's first observations with a telescope ESO PR Photo 09c/09 The ALMA Observatory ESO PR Photo 09d/09 The Milky Way band ESO PR Video 09a/09 Trailer in English ALMA is the leading telescope for observing the cool Universe -- the relic radiation of the Big Bang, and the molecular gas and dust that constitute the building blocks of stars, planetary systems, galaxies and life itself. It is currently being built in the extremely arid environment of the Chajnantor plateau, at 5000 metres altitude in the Chilean Andes, and will start scientific observations around 2011. ALMA, the largest current astronomical project, is a revolutionary telescope, comprising a state-of-the-art array of 66 giant 12-metre and 7-metre diameter antennas observing at millimetre and submillimetre wavelengths. In Search of our Cosmic Origins highlights the unprecedented window on the Universe that this facility will open for astronomers. "The show gives viewers a fascinating tour of the highest observatory on Earth, and takes them from there out into our Milky Way, and beyond," says Douglas Pierce-Price, the ALMA Public Information Officer at ESO. Edited by world fulldome experts Mirage3D, the emphasis of the new planetarium show is on the incomparable scientific adventure of the ALMA project. A young female astronomer guides the audience through a story that includes unique animations and footage, leading the viewer from the first observations by Galileo

  14. Heater probe thermocoagulation for high-risk patients who show rebleeding from peptic ulcers.

    Science.gov (United States)

    Hsieh, Yu-Hsi; Lin, Hwai-Jeng

    2011-08-26

    To investigate whether heater probe therapy is effective for patients showing rebleeding from peptic ulcers. We retrospectively reviewed the case profiles in our previous studies on endoscopic therapy for high-risk patients with peptic ulcer bleeding in the past two decades. We analysed the outcomes of 191 patients who showed rebleeding after initial endoscopic haemostasis and received endoscopic therapy with heater probe thermocoagulation. . A total of 191 patients showing rebleeding received heater probe thermocoagulation. After re-therapy, 158 patients (82.7%) achieved ultimate haemostasis. Twenty-five of the 33 patients who failed to achieve haemostasis received surgical intervention. Ten patients (5.2%) died within 1 month after re-therapy. Heater probe thermocoagulation can be used as the first choice for management of patients showing rebleeding after initial endoscopic therapy.

  15. Social Network Analyses and Nutritional Behavior: An Integrated Modeling Approach

    Directory of Open Access Journals (Sweden)

    Alistair McNair Senior

    2016-01-01

    Full Text Available Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent advances in nutrition research, combining state-space models of nutritional geometry with agent-based models of systems biology, show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a tangible and practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit agent-based models that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition. Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interaction in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  16. Severe weather as a spectacle: the Meteo-Show

    Science.gov (United States)

    Orbe, Iñaki; Gaztelumendi, Santiago

    2017-06-01

    In this work we focus on perhaps one of the worst journalist practice when dealing with severe weather, the Meteo-Show or the extended practice, especially in TV, for using weather and meteorology for spectacle. Journalism today has found weather information in a real goldmine in terms of audience due to the growing public interest in this matter. However, as it happens with other content, sensationalism and exaggeration have also reached weather information, primarily when episodes of adverse nature (snow, heavy rain, floods, etc.) are addressed. In this paper we look to identify the worst practices in weather communication through analysis of examples from real journalist work. We present some keys to understand this trend, highlighting the ingredients that are present in the worst Meteo-show.

  17. Severe weather as a spectacle: the Meteo-Show

    Directory of Open Access Journals (Sweden)

    I. Orbe

    2017-06-01

    Full Text Available In this work we focus on perhaps one of the worst journalist practice when dealing with severe weather, the Meteo-Show or the extended practice, especially in TV, for using weather and meteorology for spectacle. Journalism today has found weather information in a real goldmine in terms of audience due to the growing public interest in this matter. However, as it happens with other content, sensationalism and exaggeration have also reached weather information, primarily when episodes of adverse nature (snow, heavy rain, floods, etc. are addressed. In this paper we look to identify the worst practices in weather communication through analysis of examples from real journalist work. We present some keys to understand this trend, highlighting the ingredients that are present in the worst Meteo-show.

  18. El reality show a la hora de la merienda

    Directory of Open Access Journals (Sweden)

    Lic. Rosa María Ganga Ganga

    2000-01-01

    Full Text Available Los programas de testimonio, inscritos dentro del género televisivo del Reality Show, son una variante del más amplio subgénero del Talk Show y tienen ya una cierta tradición en nuestro país. El presente trabajo se centrará en este tipo de programas de testimonio que basan su estrategia discursiva en la presentación y representación del relato autobiográfico del hombre o la mujer anónimos, integrándose de esta forma en las corrientes más recientes de la sociología y la historiografía, y persigue esclarecer algunas de sus características y funciones, especialmente su función socializadora, a través del mecanismo biográfico y del concepto de habitus tomado de Pierre Bourdieu.

  19. Preschoolers show less trust in physically disabled or obese informants

    Directory of Open Access Journals (Sweden)

    Sara eJaffer

    2015-01-01

    Full Text Available This research examined whether preschool-aged children show less trust in physically disabled or obese informants. In Study 1, when learning about novel physical activities and facts, 4- and 5-year-olds preferred to endorse the testimony of a physically abled, non-obese informant rather than a physically disabled or obese one. In Study 2, after seeing that the physically disabled or obese informant was previously reliable whereas the physically abled, non-obese one was unreliable, 4- and 5-year-olds did not show a significant preference for either informant. We conclude that in line with the literature on children’s negative stereotypes of physically disabled or obese others, preschoolers are biased against these individuals as potential sources of new knowledge. This bias is robust in that past reliability might undermine its effect on children, but cannot reverse it.

  20. "Einstein's Playground": An Interactive Planetarium Show on Special Relativity

    Science.gov (United States)

    Sherin, Zachary; Tan, Philip; Fairweather, Heather; Kortemeyer, Gerd

    2017-12-01

    The understanding of many aspects of astronomy is closely linked with relativity and the finite speed of light, yet relativity is generally not discussed in great detail during planetarium shows for the general public. One reason may be the difficulty to visualize these phenomena in a way that is appropriate for planetariums; another may be their distance from everyday experiences that makes it hard for audiences to connect with the presentation. In this paper, we describe an effort to visualize special relativity phenomena in an immersive "everyday" scenario. In order to bring special relativity to human scale, we simulate a universe in which the speed of light is slower, so that "everyday" speeds become relativistic. We describe the physics and the technical details of our first planetarium show, "Einstein's Playground," which premiered at the Museum of Science, Boston.

  1. Experimental Lung Cancer Drug Shows Early Promise | Poster

    Science.gov (United States)

    By Frank Blanchard, Staff Writer A first-of-its-kind drug is showing early promise in attacking certain lung cancers that are hard to treat because they build up resistance to conventional chemotherapy. The drug, CO-1686, performed well in a preclinical study involving xenograft and transgenic mice, as reported in the journal Cancer Discovery. It is now being evaluated for safety and efficacy in Phase I and II clinical trials.

  2. Learning from a dive show in an aquarium setting

    Science.gov (United States)

    Walsh, Lori M.

    A study was conducted at an aquarium next to a theme park to understand information recalled from two versions of shows viewed at the largest display. The goal of this research was to determine if learning was enhanced by having a diver in water as the treatment group. This project focused on the knowledge recalled about shark and ray feeding adaptations, the information recalled about the mentioned conservation message about sustainable seafood and the potential of the two shows to make memorable experiences. During the project, 30 adult participants from each group were given a survey with five open-ended questions. Results suggest that the diver might distract from biological content information, or that the diver is such a novel element that it interferes with recall. While guests seemed to recall information about rays and sharks, the amount of information was not substantial. It appears that the diver does not affect content messaging but does impact whether guests attend to Seafood Watch messaging. The diver may have been so novel that the treatment group could not attend to the conservation message that was delivered, regardless of topic, or the control group recalled the message because the guests were not distracted by the diver or feeding. The absence of a diver seems to allow the guests to better attend to what is happening outside of the tank. While adding a diver increases photo opportunities and may bring guests to a show, the results seem to indicate that it does not significantly increase recall. The results of this study show that guests in a theme park setting can recall information from an educational program. Guests may not enter this hybrid aquarium with the intention of learning, but recall, one of the components in learning, does occur.

  3. High-frequency parameters of magnetic films showing magnetization dispersion

    International Nuclear Information System (INIS)

    Sidorenkov, V.V.; Zimin, A.B.; Kornev, Yu.V.

    1988-01-01

    Magnetization dispersion leads to skewed resonance curves shifted towards higher magnetizing fields, together with considerable reduction in the resonant absorption, while the FMR line width is considerably increased. These effects increase considerably with frequency, in contrast to films showing magnetic-anisotropy dispersion, where they decrease. It is concluded that there may be anomalies in the frequency dependence of the resonance parameters for polycrystalline magnetic films

  4. An Undergraduate Endeavor: Assembling a Live Planetarium Show About Mars

    Science.gov (United States)

    McGraw, Allison M.

    2016-10-01

    Viewing the mysterious red planet Mars goes back thousands of years with just the human eye but in more recent years the growth of telescopes, satellites and lander missions unveil unrivaled detail of the Martian surface that tells a story worth listening to. This planetarium show will go through the observations starting with the ancients to current understandings of the Martian surface, atmosphere and inner-workings through past and current Mars missions. Visual animations of its planetary motions, display of high resolution images from the Hi-RISE (High Resolution Imaging Science Experiment) and CTX (Context Camera) data imagery aboard the MRO (Mars Reconnaissance Orbiter) as well as other datasets will be used to display the terrain detail and imagery of the planet Mars with a digital projection system. Local planetary scientists and Mars specialists from the Lunar and Planetary Lab at the University of Arizona (Tucson, AZ) will be interviewed and used in the show to highlight current technology and understandings of the red planet. This is an undergraduate project that is looking for collaborations and insight in order gain structure in script writing that will teach about this planetary body to all ages in the format of a live planetarium show.

  5. Radon in Austrian tourist mines and show caves

    International Nuclear Information System (INIS)

    Ringer, W.; Graeser, J.

    2009-01-01

    The radon situation in tourist mines and show caves is barely investigated in Austria. This paper investigates the influence of its determining factors, such as climate, structure and geology. For this purpose, long-term time-resolved measurements over 6 to 12 months in 4 tourist mines and 2 show caves - with 5 to 9 measuring points each - have been carried out to obtain the course of radon concentration throughout the year. In addition, temperature and air-pressure were measured and compared to the data outside where available. Results suggest that the dominating factors of the average radon concentration are structure and location (geology) of the tunnel-system, whereas the diurnal and annual variation is mainly caused by the changing airflow, which is driven by the difference in temperature inside and outside. Downcast air is connected with very low radon concentrations, upcast air with high concentrations. In some locations the maximum values appear when the airflow ceases. But airflow can be different in different parts of mines and caves. Systems close to the surface show generally lower radon levels than the ones located deeper underground. Due to variation of structure, geology and local climate, the radon situation in mines and caves can only be described by simultaneous measurements at several measuring points. (orig.)

  6. AirShow 1.0 CFD Software Users' Guide

    Science.gov (United States)

    Mohler, Stanley R., Jr.

    2005-01-01

    AirShow is visualization post-processing software for Computational Fluid Dynamics (CFD). Upon reading binary PLOT3D grid and solution files into AirShow, the engineer can quickly see how hundreds of complex 3-D structured blocks are arranged and numbered. Additionally, chosen grid planes can be displayed and colored according to various aerodynamic flow quantities such as Mach number and pressure. The user may interactively rotate and translate the graphical objects using the mouse. The software source code was written in cross-platform Java, C++, and OpenGL, and runs on Unix, Linux, and Windows. The graphical user interface (GUI) was written using Java Swing. Java also provides multiple synchronized threads. The Java Native Interface (JNI) provides a bridge between the Java code and the C++ code where the PLOT3D files are read, the OpenGL graphics are rendered, and numerical calculations are performed. AirShow is easy to learn and simple to use. The source code is available for free from the NASA Technology Transfer and Partnership Office.

  7. Architectures for Quantum Simulation Showing a Quantum Speedup

    Science.gov (United States)

    Bermejo-Vega, Juan; Hangleiter, Dominik; Schwarz, Martin; Raussendorf, Robert; Eisert, Jens

    2018-04-01

    One of the main aims in the field of quantum simulation is to achieve a quantum speedup, often referred to as "quantum computational supremacy," referring to the experimental realization of a quantum device that computationally outperforms classical computers. In this work, we show that one can devise versatile and feasible schemes of two-dimensional, dynamical, quantum simulators showing such a quantum speedup, building on intermediate problems involving nonadaptive, measurement-based, quantum computation. In each of the schemes, an initial product state is prepared, potentially involving an element of randomness as in disordered models, followed by a short-time evolution under a basic translationally invariant Hamiltonian with simple nearest-neighbor interactions and a mere sampling measurement in a fixed basis. The correctness of the final-state preparation in each scheme is fully efficiently certifiable. We discuss experimental necessities and possible physical architectures, inspired by platforms of cold atoms in optical lattices and a number of others, as well as specific assumptions that enter the complexity-theoretic arguments. This work shows that benchmark settings exhibiting a quantum speedup may require little control, in contrast to universal quantum computing. Thus, our proposal puts a convincing experimental demonstration of a quantum speedup within reach in the near term.

  8. Elemental abundance and analyses with coadded DAO spectrograms

    International Nuclear Information System (INIS)

    Adelman, S.J.

    1987-01-01

    One can improve the quality of elemental abundance analyses by using higher signal-to-noise data than has been the practice at high resolution. The procedures developed at the Dominion Astrophysical Observatory to coadd high-dispersion coude spectrograms are used with a minimum of 10 6.5 A mm -1 IIa-O spectrograms of each of three field horizontal-branch (FHB)A stars to increase the signal-to-noise ratio of the photographic data over a considerable wavelength region. Fine analyses of the sharp-lined prototype FHB stars HD 109995 and 161817 show an internal consistency which justifies this effort. Their photospheric elemental abundances are similar to those of Population II globular cluster giants. (author)

  9. Physiological and enzymatic analyses of pineapple subjected to ionizing radiation

    International Nuclear Information System (INIS)

    Silva, Josenilda Maria da; Silva, Juliana Pizarro; Spoto, Marta Helena Fillet

    2007-01-01

    The physiological and enzymatic post-harvest characteristics of the pineapple cultivar Smooth Cayenne were evaluated after the fruits were gamma-irradiated with doses of 100 and 150 Gy and the fruits were stored for 10, 20 and 30 days at 12 deg C (±1) and relative humidity of 85% (±5). Physiological and enzymatic analyses were made for each storage period to evaluate the alterations resulting from the application of ionizing radiation. Control specimens showed higher values of soluble pectins, total pectins, reducing sugars, sucrose and total sugars and lower values of polyphenyloxidase and polygalacturonase enzyme activities. All the analyses indicated that storage time is a significantly influencing factor. The 100 Gy dosage and 20-day storage period presented the best results from the standpoint of maturation and conservation of the fruits quality. (author)

  10. SVM models for analysing the headstreams of mine water inrush

    Energy Technology Data Exchange (ETDEWEB)

    Yan Zhi-gang; Du Pei-jun; Guo Da-zhi [China University of Science and Technology, Xuzhou (China). School of Environmental Science and Spatial Informatics

    2007-08-15

    The support vector machine (SVM) model was introduced to analyse the headstrean of water inrush in a coal mine. The SVM model, based on a hydrogeochemical method, was constructed for recognising two kinds of headstreams and the H-SVMs model was constructed for recognising multi- headstreams. The SVM method was applied to analyse the conditions of two mixed headstreams and the value of the SVM decision function was investigated as a means of denoting the hydrogeochemical abnormality. The experimental results show that the SVM is based on a strict mathematical theory, has a simple structure and a good overall performance. Moreover the parameter W in the decision function can describe the weights of discrimination indices of the headstream of water inrush. The value of the decision function can denote hydrogeochemistry abnormality, which is significant in the prevention of water inrush in a coal mine. 9 refs., 1 fig., 7 tabs.

  11. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  12. Remembering Operación Triunfo: a Latin Music Reality Show in the Era of Talent Shows

    NARCIS (Netherlands)

    Savini, Paola

    2016-01-01

    abstractThe music format Operación Triunfo (2001–2011), which aired on RTVE for the first time in 2001, started as a television (TV) and musical success in Spain and today is one of the most famous shows around the world as well as an incredible socio-economic phenomenon in Spanish TV. This paper

  13. [Analyses of deaths can provide meaningful learning].

    Science.gov (United States)

    Jensen, Marie Rosenørn Hviid; Jørsboe, Hanne Blæhr

    2016-05-16

    Learning based on deceased patients has provided medicine with substantial knowledge and is still a source of new information. The basic learning approach has been autopsies, but focus has shifted towards analysis of registry data. This article evaluates different ways to analyse the natural deaths, including autopsies, audits, clinical databases and hospital standardised mortality ratios in regard of clinical learning. We claim that data-powered analysis cannot stand alone, and recommend that clinicians should organise multidisciplinary theoretically based audits, in order to keep learning from the deceased.

  14. Fully Coupled FE Analyses of Buried Structures

    Directory of Open Access Journals (Sweden)

    James T. Baylot

    1994-01-01

    Full Text Available Current procedures for determining the response of buried structures to the effects of the detonation of buried high explosives recommend decoupling the free-field stress analysis from the structure response analysis. A fully coupled (explosive–soil structure finite element analysis procedure was developed so that the accuracies of current decoupling procedures could be evaluated. Comparisons of the results of analyses performed using this procedure with scale-model experiments indicate that this finite element procedure can be used to effectively evaluate the accuracies of the methods currently being used to decouple the free-field stress analysis from the structure response analysis.

  15. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  16. ORNL analyses of AVR performance and safety

    Energy Technology Data Exchange (ETDEWEB)

    Cleveland, J.C.

    1985-01-01

    Because of the high interest in modular High Temperature Reactor performance and safety, a cooperative project has been established involving the Oak Ridge National Laboratory (ORNL), Arbeitsgemeinschaft Versuchs Reaktor GmbH (AVR), and Kernforschungsanlage Juelich GmbH (KFA) in reactor physics, performance and safety. This paper presents initial results of ORNL's examination of a hypothetical depressurized core heatup accident and consideration of how a depressurized core heatup test might be conducted by AVR staff. Also presented are initial analyses of a test involving a reduction in core flow and of a test involving reactivity insertion via control rod withdrawal.

  17. ORNL analyses of AVR performance and safety

    International Nuclear Information System (INIS)

    Cleveland, J.C.

    1985-01-01

    Because of the high interest in modular High Temperature Reactor performance and safety, a cooperative project has been established involving the Oak Ridge National Laboratory (ORNL), Arbeitsgemeinschaft Versuchs Reaktor GmbH (AVR), and Kernforschungsanlage Juelich GmbH (KFA) in reactor physics, performance and safety. This paper presents initial results of ORNL's examination of a hypothetical depressurized core heatup accident and consideration of how a depressurized core heatup test might be conducted by AVR staff. Also presented are initial analyses of a test involving a reduction in core flow and of a test involving reactivity insertion via control rod withdrawal

  18. Cost/benefit analyses of environmental impact

    International Nuclear Information System (INIS)

    Goldman, M.I.

    1974-01-01

    Various aspects of cost-benefit analyses are considered. Some topics discussed are: regulations of the National Environmental Policy Act (NEPA); statement of AEC policy and procedures for implementation of NEPA; Calvert Cliffs decision; AEC Regulatory Guide; application of risk-benefit analysis to nuclear power; application of the as low as practicable (ALAP) rule to radiation discharges; thermal discharge restrictions proposed by EPA under the 1972 Amendment to the Water Pollution Control Act; estimates of somatic and genetic insult per unit population exposure; occupational exposure; EPA Point Source Guidelines for Discharges from Steam Electric Power Plants; and costs of closed-cycle cooling using cooling towers. (U.S.)

  19. Patients with polymyositis show changes in muscle protein charges

    DEFF Research Database (Denmark)

    Bartels, E M; Jacobsen, Søren; Rasmussen, L

    1989-01-01

    Polymyositis (PM) appears with indolent proximal muscle weakness and is an inflammatory disease with breakdown of muscle cells. In our study the protein charge concentrations of the contractile proteins in the A and I bands were determined, applying a microelectrode technique. Patients with PM show...... a lower protein charge concentration than healthy control subjects which may be caused by the breakdown and removal of the proteins in the contractile filaments. A tool to judge the state of the disease as well as an aid in diagnosis may have been found in this method....

  20. Evaluating social marketing: lessons from ShowCase.

    Science.gov (United States)

    Christopoulos, Alex; Reynolds, Lucy

    2009-11-01

    In April 2009, the National Social Marketing Centre launched its new case study resource, ShowCase: a collection of 40 best practice social marketing programmes, predominantly from the UK. The process of collecting and researching these case studies has provided a unique opportunity to look at the current state of 'evaluation' within the field of social marketing. This paper shares some of the observations made during the review, exploring common challenges faced when evaluating social marketing. It also provides recommendations for improving this process to guide future social marketing delivery.