WorldWideScience

Sample records for maximum lod score

  1. The power and robustness of maximum LOD score statistics.

    Science.gov (United States)

    Yoo, Y J; Mendell, N R

    2008-07-01

    The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value. In this paper, we compare the power of maximum LOD scores based on three fixed sets of genetic parameter values with the power of the LOD score obtained after maximizing over the entire range of genetic parameter values. We simulate family data under nine generating models. For generating models with non-zero phenocopy rates, LOD scores maximized over the entire range of genetic parameters yielded greater power than maximum LOD scores for fixed sets of parameter values with zero phenocopy rates. No maximum LOD score was consistently more powerful than the others for generating models with a zero phenocopy rate. The power loss of the LOD score maximized over the entire range of genetic parameters, relative to the maximum LOD score calculated using the correct genetic parameter value, appeared to be robust to the generating models.

  2. Percentiles of the null distribution of 2 maximum lod score tests.

    Science.gov (United States)

    Ulgen, Ayse; Yoo, Yun Joo; Gordon, Derek; Finch, Stephen J; Mendell, Nancy R

    2004-01-01

    We here consider the null distribution of the maximum lod score (LOD-M) obtained upon maximizing over transmission model parameters (penetrance values, dominance, and allele frequency) as well as the recombination fraction. Also considered is the lod score maximized over a fixed choice of genetic model parameters and recombination-fraction values set prior to the analysis (MMLS) as proposed by Hodge et al. The objective is to fit parametric distributions to MMLS and LOD-M. Our results are based on 3,600 simulations of samples of n = 100 nuclear families ascertained for having one affected member and at least one other sibling available for linkage analysis. Each null distribution is approximately a mixture p(2)(0) + (1 - p)(2)(v). The values of MMLS appear to fit the mixture 0.20(2)(0) + 0.80chi(2)(1.6). The mixture distribution 0.13(2)(0) + 0.87chi(2)(2.8). appears to describe the null distribution of LOD-M. From these results we derive a simple method for obtaining critical values of LOD-M and MMLS. Copyright 2004 S. Karger AG, Basel

  3. The lod score method.

    Science.gov (United States)

    Rice, J P; Saccone, N L; Corbett, J

    2001-01-01

    The lod score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential, so that pedigrees or lod curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders, where the maximum lod score (MLS) statistic shares some of the advantages of the traditional lod score approach but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the lod score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.

  4. Comparison of empirical strategies to maximize GENEHUNTER lod scores.

    Science.gov (United States)

    Chen, C H; Finch, S J; Mendell, N R; Gordon, D

    1999-01-01

    We compare four strategies for finding the settings of genetic parameters that maximize the lod scores reported in GENEHUNTER 1.2. The four strategies are iterated complete factorial designs, iterated orthogonal Latin hypercubes, evolutionary operation, and numerical optimization. The genetic parameters that are set are the phenocopy rate, penetrance, and disease allele frequency; both recessive and dominant models are considered. We selected the optimization of a recessive model on the Collaborative Study on the Genetics of Alcoholism (COGA) data of chromosome 1 for complete analysis. Convergence to a setting producing a local maximum required the evaluation of over 100 settings (for a time budget of 800 minutes on a Pentium II 300 MHz PC). Two notable local maxima were detected, suggesting the need for a more extensive search before claiming that a global maximum had been found. The orthogonal Latin hypercube design was the best strategy for finding areas that produced high lod scores with small numbers of evaluations. Numerical optimization starting from a region producing high lod scores was the strategy that found the highest maximum observed.

  5. Lod scores for gene mapping in the presence of marker map uncertainty.

    Science.gov (United States)

    Stringham, H M; Boehnke, M

    2001-07-01

    Multipoint lod scores are typically calculated for a grid of locus positions, moving the putative disease locus across a fixed map of genetic markers. Changing the order of a set of markers and/or the distances between the markers can make a substantial difference in the resulting lod score curve and the location and height of its maximum. The typical approach of using the best maximum likelihood marker map is not easily justified if other marker orders are nearly as likely and give substantially different lod score curves. To deal with this problem, we propose three weighted multipoint lod score statistics that make use of information from all plausible marker orders. In each of these statistics, the information conditional on a particular marker order is included in a weighted sum, with weight equal to the posterior probability of that order. We evaluate the type 1 error rate and power of these three statistics on the basis of results from simulated data, and compare these results to those obtained using the best maximum likelihood map and the map with the true marker order. We find that the lod score based on a weighted sum of maximum likelihoods improves on using only the best maximum likelihood map, having a type 1 error rate and power closest to that of using the true marker order in the simulation scenarios we considered. Copyright 2001 Wiley-Liss, Inc.

  6. Extension of the lod score: the mod score.

    Science.gov (United States)

    Clerget-Darpoux, F

    2001-01-01

    In 1955 Morton proposed the lod score method both for testing linkage between loci and for estimating the recombination fraction between them. If a disease is controlled by a gene at one of these loci, the lod score computation requires the prior specification of an underlying model that assigns the probabilities of genotypes from the observed phenotypes. To address the case of linkage studies for diseases with unknown mode of inheritance, we suggested (Clerget-Darpoux et al., 1986) extending the lod score function to a so-called mod score function. In this function, the variables are both the recombination fraction and the disease model parameters. Maximizing the mod score function over all these parameters amounts to maximizing the probability of marker data conditional on the disease status. Under the absence of linkage, the mod score conforms to a chi-square distribution, with extra degrees of freedom in comparison to the lod score function (MacLean et al., 1993). The mod score is asymptotically maximum for the true disease model (Clerget-Darpoux and Bonaïti-Pellié, 1992; Hodge and Elston, 1994). Consequently, the power to detect linkage through mod score will be highest when the space of models where the maximization is performed includes the true model. On the other hand, one must avoid overparametrization of the model space. For example, when the approach is applied to affected sibpairs, only two constrained disease model parameters should be used (Knapp et al., 1994) for the mod score maximization. It is also important to emphasize the existence of a strong correlation between the disease gene location and the disease model. Consequently, there is poor resolution of the location of the susceptibility locus when the disease model at this locus is unknown. Of course, this is true regardless of the statistics used. The mod score may also be applied in a candidate gene strategy to model the potential effect of this gene in the disease. Since, however, it

  7. Posterior probability of linkage and maximal lod score.

    Science.gov (United States)

    Génin, E; Martinez, M; Clerget-Darpoux, F

    1995-01-01

    To detect linkage between a trait and a marker, Morton (1955) proposed to calculate the lod score z(theta 1) at a given value theta 1 of the recombination fraction. If z(theta 1) reaches +3 then linkage is concluded. However, in practice, lod scores are calculated for different values of the recombination fraction between 0 and 0.5 and the test is based on the maximum value of the lod score Zmax. The impact of this deviation of the test on the probability that in fact linkage does not exist, when linkage was concluded, is documented here. This posterior probability of no linkage can be derived by using Bayes' theorem. It is less than 5% when the lod score at a predetermined theta 1 is used for the test. But, for a Zmax of +3, we showed that it can reach 16.4%. Thus, considering a composite alternative hypothesis instead of a single one decreases the reliability of the test. The reliability decreases rapidly when Zmax is less than +3. Given a Zmax of +2.5, there is a 33% chance that linkage does not exist. Moreover, the posterior probability depends not only on the value of Zmax but also jointly on the family structures and on the genetic model. For a given Zmax, the chance that linkage exists may then vary.

  8. Lod score curves for phase-unknown matings.

    Science.gov (United States)

    Hulbert-Shearon, T; Boehnke, M; Lange, K

    1996-01-01

    For a phase-unknown nuclear family, we show that the likelihood and lod score are unimodal, and we describe conditions under which the maximum occurs at recombination fraction theta = 0, theta = 1/2, and 0 < theta < 1/2. These simply stated necessary and sufficient conditions seem to have escaped the notice of previous statistical geneticists.

  9. The power to detect linkage in complex disease by means of simple LOD-score analyses.

    Science.gov (United States)

    Greenberg, D A; Abreu, P; Hodge, S E

    1998-09-01

    Maximum-likelihood analysis (via LOD score) provides the most powerful method for finding linkage when the mode of inheritance (MOI) is known. However, because one must assume an MOI, the application of LOD-score analysis to complex disease has been questioned. Although it is known that one can legitimately maximize the maximum LOD score with respect to genetic parameters, this approach raises three concerns: (1) multiple testing, (2) effect on power to detect linkage, and (3) adequacy of the approximate MOI for the true MOI. We evaluated the power of LOD scores to detect linkage when the true MOI was complex but a LOD score analysis assumed simple models. We simulated data from 14 different genetic models, including dominant and recessive at high (80%) and low (20%) penetrances, intermediate models, and several additive two-locus models. We calculated LOD scores by assuming two simple models, dominant and recessive, each with 50% penetrance, then took the higher of the two LOD scores as the raw test statistic and corrected for multiple tests. We call this test statistic "MMLS-C." We found that the ELODs for MMLS-C are >=80% of the ELOD under the true model when the ELOD for the true model is >=3. Similarly, the power to reach a given LOD score was usually >=80% that of the true model, when the power under the true model was >=60%. These results underscore that a critical factor in LOD-score analysis is the MOI at the linked locus, not that of the disease or trait per se. Thus, a limited set of simple genetic models in LOD-score analysis can work well in testing for linkage.

  10. Sensitivity of lod scores to changes in diagnostic status.

    Science.gov (United States)

    Hodge, S E; Greenberg, D A

    1992-05-01

    This paper investigates effects on lod scores when one individual in a data set changes diagnostic or recombinant status. First we examine the situation in which a single offspring in a nuclear family changes status. The nuclear-family situation, in addition to being of interest in its own right, also has general theoretical importance, since nuclear families are "transparent"; that is, one can track genetic events more precisely in nuclear families than in complex pedigrees. We demonstrate that in nuclear families log10 [(1-theta)/theta] gives an upper limit on the impact that a single offspring's change in status can have on the lod score at that recombination fraction (theta). These limits hold for a fully penetrant dominant condition and fully informative marker, in either phase-known or phase-unknown matings. Moreover, log10 [(1-theta)/theta] (where theta denotes the value of theta at which Zmax occurs) gives an upper limit on the impact of a single offspring's status change on the maximum lod score (Zmax). In extended pedigrees, in contrast to nuclear families, no comparable limit can be set on the impact of a single individual on the lod score. Complex pedigrees are subject to both stabilizing and destabilizing influences, and these are described. Finally, we describe a "sensitivity analysis," in which, after all linkage analysis is completed, every informative individual in the data set is changed, one at a time, to see the effect which each separate change has on the lod scores. The procedure includes identifying "critical individuals," i.e., those who would have the greatest impact on the lod scores, should their diagnostic status in fact change. To illustrate use of the sensitivity analysis, we apply it to the large bipolar pedigree reported by Egeland et al. and Kelsoe et al. We show that the changes in lod scores observed there, on the order of 1.1-1.2 per person, are not unusual. We recommend that investigators include a sensitivity analysis as a

  11. Serial evaluation of the MODS, SOFA and LOD scores to predict ICU mortality in mixed critically ill patients.

    Science.gov (United States)

    Khwannimit, Bodin

    2008-09-01

    To perform a serial assessment and compare ability in predicting the intensive care unit (ICU) mortality of the multiple organ dysfunction score (MODS), sequential organ failure assessment (SOFA) and logistic organ dysfunction (LOD) score. The data were collected prospectively on consecutive ICU admissions over a 24-month period at a tertiary referral university hospital. The MODS, SOFA, and LOD scores were calculated on initial and repeated every 24 hrs. Two thousand fifty four patients were enrolled in the present study. The maximum and delta-scores of all the organ dysfunction scores correlated with ICU mortality. The maximum score of all models had better ability for predicting ICU mortality than initial or delta score. The areas under the receiver operating characteristic curve (AUC) for maximum scores was 0.892 for the MODS, 0.907 for the SOFA, and 0.92for the LOD. No statistical difference existed between all maximum scores and Acute Physiology and Chronic Health Evaluation II (APACHE II) score. Serial assessment of organ dysfunction during the ICU stay is reliable with ICU mortality. The maximum scores is the best discrimination comparable with APACHE II score in predicting ICU mortality.

  12. Effect of misspecification of gene frequency on the two-point LOD score.

    Science.gov (United States)

    Pal, D K; Durner, M; Greenberg, D A

    2001-11-01

    In this study, we used computer simulation of simple and complex models to ask: (1) What is the penalty in evidence for linkage when the assumed gene frequency is far from the true gene frequency? (2) If the assumed model for gene frequency and inheritance are misspecified in the analysis, can this lead to a higher maximum LOD score than that obtained under the true parameters? Linkage data simulated under simple dominant, recessive, dominant and recessive with reduced penetrance, and additive models, were analysed assuming a single locus with both the correct and incorrect dominance model and assuming a range of different gene frequencies. We found that misspecifying the analysis gene frequency led to little penalty in maximum LOD score in all models examined, especially if the assumed gene frequency was lower than the generating one. Analysing linkage data assuming a gene frequency of the order of 0.01 for a dominant gene, and 0.1 for a recessive gene, appears to be a reasonable tactic in the majority of realistic situations because underestimating the gene frequency, even when the true gene frequency is high, leads to little penalty in the LOD score.

  13. Direct power comparisons between simple LOD scores and NPL scores for linkage analysis in complex diseases.

    Science.gov (United States)

    Abreu, P C; Greenberg, D A; Hodge, S E

    1999-09-01

    Several methods have been proposed for linkage analysis of complex traits with unknown mode of inheritance. These methods include the LOD score maximized over disease models (MMLS) and the "nonparametric" linkage (NPL) statistic. In previous work, we evaluated the increase of type I error when maximizing over two or more genetic models, and we compared the power of MMLS to detect linkage, in a number of complex modes of inheritance, with analysis assuming the true model. In the present study, we compare MMLS and NPL directly. We simulated 100 data sets with 20 families each, using 26 generating models: (1) 4 intermediate models (penetrance of heterozygote between that of the two homozygotes); (2) 6 two-locus additive models; and (3) 16 two-locus heterogeneity models (admixture alpha = 1.0,.7,.5, and.3; alpha = 1.0 replicates simple Mendelian models). For LOD scores, we assumed dominant and recessive inheritance with 50% penetrance. We took the higher of the two maximum LOD scores and subtracted 0.3 to correct for multiple tests (MMLS-C). We compared expected maximum LOD scores and power, using MMLS-C and NPL as well as the true model. Since NPL uses only the affected family members, we also performed an affecteds-only analysis using MMLS-C. The MMLS-C was both uniformly more powerful than NPL for most cases we examined, except when linkage information was low, and close to the results for the true model under locus heterogeneity. We still found better power for the MMLS-C compared with NPL in affecteds-only analysis. The results show that use of two simple modes of inheritance at a fixed penetrance can have more power than NPL when the trait mode of inheritance is complex and when there is heterogeneity in the data set.

  14. Effect of heterogeneity and assumed mode of inheritance on lod scores.

    Science.gov (United States)

    Durner, M; Greenberg, D A

    1992-02-01

    Heterogeneity is a major factor in many common, complex diseases and can confound linkage analysis. Using computer-simulated heterogeneous data we tested what effect unlinked families have on a linkage analysis when heterogeneity is not taken into account. We created 60 data sets of 40 nuclear families each with different proportions of linked and unlinked families and with different modes of inheritance. The ascertainment probability was 0.05, the disease had a penetrance of 0.6, and the recombination fraction for the linked families was zero. For the analysis we used a variety of assumed modes of inheritance and penetrances. Under these conditions we looked at the effect of the unlinked families on the lod score, the evaluation of the mode of inheritance, and the estimate of penetrance and of the recombination fraction in the linked families. 1. When the analysis was done under the correct mode of inheritance for the linked families, we found that the mode of inheritance of the unlinked families had minimal influence on the highest maximum lod score (MMLS) (i.e., we maximized the maximum lod score with respect to penetrance). Adding sporadic families decreased the MMLS less than adding recessive or dominant unlinked families. 2. The mixtures of dominant linked families with unlinked families always led to a higher MMLS when analyzed under the correct (dominant) mode of inheritance than when analyzed under the incorrect mode of inheritance. In the mixtures with recessive linked families, assuming the correct mode of inheritance generally led to a higher MMLS, but we observed broad variation.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. Major strengths and weaknesses of the lod score method.

    Science.gov (United States)

    Ott, J

    2001-01-01

    Strengths and weaknesses of the lod score method for human genetic linkage analysis are discussed. The main weakness is its requirement for the specification of a detailed inheritance model for the trait. Various strengths are identified. For example, the lod score (likelihood) method has optimality properties when the trait to be studied is known to follow a Mendelian mode of inheritance. The ELOD is a useful measure for information content of the data. The lod score method can emulate various "nonparametric" methods, and this emulation is equivalent to the nonparametric methods. Finally, the possibility of building errors into the analysis will prove to be essential for the large amount of linkage and disequilibrium data expected in the near future.

  16. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  17. Pseudoautosomal region in schizophrenia: linkage analysis of seven loci by sib-pair and lod-score methods.

    Science.gov (United States)

    d'Amato, T; Waksman, G; Martinez, M; Laurent, C; Gorwood, P; Campion, D; Jay, M; Petit, C; Savoye, C; Bastard, C

    1994-05-01

    In a previous study, we reported a nonrandom segregation between schizophrenia and the pseudoautosomal locus DXYS14 in a sample of 33 sibships. That study has been extended by the addition of 16 new sibships from 16 different families. Data from six other loci of the pseudoautosomal region and of the immediately adjacent part of the X specific region have also been analyzed. Two methods of linkage analysis were used: the affected sibling pair (ASP) method and the lod-score method. Lod-score analyses were performed on the basis of three different models--A, B, and C--all shown to be consistent with the epidemiological data on schizophrenia. No clear evidence for linkage was obtained with any of these models. However, whatever the genetic model and the disease classification, maximum lod scores were positive with most of the markers, with the highest scores generally being obtained for the DXYS14 locus. When the ASP method was used, the earlier finding of nonrandom segregation between schizophrenia and the DXYS14 locus was still supported in this larger data set, at an increased level of statistical significance. Findings of ASP analyses were not significant for the other loci. Thus, findings obtained from analyses using the ASP method, but not the lod-score method, were consistent with the pseudoautosomal hypothesis for schizophrenia.

  18. The score statistic of the LD-lod analysis: detecting linkage adaptive to linkage disequilibrium.

    Science.gov (United States)

    Huang, J; Jiang, Y

    2001-01-01

    We study the properties of a modified lod score method for testing linkage that incorporates linkage disequilibrium (LD-lod). By examination of its score statistic, we show that the LD-lod score method adaptively combines two sources of information: (a) the IBD sharing score which is informative for linkage regardless of the existence of LD and (b) the contrast between allele-specific IBD sharing scores which is informative for linkage only in the presence of LD. We also consider the connection between the LD-lod score method and the transmission-disequilibrium test (TDT) for triad data and the mean test for affected sib pair (ASP) data. We show that, for triad data, the recessive LD-lod test is asymptotically equivalent to the TDT; and for ASP data, it is an adaptive combination of the TDT and the ASP mean test. We demonstrate that the LD-lod score method has relatively good statistical efficiency in comparison with the ASP mean test and the TDT for a broad range of LD and the genetic models considered in this report. Therefore, the LD-lod score method is an interesting approach for detecting linkage when the extent of LD is unknown, such as in a genome-wide screen with a dense set of genetic markers. Copyright 2001 S. Karger AG, Basel

  19. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    Science.gov (United States)

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-06-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented.

  20. Robust LOD scores for variance component-based linkage analysis.

    Science.gov (United States)

    Blangero, J; Williams, J T; Almasy, L

    2000-01-01

    The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.

  1. Validation of the LOD score compared with APACHE II score in prediction of the hospital outcome in critically ill patients.

    Science.gov (United States)

    Khwannimit, Bodin

    2008-01-01

    The Logistic Organ Dysfunction score (LOD) is an organ dysfunction score that can predict hospital mortality. The aim of this study was to validate the performance of the LOD score compared with the Acute Physiology and Chronic Health Evaluation II (APACHE II) score in a mixed intensive care unit (ICU) at a tertiary referral university hospital in Thailand. The data were collected prospectively on consecutive ICU admissions over a 24 month period from July1, 2004 until June 30, 2006. Discrimination was evaluated by the area under the receiver operating characteristic curve (AUROC). The calibration was assessed by the Hosmer-Lemeshow goodness-of-fit H statistic. The overall fit of the model was evaluated by the Brier's score. Overall, 1,429 patients were enrolled during the study period. The mortality in the ICU was 20.9% and in the hospital was 27.9%. The median ICU and hospital lengths of stay were 3 and 18 days, respectively, for all patients. Both models showed excellent discrimination. The AUROC for the LOD and APACHE II were 0.860 [95% confidence interval (CI) = 0.838-0.882] and 0.898 (95% Cl = 0.879-0.917), respectively. The LOD score had perfect calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 10 (p = 0.44). However, the APACHE II had poor calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 75.69 (p < 0.001). Brier's score showed the overall fit for both models were 0.123 (95%Cl = 0.107-0.141) and 0.114 (0.098-0.132) for the LOD and APACHE II, respectively. Thus, the LOD score was found to be accurate for predicting hospital mortality for general critically ill patients in Thailand.

  2. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  3. Using lod-score differences to determine mode of inheritance: a simple, robust method even in the presence of heterogeneity and reduced penetrance.

    Science.gov (United States)

    Greenberg, D A; Berger, B

    1994-10-01

    Determining the mode of inheritance is often difficult under the best of circumstances, but when segregation analysis is used, the problems of ambiguous ascertainment procedures, reduced penetrance, heterogeneity, and misdiagnosis make mode-of-inheritance determinations even more unreliable. The mode of inheritance can also be determined using a linkage-based method (maximized maximum lod score or mod score) and association-based methods, which can overcome many of these problems. In this work, we determined how much information is necessary to reliably determine the mode of inheritance from linkage data when heterogeneity and reduced penetrance are present in the data set. We generated data sets under both dominant and recessive inheritance with reduced penetrance and with varying fractions of linked and unlinked families. We then analyzed those data sets, assuming reduced penetrance, both dominant and recessive inheritance, and no heterogeneity. We investigated the reliability of two methods for determining the mode of inheritance from the linkage data. The first method examined the difference (delta) between the maximum lod scores calculated under the two mode-of-inheritance assumptions. We found that if delta was > 1.5, then the higher of the two maximum lod scores reflected the correct mode of inheritance with high reliability and that a delta of 2.5 appeared to practically guarantee a correct mode-of-inheritance inference. Furthermore, this reliability appeared to be virtually independent of alpha, the fraction of linked families in the data set, although the reliability decreased slightly as alpha fell below .50.(ABSTRACT TRUNCATED AT 250 WORDS)

  4. Multilocus lod scores in large pedigrees: combination of exact and approximate calculations.

    Science.gov (United States)

    Tong, Liping; Thompson, Elizabeth

    2008-01-01

    To detect the positions of disease loci, lod scores are calculated at multiple chromosomal positions given trait and marker data on members of pedigrees. Exact lod score calculations are often impossible when the size of the pedigree and the number of markers are both large. In this case, a Markov Chain Monte Carlo (MCMC) approach provides an approximation. However, to provide accurate results, mixing performance is always a key issue in these MCMC methods. In this paper, we propose two methods to improve MCMC sampling and hence obtain more accurate lod score estimates in shorter computation time. The first improvement generalizes the block-Gibbs meiosis (M) sampler to multiple meiosis (MM) sampler in which multiple meioses are updated jointly, across all loci. The second one divides the computations on a large pedigree into several parts by conditioning on the haplotypes of some 'key' individuals. We perform exact calculations for the descendant parts where more data are often available, and combine this information with sampling of the hidden variables in the ancestral parts. Our approaches are expected to be most useful for data on a large pedigree with a lot of missing data. (c) 2007 S. Karger AG, Basel

  5. Conclusion of LOD-score analysis for family data generated under two-locus models.

    Science.gov (United States)

    Dizier, M H; Babron, M C; Clerget-Darpoux, F

    1996-06-01

    The power to detect linkage by the LOD-score method is investigated here for diseases that depend on the effects of two genes. The classical strategy is, first, to detect a major-gene (MG) effect by segregation analysis and, second, to seek for linkage with genetic markers by the LOD-score method using the MG parameters. We already showed that segregation analysis can lead to evidence for a MG effect for many two-locus models, with the estimates of the MG parameters being very different from those of the two genes involved in the disease. We show here that use of these MG parameter estimates in the LOD-score analysis may lead to a failure to detect linkage for some two-locus models. For these models, use of the sib-pair method gives a non-negligible increase of power to detect linkage. The linkage-homogeneity test among subsamples differing for the familial disease distribution provides evidence of parameter misspecification, when the MG parameters are used. Moreover, for most of the models, use of the MG parameters in LOD-score analysis leads to a large bias in estimation of the recombination fraction and sometimes also to a rejection of linkage for the true recombination fraction. A final important point is that a strong evidence of an MG effect, obtained by segregation analysis, does not necessarily imply that linkage will be detected for at least one of the two genes, even with the true parameters and with a close informative marker.

  6. Conclusions of LOD-score analysis for family data generated under two-locus models

    Energy Technology Data Exchange (ETDEWEB)

    Dizier, M.H.; Babron, M.C.; Clergt-Darpoux, F. [Unite de Recherches d`Epidemiologie Genetique, Paris (France)

    1996-06-01

    The power to detect linkage by the LOD-score method is investigated here for diseases that depend on the effects of two genes. The classical strategy is, first, to detect a major-gene (MG) effect by segregation analysis and, second, to seek for linkage with genetic markers by the LOD-score method using the MG parameters. We already showed that segregation analysis can lead to evidence for a MG effect for many two-locus models, with the estimates of the MG parameters being very different from those of the two genes involved in the disease. We show here that use of these MG parameter estimates in the LOD-score analysis may lead to a failure to detect linkage for some two-locus models. For these models, use of the sib-pair method gives a non-negligible increase of power to detect linkage. The linkage-homogeneity test among subsamples differing for the familial disease distribution provides evidence of parameter misspecification, when the MG parameters are used. Moreover, for most of the models, use of the MG parameters in LOD-score analysis leads to a large bias in estimation of the recombination fraction and sometimes also to a rejection of linkage for the true recombination fraction. A final important point is that a strong evidence of an MG effect, obtained by segregation analysis, does not necessarily imply that linkage will be detected for at least one of the two genes, even with the true parameters and with a close informative marker. 17 refs., 3 tabs.

  7. Quantification of type I error probabilities for heterogeneity LOD scores.

    Science.gov (United States)

    Abreu, Paula C; Hodge, Susan E; Greenberg, David A

    2002-02-01

    Locus heterogeneity is a major confounding factor in linkage analysis. When no prior knowledge of linkage exists, and one aims to detect linkage and heterogeneity simultaneously, classical distribution theory of log-likelihood ratios does not hold. Despite some theoretical work on this problem, no generally accepted practical guidelines exist. Nor has anyone rigorously examined the combined effect of testing for linkage and heterogeneity and simultaneously maximizing over two genetic models (dominant, recessive). The effect of linkage phase represents another uninvestigated issue. Using computer simulation, we investigated type I error (P value) of the "admixture" heterogeneity LOD (HLOD) score, i.e., the LOD score maximized over both recombination fraction theta and admixture parameter alpha and we compared this with the P values when one maximizes only with respect to theta (i.e., the standard LOD score). We generated datasets of phase-known and -unknown nuclear families, sizes k = 2, 4, and 6 children, under fully penetrant autosomal dominant inheritance. We analyzed these datasets (1) assuming a single genetic model, and maximizing the HLOD over theta and alpha; and (2) maximizing the HLOD additionally over two dominance models (dominant vs. recessive), then subtracting a 0.3 correction. For both (1) and (2), P values increased with family size k; rose less for phase-unknown families than for phase-known ones, with the former approaching the latter as k increased; and did not exceed the one-sided mixture distribution xi = (1/2) chi1(2) + (1/2) chi2(2). Thus, maximizing the HLOD over theta and alpha appears to add considerably less than an additional degree of freedom to the associated chi1(2) distribution. We conclude with practical guidelines for linkage investigators. Copyright 2002 Wiley-Liss, Inc.

  8. Smoothing of the bivariate LOD score for non-normal quantitative traits.

    Science.gov (United States)

    Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John

    2005-12-30

    Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.

  9. Using lod-score differences to determine mode of inheritance: A simple, robust method even in the presence of heterogeneity and reduced penetrance

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, D.A.; Berger, B. [Mount Sinai Medical Center, New York, NY (United States)

    1994-10-01

    Determining the mode of inheritance is often difficult under the best of circumstances, but when segregation analysis is used, the problems of ambiguous ascertainment procedures, reduced penetrance, heterogeneity, and misdiagnosis make mode-of-inheritance determinations even more unreliable. The mode of inheritance can also be determined using a linkage-based method and association-based methods, which can overcome many of these problems. In this work, we determined how much information is necessary to reliably determine the mode of inheritance from linkage data when heterogeneity and reduced penetrance are present in the data set. We generated data sets under both dominant and recessive inheritance with reduced penetrance and with varying fractions of linked and unlinked families. We then analyzed those data sets, assuming reduced penetrance, both dominant and recessive inheritance, and no heterogeneity. We investigated the reliability of two methods for determining the mode of inheritance from the linkage data. The first method examined the difference ({Delta}) between the maximum lod scores calculated under the two mode-of-inheritance assumptions. We found that if {Delta} was >1.5, then the higher of the two maximum lod scores reflected the correct mode of inheritance with high reliability and that a {Delta} of 2.5 appeared to practically guarantee a correct mode-of-inheritance inference. Furthermore, this reliability appeared to be virtually independent of {alpha}, the fraction of linked families in the data set. The second method we tested was based on choosing the higher of the two maximum lod scores calculated under the different mode-of-inheritance assumptions. This method became unreliable as {alpha} decreased. These results suggest that the mode of inheritance can be inferred from linkage data with high reliability, even in the presence of heterogeneity and reduced penetrance. 12 refs., 3 figs., 2 tabs.

  10. Accuracy of a composite score using daily SAPS II and LOD scores for predicting hospital mortality in ICU patients hospitalized for more than 72 h.

    Science.gov (United States)

    Timsit, J F; Fosse, J P; Troché, G; De Lassence, A; Alberti, C; Garrouste-Orgeas, M; Azoulay, E; Chevret, S; Moine, P; Cohen, Y

    2001-06-01

    In most databases used to build general severity scores the median duration of intensive care unit (ICU) stay is less than 3 days. Consequently, these scores are not the most appropriate tools for measuring prognosis in studies dealing with ICU patients hospitalized for more than 72 h. To develop a new prognostic model based on a general severity score (SAPS II), an organ dysfunction score (LOD) and evolution of both scores during the first 3 days of ICU stay. Prospective multicenter study. Twenty-eight intensive care units (ICUs) in France. A training data-set was created with four ICUs during an 18-month period (893 patients). Seventy percent of the patients were medical (628) aged 66 years. The median SAPS II was 38. The ICU and hospital mortality rates were 22.7% and 30%, respectively. Forty-seven percent (420 patients) were transferred from hospital wards. In this population, the calibration (Hosmer-Lemeshow chi-square: 37.4, P = 0.001) and the discrimination [area under the ROC curves: 0.744 (95 % CI: 0.714-0.773)] of the original SAPS II were relatively poor. A validation data set was created with a random panel of 24 French ICUs during March 1999 (312 patients). The LOD and SAPS II scores were calculated during the first (SAPS1, LOD1), second (SAPS2, LOD2), and third (SAPS3, LOD3) calendar days. The LOD and SAPS scores alterations were assigned the value "1" when scores increased with time and "0" otherwise. A multivariable logistic regression model was used to select variables measured during the first three calendar days, and independently associated with death. Selected variables were: SAPS II at admission [OR: 1.04 (95 % CI: 1.027-1.053) per point], LOD [OR: 1.16 (95 % CI: 1.085-1.253) per point], transfer from ward [OR: 1.74 (95 % CI: 1.25-2.42)], as well as SAPS3-SAPS2 alterations [OR: 1.516 (95 % CI: 1.04-2.22)], and LOD3-LOD2 alterations [OR: 2.00 (95 % CI: 1.29-3.11)]. The final model has good calibration and discrimination properties in the

  11. Easy calculations of lod scores and genetic risks on small computers.

    Science.gov (United States)

    Lathrop, G M; Lalouel, J M

    1984-01-01

    A computer program that calculates lod scores and genetic risks for a wide variety of both qualitative and quantitative genetic traits is discussed. An illustration is given of the joint use of a genetic marker, affection status, and quantitative information in counseling situations regarding Duchenne muscular dystrophy. PMID:6585139

  12. Comparison of multipoint linkage analyses for quantitative traits in the CEPH data: parametric LOD scores, variance components LOD scores, and Bayes factors.

    Science.gov (United States)

    Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M

    2007-01-01

    We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus.

  13. Distribution of model-based multipoint heterogeneity lod scores.

    Science.gov (United States)

    Xing, Chao; Morris, Nathan; Xing, Guan

    2010-12-01

    The distribution of two-point heterogeneity lod scores (HLOD) has been intensively investigated because the conventional χ(2) approximation to the likelihood ratio test is not directly applicable. However, there was no study investigating th e distribution of the multipoint HLOD despite its wide application. Here we want to point out that, compared with the two-point HLOD, the multipoint HLOD essentially tests for homogeneity given linkage and follows a relatively simple limiting distribution ½χ²₀+ ½χ²₁, which can be obtained by established statistical theory. We further examine the theoretical result by simulation studies. © 2010 Wiley-Liss, Inc.

  14. Linkage analysis in nuclear families. 2: Relationship between affected sib-pair tests and lod score analysis.

    Science.gov (United States)

    Knapp, M; Seuchter, S A; Baur, M P

    1994-01-01

    It is believed that the main advantage of affected sib-pair tests is that their application requires no information about the underlying genetic mechanism of the disease. However, here it is proved that the mean test, which can be considered the most prominent of the affected sib-pair tests, is equivalent to lod score analysis for an assumed recessive mode of inheritance, irrespective of the true mode of the disease. Further relationships of certain sib-pair tests and lod score analysis under specific assumed genetic modes are investigated.

  15. Allele-sharing models: LOD scores and accurate linkage tests.

    Science.gov (United States)

    Kong, A; Cox, N J

    1997-11-01

    Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested.

  16. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  17. Two-locus maximum lod score analysis of a multifactorial trait: Joint consideration of IDDM2 and IDDM4 with IDDMI in type 1 diabetes

    Energy Technology Data Exchange (ETDEWEB)

    Cordell, H.J.; Todd, J.A.; Bennett, S.T. [Univ. of Oxford (United Kingdom)] [and others

    1995-10-01

    To investigate the genetic component of multifactorial diseases such as type 1 (insulin-dependent) diabetes mellitus (IDDM), models involving the joint action of several disease loci are important. These models can give increased power to detect an effect and a greater understanding of etiological mechanisms. Here, we present an extension of the maximum lod score method of N. Risch, which allows the simultaneous detection and modeling of two unlinked disease loci. Genetic constraints on the identical-by-descent sharing probabilities, analogous to the {open_quotes}triangle{close_quotes} restrictions in the single-locus method, are derived, and the size and power of the test statistics are investigated. The method is applied to affected-sib-pair data, and the joint effects of IDDM1 (HLA) and IDDM2 (the INS VNTR) and of IDDM1 and IDDM4 (FGF3-linked) are assessed with relation to the development of IDDM. In the presence of genetic heterogeneity, there is seen to be a significant advantage in analyzing more than one locus simultaneously. Analysis of these families indicates that the effects at IDDM1 and IDDM2 are well described by a multiplicative genetic model, while those at IDDM1 and IDDM4 follow a heterogeneity model. 17 refs., 9 tabs.

  18. Using lod scores to detect sex differences in male-female recombination fractions.

    Science.gov (United States)

    Feenstra, B; Greenberg, D A; Hodge, S E

    2004-01-01

    Human recombination fraction (RF) can differ between males and females, but investigators do not always know which disease genes are located in genomic areas of large RF sex differences. Knowledge of RF sex differences contributes to our understanding of basic biology and can increase the power of a linkage study, improve gene localization, and provide clues to possible imprinting. One way to detect these differences is to use lod scores. In this study we focused on detecting RF sex differences and answered the following questions, in both phase-known and phase-unknown matings: (1) How large a sample size is needed to detect a RF sex difference? (2) What are "optimal" proportions of paternally vs. maternally informative matings? (3) Does ascertaining nonoptimal proportions of paternally or maternally informative matings lead to ascertainment bias? Our results were as follows: (1) We calculated expected lod scores (ELODs) under two different conditions: "unconstrained," allowing sex-specific RF parameters (theta(female), theta(male)); and "constrained," requiring theta(female) = theta(male). We then examined the DeltaELOD (identical with difference between maximized constrained and unconstrained ELODs) and calculated minimum sample sizes required to achieve statistically significant DeltaELODs. For large RF sex differences, samples as small as 10 to 20 fully informative matings can achieve statistical significance. We give general sample size guidelines for detecting RF differences in informative phase-known and phase-unknown matings. (2) We defined p as the proportion of paternally informative matings in the dataset; and the optimal proportion p(circ) as that value of p that maximizes DeltaELOD. We determined that, surprisingly, p(circ) does not necessarily equal (1/2), although it does fall between approximately 0.4 and 0.6 in most situations. (3) We showed that if p in a sample deviates from its optimal value, no bias is introduced (asymptotically) to the maximum

  19. Two-locus maximum lod score analysis of a multifactorial trait: joint consideration of IDDM2 and IDDM4 with IDDM1 in type 1 diabetes.

    Science.gov (United States)

    Cordell, H J; Todd, J A; Bennett, S T; Kawaguchi, Y; Farrall, M

    1995-10-01

    To investigate the genetic component of multifactorial diseases such as type 1 (insulin-dependent) diabetes mellitus (IDDM), models involving the joint action of several disease loci are important. These models can give increased power to detect an effect and a greater understanding of etiological mechanisms. Here, we present an extension of the maximum lod score method of N. Risch, which allows the simultaneous detection and modeling of two unlinked disease loci. Genetic constraints on the identical-by-descent sharing probabilities, analogous to the "triangle" restrictions in the single-locus method, are derived, and the size and power of the test statistics are investigated. The method is applied to affected-sib-pair data, and the joint effects of IDDM1 (HLA) and IDDM2 (the INS VNTR) and of IDDM1 and IDDM4 (FGF3-linked) are assessed with relation to the development of IDDM. In the presence of genetic heterogeneity, there is seen to be a significant advantage in analyzing more than one locus simultaneously. Analysis of these families indicates that the effects at IDDM1 and IDDM2 are well described by a multiplicative genetic model, while those at IDDM1 and IDDM4 follow a heterogeneity model.

  20. Replication of linkage to quantitative trait loci: variation in location and magnitude of the lod score.

    Science.gov (United States)

    Hsueh, W C; Göring, H H; Blangero, J; Mitchell, B D

    2001-01-01

    Replication of linkage signals from independent samples is considered an important step toward verifying the significance of linkage signals in studies of complex traits. The purpose of this empirical investigation was to examine the variability in the precision of localizing a quantitative trait locus (QTL) by analyzing multiple replicates of a simulated data set with the use of variance components-based methods. Specifically, we evaluated across replicates the variation in both the magnitude and the location of the peak lod scores. We analyzed QTLs whose effects accounted for 10-37% of the phenotypic variance in the quantitative traits. Our analyses revealed that the precision of QTL localization was directly related to the magnitude of the QTL effect. For a QTL with effect accounting for > 20% of total phenotypic variation, > 90% of the linkage peaks fall within 10 cM from the true gene location. We found no evidence that, for a given magnitude of the lod score, the presence of interaction influenced the precision of QTL localization.

  1. D-dimer as marker for microcirculatory failure: correlation with LOD and APACHE II scores.

    Science.gov (United States)

    Angstwurm, Matthias W A; Reininger, Armin J; Spannagl, Michael

    2004-01-01

    The relevance of plasma d-dimer levels as marker for morbidity and organ dysfunction in severely ill patients is largely unknown. In a prospective study we determined d-dimer plasma levels of 800 unselected patients at admission to our intensive care unit. In 91% of the patients' samples d-dimer levels were elevated, in some patients up to several hundredfold as compared to normal values. The highest mean d-dimer values were present in the patient group with thromboembolic diseases, and particularly in non-survivors of pulmonary embolism. In patients with circulatory impairment (r=0.794) and in patients with infections (r=0.487) a statistically significant correlation was present between d-dimer levels and the APACHE II score (P<0.001). The logistic organ dysfunction score (LOD, P<0.001) correlated with d-dimer levels only in patients with circulatory impairment (r=0.474). On the contrary, patients without circulatory impairment demonstrated no correlation of d-dimer levels to the APACHE II or LOD score. Taking all patients together, no correlations of d-dimer levels with single organ failure or with indicators of infection could be detected. In conclusion, d-dimer plasma levels strongly correlated with the severity of the disease and organ dysfunction in patients with circulatory impairment or infections suggesting that elevated d-dimer levels may reflect the extent of microcirculatory failure. Thus, a therapeutic strategy to improve the microcirculation in such patients may be monitored using d-dimer plasma levels.

  2. Hereditary spastic paraplegia: LOD-score considerations for confirmation of linkage in a heterogeneous trait

    Energy Technology Data Exchange (ETDEWEB)

    Dube, M.P.; Kibar, Z.; Rouleau, G.A. [McGill Univ., Quebec (Canada)] [and others

    1997-03-01

    Hereditary spastic paraplegia (HSP) is a degenerative disorder of the motor system, defined by progressive weakness and spasticity of the lower limbs. HSP may be inherited as an autosomal dominant (AD), autosomal recessive, or an X-linked trait. AD HSP is genetically heterogeneous, and three loci have been identified so far: SPG3 maps to chromosome 14q, SPG4 to 2p, and SPG4a to 15q. We have undertaken linkage analysis with 21 uncomplicated AD families to the three AD HSP loci. We report significant linkage for three of our families to the SPG4 locus and exclude several families by multipoint linkage. We used linkage information from several different research teams to evaluate the statistical probability of linkage to the SPG4 locus for uncomplicated AD HSP families and established the critical LOD-score value necessary for confirmation of linkage to the SPG4 locus from Bayesian statistics. In addition, we calculated the empirical P-values for the LOD scores obtained with all families with computer simulation methods. Power to detect significant linkage, as well as type I error probabilities, were evaluated. This combined analytical approach permitted conclusive linkage analyses on small to medium-size families, under the restrictions of genetic heterogeneity. 19 refs., 1 fig., 1 tab.

  3. Hereditary spastic paraplegia: LOD-score considerations for confirmation of linkage in a heterogeneous trait.

    Science.gov (United States)

    Dubé, M P; Mlodzienski, M A; Kibar, Z; Farlow, M R; Ebers, G; Harper, P; Kolodny, E H; Rouleau, G A; Figlewicz, D A

    1997-03-01

    Hereditary spastic paraplegia (HSP) is a degenerative disorder of the motor system, defined by progressive weakness and spasticity of the lower limbs. HSP may be inherited as an autosomal dominant (AD), autosomal recessive, or an X-linked trait. AD HSP is genetically heterogeneous, and three loci have been identified so far: SPG3 maps to chromosome 14q, SPG4 to 2p, and SPG4a to 15q. We have undertaken linkage analysis with 21 uncomplicated AD families to the three AD HSP loci. We report significant linkage for three of our families to the SPG4 locus and exclude several families by multipoint linkage. We used linkage information from several different research teams to evaluate the statistical probability of linkage to the SPG4 locus for uncomplicated AD HSP families and established the critical LOD-score value necessary for confirmation of linkage to the SPG4 locus from Bayesian statistics. In addition, we calculated the empirical P-values for the LOD scores obtained with all families with computer simulation methods. Power to detect significant linkage, as well as type I error probabilities, were evaluated. This combined analytical approach permitted conclusive linkage analyses on small to medium-size families, under the restrictions of genetic heterogeneity.

  4. Clustering patterns of LOD scores for asthma-related phenotypes revealed by a genome-wide screen in 295 French EGEA families.

    Science.gov (United States)

    Bouzigon, Emmanuelle; Dizier, Marie-Hélène; Krähenbühl, Christine; Lemainque, Arnaud; Annesi-Maesano, Isabella; Betard, Christine; Bousquet, Jean; Charpin, Denis; Gormand, Frédéric; Guilloud-Bataille, Michel; Just, Jocelyne; Le Moual, Nicole; Maccario, Jean; Matran, Régis; Neukirch, Françoise; Oryszczyn, Marie-Pierre; Paty, Evelyne; Pin, Isabelle; Rosenberg-Bourgin, Myriam; Vervloet, Daniel; Kauffmann, Francine; Lathrop, Mark; Demenais, Florence

    2004-12-15

    A genome-wide scan for asthma phenotypes was conducted in the whole sample of 295 EGEA families selected through at least one asthmatic subject. In addition to asthma, seven phenotypes involved in the main asthma physiopathological pathways were considered: SPT (positive skin prick test response to at least one of 11 allergens), SPTQ score being the number of positive skin test responses to 11 allergens, Phadiatop (positive specific IgE response to a mixture of allergens), total IgE levels, eosinophils, bronchial responsiveness (BR) to methacholine challenge and %predicted FEV(1). Four regions showed evidence for linkage (PLOD scores. This analysis revealed clustering of LODs for asthma, SPT and Phadiatop on one axis and clustering of LODs for %FEV(1), BR and SPTQ on the other, while LODs for IgE and eosinophils appeared to be independent from all other LODs. These results provide new insights into the potential sharing of genetic determinants by asthma-related phenotypes.

  5. The quantitative LOD score: test statistic and sample size for exclusion and linkage of quantitative traits in human sibships.

    Science.gov (United States)

    Page, G P; Amos, C I; Boerwinkle, E

    1998-04-01

    We present a test statistic, the quantitative LOD (QLOD) score, for the testing of both linkage and exclusion of quantitative-trait loci in randomly selected human sibships. As with the traditional LOD score, the boundary values of 3, for linkage, and -2, for exclusion, can be used for the QLOD score. We investigated the sample sizes required for inferring exclusion and linkage, for various combinations of linked genetic variance, total heritability, recombination distance, and sibship size, using fixed-size sampling. The sample sizes required for both linkage and exclusion were not qualitatively different and depended on the percentage of variance being linked or excluded and on the total genetic variance. Information regarding linkage and exclusion in sibships larger than size 2 increased as approximately all possible pairs n(n-1)/2 up to sibships of size 6. Increasing the recombination (theta) distance between the marker and the trait loci reduced empirically the power for both linkage and exclusion, as a function of approximately (1-2theta)4.

  6. Another procedure for the preliminary ordering of loci based on two point lod scores.

    Science.gov (United States)

    Curtis, D

    1994-01-01

    Because of the difficulty of performing full likelihood analysis over multiple loci and the large numbers of possible orders, a number of methods have been proposed for quickly evaluating orders and, to a lesser extent, for generating good orders. A new method is proposed which uses a function which is moderately laborious to compute, the sum of lod scores between all pairs of loci. This function can be smoothly minimized by initially allowing the loci to be placed anywhere in space, and only subsequently constraining them to lie along a one-dimensional map. Application of this approach to sample data suggests that it has promise and might usefully be combined with other methods when loci need to be ordered.

  7. Linkage of familial Alzheimer disease to chromosome 14 in two large early-onset pedigrees: effects of marker allele frequencies on lod scores.

    Science.gov (United States)

    Nechiporuk, A; Fain, P; Kort, E; Nee, L E; Frommelt, E; Polinsky, R J; Korenberg, J R; Pulst, S M

    1993-05-01

    Alzheimer disease (AD) is a devastating neurodegenerative disease leading to global dementia. In addition to sporadic forms of AD, familial forms (FAD) have been recognized. Mutations in the amyloid precursor protein (APP) gene on chromosome (CHR) 21 have been shown to cause early-onset AD in a small number of pedigrees. Recently, linkage to markers on CHR 14 has been established in several early-onset FAD pedigrees. We now report lod scores for CHR 14 markers in two large early-onset FAD pedigrees. Pairwise linkage analysis suggested that in these pedigrees the mutation is tightly linked to the loci D14S43 and D14S53. However, assumptions regarding marker allele frequencies had a major and often unpredictable effect on calculated lod scores. Therefore, caution needs to be exercised when single pedigrees are analyzed with marker allele frequencies determined from the literature or from a pool of spouses.

  8. A new method of linkage analysis using LOD scores for quantitative traits supports linkage of monoamine oxidase activity to D17S250 in the Collaborative Study on the Genetics of Alcoholism pedigrees.

    Science.gov (United States)

    Curtis, David; Knight, Jo; Sham, Pak C

    2005-09-01

    Although LOD score methods have been applied to diseases with complex modes of inheritance, linkage analysis of quantitative traits has tended to rely on non-parametric methods based on regression or variance components analysis. Here, we describe a new method for LOD score analysis of quantitative traits which does not require specification of a mode of inheritance. The technique is derived from the MFLINK method for dichotomous traits. A range of plausible transmission models is constructed, constrained to yield the correct population mean and variance for the trait but differing with respect to the contribution to the variance due to the locus under consideration. Maximized LOD scores under homogeneity and admixture are calculated, as is a model-free LOD score which compares the maximized likelihoods under admixture assuming linkage and no linkage. These LOD scores have known asymptotic distributions and hence can be used to provide a statistical test for linkage. The method has been implemented in a program called QMFLINK. It was applied to data sets simulated using a variety of transmission models and to a measure of monoamine oxidase activity in 105 pedigrees from the Collaborative Study on the Genetics of Alcoholism. With the simulated data, the results showed that the new method could detect linkage well if the true allele frequency for the trait was close to that specified. However, it performed poorly on models in which the true allele frequency was much rarer. For the Collaborative Study on the Genetics of Alcoholism data set only a modest overlap was observed between the results obtained from the new method and those obtained when the same data were analysed previously using regression and variance components analysis. Of interest is that D17S250 produced a maximized LOD score under homogeneity and admixture of 2.6 but did not indicate linkage using the previous methods. However, this region did produce evidence for linkage in a separate data set

  9. LOD-a-lot : A queryable dump of the LOD cloud

    NARCIS (Netherlands)

    Fernández, Javier D.; Beek, Wouter; Martínez-Prieto, Miguel A.; Arias, Mario

    2017-01-01

    LOD-a-lot democratizes access to the Linked Open Data (LOD) Cloud by serving more than 28 billion unique triples from 650, K datasets over a single self-indexed file. This corpus can be queried online with a sustainable Linked Data Fragments interface, or downloaded and consumed locally: LOD-a-lot

  10. R-LODs: fast LOD-based ray tracing of massive models

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Sung-Eui; Lauterbach, Christian; Manocha, Dinesh

    2006-08-25

    We present a novel LOD (level-of-detail) algorithm to accelerate ray tracing of massive models. Our approach computes drastic simplifications of the model and the LODs are well integrated with the kd-tree data structure. We introduce a simple and efficient LOD metric to bound the error for primary and secondary rays. The LOD representation has small runtime overhead and our algorithm can be combined with ray coherence techniques and cache-coherent layouts to improve the performance. In practice, the use of LODs can alleviate aliasing artifacts and improve memory coherence. We implement our algorithm on both 32bit and 64bit machines and able to achieve up to 2.20 times improvement in frame rate of rendering models consisting of tens or hundreds of millions of triangles with little loss in image quality.

  11. Meteorological interpretation of transient LOD changes

    Science.gov (United States)

    Masaki, Y.

    2008-04-01

    The Earth’s spin rate is mainly changed by zonal winds. For example, seasonal changes in global atmospheric circulation and episodic changes accompanied with El Nĩ os are clearly detected n in the Length-of-day (LOD). Sub-global to regional meteorological phenomena can also change the wind field, however, their effects on the LOD are uncertain because such LOD signals are expected to be subtle and transient. In our previous study (Masaki, 2006), we introduced atmospheric pressure gradients in the upper atmosphere in order to obtain a rough picture of the meteorological features that can change the LOD. In this presentation, we compare one-year LOD data with meteorological elements (winds, temperature, pressure, etc.) and make an attempt to link transient LOD changes with sub-global meteorological phenomena.

  12. LOD estimation from DORIS observations

    Science.gov (United States)

    Stepanek, Petr; Filler, Vratislav; Buday, Michal; Hugentobler, Urs

    2016-04-01

    The difference between astronomically determined duration of the day and 86400 seconds is called length of day (LOD). The LOD could be also understood as the daily rate of the difference between the Universal Time UT1, based on the Earth rotation, and the International Atomic Time TAI. The LOD is estimated using various Satellite Geodesy techniques as GNSS and SLR, while absolute UT1-TAI difference is precisely determined by VLBI. Contrary to other IERS techniques, the LOD estimation using DORIS (Doppler Orbitography and Radiopositioning Integrated by satellite) measurement did not achieve a geodetic accuracy in the past, reaching the precision at the level of several ms per day. However, recent experiments performed by IDS (International DORIS Service) analysis centre at Geodetic Observatory Pecny show a possibility to reach accuracy around 0.1 ms per day, when not adjusting the cross-track harmonics in the Satellite orbit model. The paper presents the long term LOD series determined from the DORIS solutions. The series are compared with C04 as the reference. Results are discussed in the context of accuracy achieved with GNSS and SLR. Besides the multi-satellite DORIS solutions, also the LOD series from the individual DORIS satellite solutions are analysed.

  13. LOD 1 VS. LOD 2 - Preliminary Investigations Into Differences in Mobile Rendering Performance

    Science.gov (United States)

    Ellul, C.; Altenbuchner, J.

    2013-09-01

    The increasing availability, size and detail of 3D City Model datasets has led to a challenge when rendering such data on mobile devices. Understanding the limitations to the usability of such models on these devices is particularly important given the broadening range of applications - such as pollution or noise modelling, tourism, planning, solar potential - for which these datasets and resulting visualisations can be utilized. Much 3D City Model data is created by extrusion of 2D topographic datasets, resulting in what is known as Level of Detail (LoD) 1 buildings - with flat roofs. However, in the UK the National Mapping Agency (the Ordnance Survey, OS) is now releasing test datasets to Level of Detail (LoD) 2 - i.e. including roof structures. These datasets are designed to integrate with the LoD 1 datasets provided by the OS, and provide additional detail in particular on larger buildings and in town centres. The availability of such integrated datasets at two different Levels of Detail permits investigation into the impact of the additional roof structures (and hence the display of a more realistic 3D City Model) on rendering performance on a mobile device. This paper describes preliminary work carried out to investigate this issue, for the test area of the city of Sheffield (in the UK Midlands). The data is stored in a 3D spatial database as triangles and then extracted and served as a web-based data stream which is queried by an App developed on the mobile device (using the Android environment, Java and OpenGL for graphics). Initial tests have been carried out on two dataset sizes, for the city centre and a larger area, rendering the data onto a tablet to compare results. Results of 52 seconds for rendering LoD 1 data, and 72 seconds for LoD 1 mixed with LoD 2 data, show that the impact of LoD 2 is significant.

  14. LOD wars: The affected-sib-pair paradigm strikes back!

    Energy Technology Data Exchange (ETDEWEB)

    Farrall, M. [Wellcome Trust Centre for Human Genetics, Oxford (United Kingdom)

    1997-03-01

    In a recent letter, Greenberg et al. aired their concerns that the affected-sib-pair (ASP) approach was becoming excessively popular, owing to misconceptions and ignorance of the properties and limitations of both the ASP and the classic LOD-score approaches. As an enthusiast of using the ASP approach to map susceptibility genes for multifactorial traits, I would like to contribute a few comments and explanatory notes in defense of the ASP paradigm. 18 refs.

  15. Pragmatic Use of LOD - a Modular Approach

    DEFF Research Database (Denmark)

    Treldal, Niels; Vestergaard, Flemming; Karlshøj, Jan

    and reliability of deliveries along with use-case-specific information requirements provides a pragmatic approach for a LOD concept. The proposed solution combines LOD requirement definitions with Information Delivery Manual-based use case requirements to match the specific needs identified for a LOD framework......The concept of Level of Development (LOD) is a simple approach to specifying the requirements for the content of object-oriented models in a Building Information Modelling process. The concept has been implemented in many national and organization-specific variations and, in recent years, several...

  16. LOD map--A visual interface for navigating multiresolution volume visualization.

    Science.gov (United States)

    Wang, Chaoli; Shen, Han-Wei

    2006-01-01

    In multiresolution volume visualization, a visual representation of level-of-detail (LOD) quality is important for us to examine, compare, and validate different LOD selection algorithms. While traditional methods rely on ultimate images for quality measurement, we introduce the LOD map--an alternative representation of LOD quality and a visual interface for navigating multiresolution data exploration. Our measure for LOD quality is based on the formulation of entropy from information theory. The measure takes into account the distortion and contribution of multiresolution data blocks. A LOD map is generated through the mapping of key LOD ingredients to a treemap representation. The ordered treemap layout is used for relative stable update of the LOD map when the view or LOD changes. This visual interface not only indicates the quality of LODs in an intuitive way, but also provides immediate suggestions for possible LOD improvement through visually-striking features. It also allows us to compare different views and perform rendering budget control. A set of interactive techniques is proposed to make the LOD adjustment a simple and easy task. We demonstrate the effectiveness and efficiency of our approach on large scientific and medical data sets.

  17. Systematic effects in LOD from SLR observations

    Science.gov (United States)

    Bloßfeld, Mathis; Gerstl, Michael; Hugentobler, Urs; Angermann, Detlef; Müller, Horst

    2014-09-01

    Beside the estimation of station coordinates and the Earth’s gravity field, laser ranging observations to near-Earth satellites can be used to determine the rotation of the Earth. One parameter of this rotation is ΔLOD (excess Length Of Day) which describes the excess revolution time of the Earth w.r.t. 86,400 s. Due to correlations among the different parameter groups, it is difficult to obtain reliable estimates for all parameters. In the official ΔLOD products of the International Earth Rotation and Reference Systems Service (IERS), the ΔLOD information determined from laser ranging observations is excluded from the processing. In this paper, we study the existing correlations between ΔLOD, the orbital node Ω, the even zonal gravity field coefficients, cross-track empirical accelerations and relativistic accelerations caused by the Lense-Thirring and deSitter effect in detail using first order Gaussian perturbation equations. We found discrepancies due to different a priories by using different gravity field models of up to 1.0 ms for polar orbits at an altitude of 500 km and up to 40.0 ms, if the gravity field coefficients are estimated using only observations to LAGEOS 1. If observations to LAGEOS 2 are included, reliable ΔLOD estimates can be achieved. Nevertheless, an impact of the a priori gravity field even on the multi-satellite ΔLOD estimates can be clearly identified. Furthermore, we investigate the effect of empirical cross-track accelerations and the effect of relativistic accelerations of near-Earth satellites on ΔLOD. A total effect of 0.0088 ms is caused by not modeled Lense-Thirring and deSitter terms. The partial derivatives of these accelerations w.r.t. the position and velocity of the satellite cause very small variations (0.1 μs) on ΔLOD.

  18. Medium- and Long-term Prediction of LOD Change by the Leap-step Autoregressive Model

    Science.gov (United States)

    Wang, Qijie

    2015-08-01

    The accuracy of medium- and long-term prediction of length of day (LOD) change base on combined least-square and autoregressive (LS+AR) deteriorates gradually. Leap-step autoregressive (LSAR) model can significantly reduce the edge effect of the observation sequence. Especially, LSAR model greatly improves the resolution of signals’ low-frequency components. Therefore, it can improve the efficiency of prediction. In this work, LSAR is used to forecast the LOD change. The LOD series from EOP 08 C04 provided by IERS is modeled by both the LSAR and AR models. The results of the two models are analyzed and compared. When the prediction length is between 10-30 days, the accuracy improvement is less than 10%. When the prediction length amounts to above 30 day, the accuracy improved obviously, with the maximum being around 19%. The results show that the LSAR model has higher prediction accuracy and stability in medium- and long-term prediction.

  19. Meta-data for a lot of LOD

    NARCIS (Netherlands)

    Rietveld, Laurens; Beek, Wouter; Hoekstra, Rinke; Schlobach, Stefan

    2017-01-01

    This paper introduces the LOD Laundromat meta-dataset, a continuously updated RDF meta-dataset that describes the documents crawled, cleaned and (re)published by the LOD Laundromat. This meta-dataset of over 110 million triples contains structural information for more than 650,000 documents (and

  20. TLS for generating multi-LOD of 3D building model

    International Nuclear Information System (INIS)

    Akmalia, R; Setan, H; Majid, Z; Suwardhi, D; Chong, A

    2014-01-01

    The popularity of Terrestrial Laser Scanners (TLS) to capture three dimensional (3D) objects has been used widely for various applications. Development in 3D models has also led people to visualize the environment in 3D. Visualization of objects in a city environment in 3D can be useful for many applications. However, different applications require different kind of 3D models. Since a building is an important object, CityGML has defined a standard for 3D building models at four different levels of detail (LOD). In this research, the advantages of TLS for capturing buildings and the modelling process of the point cloud can be explored. TLS will be used to capture all the building details to generate multi-LOD. This task, in previous works, involves usually the integration of several sensors. However, in this research, point cloud from TLS will be processed to generate the LOD3 model. LOD2 and LOD1 will then be generalized from the resulting LOD3 model. Result from this research is a guiding process to generate the multi-LOD of 3D building starting from LOD3 using TLS. Lastly, the visualization for multi-LOD model will also be shown

  1. TLS for generating multi-LOD of 3D building model

    Science.gov (United States)

    Akmalia, R.; Setan, H.; Majid, Z.; Suwardhi, D.; Chong, A.

    2014-02-01

    The popularity of Terrestrial Laser Scanners (TLS) to capture three dimensional (3D) objects has been used widely for various applications. Development in 3D models has also led people to visualize the environment in 3D. Visualization of objects in a city environment in 3D can be useful for many applications. However, different applications require different kind of 3D models. Since a building is an important object, CityGML has defined a standard for 3D building models at four different levels of detail (LOD). In this research, the advantages of TLS for capturing buildings and the modelling process of the point cloud can be explored. TLS will be used to capture all the building details to generate multi-LOD. This task, in previous works, involves usually the integration of several sensors. However, in this research, point cloud from TLS will be processed to generate the LOD3 model. LOD2 and LOD1 will then be generalized from the resulting LOD3 model. Result from this research is a guiding process to generate the multi-LOD of 3D building starting from LOD3 using TLS. Lastly, the visualization for multi-LOD model will also be shown.

  2. +2.71 LOD score at zero recombination is not sufficient for establishing linkage between X-linked mental retardation and X-chromosome markers

    Energy Technology Data Exchange (ETDEWEB)

    Robledo, R.; Melis, P.; Siniscalco, M. [and others

    1996-07-12

    Nonspecific X-linked mental retardation (MRX) is the denomination attributed to the familial type of mental retardation compatible with X-linked inheritance but lacking specific phenotypic manifestations. It is thus to be expected that families falling under such broad definition are genetically heterogeneous in the sense that they may be due to different types of mutations occurring, most probably, at distinct X-chromosome loci. To facilitate a genetic classification of these conditions, the Nomenclature Committee of the Eleventh Human Gene Mapping Workshop proposed to assign a unique MRX-serial number to each family where evidence of linkage with one or more X-chromosome markers had been established with a LOD score of at least +2 at zero recombination. This letter is meant to emphasize the inadequacy of this criterion for a large pedigree where the segregation of the disease has been evaluated against the haplotype constitution of the entire X-chromosome carrying the mutation in question. 12 refs., 2 figs., 1 tab.

  3. Application of LOD Technology in German Libraries and Archives

    Directory of Open Access Journals (Sweden)

    Dong Jie

    2017-12-01

    Full Text Available [Purpose/significance] Linked Open Data (LOD has been widely used in large industries, as well as non-profit organizations and government organizations. Libraries and archives are ones of the early adopters of LOD technology. Libraries and archives promote the development of LOD. Germany is one of the developed countries in the libraries and archives industry, and there are many successful cases about the application of LOD in the libraries and archives. [Method/process] This paper analyzed the successful application of LOD technology in German libraries and archives by using the methods of document investigation, network survey and content analysis. [Result/conclusion] These cases reveal in the traditional field of computer science the relationship among research topics related to libraries and archives such as artificial intelligence, database and knowledge discovery. Summing up the characteristics and experience of German practice can provide more reference value for the development of relevant practice in China.

  4. Secular change of LOD caused by core evolution

    Science.gov (United States)

    Denis, C.; Rybicki, K. R.; Varga, P.

    2003-04-01

    Fossils and tidal deposits suggest that, on the average, the Earth's despinning rate had been five times less in the Proterozoic than in the Phanerozoic. This difference is probably due, for the major part, to the existence of a Proterozoic supercontinent. Nevertheless, core formation and core evolution should have compensated to some extent the effect of tidal friction, by diminishing the Earth's inertia moment. We have investigated quantitatively this contribution of the evolving core to the change of LOD. For the present epoch, we find that the solidification of the inner core causes a relative secular decrease of LOD of approximately 3 μs per century, whereas the macrodiffusion of iron oxides and sulfides from the D" into the outer core across the CMB (inasfar as Majewski's theory holds) leads to a relative secular decrease of LOD by about 15 μs per century. On the other hand, the theory of slow core formation developped by Runcorn in the early 1960s as a by-product of his theory of mantle-wide convection, leads to a relative secular decrease of LOD during most of the Proterozoic of about 0.25 ms per century. Although core formation is now widely assumed to have been a thermal run-away process that occurred shortly after the Earth itself had formed, Runcorn's theory of the growing core would nicely explain the observed palaeo-LOD curve. In any case, formation of the core implies, all in all, a relative decrease of LOD of typically 3 hours.

  5. El Nino, La Nina and VLBI Measured LOD

    Science.gov (United States)

    Clark, Thomas A.; Gipson, J. M.; Ma, C.

    1998-01-01

    VLBI is one of the most important techniques for measuring Earth orientation parameters (EOP), and is unique in its ability to make high accuracy measurements of UT1, and its time derivative, which is related to changes in the length of day, conventionally called LOD. These measurements of EOP give constraints on geophysical models of the solid-Earth, atmosphere and oceans. Changes in EOP are due either to external torques from gravitational forces, or to the exchange of angular momentum between the Earth, atmosphere and oceans. The effect of the external torques is strictly harmonic and nature, and is therefore easy to remove. We analyze an LOD time series derived from VLBI measurements with the goal of comparing this to predictions from AAM, and various ENSO indices. Previous work by ourselves and other investigators demonstrated a high degree of coherence between atmospheric angular momentum (AAM) and EOP. We continue to see this. As the angular momentum of the atmosphere increases, the rate of rotation of the Earth decreases, and vice versa. The signature of the ENSO is particularly strong. At the peak of the 1982-83 El Nino increased LOD by almost 1 ms. This was subsequently followed by a reduction in LOD of 0.75 ms. At its peak, in February of 1998, the 1997-98 El Nino increased LOD by 0.8 msec. As predicted at the 1998 Spring AGU, this has been followed by an abrupt decrease in LOD which is currently -0.4 ms. At this time (August, 1998) the current ENSO continues to develop in new and unexpected ways. We plan to update our analysis with all data available prior to the Fall AGU.

  6. An LOD with improved breakdown voltage in full-frame CCD devices

    Science.gov (United States)

    Banghart, Edmund K.; Stevens, Eric G.; Doan, Hung Q.; Shepherd, John P.; Meisenzahl, Eric J.

    2005-02-01

    In full-frame image sensors, lateral overflow drain (LOD) structures are typically formed along the vertical CCD shift registers to provide a means for preventing charge blooming in the imager pixels. In a conventional LOD structure, the n-type LOD implant is made through the thin gate dielectric stack in the device active area and adjacent to the thick field oxidation that isolates the vertical CCD columns of the imager. In this paper, a novel LOD structure is described in which the n-type LOD impurities are placed directly under the field oxidation and are, therefore, electrically isolated from the gate electrodes. By reducing the electrical fields that cause breakdown at the silicon surface, this new structure permits a larger amount of n-type impurities to be implanted for the purpose of increasing the LOD conductivity. As a consequence of the improved conductance, the LOD width can be significantly reduced, enabling the design of higher resolution imaging arrays without sacrificing charge capacity in the pixels. Numerical simulations with MEDICI of the LOD leakage current are presented that identify the breakdown mechanism, while three-dimensional solutions to Poisson's equation are used to determine the charge capacity as a function of pixel dimension.

  7. Office microlaparoscopic ovarian drilling (OMLOD) versus conventional laparoscopic ovarian drilling (LOD) for women with polycystic ovary syndrome.

    Science.gov (United States)

    Salah, Imaduldin M

    2013-02-01

    This was a prospective controlled study to compare the beneficial effects of office microlaparoscopic ovarian drilling (OMLOD) under augmented local anesthesia, as a new modality treatment option, compared to those following ovarian drilling with the conventional traditional 10-mm laparoscope (laparoscopic ovarian drilling, LOD) under general anesthesia. The study included 60 anovulatory women with polycystic ovary syndrome (PCOS) who underwent OMLOD (study group) and 60 anovulatory PCOS women, in whom conventional LOD using 10-mm laparoscope under general anesthesia was performed (comparison group). Transvaginal ultrasound scan and blood sampling to measure the serum concentrations of LH, FSH, testosterone and androstenedione were performed before and after the procedure. Intraoperative and postoperative pain scores in candidate women were evaluated during the office microlaparoscopic procedure, in addition to the number of candidates who needed extra analgesia. Women undergoing OMLOD showed good intraoperative and postoperative pain scores. The number of patients discharged within 2 h after the office procedure was significantly higher, without the need for postoperative analgesia in most patients. The LH:FSH ratio, mean serum concentrations of LH and testosterone and free androgen index decreased significantly after both OMLOD and LOD. The mean ovarian volume decreased significantly (P < 0.05) a year after both OMLOD and LOD. There were no significant differences in those results after both procedures. Intra- and postoperatively augmented local anesthesia allows outpatient bilateral ovarian drilling by microlaparoscopy without general anesthesia. The high pregnancy rate, the simplicity of the method and the faster discharge time offer a new option for patients with PCOS who are resistant to clomiphene citrate. Moreover, ovarian drilling could be performed simultaneously during the routine diagnostic microlaparoscopy and integrated into the fertility workup of

  8. One-Step Leapfrog LOD-BOR-FDTD Algorithm with CPML Implementation

    Directory of Open Access Journals (Sweden)

    Yi-Gang Wang

    2016-01-01

    Full Text Available An unconditionally stable one-step leapfrog locally one-dimensional finite-difference time-domain (LOD-FDTD algorithm towards body of revolution (BOR is presented. The equations of the proposed algorithm are obtained by the algebraic manipulation of those used in the conventional LOD-BOR-FDTD algorithm. The equations for z-direction electric and magnetic fields in the proposed algorithm should be treated specially. The new algorithm obtains a higher computational efficiency while preserving the properties of the conventional LOD-BOR-FDTD algorithm. Moreover, the convolutional perfectly matched layer (CPML is introduced into the one-step leapfrog LOD-BOR-FDTD algorithm. The equation of the one-step leapfrog CPML is concise. Numerical results show that its reflection error is small. It can be concluded that the similar CPML scheme can also be easily applied to the one-step leapfrog LOD-FDTD algorithm in the Cartesian coordinate system.

  9. The dynamic system corresponding to LOD and AAM.

    Science.gov (United States)

    Liu, Shida; Liu, Shikuo; Chen, Jiong

    2000-02-01

    Using wavelet transform, the authors can reconstruct the 1-D map of a multifractal object. The wavelet transform of LOD and AAM shows that at 20 years scale, annual scale and 2 - 3 years scale, the jump points of LOD and AAM accord with each other very well, and their reconstructing 1-D mapping dynamic system are also very similar.

  10. Automatic LOD selection

    OpenAIRE

    Forsman, Isabelle

    2017-01-01

    In this paper a method to automatically generate transition distances for LOD, improving image stability and performance is presented. Three different methods were tested all measuring the change between two level of details using the spatial frequency. The methods were implemented as an optional pre-processing step in order to determine the transition distances from multiple view directions. During run-time both view direction based selection and the furthest distance for each direction was ...

  11. Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?

    Science.gov (United States)

    Ershadi, Saba; Shayanfar, Ali

    2018-03-22

    The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.

  12. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    Science.gov (United States)

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  13. PROPOSAL FOR A NEW LOD AND MULTI-REPRESENTATION CONCEPT FOR CITYGML

    Directory of Open Access Journals (Sweden)

    M.-O. Löwner

    2016-10-01

    Full Text Available The Open Geospatial Consortium (OGC CityGML standard offers a Level of Detail (LoD concept that enables the representation of CityGML features from a very detailed to a less detailed description. Due to a rising application variety, the current LoD concept seems to be too inflexible. Here, we present a multi representation concept (MRC that enables a user-defined definition of LoDs. Because CityGML is an international standard, official profiles of the MRC are proposed. However, encoding of the defined profiles reveals many problems including mapping the conceptual model to the normative encoding, missing technologies and so on. Therefore, we propose to use the MRC as a meta model for the further definition of an LoD concept for CityGML 3.0.

  14. Application of General Regression Neural Network to the Prediction of LOD Change

    Science.gov (United States)

    Zhang, Xiao-Hong; Wang, Qi-Jie; Zhu, Jian-Jun; Zhang, Hao

    2012-01-01

    Traditional methods for predicting the change in length of day (LOD change) are mainly based on some linear models, such as the least square model and autoregression model, etc. However, the LOD change comprises complicated non-linear factors and the prediction effect of the linear models is always not so ideal. Thus, a kind of non-linear neural network — general regression neural network (GRNN) model is tried to make the prediction of the LOD change and the result is compared with the predicted results obtained by taking advantage of the BP (back propagation) neural network model and other models. The comparison result shows that the application of the GRNN to the prediction of the LOD change is highly effective and feasible.

  15. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  16. Enhanced LOD Concepts for Virtual 3d City Models

    Science.gov (United States)

    Benner, J.; Geiger, A.; Gröger, G.; Häfele, K.-H.; Löwner, M.-O.

    2013-09-01

    Virtual 3D city models contain digital three dimensional representations of city objects like buildings, streets or technical infrastructure. Because size and complexity of these models continuously grow, a Level of Detail (LoD) concept effectively supporting the partitioning of a complete model into alternative models of different complexity and providing metadata, addressing informational content, complexity and quality of each alternative model is indispensable. After a short overview on various LoD concepts, this paper discusses the existing LoD concept of the CityGML standard for 3D city models and identifies a number of deficits. Based on this analysis, an alternative concept is developed and illustrated with several examples. It differentiates between first, a Geometric Level of Detail (GLoD) and a Semantic Level of Detail (SLoD), and second between the interior building and its exterior shell. Finally, a possible implementation of the new concept is demonstrated by means of an UML model.

  17. LOD lab : Scalable linked data processing

    NARCIS (Netherlands)

    Beek, Wouter; Rietveld, Laurens; Ilievski, F.; Schlobach, Stefan

    2017-01-01

    With tens if not hundreds of billions of logical statements, the Linked Open Data (LOD) is one of the biggest knowledge bases ever built. As such it is a gigantic source of information for applications in various domains, but also given its size an ideal test-bed for knowledge representation and

  18. Quadtree of TIN: a new algorithm of dynamic LOD

    Science.gov (United States)

    Zhang, Junfeng; Fei, Lifan; Chen, Zhen

    2009-10-01

    Currently, Real-time visualization of large-scale digital elevation model mainly employs the regular structure of GRID based on quadtree and triangle simplification methods based on irregular triangulated network (TIN). TIN is a refined means to express the terrain surface in the computer science, compared with GRID. However, the data structure of TIN model is complex, and is difficult to realize view-dependence representation of level of detail (LOD) quickly. GRID is a simple method to realize the LOD of terrain, but contains more triangle count. A new algorithm, which takes full advantage of the two methods' merit, is presented in this paper. This algorithm combines TIN with quadtree structure to realize the view-dependence LOD controlling over the irregular sampling point sets, and holds the details through the distance of viewpoint and the geometric error of terrain. Experiments indicate that this approach can generate an efficient quadtree triangulation hierarchy over any irregular sampling point sets and achieve dynamic and visual multi-resolution performance of large-scale terrain at real-time.

  19. An Application to the Prediction of LOD Change Based on General Regression Neural Network

    Science.gov (United States)

    Zhang, X. H.; Wang, Q. J.; Zhu, J. J.; Zhang, H.

    2011-07-01

    Traditional prediction of the LOD (length of day) change was based on linear models, such as the least square model and the autoregressive technique, etc. Due to the complex non-linear features of the LOD variation, the performances of the linear model predictors are not fully satisfactory. This paper applies a non-linear neural network - general regression neural network (GRNN) model to forecast the LOD change, and the results are analyzed and compared with those obtained with the back propagation neural network and other models. The comparison shows that the performance of the GRNN model in the prediction of the LOD change is efficient and feasible.

  20. The research of selection model based on LOD in multi-scale display of electronic map

    Science.gov (United States)

    Zhang, Jinming; You, Xiong; Liu, Yingzhen

    2008-10-01

    This paper proposes a selection model based on LOD to aid the display of electronic map. The ratio of display scale to map scale is regarded as a LOD operator. The categorization rule, classification rule, elementary rule and spatial geometry character rule of LOD operator setting are also concluded.

  1. Status and Prospects for Combined GPS LOD and VLBI UT1 Measurements

    Science.gov (United States)

    Senior, K.; Kouba, J.; Ray, J.

    2010-01-01

    A Kalman filter was developed to combine VLBI estimates of UT1-TAI with biased length of day (LOD) estimates from GPS. The VLBI results are the analyses of the NASA Goddard Space Flight Center group from 24-hr multi-station observing sessions several times per week and the nearly daily 1-hr single-baseline sessions. Daily GPS LOD estimates from the International GNSS Service (IGS) are combined with the VLBI UT1-TAI by modeling the natural excitation of LOD as the integral of a white noise process (i.e., as a random walk) and the UT1 variations as the integration of LOD, similar to the method described by Morabito et al. (1988). To account for GPS technique errors, which express themselves mostly as temporally correlated biases in the LOD measurements, a Gauss-Markov model has been added to assimilate the IGS data, together with a fortnightly sinusoidal term to capture errors in the IGS treatments of tidal effects. Evaluated against independent atmospheric and oceanic axial angular momentum (AAM + OAM) excitations and compared to other UT1/LOD combinations, ours performs best overall in terms of lowest RMS residual and highest correlation with (AAM + OAM) over sliding intervals down to 3 d. The IERS 05C04 and Bulletin A combinations show strong high-frequency smoothing and other problems. Until modified, the JPL SPACE series suffered in the high frequencies from not including any GPS-based LODs. We find, surprisingly, that further improvements are possible in the Kalman filter combination by selective rejection of some VLBI data. The best combined results are obtained by excluding all the 1-hr single-baseline UT1 data as well as those 24-hr UT1 measurements with formal errors greater than 5 μs (about 18% of the multi-baseline sessions). A rescaling of the VLBI formal errors, rather than rejection, was not an effective strategy. These results suggest that the UT1 errors of the 1-hr and weaker 24-hr VLBI sessions are non-Gaussian and more heterogeneous than expected

  2. Proposal for a new LOD and multi-representation concept for CityGML

    NARCIS (Netherlands)

    Löwner, Marc-O; Gröger, Gerhard; Benner, Joachim; Biljecki, F.; Nagel, Claus; Dimopoulou, E.; van Oosterom, P.

    2016-01-01

    The Open Geospatial Consortium (OGC) CityGML standard offers a Level of Detail (LoD) concept that enables the representation of CityGML features from a very detailed to a less detailed description. Due to a rising application variety, the current LoD concept seems to be too inflexible. Here, we

  3. Forecasting irregular variations of UT1-UTC and LOD data caused by ENSO

    Science.gov (United States)

    Niedzielski, T.; Kosek, W.

    2008-04-01

    The research focuses on prediction of LOD and UT1-UTC time series up to one-year in the future with the particular emphasis on the prediction improvement during El Nĩ o or La Nĩ a n n events. The polynomial-harmonic least-squares model is applied to fit the deterministic function to LOD data. The stochastic residuals computed as the difference between LOD data and the polynomial- harmonic model reveal the extreme values driven by El Nĩ o or La Nĩ a. These peaks are modeled by the n n stochastic bivariate autoregressive prediction. This approach focuses on the auto- and cross-correlations between LOD and the axial component of the atmospheric angular momentum. This technique allows one to derive more accurate predictions than purely univariate forecasts, particularly during El Nĩ o/La n Nĩ a events. n

  4. Frank : The LOD cloud at your fingertips?

    NARCIS (Netherlands)

    Beek, Wouter; Rietveld, Laurens

    2015-01-01

    Large-scale, algorithmic access to LOD Cloud data has been hampered by the absence of queryable endpoints for many datasets, a plethora of serialization formats, and an abundance of idiosyncrasies such as syntax errors. As of late, very large-scale - hundreds of thousands of document, tens of

  5. Maximum Potential Score (MPS: An operating model for a successful customer-focused strategy.

    Directory of Open Access Journals (Sweden)

    Cabello González, José Manuel

    2015-12-01

    Full Text Available One of marketers’ chief objectives is to achieve customer loyalty, which is a key factor for profitable growth. Therefore, they need to develop a strategy that attracts and maintains customers, giving them adequate motives, both tangible (prices and promotions and intangible (personalized service and treatment, to satisfy a customer and make him loyal to the company. Finding a way to accurately measure satisfaction and customer loyalty is very important. With regard to typical Relationship Marketing measures, we can consider listening to customers, which can help to achieve a competitive sustainable advantage. Customer satisfaction surveys are essential tools for listening to customers. Short questionnaires have gained considerable acceptance among marketers as a means to achieve a customer satisfaction measure. Our research provides an indication of the benefits of a short questionnaire (one/three questions. We find that the number of questions survey is significantly related to the participation in the survey (Net Promoter Score or NPS. We also prove that a the three question survey is more likely to have more participants than a traditional survey (Maximum Potential Score or MPS . Our main goal is to analyse one method as a potential predictor of customer loyalty. Using surveys, we attempt to empirically establish the causal factors in determining the satisfaction of customers. This paper describes a maximum potential operating model that captures with a three questions survey, important elements for a successful customer-focused strategy. MPS may give us lower participation rates than NPS but important information that helps to convert unhappy customers or just satisfied customers, into loyal customers.

  6. The variants of an LOD of a 3D building model and their influence on spatial analyses

    Science.gov (United States)

    Biljecki, Filip; Ledoux, Hugo; Stoter, Jantien; Vosselman, George

    2016-06-01

    The level of detail (LOD) of a 3D city model indicates the model's grade and usability. However, there exist multiple valid variants of each LOD. As a consequence, the LOD concept is inconclusive as an instruction for the acquisition of 3D city models. For instance, the top surface of an LOD1 block model may be modelled at the eaves of a building or at its ridge height. Such variants, which we term geometric references, are often overlooked and are usually not documented in the metadata. Furthermore, the influence of a particular geometric reference on the performance of a spatial analysis is not known. In response to this research gap, we investigate a variety of LOD1 and LOD2 geometric references that are commonly employed, and perform numerical experiments to investigate their relative difference when used as input for different spatial analyses. We consider three use cases (estimation of the area of the building envelope, building volume, and shadows cast by buildings), and compute the deviations in a Monte Carlo simulation. The experiments, carried out with procedurally generated models, indicate that two 3D models representing the same building at the same LOD, but modelled according to different geometric references, may yield substantially different results when used in a spatial analysis. The outcome of our experiments also suggests that the geometric reference may have a bigger influence than the LOD, since an LOD1 with a specific geometric reference may yield a more accurate result than when using LOD2 models.

  7. Frank: The LOD cloud at your fingertips?

    NARCIS (Netherlands)

    Beek, Wouter; Rietveld, Laurens

    2015-01-01

    Large-scale, algorithmic access to LOD Cloud data has been hampered by the absence of queryable endpoints for many datasets, a plethora of serialization formats, and an abundance of idiosyncrasies such as syntax errors. As of late, very large-scale — hundreds of thousands of document, tens of

  8. Cycles, scaling and crossover phenomenon in length of the day (LOD) time series

    Science.gov (United States)

    Telesca, Luciano

    2007-06-01

    The dynamics of the temporal fluctuations of the length of the day (LOD) time series from January 1, 1962 to November 2, 2006 were investigated. The power spectrum of the whole time series has revealed annual, semi-annual, decadal and daily oscillatory behaviors, correlated with oceanic-atmospheric processes and interactions. The scaling behavior was analyzed by using the detrended fluctuation analysis (DFA), which has revealed two different scaling regimes, separated by a crossover timescale at approximately 23 days. Flicker-noise process can describe the dynamics of the LOD time regime involving intermediate and long timescales, while Brownian dynamics characterizes the LOD time series for small timescales.

  9. CONFIRMATION OF X-LINKED INHERITANCE AND PROVISIONAL MAPPING OF THE KERATOSIS FOLLICULARIS SPINULOSA DECALVANS GENE ON XP IN A LARGE DUTCH FAMILY

    NARCIS (Netherlands)

    Oosterwijk, JC; NELEN, M; VANZANDVOORT, PM; VANOSCH, LDM; ORANJE, AP; WITTEBOLPOST, D; VANOOST, BA

    In a large Dutch family with keratosis follicularis spinulosa decalvans (KFSD, MIM 308800), DNA linkage analysis was performed in order to locate the gene. Pedigree analysis and lod score calculation confirmed X-linked inheritance and revealed significant linkage to DNA markers on Xp. A maximum lod

  10. LOD Laundromat : Why the Semantic Web needs centralization (even if we don't like it)

    NARCIS (Netherlands)

    Beek, Wouter; Rietveld, Laurens; Schlobach, Stefan; van Harmelen, Frank

    2016-01-01

    LOD Laundromat poses a centralized solution for today's Semantic Web problems. This approach adheres more closely to the original vision of a Web of Data, providing uniform access to a large and ever-increasing subcollection of the LOD Cloud.

  11. Correlation between maximum phonetically balanced word recognition score and pure-tone auditory threshold in elder presbycusis patients over 80 years old.

    Science.gov (United States)

    Deng, Xin-Sheng; Ji, Fei; Yang, Shi-Ming

    2014-02-01

    The maximum phonetically balanced word recognition score (PBmax) showed poor correlation with pure-tone thresholds in presbycusis patients older than 80 years. To study the characteristics of monosyllable recognition in presbycusis patients older than 80 years of age. Thirty presbycusis patients older than 80 years were included as the test group (group 80+). Another 30 patients aged 60-80 years were selected as the control group (group 80-) . PBmax was tested by Mandarin monosyllable recognition test materials with the signal level at 30 dB above the averaged thresholds of 0.5, 1, 2, and 4 kHz (4FA) or the maximum comfortable level. The PBmax values of the test group and control group were compared with each other and the correlation between PBmax and predicted maximum speech recognition scores based on 4FA (PBmax-predict) were statistically analyzed. Under the optimal test conditions, the averaged PBmax was (77.3 ± 16.7) % for group 80- and (52.0 ± 25.4) % for group 80+ (p < 0.001). The PBmax of group 80- was significantly correlated with PBmax-predict (Spearman correlation = 0.715, p < 0.001). The score for group 80+ was less statistically correlated with PBmax-predict (Spearman correlation = 0.572, p = 0.001).

  12. Tidal influence through LOD variations on the temporal distribution of earthquake occurrences

    Science.gov (United States)

    Varga, P.; Gambis, D.; Bizouard, Ch.; Bus, Z.; Kiszely, M.

    2006-10-01

    Stresses generated by the body tides are very small at the depth of crustal earth- quakes (~10^2 N/m2). The maximum value of the lunisolar stress within the depth range of earthquakes is 10^3 N/m2 (at depth of about 600 km). Surface loads, due to oceanic tides, in coastal areas are ~ 104 N/m2. These influences are however too small to affect the outbreak time of seismic events. Authors show the effect on time distribution of seismic activity due to ΔLOD generated by zonal tides for the case of Mf, Mm, Ssa and Sa tidal constituents can be much more effective to trigger earthquakes. According to this approach we show that the tides are not directly triggering the seismic events but through the generated length of day variations. That is the reason why in case of zonal tides a correlation of the lunisolar effect and seismic activity exists, what is not the case for the tesseral and sectorial tides.

  13. Medium- and Long-term Prediction of LOD Change with the Leap-step Autoregressive Model

    Science.gov (United States)

    Liu, Q. B.; Wang, Q. J.; Lei, M. F.

    2015-09-01

    It is known that the accuracies of medium- and long-term prediction of changes of length of day (LOD) based on the combined least-square and autoregressive (LS+AR) decrease gradually. The leap-step autoregressive (LSAR) model is more accurate and stable in medium- and long-term prediction, therefore it is used to forecast the LOD changes in this work. Then the LOD series from EOP 08 C04 provided by IERS (International Earth Rotation and Reference Systems Service) is used to compare the effectiveness of the LSAR and traditional AR methods. The predicted series resulted from the two models show that the prediction accuracy with the LSAR model is better than that from AR model in medium- and long-term prediction.

  14. GOOF: OCTOPUS error messages, ORDER, ORDERLIB, FLOE, CHAT, and LOD

    Energy Technology Data Exchange (ETDEWEB)

    Whitten, G.

    1977-07-10

    This is a compilation of the error messages returned by five parts of the Livermore timesharing system: the ORDER batch-processor, the ORDERLIB subroutine library, the FLOE operating system, the CHAT compiler, and the LOD loader.

  15. Secular changes of LOD associated with a growth of the inner core

    Science.gov (United States)

    Denis, C.; Rybicki, K. R.; Varga, P.

    2006-05-01

    From recent estimates of the age of the inner core based on the theory of thermal evolution of the core, we estimate that nowadays the growth of the inner core may perhaps contribute to the observed overall secular increase of LOD caused mainly by tidal friction (i.e., 1.72 ms per century) by a relative decrease of 2 to 7 μs per century. Another, albeit much less plausible, hypothesis is that crystallization of the inner core does not produce any change of LOD, but makes the inner core rotate differentially with respect to the outer core and mantle.

  16. Efficient Simplification Methods for Generating High Quality LODs of 3D Meshes

    Institute of Scientific and Technical Information of China (English)

    Muhammad Hussain

    2009-01-01

    Two simplification algorithms are proposed for automatic decimation of polygonal models, and for generating their LODs. Each algorithm orders vertices according to their priority values and then removes them iteratively. For setting the priority value of each vertex, exploiting normal field of its one-ring neighborhood, we introduce a new measure of geometric fidelity that reflects well the local geometric features of the vertex. After a vertex is selected, using other measures of geometric distortion that are based on normal field deviation and distance measure, it is decided which of the edges incident on the vertex is to be collapsed for removing it. The collapsed edge is substituted with a new vertex whose position is found by minimizing the local quadric error measure. A comparison with the state-of-the-art algorithms reveals that the proposed algorithms are simple to implement, are computationally more efficient, generate LODs with better quality, and preserve salient features even after drastic simplification. The methods are useful for applications such as 3D computer games, virtual reality, where focus is on fast running time, reduced memory overhead, and high quality LODs.

  17. LOD-based clustering techniques for efficient large-scale terrain storage and visualization

    Science.gov (United States)

    Bao, Xiaohong; Pajarola, Renato

    2003-05-01

    Large multi-resolution terrain data sets are usually stored out-of-core. To visualize terrain data at interactive frame rates, the data needs to be organized on disk, loaded into main memory part by part, then rendered efficiently. Many main-memory algorithms have been proposed for efficient vertex selection and mesh construction. Organization of terrain data on disk is quite difficult because the error, the triangulation dependency and the spatial location of each vertex all need to be considered. Previous terrain clustering algorithms did not consider the per-vertex approximation error of individual terrain data sets. Therefore, the vertex sequences on disk are exactly the same for any terrain. In this paper, we propose a novel clustering algorithm which introduces the level-of-detail (LOD) information to terrain data organization to map multi-resolution terrain data to external memory. In our approach the LOD parameters of the terrain elevation points are reflected during clustering. The experiments show that dynamic loading and paging of terrain data at varying LOD is very efficient and minimizes page faults. Additionally, the preprocessing of this algorithm is very fast and works from out-of-core.

  18. LOD-climate Links: how the 2015-2016 El Niño Lengthened the Day by 0.8 ms, and Possible Rotational Forcing of Multidecadal Temperature Changes

    Science.gov (United States)

    Lambert, S. B.; de Viron, O.; Marcus, S.

    2016-12-01

    El Niño events are generally accompanied by significant changes in the Earth's length-of-day (LOD) that can be explained by two approaches. Considering the angular momentum conservation of the system composed by the solid Earth and the atmosphere, ENSO events are accompanied by a strengthening of the subtropical jet streams, and, therefore, a decrease of the Earth's rotation rate. Using the torque approach, the low pressure field of the Eastern Pacific, which is close to high mountain ranges along the Western American coasts, creates a negative torque of the atmosphere on the solid Earth which tends to slow down the Earth's rotation. The large 1983 event was associated with a lengthening of the day of about 1 ms. During the 2015-2016 winter season, a major ENSO event occurred, classified as very strong by meteorological agencies. This central Pacific event, for which the Nino 3.4 index is as high as in 1983, was also concurrent with positive phases of PDO, NAO, and AAO. It coincided with an excursion of the LOD as large as 0.8 ms over a few weeks reaching its maximum around 2016 New Year. We evaluate the mountain and friction torques responsible for the Earth's rotation variations during the winter season and compare to the mean situations and to previous strong ENSO events of 1983 and 1998. Especially, we noticed that the contribution from American mountain ranges is close to the value of 1983. The weaker LOD excursion comes from an inexistent torque over the Himalayas, a weaker contribution from Europe, and a noticeable positive contribution from Antarctica. On longer time scales, core-generated ms-scale LOD excursions are found to precede NH surface and global SST fluctuations by nearly a decade; although the cause of this apparent rotational effect is not known, reported correlations of LOD and tidal-orbital forcing with surface and submarine volcanic activity offer prospects to explain these observations in a core-to-climate chain of causality.

  19. The Use of Daily Geodetic UT1 and LOD Data in the Optimal Estimation of UT1 and LOD With the JPL Kalman Earth Orientation Filter

    Science.gov (United States)

    Freedman, A. P.; Steppe, J. A.

    1995-01-01

    The Jet Propulsion Laboratory Kalman Earth Orientation Filter (KEOF) uses several of the Earth rotation data sets available to generate optimally interpolated UT1 and LOD series to support spacecraft navigation. This paper compares use of various data sets within KEOF.

  20. Exploring the Processes of Generating LOD (0-2) Citygml Models in Greater Municipality of Istanbul

    Science.gov (United States)

    Buyuksalih, I.; Isikdag, U.; Zlatanova, S.

    2013-08-01

    3D models of cities, visualised and exploded in 3D virtual environments have been available for several years. Currently a large number of impressive realistic 3D models have been regularly presented at scientific, professional and commercial events. One of the most promising developments is OGC standard CityGML. CityGML is object-oriented model that support 3D geometry and thematic semantics, attributes and relationships, and offers advanced options for realistic visualization. One of the very attractive characteristics of the model is the support of 5 levels of detail (LOD), starting from 2.5D less accurate model (LOD0) and ending with very detail indoor model (LOD4). Different local government offices and municipalities have different needs when utilizing the CityGML models, and the process of model generation depends on local and domain specific needs. Although the processes (i.e. the tasks and activities) for generating the models differs depending on its utilization purpose, there are also some common tasks (i.e. common denominator processes) in the model generation of City GML models. This paper focuses on defining the common tasks in generation of LOD (0-2) City GML models and representing them in a formal way with process modeling diagrams.

  1. LOD-A-lot : A single-file enabler for data science

    NARCIS (Netherlands)

    Beek, Wouter; Ferńandez, Javier D.; Verborgh, Ruben

    2017-01-01

    Many data scientists make use of Linked Open Data (LOD) as a huge interconnected knowledge base represented in RDF. However, the distributed nature of the information and the lack of a scalable approach to manage and consume such Big Semantic Data makes it difficult and expensive to conduct

  2. Search for linkage to schizophrenia on the X and Y chromosomes

    Energy Technology Data Exchange (ETDEWEB)

    Devoto, M.; Ott, J. [Columbia Univ., New York, NY (United States); Vita, A. [Univ. of Milan (Italy)] [and others

    1994-06-15

    Markers for X chromosome loci were used in linkage studies of a large group of small families (n = 126) with at least two schizophrenic members in one sibship. Based on the hypothesis that a gene for schizophrenia could be X-Y linked, with homologous loci on both X and Y, our analyses included all families regardless of the pattern of familial inheritance. Lod scores were computed with both standard X-linked and a novel X-Y model, and sib-pair analyses were performed for all markers examining the sharing of maternal alleles. Small positive lod scores were obtained for loci pericentromeric, from Xp11.4 to Xq12. Lod scores were also computed separately in families selected for evidence of maternal inheritance and absence of male to male transmission of psychosis. The lod scores for linkage to the locus DXS7 reached a maximum of 1.83 at 0.08% recombination, assuming dominant inheritance on the X chromosome in these families (n = 34). Further investigation of the X-Y homologous gene hypothesis focussing on this region is warranted. 39 refs. 1 fig., 6 tabs.

  3. Application of LOD technology to the economic residence GIS for industry and commerce administration

    Science.gov (United States)

    Song, Yongjun; Feng, Xuezhi; Zhao, Shuhe; Yin, Haiwei; Li, Yulin; Cui, Hongxia; Zhang, Hui; Zhong, Quanbao

    2007-06-01

    The LOD technology has an impact upon the multi-scale representation of spatial database. This paper takes advantage of LOD technology to express the multi-scale geographical data, and establish the exchange of multi-scale electronic map, further attain the goal that the details of geographic features such as point, line and polygon can be displayed more and more clearly with the display scale being enlarged to be convenient for the personnel of all offices of industry and commerce administration to label the locations of the corporations or enterprises.

  4. Aspects Of 40- to 50-Day Oscillations In LOD And AAM

    Science.gov (United States)

    Dickey, Jean O.; Marcus, Steven L.; Ghil, Michael

    1992-01-01

    Report presents study of fluctuations in rotation of Earth, focusing on irregular intraseasonal oscillations in length of day (LOD) and atmospheric angular momentum (AAM) with periods varying from 40 to 50 days. Study draws upon and extends results of prior research.

  5. GENERATION OF MULTI-LOD 3D CITY MODELS IN CITYGML WITH THE PROCEDURAL MODELLING ENGINE RANDOM3DCITY

    Directory of Open Access Journals (Sweden)

    F. Biljecki

    2016-09-01

    Full Text Available The production and dissemination of semantic 3D city models is rapidly increasing benefiting a growing number of use cases. However, their availability in multiple LODs and in the CityGML format is still problematic in practice. This hinders applications and experiments where multi-LOD datasets are required as input, for instance, to determine the performance of different LODs in a spatial analysis. An alternative approach to obtain 3D city models is to generate them with procedural modelling, which is – as we discuss in this paper – well suited as a method to source multi-LOD datasets useful for a number of applications. However, procedural modelling has not yet been employed for this purpose. Therefore, we have developed RANDOM3DCITY, an experimental procedural modelling engine for generating synthetic datasets of buildings and other urban features. The engine is designed to produce models in CityGML and does so in multiple LODs. Besides the generation of multiple geometric LODs, we implement the realisation of multiple levels of spatiosemantic coherence, geometric reference variants, and indoor representations. As a result of their permutations, each building can be generated in 392 different CityGML representations, an unprecedented number of modelling variants of the same feature. The datasets produced by RANDOM3DCITY are suited for several applications, as we show in this paper with documented uses. The developed engine is available under an open-source licence at Github at http://github.com/tudelft3d/Random3Dcity.

  6. LOD BIM Element specification for Railway Turnout Systems Risk Mitigation using the Information Delivery Manual

    Science.gov (United States)

    Gigante-Barrera, Ángel; Dindar, Serdar; Kaewunruen, Sakdirat; Ruikar, Darshan

    2017-10-01

    Railway turnouts are complex systems designed using complex geometries and grades which makes them difficult to be managed in terms of risk prevention. This feature poses a substantial peril to rail users as it is considered a cause of derailment. In addition, derailment deals to financial losses due to operational downtimes and monetary compensations in case of death or injure. These are fundamental drivers to consider mitigating risks arising from poor risk management during design. Prevention through design (PtD) is a process that introduces tacit knowledge from industry professionals during the design process. There is evidence that Building Information Modelling (BIM) can help to mitigate risk since the inception of the project. BIM is considered an Information System (IS) were tacit knowledge can be stored and retrieved from a digital database making easy to take promptly decisions as information is ready to be analysed. BIM at the model element level entails working with 3D elements and embedded data, therefore adding a layer of complexity to the management of information along the different stages of the project and across different disciplines. In order to overcome this problem, the industry has created a framework for model progression specification named Level of Development (LOD). The paper presents an IDM based framework for design risk mitigation through code validation using the LOD. This effort resulted on risk datasets which describe graphically and non-graphically a rail turnout as the model progresses. Thus, permitting its inclusion within risk information systems. The assignment of an LOD construct to a set of data, requires specialised management and process related expertise. Furthermore, the selection of a set of LOD constructs requires a purpose based analysis. Therefore, a framework for LOD constructs implementation within the IDM for code checking is required for the industry to progress in this particular field.

  7. A smoothed maximum score estimator for the binary choice panel data model with individual fixed effects and applications to labour force participation

    NARCIS (Netherlands)

    Charlier, G.W.P.

    1994-01-01

    In a binary choice panel data model with individual effects and two time periods, Manski proposed the maximum score estimator, based on a discontinuous objective function, and proved its consistency under weak distributional assumptions. However, the rate of convergence of this estimator is low (N)

  8. Effects of lodoxamide (LOD), disodium cromoglycate (DSCG) and N-acetyl-aspartyl-glutamate sodium salt (NAAGA) on ocular active anaphylaxis.

    Science.gov (United States)

    Goldschmidt, P; Luyckx, J

    1996-04-01

    LOD, DSCG and NAAGA eye-drops were evaluated on experimentally-induced ocular active anaphylaxis in guinea pigs. Twelve animals per group were sensitized with egg albumin i.p. and challenged on the surface of the eye 14 days later. Two days before challenge, animals were treated with LOD, DSCG or NAAGA 4 times a day. Permeability indexes were calculated after intracardiac injection of Evans Blue. No effect on ocular active anaphylaxis was found with LOD nor with DSCG. NAAGA was able to significantly reduce blood-eye permeability indexes.

  9. Linked open data creating knowledge out of interlinked data : results of the LOD2 project

    CERN Document Server

    Bryl, Volha; Tramp, Sebastian

    2014-01-01

    Linked Open Data (LOD) is a pragmatic approach for realizing the Semantic Web vision of making the Web a global, distributed, semantics-based information system. This book presents an overview on the results of the research project “LOD2 -- Creating Knowledge out of Interlinked Data”. LOD2 is a large-scale integrating project co-funded by the European Commission within the FP7 Information and Communication Technologies Work Program. Commencing in September 2010, this 4-year project comprised leading Linked Open Data research groups, companies, and service providers from across 11 European countries and South Korea. The aim of this project was to advance the state-of-the-art in research and development in four key areas relevant for Linked Data, namely 1. RDF data management; 2. the extraction, creation, and enrichment of structured RDF data; 3. the interlinking and fusion of Linked Data from different sources and 4. the authoring, exploration and visualization of Linked Data.

  10. A Microfluidic Lab-on-a-Disc (LOD for Antioxidant Activities of Plant Extracts

    Directory of Open Access Journals (Sweden)

    Nurhaslina Abd Rahman

    2018-03-01

    Full Text Available Antioxidants are an important substance that can fight the deterioration of free radicals and can easily oxidize when exposed to light. There are many methods to measure the antioxidant activity in a biological sample, for example 2,2-diphenyl-1-picrylhydrazyl (DPPH antioxidant activity test, which is one of the simplest methods used. Despite its simplicity, the organic solvent that has been used to dilute DPPH is easily evaporated and degraded with respect to light exposure and time. Thus, it needs to be used at the earliest convenient time prior to the experiment. To overcome this issue, a rapid and close system for antioxidant activity is required. In this paper, we introduced the Lab-on-a-Disc (LoD method that integrates the DPPH antioxidant activity test on a microfluidic compact disc (CD. We used ascorbic acid, quercetin, Areca catechu, Polygonum minus, and Syzygium polyanthum plant extracts to compare the results of our proposed LoD method with the conventional method. Contrasted to the arduous laborious conventional method, our proposed method offer rapid analysis and simple determination of antioxidant. This proposed LoD method for antioxidant activity in plants would be a platform for the further development of antioxidant assay.

  11. Strategy for determination of LOD and LOQ values--some basic aspects.

    Science.gov (United States)

    Uhrovčík, Jozef

    2014-02-01

    The paper is devoted to the evaluation of limit of detection (LOD) and limit of quantification (LOQ) values in concentration domain by using 4 different approaches; namely 3σ and 10σ approaches, ULA2 approach, PBA approach and MDL approach. Brief theoretical analyses of all above mentioned approaches are given together with directions for their practical use. Calculations and correct calibration design are exemplified by using of electrothermal atomic absorption spectrometry for determination of lead in drinking water sample. These validation parameters reached 1.6 μg L(-1) (LOD) and 5.4 μg L(-1) (LOQ) by using 3σ and 10σ approaches. For obtaining relevant values of analyte concentration the influence of calibration design and measurement methodology were examined. The most preferred technique has proven to be a method of preconcentration of the analyte on the surface of the graphite cuvette (boost cycle). © 2013 Elsevier B.V. All rights reserved.

  12. Ornithine aminotransferase (OAT): recombination between an X-linked OAT sequence (7.5 kb) and the Norrie disease locus.

    Science.gov (United States)

    Ngo, J T; Bateman, J B; Spence, M A; Cortessis, V; Sparkes, R S; Kivlin, J D; Mohandas, T; Inana, G

    1990-01-01

    A human ornithine aminotransferase (OAT) locus has been mapped to the Xp11.2, as has the Norrie disease locus. We used a cDNA probe to investigate a 3-generation UCLA family with Norrie disease; a 4.2-kb RFLP was detected and a maximum lod score of 0.602 at zero recombination fraction was calculated. We used the same probe to study a second multigeneration family with Norrie disease from Utah. A different RFLP of 7.5 kb in size was identified and a recombinational event between the OAT locus represented by this RFLP and the disease loci was observed. Linkage analysis of these two loci in this family revealed a maximum load score of 1.88 at a recombination fraction of 0.10. Although both families have affected members with the same disease, the lod scores are reported separately because the 4.2- and 7.5-kb RFLPs may represent two different loci for the X-linked OAT.

  13. Automated Reconstruction of Building LoDs from Airborne LiDAR Point Clouds Using an Improved Morphological Scale Space

    Directory of Open Access Journals (Sweden)

    Bisheng Yang

    2016-12-01

    Full Text Available Reconstructing building models at different levels of detail (LoDs from airborne laser scanning point clouds is urgently needed for wide application as this method can balance between the user’s requirements and economic costs. The previous methods reconstruct building LoDs from the finest 3D building models rather than from point clouds, resulting in heavy costs and inflexible adaptivity. The scale space is a sound theory for multi-scale representation of an object from a coarser level to a finer level. Therefore, this paper proposes a novel method to reconstruct buildings at different LoDs from airborne Light Detection and Ranging (LiDAR point clouds based on an improved morphological scale space. The proposed method first extracts building candidate regions following the separation of ground and non-ground points. For each building candidate region, the proposed method generates a scale space by iteratively using the improved morphological reconstruction with the increase of scale, and constructs the corresponding topological relationship graphs (TRGs across scales. Secondly, the proposed method robustly extracts building points by using features based on the TRG. Finally, the proposed method reconstructs each building at different LoDs according to the TRG. The experiments demonstrate that the proposed method robustly extracts the buildings with details (e.g., door eaves and roof furniture and illustrate good performance in distinguishing buildings from vegetation or other objects, while automatically reconstructing building LoDs from the finest building points.

  14. Estimation of the POD function and the LOD of a qualitative microbiological measurement method.

    Science.gov (United States)

    Wilrich, Cordula; Wilrich, Peter-Theodor

    2009-01-01

    Qualitative microbiological measurement methods in which the measurement results are either 0 (microorganism not detected) or 1 (microorganism detected) are discussed. The performance of such a measurement method is described by its probability of detection as a function of the contamination (CFU/g or CFU/mL) of the test material, or by the LOD(p), i.e., the contamination that is detected (measurement result 1) with a specified probability p. A complementary log-log model was used to statistically estimate these performance characteristics. An intralaboratory experiment for the detection of Listeria monocytogenes in various food matrixes illustrates the method. The estimate of LOD50% is compared with the Spearman-Kaerber method.

  15. CA-LOD: Collision Avoidance Level of Detail for Scalable, Controllable Crowds

    Science.gov (United States)

    Paris, Sébastien; Gerdelan, Anton; O'Sullivan, Carol

    The new wave of computer-driven entertainment technology throws audiences and game players into massive virtual worlds where entire cities are rendered in real time. Computer animated characters run through inner-city streets teeming with pedestrians, all fully rendered with 3D graphics, animations, particle effects and linked to 3D sound effects to produce more realistic and immersive computer-hosted entertainment experiences than ever before. Computing all of this detail at once is enormously computationally expensive, and game designers as a rule, have sacrificed the behavioural realism in favour of better graphics. In this paper we propose a new Collision Avoidance Level of Detail (CA-LOD) algorithm that allows games to support huge crowds in real time with the appearance of more intelligent behaviour. We propose two collision avoidance models used for two different CA-LODs: a fuzzy steering focusing on the performances, and a geometric steering to obtain the best realism. Mixing these approaches allows to obtain thousands of autonomous characters in real time, resulting in a scalable but still controllable crowd.

  16. Interannual variations in length-of-day (LOD) as a tool to assess climate variability and climate change

    Science.gov (United States)

    Lehmann, E.

    2016-12-01

    On interannual time scales the atmosphere affects significantly fluctuations in the geodetic quantity of length-of-day (LOD). This effect is directly proportional to perturbations in the relative angular momentum of the atmosphere (AAM) computed from zonal winds. During El Niño events tropospheric westerlies increase due to elevated sea surface temperatures (SST) in the Pacific inducing peak anomalies in relative AAM and correspondingly, in LOD. However, El Niño events affect LOD variations differently strong and the causes of this varying effect are yet not clear. Here, we investigate the LOD-El Niño relationship in the 20th and 21st century (1982-2100) whether the quantity of LOD can be used as a geophysical tool to assess variability and change in a future climate. In our analysis we applied a windowed discrete Fourier transform on all de-seasonalized data to remove climatic signals outside of the El Niño frequency band. LOD (data: IERS) was related in space and time to relative AAM and SSTs (data: ERA-40 reanalysis, IPCC ECHAM05-OM1 20C, A1B). Results from mapped Pearson correlation coefficients and time frequency behavior analysis identified a teleconnection pattern that we term the EN≥65%-index. The EN≥65%-index prescribes a significant change in variation in length-of-day of +65% and more related to (1) SST anomalies of >2° in the Pacific Niño region (160°E-80°W, 5°S-5°N), (2) corresponding stratospheric warming anomalies of the quasi-biennial oscillation (QBO), and (3) strong westerly winds in the lower equatorial stratosphere. In our analysis we show that the coupled atmosphere-ocean conditions prescribed in the EN≥65%-index apply to the extreme El Niño events of 19982/83 and 1997/98, and to 75% of all El Niño events in the last third of the 21st century. At that period of time the EN≥65%-index describes a projected altered base state of the equatorial Pacific that shows almost continuous El Niño conditions under climate warming.

  17. Improvement of LOD in Fluorescence Detection with Spectrally Nonuniform Background by Optimization of Emission Filtering.

    Science.gov (United States)

    Galievsky, Victor A; Stasheuski, Alexander S; Krylov, Sergey N

    2017-10-17

    The limit-of-detection (LOD) in analytical instruments with fluorescence detection can be improved by reducing noise of optical background. Efficiently reducing optical background noise in systems with spectrally nonuniform background requires complex optimization of an emission filter-the main element of spectral filtration. Here, we introduce a filter-optimization method, which utilizes an expression for the signal-to-noise ratio (SNR) as a function of (i) all noise components (dark, shot, and flicker), (ii) emission spectrum of the analyte, (iii) emission spectrum of the optical background, and (iv) transmittance spectrum of the emission filter. In essence, the noise components and the emission spectra are determined experimentally and substituted into the expression. This leaves a single variable-the transmittance spectrum of the filter-which is optimized numerically by maximizing SNR. Maximizing SNR provides an accurate way of filter optimization, while a previously used approach based on maximizing a signal-to-background ratio (SBR) is the approximation that can lead to much poorer LOD specifically in detection of fluorescently labeled biomolecules. The proposed filter-optimization method will be an indispensable tool for developing new and improving existing fluorescence-detection systems aiming at ultimately low LOD.

  18. Fault feature analysis of cracked gear based on LOD and analytical-FE method

    Science.gov (United States)

    Wu, Jiateng; Yang, Yu; Yang, Xingkai; Cheng, Junsheng

    2018-01-01

    At present, there are two main ideas for gear fault diagnosis. One is the model-based gear dynamic analysis; the other is signal-based gear vibration diagnosis. In this paper, a method for fault feature analysis of gear crack is presented, which combines the advantages of dynamic modeling and signal processing. Firstly, a new time-frequency analysis method called local oscillatory-characteristic decomposition (LOD) is proposed, which has the attractive feature of extracting fault characteristic efficiently and accurately. Secondly, an analytical-finite element (analytical-FE) method which is called assist-stress intensity factor (assist-SIF) gear contact model, is put forward to calculate the time-varying mesh stiffness (TVMS) under different crack states. Based on the dynamic model of the gear system with 6 degrees of freedom, the dynamic simulation response was obtained for different tooth crack depths. For the dynamic model, the corresponding relation between the characteristic parameters and the degree of the tooth crack is established under a specific condition. On the basis of the methods mentioned above, a novel gear tooth root crack diagnosis method which combines the LOD with the analytical-FE is proposed. Furthermore, empirical mode decomposition (EMD) and ensemble empirical mode decomposition (EEMD) are contrasted with the LOD by gear crack fault vibration signals. The analysis results indicate that the proposed method performs effectively and feasibility for the tooth crack stiffness calculation and the gear tooth crack fault diagnosis.

  19. Genome scan for linkage to asthma using a linkage disequilibrium-lod score test.

    Science.gov (United States)

    Jiang, Y; Slager, S L; Huang, J

    2001-01-01

    We report a genome-wide linkage study of asthma on the German and Collaborative Study on the Genetics of Asthma (CSGA) data. Using a combined linkage and linkage disequilibrium test and the nonparametric linkage score, we identified 13 markers from the German data, 1 marker from the African American (CSGA) data, and 7 markers from the Caucasian (CSGA) data in which the p-values ranged between 0.0001 and 0.0100. From our analysis and taking into account previous published linkage studies of asthma, we suggest that three regions in chromosome 5 (around D5S418, D5S644, and D5S422), one region in chromosome 6 (around three neighboring markers D6S1281, D6S291, and D6S1019), one region in chromosome 11 (around D11S2362), and two regions in chromosome 12 (around D12S351 and D12S324) especially merit further investigation.

  20. Wavelet analysis of interannual LOD, AAM, and ENSO: 1997-98 El Niño and 1998-99 La Niña signals

    Science.gov (United States)

    Zhou, Y. H.; Zheng, D. W.; Liao, X. H.

    2001-05-01

    On the basis of the data series of the length of day (LOD), the atmospheric angular momentum (AAM) and the Southern Oscillation Index (SOI) for January 1970-June 1999, the relationship among Interannual LOD, AAM, and the EL Niño/Southern Oscillation (ENSO) is analyzed by the wavelet transform method. The results suggest that they have similar time-varying spectral structures. The signals of 1997-98 El Niño and 1998-99 La Niña events can be detected from the LOD or AAM data.

  1. Nance-Horan syndrome: linkage analysis in 4 families refines localization in Xp22.31-p22.13 region.

    Science.gov (United States)

    Toutain, A; Ronce, N; Dessay, B; Robb, L; Francannet, C; Le Merrer, M; Briard, M L; Kaplan, J; Moraine, C

    1997-02-01

    Nance-Horan syndrome (NHS) is an X-linked disease characterized by severe congenital cataract with microcornea, distinctive dental findings, evocative facial features and mental impairment in some cases. Previous linkage studies have placed the NHS gene in a large region from DXS143 (Xp22.31) to DXS451 (Xp22.13). To refine this localization further, we have performed linkage analysis in four families. As the maximum expected Lod score is reached in each family for several markers in the Xp22.31-p22.13 region and linkage to the rest of the X chromosome can be excluded, our study shows that NHS is a genetically homogeneous condition. An overall maximum two-point Lod score of 9.36 (theta = 0.00) is obtained with two closely linked markers taken together. DXS207 and DXS1053 in Xp22.2. Recombinant haplotypes indicate that the NHS gene lies between DXS85 and DXS1226. Multipoint analysis yield a maximum Lod score of 9.45 with the support interval spanning a 15-cM region that includes DXS16 and DXS1229/365. The deletion map of the Xp22.3-Xp21.3 region suggests that the phenotypic variability of NHS is not related to gross rearrangement of sequences of varying size but rather to allelic mutations in a single gene, presumably located proximal to DXS16 and distal to DXS1226. Comparison with the map position of the mouse Xcat mutation supports the location of the NHS gene between the GRPR and PDHA1 genes in Xp22.2.

  2. Relationship between visual prostate score (VPSS and maximum flow rate (Qmax in men with urinary tract symptoms

    Directory of Open Access Journals (Sweden)

    Mazhar A. Memon

    2016-04-01

    Full Text Available ABSTRACT Objective: To evaluate correlation between visual prostate score (VPSS and maximum flow rate (Qmax in men with lower urinary tract symptoms. Material and Methods: This is a cross sectional study conducted at a university Hospital. Sixty-seven adult male patients>50 years of age were enrolled in the study after signing an informed consent. Qmax and voided volume recorded at uroflowmetry graph and at the same time VPSS were assessed. The education level was assessed in various defined groups. Pearson correlation coefficient was computed for VPSS and Qmax. Results: Mean age was 66.1±10.1 years (median 68. The mean voided volume on uroflowmetry was 268±160mL (median 208 and the mean Qmax was 9.6±4.96mLs/sec (median 9.0. The mean VPSS score was 11.4±2.72 (11.0. In the univariate linear regression analysis there was strong negative (Pearson's correlation between VPSS and Qmax (r=848, p<0.001. In the multiple linear regression analyses there was a significant correlation between VPSS and Qmax (β-http://www.blogapaixonadosporviagens.com.br/p/caribe.html after adjusting the effect of age, voided volume (V.V and level of education. Multiple linear regression analysis done for independent variables and results showed that there was no significant correlation between the VPSS and independent factors including age (p=0.27, LOE (p=0.941 and V.V (p=0.082. Conclusion: There is a significant negative correlation between VPSS and Qmax. The VPSS can be used in lieu of IPSS score. Men even with limited educational background can complete VPSS without assistance.

  3. Quantile-based permutation thresholds for quantitative trait loci hotspots.

    Science.gov (United States)

    Neto, Elias Chaibub; Keller, Mark P; Broman, Andrew F; Attie, Alan D; Jansen, Ritsert C; Broman, Karl W; Yandell, Brian S

    2012-08-01

    Quantitative trait loci (QTL) hotspots (genomic locations affecting many traits) are a common feature in genetical genomics studies and are biologically interesting since they may harbor critical regulators. Therefore, statistical procedures to assess the significance of hotspots are of key importance. One approach, randomly allocating observed QTL across the genomic locations separately by trait, implicitly assumes all traits are uncorrelated. Recently, an empirical test for QTL hotspots was proposed on the basis of the number of traits that exceed a predetermined LOD value, such as the standard permutation LOD threshold. The permutation null distribution of the maximum number of traits across all genomic locations preserves the correlation structure among the phenotypes, avoiding the detection of spurious hotspots due to nongenetic correlation induced by uncontrolled environmental factors and unmeasured variables. However, by considering only the number of traits above a threshold, without accounting for the magnitude of the LOD scores, relevant information is lost. In particular, biologically interesting hotspots composed of a moderate to small number of traits with strong LOD scores may be neglected as nonsignificant. In this article we propose a quantile-based permutation approach that simultaneously accounts for the number and the LOD scores of traits within the hotspots. By considering a sliding scale of mapping thresholds, our method can assess the statistical significance of both small and large hotspots. Although the proposed approach can be applied to any type of heritable high-volume "omic" data set, we restrict our attention to expression (e)QTL analysis. We assess and compare the performances of these three methods in simulations and we illustrate how our approach can effectively assess the significance of moderate and small hotspots with strong LOD scores in a yeast expression data set.

  4. Multivariate estimation of the limit of detection by orthogonal partial least squares in temperature-modulated MOX sensors.

    Science.gov (United States)

    Burgués, Javier; Marco, Santiago

    2018-08-17

    Metal oxide semiconductor (MOX) sensors are usually temperature-modulated and calibrated with multivariate models such as partial least squares (PLS) to increase the inherent low selectivity of this technology. The multivariate sensor response patterns exhibit heteroscedastic and correlated noise, which suggests that maximum likelihood methods should outperform PLS. One contribution of this paper is the comparison between PLS and maximum likelihood principal components regression (MLPCR) in MOX sensors. PLS is often criticized by the lack of interpretability when the model complexity increases beyond the chemical rank of the problem. This happens in MOX sensors due to cross-sensitivities to interferences, such as temperature or humidity and non-linearity. Additionally, the estimation of fundamental figures of merit, such as the limit of detection (LOD), is still not standardized in multivariate models. Orthogonalization methods, such as orthogonal projection to latent structures (O-PLS), have been successfully applied in other fields to reduce the complexity of PLS models. In this work, we propose a LOD estimation method based on applying the well-accepted univariate LOD formulas to the scores of the first component of an orthogonal PLS model. The resulting LOD is compared to the multivariate LOD range derived from error-propagation. The methodology is applied to data extracted from temperature-modulated MOX sensors (FIS SB-500-12 and Figaro TGS 3870-A04), aiming at the detection of low concentrations of carbon monoxide in the presence of uncontrolled humidity (chemical noise). We found that PLS models were simpler and more accurate than MLPCR models. Average LOD values of 0.79 ppm (FIS) and 1.06 ppm (Figaro) were found using the approach described in this paper. These values were contained within the LOD ranges obtained with the error-propagation approach. The mean LOD increased to 1.13 ppm (FIS) and 1.59 ppm (Figaro) when considering validation samples

  5. Implementace algoritmu LoD terénu

    OpenAIRE

    Radil, Přemek

    2012-01-01

    Tato práce pojednává o implementaci algoritmu pro LoD vizualizaci terénu Seamless Patches for GPU-Based Terrain Rendering jako rozšíření knihovny Coin3D. Prezentuje postupy, za pomoci kterých tento algoritmus zobrazuje rozsáhlé terénní datasety. Celý terén je složen z plátů, které jsou uloženy v hierarchické struktuře. Hierarchie plátů je pak za běhu programu procházena jsou z ní generovány aktivní pláty na základě pozice pozorovatele. Každý plát se skládá z předem definovaných dlaždic a spoj...

  6. Establishing an Appropriate Level of Detail (LoD) for a Building Information Model (BIM) - West Block, Parliament Hill, Ottawa, Canada

    Science.gov (United States)

    Fai, S.; Rafeiro, J.

    2014-05-01

    In 2011, Public Works and Government Services Canada (PWGSC) embarked on a comprehensive rehabilitation of the historically significant West Block of Canada's Parliament Hill. With over 17 thousand square meters of floor space, the West Block is one of the largest projects of its kind in the world. As part of the rehabilitation, PWGSC is working with the Carleton Immersive Media Studio (CIMS) to develop a building information model (BIM) that can serve as maintenance and life-cycle management tool once construction is completed. The scale and complexity of the model have presented many challenges. One of these challenges is determining appropriate levels of detail (LoD). While still a matter of debate in the development of international BIM standards, LoD is further complicated in the context of heritage buildings because we must reconcile the LoD of the BIM with that used in the documentation process (terrestrial laser scan and photogrammetric survey data). In this paper, we will discuss our work to date on establishing appropriate LoD within the West Block BIM that will best serve the end use. To facilitate this, we have developed a single parametric model for gothic pointed arches that can be used for over seventy-five unique window types present in the West Block. Using the AEC (CAN) BIM as a reference, we have developed a workflow to test each of these window types at three distinct levels of detail. We have found that the parametric Gothic arch significantly reduces the amount of time necessary to develop scenarios to test appropriate LoD.

  7. The Little Ice Age was 1.0-1.5 °C cooler than current warm period according to LOD and NAO

    Science.gov (United States)

    Mazzarella, Adriano; Scafetta, Nicola

    2018-02-01

    We study the yearly values of the length of day (LOD, 1623-2016) and its link to the zonal index (ZI, 1873-2003), the Northern Atlantic oscillation index (NAO, 1659-2000) and the global sea surface temperature (SST, 1850-2016). LOD is herein assumed to be mostly the result of the overall circulations occurring within the ocean-atmospheric system. We find that LOD is negatively correlated with the global SST and with both the integral function of ZI and NAO, which are labeled as IZI and INAO. A first result is that LOD must be driven by a climatic change induced by an external (e.g. solar/astronomical) forcing since internal variability alone would have likely induced a positive correlation among the same variables because of the conservation of the Earth's angular momentum. A second result is that the high correlation among the variables implies that the LOD and INAO records can be adopted as global proxies to reconstruct past climate change. Tentative global SST reconstructions since the seventeenth century suggest that around 1700, that is during the coolest period of the Little Ice Age (LIA), SST could have been about 1.0-1.5 °C cooler than the 1950-1980 period. This estimated LIA cooling is greater than what some multiproxy global climate reconstructions suggested, but it is in good agreement with other more recent climate reconstructions including those based on borehole temperature data.

  8. Autozygosity mapping of a large consanguineous Pakistani family reveals a novel non-syndromic autosomal recessive mental retardation locus on 11p15-tel

    DEFF Research Database (Denmark)

    Rehman, Shoaib ur; Baig, Shahid Mahmood; Eiberg, Hans

    2011-01-01

    done in all sampled individuals in the family. The nuclear central loop in the five generation family showed homozygosity for a 6-Mb telomeric region on 11p15, whereas all other linkage regions were excluded by calculation of logarithm of odds (LOD) for the SNP microarray data. A maximum LOD score of Z......Autosomal recessive inherited mental retardation is an extremely heterogeneous disease and accounts for approximately 25% of all non-syndromic mental retardation cases. Autozygosity mapping of a large consanguineous Pakistani family revealed a novel locus for non-syndromic autosomal recessive...

  9. The Partition of Multi-Resolution LOD Based on Qtm

    Science.gov (United States)

    Hou, M.-L.; Xing, H.-Q.; Zhao, X.-S.; Chen, J.

    2011-08-01

    The partition hierarch of Quaternary Triangular Mesh (QTM) determine the accuracy of spatial analysis and application based on QTM. In order to resolve the problem that the partition hierarch of QTM is limited by the level of the computer hardware, the new method that Multi- Resolution LOD (Level of Details) based on QTM will be discussed in this paper. This method can make the resolution of the cells varying with the viewpoint position by partitioning the cells of QTM, selecting the particular area according to the viewpoint; dealing with the cracks caused by different subdivisions, it satisfies the request of unlimited partition in part.

  10. THE PARTITION OF MULTI-RESOLUTION LOD BASED ON QTM

    Directory of Open Access Journals (Sweden)

    M.-L. Hou

    2012-08-01

    Full Text Available The partition hierarch of Quaternary Triangular Mesh (QTM determine the accuracy of spatial analysis and application based on QTM. In order to resolve the problem that the partition hierarch of QTM is limited by the level of the computer hardware, the new method that Multi- Resolution LOD (Level of Details based on QTM will be discussed in this paper. This method can make the resolution of the cells varying with the viewpoint position by partitioning the cells of QTM, selecting the particular area according to the viewpoint; dealing with the cracks caused by different subdivisions, it satisfies the request of unlimited partition in part.

  11. Towards an Editable, Versionized LOD Service for Library Data

    Directory of Open Access Journals (Sweden)

    Felix Ostrowski

    2013-02-01

    Full Text Available The Northrhine-Westphalian Library Service Center (hbz launched its LOD service lobid.org in August 2010 and has since then continuously been improving the underlying conversion processes, data models and software. The present paper first explains the background and motivation for developing lobid.org . It then describes the underlying software framework Phresnel which is written in PHP and which provides presentation and editing capabilities of RDF data based on the Fresnel Display Vocabulary for RDF. The paper gives an overview of the current state of the Phresnel development and discusses the technical challenges encountered. Finally, possible prospects for further developing Phresnel are outlined.

  12. Animation Strategies for Smooth Transformations Between Discrete Lods of 3d Building Models

    Science.gov (United States)

    Kada, Martin; Wichmann, Andreas; Filippovska, Yevgeniya; Hermes, Tobias

    2016-06-01

    The cartographic 3D visualization of urban areas has experienced tremendous progress over the last years. An increasing number of applications operate interactively in real-time and thus require advanced techniques to improve the quality and time response of dynamic scenes. The main focus of this article concentrates on the discussion of strategies for smooth transformation between two discrete levels of detail (LOD) of 3D building models that are represented as restricted triangle meshes. Because the operation order determines the geometrical and topological properties of the transformation process as well as its visual perception by a human viewer, three different strategies are proposed and subsequently analyzed. The simplest one orders transformation operations by the length of the edges to be collapsed, while the other two strategies introduce a general transformation direction in the form of a moving plane. This plane either pushes the nodes that need to be removed, e.g. during the transformation of a detailed LOD model to a coarser one, towards the main building body, or triggers the edge collapse operations used as transformation paths for the cartographic generalization.

  13. LOD First Estimates In 7406 SLR San Juan Argentina Station

    Science.gov (United States)

    Pacheco, A.; Podestá, R.; Yin, Z.; Adarvez, S.; Liu, W.; Zhao, L.; Alvis Rojas, H.; Actis, E.; Quinteros, J.; Alacoria, J.

    2015-10-01

    In this paper we show results derived from satellite observations at the San Juan SLR station of Felix Aguilar Astronomical Observatory (OAFA). The Satellite Laser Ranging (SLR) telescope was installed in early 2006, in accordance with an international cooperation agreement between the San Juan National University (UNSJ) and the Chinese Academy of Sciences (CAS). The SLR has been in successful operation since 2011 using NAOC SLR software for the data processing. This program was designed to calculate satellite orbits and station coordinates, however it was used in this work for the determination of LOD (Length Of Day) time series and Earth Rotation speed.

  14. Prediction of UT1-UTC, LOD and AAM χ3 by combination of least-squares and multivariate stochastic methods

    Science.gov (United States)

    Niedzielski, Tomasz; Kosek, Wiesław

    2008-02-01

    This article presents the application of a multivariate prediction technique for predicting universal time (UT1-UTC), length of day (LOD) and the axial component of atmospheric angular momentum (AAM χ 3). The multivariate predictions of LOD and UT1-UTC are generated by means of the combination of (1) least-squares (LS) extrapolation of models for annual, semiannual, 18.6-year, 9.3-year oscillations and for the linear trend, and (2) multivariate autoregressive (MAR) stochastic prediction of LS residuals (LS + MAR). The MAR technique enables the use of the AAM χ 3 time-series as the explanatory variable for the computation of LOD or UT1-UTC predictions. In order to evaluate the performance of this approach, two other prediction schemes are also applied: (1) LS extrapolation, (2) combination of LS extrapolation and univariate autoregressive (AR) prediction of LS residuals (LS + AR). The multivariate predictions of AAM χ 3 data, however, are computed as a combination of the extrapolation of the LS model for annual and semiannual oscillations and the LS + MAR. The AAM χ 3 predictions are also compared with LS extrapolation and LS + AR prediction. It is shown that the predictions of LOD and UT1-UTC based on LS + MAR taking into account the axial component of AAM are more accurate than the predictions of LOD and UT1-UTC based on LS extrapolation or on LS + AR. In particular, the UT1-UTC predictions based on LS + MAR during El Niño/La Niña events exhibit considerably smaller prediction errors than those calculated by means of LS or LS + AR. The AAM χ 3 time-series is predicted using LS + MAR with higher accuracy than applying LS extrapolation itself in the case of medium-term predictions (up to 100 days in the future). However, the predictions of AAM χ 3 reveal the best accuracy for LS + AR.

  15. Nance-Horan syndrome: localization within the region Xp21.1-Xp22.3 by linkage analysis.

    Science.gov (United States)

    Stambolian, D; Lewis, R A; Buetow, K; Bond, A; Nussbaum, R

    1990-07-01

    Nance-Horan Syndrome (NHS) or X-linked cataract-dental syndrome (MIM 302350) is a disease of unknown pathogenesis characterized by congenital cataracts and dental anomalies. We performed linkage analysis in three kindreds with NHS by using six RFLP markers between Xp11.3 and Xp22.3. Close linkage was found between NHS and polymorphic loci DXS43 (theta = 0 with lod score 2.89), DXS41 (theta = 0 with lod score 3.44), and DXS67 (theta = 0 with lod score 2.74), defined by probes pD2, p99-6, and pB24, respectively. Recombinations were found with the marker loci DXS84 (theta = .04 with lod score 4.13), DXS143 (theta = .06 with lod score 3.11) and DXS7 (theta = .09 with lod score 1.68). Multipoint linkage analysis determined the NHS locus to be linked completely to DXS41 (lod score = 7.07). Our linkage results, combined with analysis of Xp interstitial deletions, suggest that the NHS locus is located within or close to the Xp22.1-Xp22.2 region.

  16. Visualizing dynamic geosciences phenomena using an octree-based view-dependent LOD strategy within virtual globes

    Science.gov (United States)

    Li, Jing; Wu, Huayi; Yang, Chaowei; Wong, David W.; Xie, Jibo

    2011-09-01

    Geoscientists build dynamic models to simulate various natural phenomena for a better understanding of our planet. Interactive visualizations of these geoscience models and their outputs through virtual globes on the Internet can help the public understand the dynamic phenomena related to the Earth more intuitively. However, challenges arise when the volume of four-dimensional data (4D), 3D in space plus time, is huge for rendering. Datasets loaded from geographically distributed data servers require synchronization between ingesting and rendering data. Also the visualization capability of display clients varies significantly in such an online visualization environment; some may not have high-end graphic cards. To enhance the efficiency of visualizing dynamic volumetric data in virtual globes, this paper proposes a systematic framework, in which an octree-based multiresolution data structure is implemented to organize time series 3D geospatial data to be used in virtual globe environments. This framework includes a view-dependent continuous level of detail (LOD) strategy formulated as a synchronized part of the virtual globe rendering process. Through the octree-based data retrieval process, the LOD strategy enables the rendering of the 4D simulation at a consistent and acceptable frame rate. To demonstrate the capabilities of this framework, data of a simulated dust storm event are rendered in World Wind, an open source virtual globe. The rendering performances with and without the octree-based LOD strategy are compared. The experimental results show that using the proposed data structure and processing strategy significantly enhances the visualization performance when rendering dynamic geospatial phenomena in virtual globes.

  17. ANIMATION STRATEGIES FOR SMOOTH TRANSFORMATIONS BETWEEN DISCRETE LODS OF 3D BUILDING MODELS

    Directory of Open Access Journals (Sweden)

    M. Kada

    2016-06-01

    Full Text Available The cartographic 3D visualization of urban areas has experienced tremendous progress over the last years. An increasing number of applications operate interactively in real-time and thus require advanced techniques to improve the quality and time response of dynamic scenes. The main focus of this article concentrates on the discussion of strategies for smooth transformation between two discrete levels of detail (LOD of 3D building models that are represented as restricted triangle meshes. Because the operation order determines the geometrical and topological properties of the transformation process as well as its visual perception by a human viewer, three different strategies are proposed and subsequently analyzed. The simplest one orders transformation operations by the length of the edges to be collapsed, while the other two strategies introduce a general transformation direction in the form of a moving plane. This plane either pushes the nodes that need to be removed, e.g. during the transformation of a detailed LOD model to a coarser one, towards the main building body, or triggers the edge collapse operations used as transformation paths for the cartographic generalization.

  18. Nonsyndromic cleft lip with or without cleft palate: New BCL3 information

    Energy Technology Data Exchange (ETDEWEB)

    Amos, C.; Hecht, J.T. [Univ. of Texas Medical School, Houston, TX (United States); Gasser, D. [Univ. of Pennsylvania School of Medicine, Philadelphia, PA (United States)

    1996-09-01

    We did not previously provide LOD scores for linkage assuming heterogeneity, as suggested by Ott for the linkage analysis of cleft lip with or without cleft palate (CL/P) and BCL3, ApoC2, and D19S178 in the paper by Stein et al. The results from analysis using the HOMOG program, allowing for heterogeneity under the reduced penetrance model, gave a maximum LOD score of 1.85 for ApoC2, 0.41 for BCL3, 0.03 for D19S178, and 1.72 for multipoint analysis in the interval. For the affecteds-only model, the values are 1.96 for ApoC2, 0.41 for BCL3, 0.01 for D19S178, and 1.44 for the multipoint analysis. 8 refs.

  19. Autosomal dominant distal myopathy: Linkage to chromosome 14

    Energy Technology Data Exchange (ETDEWEB)

    Laing, N.G.; Laing, B.A.; Wilton, S.D.; Dorosz, S.; Mastaglia, F.L.; Kakulas, B.A. [Australian Neuromuscular Research Institute, Perth (Australia); Robbins, P.; Meredith, C.; Honeyman, K.; Kozman, H.

    1995-02-01

    We have studied a family segregating a form of autosomal dominant distal myopathy (MIM 160500) and containing nine living affected individuals. The myopathy in this family is closest in clinical phenotype to that first described by Gowers in 1902. A search for linkage was conducted using microsatellite, VNTR, and RFLP markers. In total, 92 markers on all 22 autosomes were run. Positive linkage was obtained with 14 of 15 markers tested on chromosome 14, with little indication of linkage elsewhere in the genome. Maximum two-point LOD scores of 2.60 at recombination fraction .00 were obtained for the markers MYH7 and D14S64 - the family structure precludes a two-point LOD score {ge} 3. Recombinations with D14S72 and D14S49 indicate that this distal myopathy locus, MPD1, should lie between these markers. A multipoint analysis assuming 100% penetrance and using the markers D14S72, D14S50, MYH7, D14S64, D14S54, and D14S49 gave a LOD score of exactly 3 at MYH7. Analysis at a penetrance of 80% gave a LOD score of 2.8 at this marker. This probable localization of a gene for distal myopathy, MPD1, on chromosome 14 should allow other investigators studying distal myopathy families to test this region for linkage in other types of the disease, to confirm linkage or to demonstrate the likely genetic heterogeneity. 24 refs., 3 figs., 1 tab.

  20. Finanční analýza sportovního střediska Loděnice Trója UK FTVS

    OpenAIRE

    Ocman, Josef

    2010-01-01

    Title: The financial analysis of sport centre Loděnice Troja FTVS UK Annotation: The main goal of the project is to detect prosperity and utilization of sport centre Loděnice Troja FTVS UK on base of evaluation of economy of the departments and his subdepartments. The goal is achived by an analyse of accouting data and with help of metod of financial analysis. . The project was firmed up on base of order of management FTVS UK. Keywords: Financial analysis, municipal firm, ratio, calculation 3

  1. A comparison of LOD and UT1-UTC forecasts by different combined prediction techniques

    Science.gov (United States)

    Kosek, W.; Kalarus, M.; Johnson, T. J.; Wooden, W. H.; McCarthy, D. D.; Popiński, W.

    Stochastic prediction techniques including autocovariance, autoregressive, autoregressive moving average, and neural networks were applied to the UT1-UTC and Length of Day (LOD) International Earth Rotation and Reference Systems Servive (IERS) EOPC04 time series to evaluate the capabilities of each method. All known effects such as leap seconds and solid Earth zonal tides were first removed from the observed values of UT1-UTC and LOD. Two combination procedures were applied to predict the resulting LODR time series: 1) the combination of the least-squares (LS) extrapolation with a stochastic predition method, and 2) the combination of the discrete wavelet transform (DWT) filtering and a stochastic prediction method. The results of the combination of the LS extrapolation with different stochastic prediction techniques were compared with the results of the UT1-UTC prediction method currently used by the IERS Rapid Service/Prediction Centre (RS/PC). It was found that the prediction accuracy depends on the starting prediction epochs, and for the combined forecast methods, the mean prediction errors for 1 to about 70 days in the future are of the same order as those of the method used by the IERS RS/PC.

  2. Genetic localisation of MRX27 to Xq24-26 defines another discrete gene for non-specific X-linked mental retardation

    Energy Technology Data Exchange (ETDEWEB)

    Gedeon, A.K.; Connor, J.M.; Mulley, J.C. [Univ. of Adelaide (Australia); Connor, J.M. [Duncan Guthrie Inst. of Medical Genetics, Yorkhill (United Kingdom); Glass, I.A. [Univ. of California, San Franciso, CA (United States)

    1996-07-12

    A large family with non-specific X-linked mental retardation (MRX) was first described in 1991, with a suggestion of linkage to Xq26-27. The maximum lod score was 1.60 ({theta} = 0.10) with the F9 locus. The localization of this MRX gene has now been established by linkage to microsatellite markers. Peak pairwise lod scores of 4.02 and 4.01 ({theta} = 0.00) were attained at the DXS1114 and DXS994 loci respectively. This MRX gene is now designated MRX27 and is localized to Xq24-26 by recombination events detected by DXS424 and DXS102. This regional localization spans 26.2 cM on the genetic background map and defines another distinct MRX interval by linkage to a specific region of the X chromosome. 25 refs., 1 fig., 1 tab.

  3. A novel D458V mutation in the SANS PDZ binding motif causes atypical Usher syndrome.

    NARCIS (Netherlands)

    Kalay, E.; Brouwer, A.P.M. de; Caylan, R.; Nabuurs, S.B.; Wollnik, B.; Karaguzel, A.; Heister, J.G.A.M.; Erdol, H.; Cremers, F.P.M.; Cremers, C.W.R.J.; Brunner, H.G.; Kremer, J.M.J.

    2005-01-01

    Homozygosity mapping and linkage analysis in a Turkish family with autosomal recessive prelingual sensorineural hearing loss revealed a 15-cM critical region at 17q25.1-25.3 flanked by the polymorphic markers D17S1807 and D17S1806. The maximum two-point lod score was 4.07 at theta=0.0 for the marker

  4. Modulation of the SSTA decadal variation on ENSO events and relationships of SSTA With LOD,SOI, etc

    Science.gov (United States)

    Liao, D. C.; Zhou, Y. H.; Liao, X. H.

    2007-01-01

    Interannual and decadal components of the length of day (LOD), Southern Oscillation Index (SOI) and Sea Surface Temperature anomaly (SSTA) in Nino regions are extracted by band-pass filtering, and used for research of the modulation of the SSTA on the ENSO events. Results show that besides the interannual components, the decadal components in SSTA have strong impacts on monitoring and representing of the ENSO events. When the ENSO events are strong, the modulation of the decadal components of the SSTA tends to prolong the life-time of the events and enlarge the extreme anomalies of the SST, while the ENSO events, which are so weak that they can not be detected by the interannual components of the SSTA, can also be detected with the help of the modulation of the SSTA decadal components. The study further draws attention to the relationships of the SSTA interannual and decadal components with those of LOD, SOI, both of the sea level pressure anomalies (SLPA) and the trade wind anomalies (TWA) in tropic Pacific, and also with those of the axial components of the atmospheric angular momentum (AAM) and oceanic angular momentum (OAM). Results of the squared coherence and coherent phases among them reveal close connections with the SSTA and almost all of the parameters mentioned above on the interannual time scales, while on the decadal time scale significant connections are among the SSTA and SOI, SLPA, TWA, ?3w and ?3w+v as well, and slight weaker connections between the SSTA and LOD, ?3pib and ?3bp

  5. Heritability and whole genome linkage of pulse pressure in Chinese twin pairs

    DEFF Research Database (Denmark)

    Jiang, Wengjie; Zhang, Dongfeng; Pang, Zengchang

    2012-01-01

    with a heritability estimate of 0.45. Genome-wide non-parametric linkage analysis identified three significant linkage peaks on chromosome 11 (lod score 4.06 at 30.5 cM), chromosome 12 (lod score 3.97 at 100.7 cM), and chromosome 18 (lod score 4.01 at 70.7 cM) with the last two peaks closely overlapping with linkage...

  6. Linkage analysis of candidate genes in autoimmune thyroid disease. II. Selected gender-related genes and the X-chromosome. International Consortium for the Genetics of Autoimmune Thyroid Disease.

    Science.gov (United States)

    Barbesino, G; Tomer, Y; Concepcion, E S; Davies, T F; Greenberg, D A

    1998-09-01

    Hashimoto's thyroiditis (HT) and Graves' disease (GD) are autoimmune thyroid diseases (AITD) in which multiple genetic factors are suspected to play an important role. Until now, only a few minor risk factors for these diseases have been identified. Susceptibility seems to be stronger in women, pointing toward a possible role for genes related to sex steroid action or mechanisms related to genes on the X-chromosome. We have studied a total of 45 multiplex families, each containing at least 2 members affected with either GD (55 patients) or HT (72 patients), and used linkage analysis to target as candidate susceptibility loci genes involved in estrogen activity, such as the estrogen receptor alpha and beta and the aromatase genes. We then screened the entire X-chromosome using a set of polymorphic microsatellite markers spanning the whole chromosome. We found a region of the X-chromosome (Xq21.33-22) giving positive logarithm of odds (LOD) scores and then reanalyzed this area with dense markers in a multipoint analysis. Our results excluded linkage to the estrogen receptor alpha and aromatase genes when either the patients with GD only, those with HT only, or those with any AITD were considered as affected. Linkage to the estrogen receptor beta could not be totally ruled out, partly due to incomplete mapping information for the gene itself at this time. The X-chromosome data revealed consistently positive LOD scores (maximum of 1.88 for marker DXS8020 and GD patients) when either definition of affectedness was considered. Analysis of the family data using a multipoint analysis with eight closely linked markers generated LOD scores suggestive of linkage to GD in a chromosomal area (Xq21.33-22) extending for about 6 cM and encompassing four markers. The maximum LOD score (2.5) occurred at DXS8020. In conclusion, we ruled out a major role for estrogen receptor alpha and the aromatase genes in the genetic predisposition to AITD. Estrogen receptor beta remains a

  7. Highly sensitive lactate biosensor by engineering chitosan/PVI-Os/CNT/LOD network nanocomposite.

    Science.gov (United States)

    Cui, Xiaoqiang; Li, Chang Ming; Zang, Jianfeng; Yu, Shucong

    2007-06-15

    A novel chitosan/PVI-Os(polyvinylimidazole-Os)/CNT(carbon nanotube)/LOD (lactate oxidase) network nanocomposite was constructed on gold electrode for detection of lactate. The composite was nanoengineered by selected matched material components and optimized composition ratio to produce a superior lactate sensor. Positively charged chitosan and PVI-Os were used as the matrix and the mediator to immobilize the negatively charged LOD and to enhance the electron transfer, respectively. CNTs were introduced as the essential component in the composite for the network nanostructure. FESEM (field emission scan electron microscopy) and electrochemical characterization demonstrated that CNT behaved as a cross-linker to network PVI and chitosan due to its nanoscaled and negative charged nature. This significantly improved the conductivity, stability and electroactivity for detection of lactate. The standard deviation of the sensor without CNT in the composite was greatly reduced from 19.6 to 4.9% by addition of CNTs. With optimized conditions the sensitivity and detection limit of the lactate sensor was 19.7 microA mM(-1)cm(-2) and 5 microM, respectively. The sensitivity was remarkably improved in comparison to the newly reported values of 0.15-3.85 microA mM(-1)cm(-2). This novel nanoengineering approach for selecting matched components to form a network nanostructure could be extended to other enzyme biosensors, and to have broad potential applications in diagnostics, life science and food analysis.

  8. MCMC multilocus lod scores: application of a new approach.

    Science.gov (United States)

    George, Andrew W; Wijsman, Ellen M; Thompson, Elizabeth A

    2005-01-01

    On extended pedigrees with extensive missing data, the calculation of multilocus likelihoods for linkage analysis is often beyond the computational bounds of exact methods. Growing interest therefore surrounds the implementation of Monte Carlo estimation methods. In this paper, we demonstrate the speed and accuracy of a new Markov chain Monte Carlo method for the estimation of linkage likelihoods through an analysis of real data from a study of early-onset Alzheimer's disease. For those data sets where comparison with exact analysis is possible, we achieved up to a 100-fold increase in speed. Our approach is implemented in the program lm_bayes within the framework of the freely available MORGAN 2.6 package for Monte Carlo genetic analysis (http://www.stat.washington.edu/thompson/Genepi/MORGAN/Morgan.shtml).

  9. Comparison of sensitivity to artificial spectral errors and multivariate LOD in NIR spectroscopy - Determining the performance of miniaturizations on melamine in milk powder.

    Science.gov (United States)

    Henn, Raphael; Kirchler, Christian G; Grossgut, Maria-Elisabeth; Huck, Christian W

    2017-05-01

    This study compared three commercially available spectrometers - whereas two of them were miniaturized - in terms of prediction ability of melamine in milk powder (infant formula). Therefore all spectra were split into calibration- and validation-set using Kennard Stone and Duplex algorithm in comparison. For each instrument the three best performing PLSR models were constructed using SNV and Savitzky Golay derivatives. The best RMSEP values were 0.28g/100g, 0.33g/100g and 0.27g/100g for the NIRFlex N-500, the microPHAZIR and the microNIR2200 respectively. Furthermore the multivariate LOD interval [LOD min , LOD max ] was calculated for all the PLSR models unveiling significant differences among the spectrometers showing values of 0.20g/100g - 0.27g/100g, 0.28g/100g - 0.54g/100g and 0.44g/100g - 1.01g/100g for the NIRFlex N-500, the microPHAZIR and the microNIR2200 respectively. To assess the robustness of all models, artificial introduction of white noise, baseline shift, multiplicative effect, spectral shrink and stretch, stray light and spectral shift were applied. Monitoring the RMSEP as function of the perturbation gave indication of robustness of the models and helped to compare the performances of the spectrometers. Not taking the additional information from the LOD calculations into account one could falsely assume that all the spectrometers perform equally well which is not the case when the multivariate evaluation and robustness data were considered. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Weighted combination of LOD values oa splitted into frequency windows

    Science.gov (United States)

    Fernandez, L. I.; Gambis, D.; Arias, E. F.

    In this analysis a one-day combined time series of LOD(length-of-day) estimates is presented. We use individual data series derived by 7 GPS and 3 SLR analysis centers, which routinely contribute to the IERS database over a recent 27-month period (Jul 1996 - Oct 1998). The result is compared to the multi-technique combined series C04 produced by the Central Bureau of the IERS that is commonly used as a reference for the study of the phenomena of Earth rotation variations. The Frequency Windows Combined Series procedure brings out a time series, which is close to C04 but shows an amplitude difference that might explain the evident periodic behavior present in the differences of these two combined series. This method could be useful to generate a new time series to be used as a reference in the high frequency variations of the Earth rotation studies.

  11. Sequential Organ Failure Assessment Score for Evaluating Organ Failure and Outcome of Severe Maternal Morbidity in Obstetric Intensive Care

    Directory of Open Access Journals (Sweden)

    Antonio Oliveira-Neto

    2012-01-01

    Full Text Available Objective. To evaluate the performance of Sequential Organ Failure Assessment (SOFA score in cases of severe maternal morbidity (SMM. Design. Retrospective study of diagnostic validation. Setting. An obstetric intensive care unit (ICU in Brazil. Population. 673 women with SMM. Main Outcome Measures. mortality and SOFA score. Methods. Organ failure was evaluated according to maximum score for each one of its six components. The total maximum SOFA score was calculated using the poorest result of each component, reflecting the maximum degree of alteration in systemic organ function. Results. highest total maximum SOFA score was associated with mortality, 12.06 ± 5.47 for women who died and 1.87 ± 2.56 for survivors. There was also a significant correlation between the number of failing organs and maternal mortality, ranging from 0.2% (no failure to 85.7% (≥3 organs. Analysis of the area under the receiver operating characteristic (ROC curve (AUC confirmed the excellent performance of total maximum SOFA score for cases of SMM (AUC = 0.958. Conclusions. Total maximum SOFA score proved to be an effective tool for evaluating severity and estimating prognosis in cases of SMM. Maximum SOFA score may be used to conceptually define and stratify the degree of severity in cases of SMM.

  12. Determination of Oversulphated Chondroitin Sulphate and Dermatan Sulphate in unfractionated heparin by (1)H-NMR - Collaborative study for quantification and analytical determination of LoD.

    Science.gov (United States)

    McEwen, I; Mulloy, B; Hellwig, E; Kozerski, L; Beyer, T; Holzgrabe, U; Wanko, R; Spieser, J-M; Rodomonte, A

    2008-12-01

    Oversulphated Chondroitin Sulphate (OSCS) and Dermatan Sulphate (DS) in unfractionated heparins can be identified by nuclear magnetic resonance spectrometry (NMR). The limit of detection (LoD) of OSCS is 0.1% relative to the heparin content. This LoD is obtained at a signal-to-noise ratio (S/N) of 2000:1 of the heparin methyl signal. Quantification is best obtained by comparing peak heights of the OSCS and heparin methyl signals. Reproducibility of less than 10% relative standard deviation (RSD) has been obtained. The accuracy of quantification was good.

  13. Genetic linkage analyses and Cx50 mutation detection in a large multiplex Chinese family with hereditary nuclear cataract.

    Science.gov (United States)

    He, Wei; Li, Xin; Chen, Jiajing; Xu, Ling; Zhang, Feng; Dai, Qiushi; Cui, Hao; Wang, Duen-Mei; Yu, Jun; Hu, Songnian; Lu, Shan

    2011-03-01

    The aim of the study was to characterize the underlying mutation in a large multiplex Chinese family with hereditary nuclear cataract. A 6-generation Chinese family having hereditary nuclear cataract was recruited and clinically verified. Blood DNA samples were obtained from 53 available family members. Linkage analyses were performed on the known candidate regions for hereditary cataract with 36 polymorphic microsatellite markers. To identify mutations related to cataract, a direct sequencing approach was applied to a candidate gene residing in our linkage locus. A linkage locus was identified with a maximum 2-point LOD score of 4.31 (recombination fraction = 0) at marker D1S498 and a maximum multipoint LOD score of 5.7 between markers D1S2344 and D1S498 on chromosome 1q21.1, where the candidate gene Cx50 is located. Direct sequencing of Cx50 showed a 139 G to A transition occurred in all affected family members. This transitional mutation resulted in a replacement of aspartic acid by asparagine at residue 47 (D47N) and led to a loss-of-function of the protein. The D47N mutation of Cx50 causes the hereditary nuclear cataract in this family in an autosomal dominant mode of inheritance with incomplete penetrance.

  14. Singularity Processing Method of Microstrip Line Edge Based on LOD-FDTD

    Directory of Open Access Journals (Sweden)

    Lei Li

    2014-01-01

    Full Text Available In order to improve the performance of the accuracy and efficiency for analyzing the microstrip structure, a singularity processing method is proposed theoretically and experimentally based on the fundamental locally one-dimensional finite difference time domain (LOD-FDTD with second-order temporal accuracy (denoted as FLOD2-FDTD. The proposed method can highly improve the performance of the FLOD2-FDTD even when the conductor is embedded into more than half of the cell by the coordinate transformation. The experimental results showed that the proposed method can achieve higher accuracy when the time step size is less than or equal to 5 times of that the Courant-Friedrich-Levy (CFL condition allowed. In comparison with the previously reported methods, the proposed method for calculating electromagnetic field near microstrip line edge not only improves the efficiency, but also can provide a higher accuracy.

  15. Using Parameters of Dynamic Pulse Function for 3d Modeling in LOD3 Based on Random Textures

    Science.gov (United States)

    Alizadehashrafi, B.

    2015-12-01

    The pulse function (PF) is a technique based on procedural preprocessing system to generate a computerized virtual photo of the façade with in a fixed size square(Alizadehashrafi et al., 2009, Musliman et al., 2010). Dynamic Pulse Function (DPF) is an enhanced version of PF which can create the final photo, proportional to real geometry. This can avoid distortion while projecting the computerized photo on the generated 3D model(Alizadehashrafi and Rahman, 2013). The challenging issue that might be handled for having 3D model in LoD3 rather than LOD2, is the final aim that have been achieved in this paper. In the technique based DPF the geometries of the windows and doors are saved in an XML file schema which does not have any connections with the 3D model in LoD2 and CityGML format. In this research the parameters of Dynamic Pulse Functions are utilized via Ruby programming language in SketchUp Trimble to generate (exact position and deepness) the windows and doors automatically in LoD3 based on the same concept of DPF. The advantage of this technique is automatic generation of huge number of similar geometries e.g. windows by utilizing parameters of DPF along with defining entities and window layers. In case of converting the SKP file to CityGML via FME software or CityGML plugins the 3D model contains the semantic database about the entities and window layers which can connect the CityGML to MySQL(Alizadehashrafi and Baig, 2014). The concept behind DPF, is to use logical operations to project the texture on the background image which is dynamically proportional to real geometry. The process of projection is based on two vertical and horizontal dynamic pulses starting from upper-left corner of the background wall in down and right directions respectively based on image coordinate system. The logical one/zero on the intersections of two vertical and horizontal dynamic pulses projects/does not project the texture on the background image. It is possible to define

  16. Search for intracranial aneurysm susceptibility gene(s using Finnish families

    Directory of Open Access Journals (Sweden)

    Ryynänen Markku

    2002-08-01

    Full Text Available Abstract Background Cerebrovascular disease is the third leading cause of death in the United States, and about one-fourth of cerebrovascular deaths are attributed to ruptured intracranial aneurysms (IA. Epidemiological evidence suggests that IAs cluster in families, and are therefore probably genetic. Identification of individuals at risk for developing IAs by genetic tests will allow concentration of diagnostic imaging on high-risk individuals. We used model-free linkage analysis based on allele sharing with a two-stage design for a genome-wide scan to identify chromosomal regions that may harbor IA loci. Methods We previously estimated sibling relative risk in the Finnish population at between 9 and 16, and proceeded with a genome-wide scan for loci predisposing to IA. In 85 Finnish families with two or more affected members, 48 affected sibling pairs (ASPs were available for our genetic study. Power calculations indicated that 48 ASPs were adequate to identify chromosomal regions likely to harbor predisposing genes and that a liberal stage I lod score threshold of 0.8 provided a reasonable balance between detection of false positive regions and failure to detect real loci with moderate effect. Results Seven chromosomal regions exceeded the stage I lod score threshold of 0.8 and five exceeded 1.0. The most significant region, on chromosome 19q, had a maximum multipoint lod score (MLS of 2.6. Conclusions Our study provides evidence for the locations of genes predisposing to IA. Further studies are necessary to elucidate the genes and their role in the pathophysiology of IA, and to design genetic tests.

  17. Geophysical excitation of LOD/UT1 estimated from the output of the global circulation models of the atmosphere - ERA-40 reanalysis and of the ocean - OMCT

    Science.gov (United States)

    Korbacz, A.; Brzeziński, A.; Thomas, M.

    2008-04-01

    We use new estimates of the global atmospheric and oceanic angular momenta (AAM, OAM) to study the influence on LOD/UT1. The AAM series was calculated from the output fields of the atmospheric general circulation model ERA-40 reanalysis. The OAM series is an outcome of global ocean model OMCT simulation driven by global fields of the atmospheric parameters from the ERA- 40 reanalysis. The excitation data cover the period between 1963 and 2001. Our calculations concern atmospheric and oceanic effects in LOD/UT1 over the periods between 20 days and decades. Results are compared to those derived from the alternative AAM/OAM data sets.

  18. A new sentence generator providing material for maximum reading speed measurement.

    Science.gov (United States)

    Perrin, Jean-Luc; Paillé, Damien; Baccino, Thierry

    2015-12-01

    A new method is proposed to generate text material for assessing maximum reading speed of adult readers. The described procedure allows one to generate a vast number of equivalent short sentences. These sentences can be displayed for different durations in order to determine the reader's maximum speed using a psychophysical threshold algorithm. Each sentence is built so that it is either true or false according to common knowledge. The actual reading is verified by asking the reader to determine the truth value of each sentence. We based our design on the generator described by Crossland et al. and upgraded it. The new generator handles concepts distributed in an ontology, which allows an easy determination of the sentences' truth value and control of lexical and psycholinguistic parameters. In this way many equivalent sentence can be generated and displayed to perform the measurement. Maximum reading speed scores obtained with pseudo-randomly chosen sentences from the generator were strongly correlated with maximum reading speed scores obtained with traditional MNREAD sentences (r = .836). Furthermore, the large number of sentences that can be generated makes it possible to perform repeated measurements, since the possibility of a reader learning individual sentences is eliminated. Researchers interested in within-reader performance variability could use the proposed method for this purpose.

  19. Maximum Parsimony on Phylogenetic networks

    Science.gov (United States)

    2012-01-01

    Background Phylogenetic networks are generalizations of phylogenetic trees, that are used to model evolutionary events in various contexts. Several different methods and criteria have been introduced for reconstructing phylogenetic trees. Maximum Parsimony is a character-based approach that infers a phylogenetic tree by minimizing the total number of evolutionary steps required to explain a given set of data assigned on the leaves. Exact solutions for optimizing parsimony scores on phylogenetic trees have been introduced in the past. Results In this paper, we define the parsimony score on networks as the sum of the substitution costs along all the edges of the network; and show that certain well-known algorithms that calculate the optimum parsimony score on trees, such as Sankoff and Fitch algorithms extend naturally for networks, barring conflicting assignments at the reticulate vertices. We provide heuristics for finding the optimum parsimony scores on networks. Our algorithms can be applied for any cost matrix that may contain unequal substitution costs of transforming between different characters along different edges of the network. We analyzed this for experimental data on 10 leaves or fewer with at most 2 reticulations and found that for almost all networks, the bounds returned by the heuristics matched with the exhaustively determined optimum parsimony scores. Conclusion The parsimony score we define here does not directly reflect the cost of the best tree in the network that displays the evolution of the character. However, when searching for the most parsimonious network that describes a collection of characters, it becomes necessary to add additional cost considerations to prefer simpler structures, such as trees over networks. The parsimony score on a network that we describe here takes into account the substitution costs along the additional edges incident on each reticulate vertex, in addition to the substitution costs along the other edges which are

  20. WMAXC: a weighted maximum clique method for identifying condition-specific sub-network.

    Directory of Open Access Journals (Sweden)

    Bayarbaatar Amgalan

    Full Text Available Sub-networks can expose complex patterns in an entire bio-molecular network by extracting interactions that depend on temporal or condition-specific contexts. When genes interact with each other during cellular processes, they may form differential co-expression patterns with other genes across different cell states. The identification of condition-specific sub-networks is of great importance in investigating how a living cell adapts to environmental changes. In this work, we propose the weighted MAXimum clique (WMAXC method to identify a condition-specific sub-network. WMAXC first proposes scoring functions that jointly measure condition-specific changes to both individual genes and gene-gene co-expressions. It then employs a weaker formula of a general maximum clique problem and relates the maximum scored clique of a weighted graph to the optimization of a quadratic objective function under sparsity constraints. We combine a continuous genetic algorithm and a projection procedure to obtain a single optimal sub-network that maximizes the objective function (scoring function over the standard simplex (sparsity constraints. We applied the WMAXC method to both simulated data and real data sets of ovarian and prostate cancer. Compared with previous methods, WMAXC selected a large fraction of cancer-related genes, which were enriched in cancer-related pathways. The results demonstrated that our method efficiently captured a subset of genes relevant under the investigated condition.

  1. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    Science.gov (United States)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  2. Further linkage data on Norrie disease.

    Science.gov (United States)

    Kivlin, J D; Sanborn, G E; Wright, E; Cannon, L; Carey, J

    1987-03-01

    We obtained a LOD score of +1.61 using DNA marker L1.28 in 5 generations of a family with Norrie disease, raising the total LOD score to +5.42. There have been no recombinations between the 2 loci in any family to date, making the marker useful for genetic counseling.

  3. Scoring of treatment-related late effects in prostate cancer

    International Nuclear Information System (INIS)

    Livsey, Jacqueline E.; Routledge, Jacqueline; Burns, Meriel; Swindell, Rick; Davidson, Susan E.; Cowan, Richard A.; Logue, John P.; Wylie, James P.

    2002-01-01

    Background and purpose: To assess the correlation between different general and organ specific quality of life and morbidity scoring methods in a cohort of men treated with radical radiotherapy for prostate cancer. Materials and methods: Men who had been treated with radical radiotherapy (50 Gy in 16 fractions over 21 days) for localized prostate cancer more than 3 years previously and who had no evidence of recurrent disease were invited to take part in the study. A total of 101 of 135 invited patients agreed and completed LENT/SOMA, UCLA Prostate Cancer Index, and 36 item RAND Health survey questionnaires. Results: The patients had comparable results with other published series with respect to the UCLA and SF-36 indices. There was significant correlation between the corresponding parts of the UCLA and LENT/SOMA scales (P<0.0005). However, for the same symptoms, a patient tended to score lower (worse) on the UCLA scale in comparison to LENT/SOMA. The relationship between the average LENT/SOMA score and maximum score was also not straightforward with each set of data revealing different information. Conclusions: The LENT/SOMA questions were, in the main, more wide-ranging and informative than the UCLA index. It is helpful to give both the overall and maximum LENT/SOMA scores to most efficiently use all of the data. There may need to be a further LENT/SOMA question to allow both symptoms of tenesmus and faecal urgency to be fully addressed

  4. Influence of LOD variations on seismic energy release

    Science.gov (United States)

    Riguzzi, F.; Krumm, F.; Wang, K.; Kiszely, M.; Varga, P.

    2009-04-01

    Tidal friction causes significant time variations of geodynamical parameters, among them geometrical flattening. The axial despinning of the Earth due to tidal friction through the change of flattening generates incremental meridional and azimuthal stresses. The stress pattern in an incompressible elastic upper mantle and crust is symmetric to the equator and has its inflection points at the critical latitude close to ±45°. Consequently the distribution of seismic energy released by strong, shallow focus earthquakes should have also sharp maxima at this latitude. To investigate the influence of length of day (LOD) variations on earthquake activity an earthquake catalogue of strongest seismic events (M>7.0) was completed for the period 1900-2007. It is shown with the use of this catalogue that for the studied time-interval the catalogue is complete and consists of the seismic events responsible for more than 90% of released seismic energy. Study of the catalogue for earthquakes M>7.0 shows that the seismic energy discharged by the strongest seismic events has significant maxima at ±45°, what renders probably that the seismic activity of our planet is influenced by an external component, i.e. by the tidal friction, which acts through the variation of the hydrostatic figure of the Earth caused by it. Distribution along the latitude of earthquake numbers and energies was investigated also for the case of global linear tectonic structures, such as mid ocean ridges and subduction zones. It can be shown that the number of the shallow focus shocks has a repartition along the latitude similar to the distribution of the linear tectonic structures. This means that the position of foci of seismic events is mainly controlled by the tectonic activity.

  5. Desempenho de seis modelos de predição prognóstica em pacientes críticos que receberam suporte renal extracorpóreo Performance of six prognostic scores in critically ILL patients receiving renal replacement therapy

    Directory of Open Access Journals (Sweden)

    Elizabeth R. Maccariello

    2008-06-01

    Full Text Available JUSTIFICATIVA E OBJETIVOS: Não existe consenso sobre qual modelo prognóstico deva ser utilizado em pacientes com disfunção renal aguda (DRA. O objetivo deste estudo foi avaliar o desempenho de seis escores de prognóstico em pacientes que necessitaram de suporte renal. MÉTODO: Coorte prospectiva de pacientes internados nas unidades de terapia intensiva (UTI de três hospitais terciários que necessitaram de suporte renal por DRA durante 32 meses. Foram excluídos os pacientes crônicos em programa de diálise ou com BACKGROUND AND OBJECTIVES: There is no consensus about prognostic scores for use in patients with acute kidney injury (AKI. The aim of this study was to evaluate the performance of six prognostic scores in predicting hospital mortality in patients with AKI and need for renal replacement therapy (RRT. METHODS: Prospective cohort of patients admitted to the intensive care units (ICU of three tertiary care hospitals that required RRT for AKI over a 32-month period. Patients with end-stage renal disease and those with ICU stay < 24h were excluded. Data from the first 24h of ICU admission were used to calculate SAPS II and APACHE II scores, and data from the first 24h of RRT were used in the calculation of LOD, ODIN, Liaño and Mehta scores. Discrimination was evaluated using the area under ROC curve (AUROC and calibration using the Hosmer-Lemeshow goodness-of-fit test. The hospital mortality was the end-point of interest. RESULTS: 467 patients were evaluated. Hospital mortality rate was 75%. Mean SAPS II and APACHE II scores were 48.5 ±11.2 and 27.4 ± 6.3 points, and median LOD score was 7 (5-8 points. Except for Mehta score (p = 0.001, calibration was appropriate in all models. However, discrimination was uniformly unsatisfactory; AUROC ranged from 0.60 for ODIN to 0.72 for SAPS II and Mehta scores. In addition, except for Mehta, all models tended to underestimate hospital mortality. CONCLUSIONS: Organ dysfunction, general and

  6. Combinación de Valores de Longitud del Día (LOD) según ventanas de frecuencia

    Science.gov (United States)

    Fernández, L. I.; Arias, E. F.; Gambis, D.

    El concepto de solución combinada se sustenta en el hecho de que las diferentes series temporales de datos derivadas a partir de distintas técnicas de la Geodesia Espacial son muy disimiles entre si. Las principales diferencias, fácilmente detectables, entre las distintas series son: diferente intervalo de muestreo, extensión temporal y calidad. Los datos cubren un período reciente de 27 meses (julio 96-oct. 98). Se utilizaron estimaciones de la longitud del día (LOD) originadas en 10 centros operativos del IERS (International Earth Rotation Service) a partir de las técnicas GPS (Global Positioning System) y SLR (Satellite Laser Ranging). La serie temporal combinada así obtenida se comparó con la solución EOP (Parámetros de la Orientación Terrestre) combinada multi-técnica derivada por el IERS (C04). El comportamiento del ruido en LOD para todas las técnicas mostró ser dependiente de la frecuencia (Vondrak, 1998). Por esto, las series dato se dividieron en ventanas de frecuencia, luego de haberles removido bies y tendencias. Luego, se asignaron diferentes factores de peso a cada ventana discriminando por técnicas. Finalmente estas soluciones parcialmente combinadas se mezclaron para obtener la solución combinada final. Sabemos que la mejor solución combinada tendrá una precisión menor que la precisión de las series temporales de datos que la originaron. Aun así, la importancia de una serie combinada confiable de EOP, esto es, de una precisión aceptable y libre de sistematismos evidentes, radica en la necesidad de una base de datos EOP de referencia para el estudio de fenómenos geofísicos que motivan variaciones en la rotación terrestre.

  7. Is a maximum Revised Trauma Score a safe triage tool for Helicopter Emergency Medical Services cancellations?

    NARCIS (Netherlands)

    Giannakopoulos, Georgios F.; Saltzherr, Teun Peter; Lubbers, Wouter D.; Christiaans, Herman M. T.; van Exter, Pieternel; de Lange-de Klerk, Elly S. M.; Bloemers, Frank W.; Zuidema, Wietse P.; Goslings, J. Carel; Bakker, Fred C.

    2011-01-01

    Introduction The Revised Trauma Score is used worldwide in the prehospital setting and provides a snapshot of patient's physiological state. Several studies have shown that the reliability of the RTS is high in trauma outcomes. In the Netherlands, Helicopter Emergency Medical Services (HEMS) are

  8. Assessing the effect of the relative atmospheric angular momentum (AAM) on length-of-day (LOD) variations under climate warming

    Science.gov (United States)

    Lehmann, E.; Hansen, F.; Ulbrich, U.; Nevir, P.; Leckebusch, G. C.

    2009-04-01

    While most studies on model-projected future climate warming discuss climatological quantities, this study investigates the response of the relative atmospheric angular momentum (AAM) to climate warming for the 21th century and discusses its possible effects on future length-of-day variations. Following the derivation of the dynamic relation between atmosphere and solid earth by Barnes et al. (Proc. Roy. Soc., 1985) this study relates the axial atmospheric excitation function X3 to changes in length-of-day that are proportional to variations in zonal winds. On interannual time scales changes in the relative AAM (ERA40 reanalyses) are well correlated with observed length-of-day (LOD, IERS EOP CO4) variability (r=0.75). The El Niño-Southern Oscillation (ENSO) is a prominent coupled ocean-atmosphere phenomenon to cause global climate variability on interannual time scales. Correspondingly, changes in observed LOD relate to ENSO due to observed strong wind anomalies. This study investigates the varying effect of AAM anomalies on observed LOD by relating AAM to variations to ENSO teleconnections (sea surface temperatures, SSTs) and the Pacific North America (PNA) oscillation for the 20th and 21st century. The differently strong effect of strong El Niño events (explained variance 71%-98%) on present time (1962-2000) observed LOD-AAM relation can be associated to variations in location and strength of jet streams in the upper troposphere. Correspondingly, the relation between AAM and SSTs in the NIÑO 3.4 region also varies between explained variances of 15% to 73%. Recent coupled ocean-atmosphere projections on future climate warming suggest changes in frequency and amplitude of ENSO events. Since changes in the relative AAM indicate shifts in large-scale atmospheric circulation patterns due to climate change, AAM - ENSO relations are assessed in coupled atmosphere-ocean (ECHAM5-OM1) climate warming projections (A1B) for the 21st century. A strong rise (+31%) in

  9. Dāvida Lodža „Mazā pasaule” un „Paradīzes jaunumi” kā piedzīvojumu romāni

    OpenAIRE

    Kaušakīte, Inessa

    2007-01-01

    Maģistra darbā „Deivida Lodža „Mazā pasaule” un „Paradīzes Jaunumi” kā piedzīvojumu romāni” mērķis ir pievērst uzmanību piedzīvojuma žanra dažādām īpatnībām mūsdienu angļu romānista darbos. Darbs arī atklāj postmodernā piedzīvojumu romāna varoņa rakstura īpatnības. Pirmā nodaļa izceļ klasiskā 16. gadsimta spāņu literatūras piedzīvojumu romāna svarīgākās īpatnības. Šī nodaļa pēta apstākļus, kas palīdzējuši žanra attīstībai Anglijā. Otrā nodaļa pievērš uzmanību Deivida Lodža liktenim, kas kā...

  10. [Familial febrile convulsions is supposed to link to human chromosome 19p13.3].

    Science.gov (United States)

    Qi, Y; Lü, J; Wu, X

    2001-01-10

    To localize the familial febrile convulsion (FC) genes on human chromosomes. For 63 FC pedigrees, tetranucleotide repeat markers D19S253 D19S395 and D19S591 on the short arm of chromosome 19, as well as dinucleotide repeat markers D8S84 and D8S85 on the long arm of chromosome 8 were genotyped. Transmission disequilibrium test (TDT) and Lod score calculation were carried out. The data were processed by PPAP software package. All the alleles in every locus of FC probands and normal controls were in Hardy-Weinburg balance. Transmission disequilibrium was found on D8S84, D19S395 and D19S591 in FC families. chi(2) values were 4.0, 5.124 and 7.364 separately. Each P value was < 0.05, and significantly meaningful. The two-point Lod scores between D8S84 and FC, D8S85 and FC, D19S253 and FC, D19S395 and FC, D19S591 and FC are 0.00002, 0.000017, 0.58, 1.53 and 1.42 respectively. The multi-point Lod score among markers on chromosome 8q and FC was 0.88, while Lod score among markers on chromosome 19p and FC reached 2.78. The results by both the non-parameter (TDT) and parameter (Lod score) methods were consistant on a whole. FC is linked with chromosome region 19p13.3, but not with chromosome 8q.

  11. PASI and PQOL-12 score in psoriasis : Is there any correlation?

    Directory of Open Access Journals (Sweden)

    Vikas Shankar

    2011-01-01

    Full Text Available Background: Psoriasis, a common papulo-squamous disorder of the skin, is universal in occurrence and may interfere with the quality of life adversely. Whether extent of the disease has any bearing upon the patients′ psychology has not much been studied in this part of the world. Aims: The objective of this hospital-based cross-sectional study was to assess the disease severity objectively using Psoriasis area and severity index (PASI score and the quality of life by Psoriasis quality-of-life questionnaire-12 (PQOL-12 and to draw correlation between them, if any. Materials and Methods PASI score denotes an objective method of scoring severity of psoriasis, reflecting not only the body surface area but also erythema, induration and scaling. The PQOL-12 represents a 12-item self-administered, disease-specific psychometric instrument created to specifically assess quality-of-life issues that are more important with psoriasis patients. PASI and PQOL-12 score were calculated in each patient for objectively assessing their disease severity and quality of life. Results: In total, 34 psoriasis patients (16 males, 18 females, of age ranging from 8 to 55 years, were studied. Maximum and minimum PASI scores were 0.8 and 32.8, respectively, whereas maximum and minimum PQOL-12 scores were 4 and 120, respectively. PASI and PQOL-12 values showed minimal positive correlation (r = +0.422. Conclusion: Disease severity of psoriasis had no direct reflection upon their quality of life. Limited psoriasis on visible area may also have greater impact on mental health.

  12. [Cleft lip, alveolar and palate sequelae. Proposal of new alveolar score by the Alveolar Cleft Score (ACS) classification].

    Science.gov (United States)

    Molé, C; Simon, E

    2015-06-01

    The management of cleft lip, alveolar and palate sequelae remains problematic today. To optimize it, we tried to establish a new clinical index for diagnostic and prognostic purposes. Seven tissue indicators, that we consider to be important in the management of alveolar sequelae, are listed by assigning them individual scores. The final score, obtained by adding together the individual scores, can take a low, high or maximum value. We propose a new classification (ACS: Alveolar Cleft Score) that guides the therapeutic team to a prognosis approach, in terms of the recommended surgical and prosthetic reconstruction, the type of medical care required, and the preventive and supportive therapy to establish. Current studies are often only based on a standard radiological evaluation of the alveolar bone height at the cleft site. However, the gingival, the osseous and the cellular areas bordering the alveolar cleft sequelae induce many clinical parameters, which should be reflected in the morphological diagnosis, to better direct the surgical indications and the future prosthetic requirements, and to best maintain successful long term aesthetic and functional results. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  13. QTL Information Table - Q-TARO | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available opulation of plants No. of plants Number of plants LOD LOD score Parent A Parent A Parent B Parent B Direction (Parent...) Direction (Parent) a) Physical Marker of physical mapping b) Fine1; interval Flanking marker of

  14. Identification of Missense Mutation (I12T in the BSND Gene and Bioinformatics Analysis

    Directory of Open Access Journals (Sweden)

    Hina Iqbal

    2011-01-01

    Full Text Available Nonsyndromic hearing loss is a paradigm of genetic heterogeneity with 85 loci and 39 nuclear disease genes reported so far. Mutations of BSND have been shown to cause Bartter syndrome type IV, characterized by significant renal abnormalities and deafness and nonsyndromic nearing loss. We studied a Pakistani consanguineous family. Clinical examinations of affected individuals did not reveal the presence of any associated signs, which are hallmarks of the Bartter syndrome type IV. Linkage analysis identified an area of 18.36 Mb shared by all affected individuals between markers D1S2706 and D1S1596. A maximum two-point LOD score of 2.55 with markers D1S2700 and multipoint LOD score of 3.42 with marker D1S1661 were obtained. BSND mutation, that is, p.I12T, cosegregated in all extant members of our pedigree. BSND mutations can cause nonsyndromic hearing loss, and it is a second report for this mutation. The respected protein, that is, BSND, was first modeled, and then, the identified mutation was further analyzed by using different bioinformatics tools; finally, this protein and its mutant was docked with CLCNKB and REN, interactions of BSND, respectively.

  15. Maximum Likelihood Dynamic Factor Modeling for Arbitrary "N" and "T" Using SEM

    Science.gov (United States)

    Voelkle, Manuel C.; Oud, Johan H. L.; von Oertzen, Timo; Lindenberger, Ulman

    2012-01-01

    This article has 3 objectives that build on each other. First, we demonstrate how to obtain maximum likelihood estimates for dynamic factor models (the direct autoregressive factor score model) with arbitrary "T" and "N" by means of structural equation modeling (SEM) and compare the approach to existing methods. Second, we go beyond standard time…

  16. Marfan syndrome is closely linked to a marker on chromosome 15q1. 5 r arrow q2. 1

    Energy Technology Data Exchange (ETDEWEB)

    Tsipouras, P.; Sarfarazi, M.; Devi, A. (Univ. of Connecticut Health Center, Farmington (United States)); Weiffenbach, B. (Collaborative Research, Inc., Waltham, MA (United States)); Boxer, M. (Ninewells Hospital and Medical School, Dundee (Scotland))

    1991-05-15

    Marfan syndrome is a systemic disorder of the connective tissue inherited as an autosomal dominant trait. The disorder imparts significant morbidity and martality. The etiology of the disorder remains elusive. A recent study localized the gene for Marfan syndrome on chromosome 15. The authors present data showing that marker D15S48 is genetically linked to Marfan syndrome. Pairwise linkage analysis gave a maximum lod (logarithm of odds) score of Z = 11.78 at {theta} = 0.02. Furthermore our data suggest that the Marfan syndrome locus is possibly flanked on either side by D15S48 and D15S49.

  17. Missing data methods for dealing with missing items in quality of life questionnaires. A comparison by simulation of personal mean score, full information maximum likelihood, multiple imputation, and hot deck techniques applied to the SF-36 in the French 2003 decennial health survey.

    Science.gov (United States)

    Peyre, Hugo; Leplège, Alain; Coste, Joël

    2011-03-01

    Missing items are common in quality of life (QoL) questionnaires and present a challenge for research in this field. It remains unclear which of the various methods proposed to deal with missing data performs best in this context. We compared personal mean score, full information maximum likelihood, multiple imputation, and hot deck techniques using various realistic simulation scenarios of item missingness in QoL questionnaires constructed within the framework of classical test theory. Samples of 300 and 1,000 subjects were randomly drawn from the 2003 INSEE Decennial Health Survey (of 23,018 subjects representative of the French population and having completed the SF-36) and various patterns of missing data were generated according to three different item non-response rates (3, 6, and 9%) and three types of missing data (Little and Rubin's "missing completely at random," "missing at random," and "missing not at random"). The missing data methods were evaluated in terms of accuracy and precision for the analysis of one descriptive and one association parameter for three different scales of the SF-36. For all item non-response rates and types of missing data, multiple imputation and full information maximum likelihood appeared superior to the personal mean score and especially to hot deck in terms of accuracy and precision; however, the use of personal mean score was associated with insignificant bias (relative bias personal mean score appears nonetheless appropriate for dealing with items missing from completed SF-36 questionnaires in most situations of routine use. These results can reasonably be extended to other questionnaires constructed according to classical test theory.

  18. PNNL: A Supervised Maximum Entropy Approach to Word Sense Disambiguation

    Energy Technology Data Exchange (ETDEWEB)

    Tratz, Stephen C.; Sanfilippo, Antonio P.; Gregory, Michelle L.; Chappell, Alan R.; Posse, Christian; Whitney, Paul D.

    2007-06-23

    In this paper, we described the PNNL Word Sense Disambiguation system as applied to the English All-Word task in Se-mEval 2007. We use a supervised learning approach, employing a large number of features and using Information Gain for dimension reduction. Our Maximum Entropy approach combined with a rich set of features produced results that are significantly better than baseline and are the highest F-score for the fined-grained English All-Words subtask.

  19. The Pooling-score (P-score): inter- and intra-rater reliability in endoscopic assessment of the severity of dysphagia.

    Science.gov (United States)

    Farneti, D; Fattori, B; Nacci, A; Mancini, V; Simonelli, M; Ruoppolo, G; Genovese, E

    2014-04-01

    This study evaluated the intra- and inter-rater reliability of the Pooling score (P-score) in clinical endoscopic evaluation of severity of swallowing disorder, considering excess residue in the pharynx and larynx. The score (minimum 4 - maximum 11) is obtained by the sum of the scores given to the site of the bolus, the amount and ability to control residue/bolus pooling, the latter assessed on the basis of cough, raclage, number of dry voluntary or reflex swallowing acts ( 5). Four judges evaluated 30 short films of pharyngeal transit of 10 solid (1/4 of a cracker), 11 creamy (1 tablespoon of jam) and 9 liquid (1 tablespoon of 5 cc of water coloured with methlyene blue, 1 ml in 100 ml) boluses in 23 subjects (10 M/13 F, age from 31 to 76 yrs, mean age 58.56±11.76 years) with different pathologies. The films were randomly distributed on two CDs, which differed in terms of the sequence of the films, and were given to judges (after an explanatory session) at time 0, 24 hours later (time 1) and after 7 days (time 2). The inter- and intra-rater reliability of the P-score was calculated using the intra-class correlation coefficient (ICC; 3,k). The possibility that consistency of boluses could affect the scoring of the films was considered. The ICC for site, amount, management and the P-score total was found to be, respectively, 0.999, 0.997, 1.00 and 0.999. Clinical evaluation of a criterion of severity of a swallowing disorder remains a crucial point in the management of patients with pathologies that predispose to complications. The P-score, derived from static and dynamic parameters, yielded a very high correlation among the scores attributed by the four judges during observations carried out at different times. Bolus consistencies did not affect the outcome of the test: the analysis of variance, performed to verify if the scores attributed by the four judges to the parameters selected, might be influenced by the different consistencies of the boluses, was not

  20. Description and validation of a scoring system for tomosynthesis in pulmonary cystic fibrosis

    Energy Technology Data Exchange (ETDEWEB)

    Vult von Steyern, Kristina; Bjoerkman-Burtscher, Isabella M.; Bozovic, Gracijela; Wiklund, Marie; Geijer, Mats [Skaane University Hospital, Lund University, Centre for Medical Imaging and Physiology, Lund (Sweden); Hoeglund, Peter [Skaane University Hospital, Competence Centre for Clinical Research, Lund (Sweden)

    2012-12-15

    To design and validate a scoring system for tomosynthesis (digital tomography) in pulmonary cystic fibrosis. A scoring system dedicated to tomosynthesis in pulmonary cystic fibrosis was designed. Three radiologists independently scored 88 pairs of radiographs and tomosynthesis examinations of the chest in 60 patients with cystic fibrosis and 7 oncology patients. Radiographs were scored according to the Brasfield scoring system and tomosynthesis examinations were scored using the new scoring system. Observer agreements for the tomosynthesis score were almost perfect for the total score with square-weighted kappa >0.90, and generally substantial to almost perfect for subscores. Correlation between the tomosynthesis score and the Brasfield score was good for the three observers (Kendall's rank correlation tau 0.68, 0.77 and 0.78). Tomosynthesis was generally scored higher as a percentage of the maximum score. Observer agreements for the total score for Brasfield score were almost perfect (square-weighted kappa 0.80, 0.81 and 0.85). The tomosynthesis scoring system seems robust and correlates well with the Brasfield score. Compared with radiography, tomosynthesis is more sensitive to cystic fibrosis changes, especially bronchiectasis and mucus plugging, and the new tomosynthesis scoring system offers the possibility of more detailed and accurate scoring of disease severity. (orig.)

  1. Description and validation of a scoring system for tomosynthesis in pulmonary cystic fibrosis

    International Nuclear Information System (INIS)

    Vult von Steyern, Kristina; Bjoerkman-Burtscher, Isabella M.; Bozovic, Gracijela; Wiklund, Marie; Geijer, Mats; Hoeglund, Peter

    2012-01-01

    To design and validate a scoring system for tomosynthesis (digital tomography) in pulmonary cystic fibrosis. A scoring system dedicated to tomosynthesis in pulmonary cystic fibrosis was designed. Three radiologists independently scored 88 pairs of radiographs and tomosynthesis examinations of the chest in 60 patients with cystic fibrosis and 7 oncology patients. Radiographs were scored according to the Brasfield scoring system and tomosynthesis examinations were scored using the new scoring system. Observer agreements for the tomosynthesis score were almost perfect for the total score with square-weighted kappa >0.90, and generally substantial to almost perfect for subscores. Correlation between the tomosynthesis score and the Brasfield score was good for the three observers (Kendall's rank correlation tau 0.68, 0.77 and 0.78). Tomosynthesis was generally scored higher as a percentage of the maximum score. Observer agreements for the total score for Brasfield score were almost perfect (square-weighted kappa 0.80, 0.81 and 0.85). The tomosynthesis scoring system seems robust and correlates well with the Brasfield score. Compared with radiography, tomosynthesis is more sensitive to cystic fibrosis changes, especially bronchiectasis and mucus plugging, and the new tomosynthesis scoring system offers the possibility of more detailed and accurate scoring of disease severity. (orig.)

  2. The normalization of data in the Constant-Murley score for the shoulder. A study conducted on 563 healthy subjects.

    Science.gov (United States)

    Grassi, F A; Tajana, M S

    2003-01-01

    The study was conducted in order to evaluate the theoretical design of the Constant-Murley system and to reveal any difficulties in obtaining data when it is used in 563 subjects not affected with shoulder pathology. The total mean score for the subjects examined was 85.2 points (minimum 75, maximum 100 points). Values revealed a decreasing trend beginning at 50 years of age for men and 30 for women. Only 4 subjects achieved a maximum score of 100. The measurements taken allowed us to elaborate a reference table based on sex and age, which was required to calculate the correct score. These values differ from those reported by the inventors of the system and they reveal the need to compile personal tables for the normalization of scores.

  3. New Combined Scoring System for Predicting Respiratory Failure in Iraqi Patients with Guillain-Barré Syndrome

    Directory of Open Access Journals (Sweden)

    Zaki Noah Hasan

    2010-09-01

    Full Text Available The Guillain-Barré syndrome (GBS is an acute post-infective autoimmune polyradiculoneuropathy, it is the commonest peripheral neuropathy causing respiratory failure. The aim of the study is to use the New Combined Scoring System in anticipating respiratory failure in order to perform elective measures without waiting for emergency situations to occur.
    Patients and methods: Fifty patients with GBS were studied. Eight clinical parameters (including progression of patients to maximum weakness, respiratory rate/minute, breath holding
    count (the number of digits the patient can count in holding his breath, presence of facial muscle weakness (unilateral or bilateral, presence of weakness of the bulbar muscle, weakness of the neck flexor muscle, and limbs weakness were assessed for each patient and a certain score was given to
    each parameter, a designed combined score being constructed by taking into consideration all the above mentioned clinical parameters. Results and discussion: Fifteen patients (30% that were enrolled in our study developed respiratory failure. There was a highly significant statistical association between the development of respiratory failure and the lower grades of (bulbar muscle weakness score, breath holding count scores, neck muscle weakness score, lower limbs and upper limbs weakness score , respiratory rate score and the total sum score above 16 out of 30 (p-value=0.000 . No significant statistical difference was found regarding the progression to maximum weakness (p-value=0.675 and facial muscle weakness (p-value=0.482.
    Conclusion: The patients who obtained a combined score (above 16’30 are at great risk of having respiratory failure.

  4. Statins Reduces the Risk of Dementia in Patients with Late-Onset Depression: A Retrospective Cohort Study.

    Science.gov (United States)

    Yang, Ya-Hsu; Teng, Hao-Wei; Lai, Yen-Ting; Li, Szu-Yuan; Lin, Chih-Ching; Yang, Albert C; Chan, Hsiang-Lin; Hsieh, Yi-Hsuan; Lin, Chiao-Fan; Hsu, Fu-Ying; Liu, Chih-Kuang; Liu, Wen-Sheng

    2015-01-01

    Patients with late-onset depression (LOD) have been reported to run a higher risk of subsequent dementia. The present study was conducted to assess whether statins can reduce the risk of dementia in these patients. We used the data from National Health Insurance of Taiwan during 1996-2009. Standardized Incidence Ratios (SIRs) were calculated for LOD and subsequent dementia. The criteria for LOD diagnoses included age ≥65 years, diagnosis of depression after 65 years of age, at least three service claims, and treatment with antidepressants. The time-dependent Cox proportional hazards model was applied for multivariate analyses. Propensity scores with the one-to-one nearest-neighbor matching model were used to select matching patients for validation studies. Kaplan-Meier curve estimate was used to measure the group of patients with dementia living after diagnosis of LOD. Totally 45,973 patients aged ≥65 years were enrolled. The prevalence of LOD was 12.9% (5,952/45,973). Patients with LOD showed to have a higher incidence of subsequent dementia compared with those without LOD (Odds Ratio: 2.785; 95% CI 2.619-2.958). Among patients with LOD, lipid lowering agent (LLA) users (for at least 3 months) had lower incidence of subsequent dementia than non-users (Hazard Ratio = 0.781, 95% CI 0.685-0.891). Nevertheless, only statins users showed to have reduced risk of dementia (Hazard Ratio = 0.674, 95% CI 0.547-0.832) while other LLAs did not, which was further validated by Kaplan-Meier estimates after we used the propensity scores with the one-to-one nearest-neighbor matching model to control the confounding factors. Statins may reduce the risk of subsequent dementia in patients with LOD.

  5. Scoring sacroiliac joints by magnetic resonance imaging. A Multiple-reader reliability experiment

    DEFF Research Database (Denmark)

    Landewé, RB; Hermann, KG; van der Heijde, DM

    2005-01-01

    Magnetic resonance imaging (MRI) of the sacroiliac (SI) joints and the spine is increasingly important in the assessment of inflammatory activity and structural damage in clinical trials with patients with ankylosing spondylitis (AS). We investigated inter-reader reliability and sensitivity...... for 'depth' and 'intensity,' and the fifth method included the SPARCC slice with the maximum score. Inter-reader reliability was investigated by calculating intraclass correlation coefficients (ICC) for all readers together and for all possible reader pairs. Sensitivity to change was investigated...... values close to zero (no agreement) and highest observed values over 0.80 (excellent agreement). In general, agreement of status scores was somewhat better than agreement of change scores, and agreement of the comprehensive SPARCC scoring system was somewhat better than agreement of the more condensed...

  6. The worst case complexity of maximum parsimony.

    Science.gov (United States)

    Carmel, Amir; Musa-Lempel, Noa; Tsur, Dekel; Ziv-Ukelson, Michal

    2014-11-01

    One of the core classical problems in computational biology is that of constructing the most parsimonious phylogenetic tree interpreting an input set of sequences from the genomes of evolutionarily related organisms. We reexamine the classical maximum parsimony (MP) optimization problem for the general (asymmetric) scoring matrix case, where rooted phylogenies are implied, and analyze the worst case bounds of three approaches to MP: The approach of Cavalli-Sforza and Edwards, the approach of Hendy and Penny, and a new agglomerative, "bottom-up" approach we present in this article. We show that the second and third approaches are faster than the first one by a factor of Θ(√n) and Θ(n), respectively, where n is the number of species.

  7. Recombinational event between Norrie disease and DXS7 loci.

    Science.gov (United States)

    Ngo, J T; Spence, M A; Cortessis, V; Sparkes, R S; Bateman, J B

    1988-07-01

    We have identified a family affected with X-linked recessive Norrie disease, in which a recombinational event occurred between the disease locus and the DXS7 locus identified by the probe L1.28. The addition of our family brings the total of published informative families to seven, with a maximum lod score of 7.58 at a recombination frequency of 0.038 +/- 0.036. This finding indicates that the L1.28 probe is useful but may not be completely reliable for prenatal diagnosis and that the gene for Norrie disease is not within the DNA sequence identified by the L1.28 probe.

  8. 3D BUILDING MODELING IN LOD2 USING THE CITYGML STANDARD

    Directory of Open Access Journals (Sweden)

    D. Preka

    2016-10-01

    Full Text Available Over the last decade, scientific research has been increasingly focused on the third dimension in all fields and especially in sciences related to geographic information, the visualization of natural phenomena and the visualization of the complex urban reality. The field of 3D visualization has achieved rapid development and dynamic progress, especially in urban applications, while the technical restrictions on the use of 3D information tend to subside due to advancements in technology. A variety of 3D modeling techniques and standards has already been developed, as they gain more traction in a wide range of applications. Such a modern standard is the CityGML, which is open and allows for sharing and exchanging of 3D city models. Within the scope of this study, key issues for the 3D modeling of spatial objects and cities are considered and specifically the key elements and abilities of CityGML standard, which is used in order to produce a 3D model of 14 buildings that constitute a block at the municipality of Kaisariani, Athens, in Level of Detail 2 (LoD2, as well as the corresponding relational database. The proposed tool is based upon the 3DCityDB package in tandem with a geospatial database (PostgreSQL w/ PostGIS 2.0 extension. The latter allows for execution of complex queries regarding the spatial distribution of data. The system is implemented in order to facilitate a real-life scenario in a suburb of Athens.

  9. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  10. Description and validation of a scoring system for tomosynthesis in pulmonary cystic fibrosis.

    Science.gov (United States)

    Vult von Steyern, Kristina; Björkman-Burtscher, Isabella M; Höglund, Peter; Bozovic, Gracijela; Wiklund, Marie; Geijer, Mats

    2012-12-01

    To design and validate a scoring system for tomosynthesis (digital tomography) in pulmonary cystic fibrosis. A scoring system dedicated to tomosynthesis in pulmonary cystic fibrosis was designed. Three radiologists independently scored 88 pairs of radiographs and tomosynthesis examinations of the chest in 60 patients with cystic fibrosis and 7 oncology patients. Radiographs were scored according to the Brasfield scoring system and tomosynthesis examinations were scored using the new scoring system. Observer agreements for the tomosynthesis score were almost perfect for the total score with square-weighted kappa >0.90, and generally substantial to almost perfect for subscores. Correlation between the tomosynthesis score and the Brasfield score was good for the three observers (Kendall's rank correlation tau 0.68, 0.77 and 0.78). Tomosynthesis was generally scored higher as a percentage of the maximum score. Observer agreements for the total score for Brasfield score were almost perfect (square-weighted kappa 0.80, 0.81 and 0.85). The tomosynthesis scoring system seems robust and correlates well with the Brasfield score. Compared with radiography, tomosynthesis is more sensitive to cystic fibrosis changes, especially bronchiectasis and mucus plugging, and the new tomosynthesis scoring system offers the possibility of more detailed and accurate scoring of disease severity. Tomosynthesis is more sensitive than conventional radiography for pulmonary cystic fibrosis changes. The radiation dose from chest tomosynthesis is low compared with computed tomography. Tomosynthesis may become useful in the regular follow-up of patients with cystic fibrosis.

  11. Influence of Dynamic Neuromuscular Stabilization Approach on Maximum Kayak Paddling Force

    Directory of Open Access Journals (Sweden)

    Davidek Pavel

    2018-03-01

    Full Text Available The purpose of this study was to examine the effect of Dynamic Neuromuscular Stabilization (DNS exercise on maximum paddling force (PF and self-reported pain perception in the shoulder girdle area in flatwater kayakers. Twenty male flatwater kayakers from a local club (age = 21.9 ± 2.4 years, body height = 185.1 ± 7.9 cm, body mass = 83.9 ± 9.1 kg were randomly assigned to the intervention or control groups. During the 6-week study, subjects from both groups performed standard off-season training. Additionally, the intervention group engaged in a DNS-based core stabilization exercise program (quadruped exercise, side sitting exercise, sitting exercise and squat exercise after each standard training session. Using a kayak ergometer, the maximum PF stroke was measured four times during the six weeks. All subjects completed the Disabilities of the Arm, Shoulder and Hand (DASH questionnaire before and after the 6-week interval to evaluate subjective pain perception in the shoulder girdle area. Initially, no significant differences in maximum PF and the DASH questionnaire were identified between the two groups. Repeated measures analysis of variance indicated that the experimental group improved significantly compared to the control group on maximum PF (p = .004; Cohen’s d = .85, but not on the DASH questionnaire score (p = .731 during the study. Integration of DNS with traditional flatwater kayak training may significantly increase maximum PF, but may not affect pain perception to the same extent.

  12. Mammographic image restoration using maximum entropy deconvolution

    International Nuclear Information System (INIS)

    Jannetta, A; Jackson, J C; Kotre, C J; Birch, I P; Robson, K J; Padgett, R

    2004-01-01

    An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise ratio perceived by the observer. Images of the TORMAM mammographic image quality phantom were recorded using the standard magnification settings of 1.8 magnification/fine focus and also at 1.8 magnification/broad focus and 3.0 magnification/fine focus; the latter two arrangements would normally give rise to unacceptable geometric blurring. Measured point-spread functions were used in conjunction with the MEM image processing to de-blur these images. The results are presented as comparative images of phantom test features and as observer scores for the raw and processed images. Visualization of high resolution features and the total image scores for the test phantom were improved by the application of the MEM processing. It is argued that this successful demonstration of image de-blurring in noisy radiological images offers the possibility of weakening the link between focal spot size and geometric blurring in radiology, thus opening up new approaches to system optimization

  13. [Different scoring systems to evaluate the prognosis of Fournier's gangrene: A comparative study].

    Science.gov (United States)

    Zhu, Xiao-dong; Ding, Fei; Wang, Guo-dong; Shao, Qiang

    2015-08-01

    To sum up the experience in diagnosis and treatment of Fournier's gangrene and find an optimal evaluation tool for its prognosis by comparing currently used prognostic scoring systems. We retrospectively analyzed 16 cases of Fournier's gangrene diagnosed and surgically treated in our hospital between 2004 and 2012. Using Fournier's Gangrene Severity Index (FGSI), Uludag Fournier's Gangrene Severity Index (UFGSI), Age-Adjusted Charlson Comorbidity Index (ACCI), and Surgical Apgar Score (sAPGAR) , we obtained the prognostic scores of the patients and made comparisons among different scoring systems. FGSI, UFGSI, ACCI, and sAPGAR were all clinically used scoring systems. Statistically significant differences were found in the scores of ACCI and UFGSI but not in those of FGSI and sAPGAR between the death and survival groups, with the maximum area under the ROC curve and minimum standard error for the ACCI score. Both ACCI and UFGSI are useful for evaluating the prognosis of Fournier's gangrene. However, ACCI is even better for its higher sensitivity and specificity and easier clinical collection.

  14. Difference of achalasia subtypes based on clinical symptoms, radiographic findings, and stasis scores

    Directory of Open Access Journals (Sweden)

    A. Meillier

    2018-01-01

    Conclusions: Achalasia subtypes had similar clinical symptoms, except for increased vomiting severity in subtype i. The maximum esophageal diameter in subtype ii was significantly greater than in subtype iii. Esophageal stasis scores were similar. Thus, high-resolution esophageal manometry remains essential in assessing achalasia subtypes.

  15. Morphologic and functional scoring of cystic fibrosis lung disease using MRI

    International Nuclear Information System (INIS)

    Eichinger, Monika; Optazaite, Daiva-Elzbieta; Kopp-Schneider, Annette; Hintze, Christian; Biederer, Jürgen; Niemann, Anne; Mall, Marcus A.; Wielpütz, Mark O.; Kauczor, Hans-Ulrich; Puderbach, Michael

    2012-01-01

    Magnetic resonance imaging (MRI) gains increasing importance in the assessment of cystic fibrosis (CF) lung disease. The aim of this study was to develop a morpho-functional MR-scoring-system and to evaluate its intra- and inter-observer reproducibility and clinical practicability to monitor CF lung disease over a broad severity range from infancy to adulthood. 35 CF patients with broad age range (mean 15.3 years; range 0.5–42) were examined by morphological and functional MRI. Lobe based analysis was performed for parameters bronchiectasis/bronchial-wall-thickening, mucus plugging, abscesses/sacculations, consolidations, special findings and perfusion defects. The maximum global score was 72. Two experienced radiologists scored the images at two time points (interval 10 weeks). Upper and lower limits of agreement, concordance correlation coefficients (CCC), total deviation index and coverage probability were calculated for global, morphology, function, component and lobar scores. Global scores ranged from 6 to 47. Intra- and inter-reader agreement for global scores were good (CCC: 0.98 (R1), 0.94 (R2), 0.97 (R1/R2)) and were comparable between high and low scores. Our results indicate that the proposed morpho-functional MR-scoring-system is reproducible and applicable for semi-quantitative evaluation of a large spectrum of CF lung disease severity. This scoring-system can be applied for the routine assessment of CF lung disease and maybe as endpoint for clinical trials.

  16. Differences of wells scores accuracy, caprini scores and padua scores in deep vein thrombosis diagnosis

    Science.gov (United States)

    Gatot, D.; Mardia, A. I.

    2018-03-01

    Deep Vein Thrombosis (DVT) is the venous thrombus in lower limbs. Diagnosis is by using venography or ultrasound compression. However, these examinations are not available yet in some health facilities. Therefore many scoring systems are developed for the diagnosis of DVT. The scoring method is practical and safe to use in addition to efficacy, and effectiveness in terms of treatment and costs. The existing scoring systems are wells, caprini and padua score. There have been many studies comparing the accuracy of this score but not in Medan. Therefore, we are interested in comparative research of wells, capriniand padua score in Medan.An observational, analytical, case-control study was conducted to perform diagnostic tests on the wells, caprini and padua score to predict the risk of DVT. The study was at H. Adam Malik Hospital in Medan.From a total of 72 subjects, 39 people (54.2%) are men and the mean age are 53.14 years. Wells score, caprini score and padua score has a sensitivity of 80.6%; 61.1%, 50% respectively; specificity of 80.65; 66.7%; 75% respectively, and accuracy of 87.5%; 64.3%; 65.7% respectively.Wells score has better sensitivity, specificity and accuracy than caprini and padua score in diagnosing DVT.

  17. Genome-wide linkage scan for maximum and length-dependent knee muscle strength in young men: significant evidence for linkage at chromosome 14q24.3.

    Science.gov (United States)

    De Mars, G; Windelinckx, A; Huygens, W; Peeters, M W; Beunen, G P; Aerssens, J; Vlietinck, R; Thomis, M A I

    2008-05-01

    Maintenance of high muscular fitness is positively related to bone health, functionality in daily life and increasing insulin sensitivity, and negatively related to falls and fractures, morbidity and mortality. Heritability of muscle strength phenotypes ranges between 31% and 95%, but little is known about the identity of the genes underlying this complex trait. As a first attempt, this genome-wide linkage study aimed to identify chromosomal regions linked to muscle and bone cross-sectional area, isometric knee flexion and extension torque, and torque-length relationship for knee flexors and extensors. In total, 283 informative male siblings (17-36 years old), belonging to 105 families, were used to conduct a genome-wide SNP-based multipoint linkage analysis. The strongest evidence for linkage was found for the torque-length relationship of the knee flexors at 14q24.3 (LOD = 4.09; p<10(-5)). Suggestive evidence for linkage was found at 14q32.2 (LOD = 3.00; P = 0.005) for muscle and bone cross-sectional area, at 2p24.2 (LOD = 2.57; p = 0.01) for isometric knee torque at 30 degrees flexion, at 1q21.3, 2p23.3 and 18q11.2 (LOD = 2.33, 2.69 and 2.21; p<10(-4) for all) for the torque-length relationship of the knee extensors and at 18p11.31 (LOD = 2.39; p = 0.0004) for muscle-mass adjusted isometric knee extension torque. We conclude that many small contributing genes rather than a few important genes are involved in causing variation in different underlying phenotypes of muscle strength. Furthermore, some overlap in promising genomic regions were identified among different strength phenotypes.

  18. Targeted maximum likelihood estimation for a binary treatment: A tutorial.

    Science.gov (United States)

    Luque-Fernandez, Miguel Angel; Schomaker, Michael; Rachet, Bernard; Schnitzer, Mireille E

    2018-04-23

    When estimating the average effect of a binary treatment (or exposure) on an outcome, methods that incorporate propensity scores, the G-formula, or targeted maximum likelihood estimation (TMLE) are preferred over naïve regression approaches, which are biased under misspecification of a parametric outcome model. In contrast propensity score methods require the correct specification of an exposure model. Double-robust methods only require correct specification of either the outcome or the exposure model. Targeted maximum likelihood estimation is a semiparametric double-robust method that improves the chances of correct model specification by allowing for flexible estimation using (nonparametric) machine-learning methods. It therefore requires weaker assumptions than its competitors. We provide a step-by-step guided implementation of TMLE and illustrate it in a realistic scenario based on cancer epidemiology where assumptions about correct model specification and positivity (ie, when a study participant had 0 probability of receiving the treatment) are nearly violated. This article provides a concise and reproducible educational introduction to TMLE for a binary outcome and exposure. The reader should gain sufficient understanding of TMLE from this introductory tutorial to be able to apply the method in practice. Extensive R-code is provided in easy-to-read boxes throughout the article for replicability. Stata users will find a testing implementation of TMLE and additional material in the Appendix S1 and at the following GitHub repository: https://github.com/migariane/SIM-TMLE-tutorial. © 2018 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  19. LOD +: Augmenting LOD with Skeletons

    OpenAIRE

    Lange , Benoit; Rodriguez , Nancy

    2010-01-01

    International audience; Until now computer graphic researchers have tried to solve visualization problems introduced by the size of meshes. Modern tools produce large models and hardware is not able to render them in full resolution. For example, the digital Michelangelo project extracted a model with more than one billion polygons. One can notice hardware has become more and more powerful but meshes have also become more and more complex. To solve this issue, people have worked on many solut...

  20. MPBoot: fast phylogenetic maximum parsimony tree inference and bootstrap approximation.

    Science.gov (United States)

    Hoang, Diep Thi; Vinh, Le Sy; Flouri, Tomáš; Stamatakis, Alexandros; von Haeseler, Arndt; Minh, Bui Quang

    2018-02-02

    The nonparametric bootstrap is widely used to measure the branch support of phylogenetic trees. However, bootstrapping is computationally expensive and remains a bottleneck in phylogenetic analyses. Recently, an ultrafast bootstrap approximation (UFBoot) approach was proposed for maximum likelihood analyses. However, such an approach is still missing for maximum parsimony. To close this gap we present MPBoot, an adaptation and extension of UFBoot to compute branch supports under the maximum parsimony principle. MPBoot works for both uniform and non-uniform cost matrices. Our analyses on biological DNA and protein showed that under uniform cost matrices, MPBoot runs on average 4.7 (DNA) to 7 times (protein data) (range: 1.2-20.7) faster than the standard parsimony bootstrap implemented in PAUP*; but 1.6 (DNA) to 4.1 times (protein data) slower than the standard bootstrap with a fast search routine in TNT (fast-TNT). However, for non-uniform cost matrices MPBoot is 5 (DNA) to 13 times (protein data) (range:0.3-63.9) faster than fast-TNT. We note that MPBoot achieves better scores more frequently than PAUP* and fast-TNT. However, this effect is less pronounced if an intensive but slower search in TNT is invoked. Moreover, experiments on large-scale simulated data show that while both PAUP* and TNT bootstrap estimates are too conservative, MPBoot bootstrap estimates appear more unbiased. MPBoot provides an efficient alternative to the standard maximum parsimony bootstrap procedure. It shows favorable performance in terms of run time, the capability of finding a maximum parsimony tree, and high bootstrap accuracy on simulated as well as empirical data sets. MPBoot is easy-to-use, open-source and available at http://www.cibiv.at/software/mpboot .

  1. Maximum permissible dose

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    This chapter presents a historic overview of the establishment of radiation guidelines by various national and international agencies. The use of maximum permissible dose and maximum permissible body burden limits to derive working standards is discussed

  2. TopFed: TCGA tailored federated query processing and linking to LOD.

    Science.gov (United States)

    Saleem, Muhammad; Padmanabhuni, Shanmukha S; Ngomo, Axel-Cyrille Ngonga; Iqbal, Aftab; Almeida, Jonas S; Decker, Stefan; Deus, Helena F

    2014-01-01

    The Cancer Genome Atlas (TCGA) is a multidisciplinary, multi-institutional effort to catalogue genetic mutations responsible for cancer using genome analysis techniques. One of the aims of this project is to create a comprehensive and open repository of cancer related molecular analysis, to be exploited by bioinformaticians towards advancing cancer knowledge. However, devising bioinformatics applications to analyse such large dataset is still challenging, as it often requires downloading large archives and parsing the relevant text files. Therefore, it is making it difficult to enable virtual data integration in order to collect the critical co-variates necessary for analysis. We address these issues by transforming the TCGA data into the Semantic Web standard Resource Description Format (RDF), link it to relevant datasets in the Linked Open Data (LOD) cloud and further propose an efficient data distribution strategy to host the resulting 20.4 billion triples data via several SPARQL endpoints. Having the TCGA data distributed across multiple SPARQL endpoints, we enable biomedical scientists to query and retrieve information from these SPARQL endpoints by proposing a TCGA tailored federated SPARQL query processing engine named TopFed. We compare TopFed with a well established federation engine FedX in terms of source selection and query execution time by using 10 different federated SPARQL queries with varying requirements. Our evaluation results show that TopFed selects on average less than half of the sources (with 100% recall) with query execution time equal to one third to that of FedX. With TopFed, we aim to offer biomedical scientists a single-point-of-access through which distributed TCGA data can be accessed in unison. We believe the proposed system can greatly help researchers in the biomedical domain to carry out their research effectively with TCGA as the amount and diversity of data exceeds the ability of local resources to handle its retrieval and

  3. Diet Quality Scores of Australian Adults Who Have Completed the Healthy Eating Quiz.

    Science.gov (United States)

    Williams, Rebecca L; Rollo, Megan E; Schumacher, Tracy; Collins, Clare E

    2017-08-15

    Higher scores obtained using diet quality and variety indices are indicators of more optimal food and nutrient intakes and lower chronic disease risk. The aim of this paper is to describe the overall diet quality and variety in a sample of Australian adults who completed an online diet quality self-assessment tool, the Healthy Eating Quiz. The Healthy Eating Quiz takes approximately five minutes to complete online and computes user responses into a total diet quality score (out of a maximum of 73 points) and then categorizes them into the following groups: 'needs work' (Healthy eating quiz scores were higher in those aged 45-75 years compared to 16-44 years ( p Healthy Eating Quiz data indicates that individuals receiving feedback on how to improve their score can improve their diet quality, there is a need for further nutrition promotion interventions in Australian adults.

  4. Sequential organ failure assessment scoring and prediction of patient's outcome in Intensive Care Unit of a tertiary care hospital

    Directory of Open Access Journals (Sweden)

    Aditi Jain

    2016-01-01

    Conclusion: SOFA score is a simple, but effective prognostic indicator and evaluator for patient progress in ICU. Day 1 SOFA can triage the patients into risk categories. For further management, mean and maximum score help determine the severity of illness and can act as a guide for the intensity of therapy required for each patient.

  5. [Definition of the Diagnosis Osteomyelitis-Osteomyelitis Diagnosis Score (ODS)].

    Science.gov (United States)

    Schmidt, H G K; Tiemann, A H; Braunschweig, R; Diefenbeck, M; Bühler, M; Abitzsch, D; Haustedt, N; Walter, G; Schoop, R; Heppert, V; Hofmann, G O; Glombitza, M; Grimme, C; Gerlach, U-J; Flesch, I

    2011-08-01

    The disease "osteomyelitis" is characterised by different symptoms and parameters. Decisive roles in the development of the disease are played by the causative bacteria, the route of infection and the individual defense mechanisms of the host. The diagnosis is based on different symptoms and findings from the clinical history, clinical symptoms, laboratory results, diagnostic imaging, microbiological and histopathological analyses. While different osteomyelitis classifications have been published, there is to the best of our knowledge no score that gives information how sure the diagnosis "osteomyelitis" is in general. For any scientific study of a disease a valid definition is essential. We have developed a special osteomyelitis diagnosis score for the reliable classification of clinical, laboratory and technical findings. The score is based on five diagnostic procedures: 1) clinical history and risk factors, 2) clinical examination and laboratory results, 3) diagnostic imaging (ultrasound, radiology, CT, MRI, nuclear medicine and hybrid methods), 4) microbiology, and 5) histopathology. Each diagnostic procedure is related to many individual findings, which are weighted by a score system, in order to achieve a relevant value for each assessment. If the sum of the five diagnostic criteria is 18 or more points, the diagnosis of osteomyelitis can be viewed as "safe" (diagnosis class A). Between 8-17 points the diagnosis is "probable" (diagnosis class B). Less than 8 points means that the diagnosis is "possible, but unlikely" (class C diagnosis). Since each parameter can score six points at a maximum, a reliable diagnosis can only be achieved if at least 3 parameters are scored with 6 points. The osteomyelitis diagnosis score should help to avoid the false description of a clinical presentation as "osteomyelitis". A safe diagnosis is essential for the aetiology, treatment and outcome studies of osteomyelitis. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Quantitative trait loci on chromosomes 2p, 4p, and 13q influence bone mineral density of the forearm and hip in Mexican Americans.

    Science.gov (United States)

    Kammerer, Candace M; Schneider, Jennifer L; Cole, Shelley A; Hixson, James E; Samollow, Paul B; O'Connell, Jeffrey R; Perez, Reina; Dyer, Thomas D; Almasy, Laura; Blangero, John; Bauer, Richard L; Mitchell, Braxton D

    2003-12-01

    We performed a genome scan using BMD data of the forearm and hip on 664 individuals in 29 Mexican-American families. We obtained evidence for QTL on chromosome 4p, affecting forearm BMD overall, and on chromosomes 2p and 13q, affecting hip BMD in men. The San Antonio Family Osteoporosis Study (SAFOS) was designed to identify genes and environmental factors that influence bone mineral density (BMD) using data from large Mexican-American families. We performed a genome-wide linkage analysis using 416 highly polymorphic microsatellite markers spaced approximately 9.5 cM apart to locate and identify quantitative trait loci (QTL) that affect BMD of the forearm and hip. Multipoint variance components linkage analyses were done using data on all 664 subjects, as well as two subgroups of 259 men and 261 premenopausal women, from 29 families for which genotypic and phenotypic data were available. We obtained significant evidence for a QTL affecting forearm (radius midpoint) BMD in men and women combined on chromosome 4p near D4S2639 (maximum LOD = 4.33, genomic p = 0.006) and suggestive evidence for a QTL on chromosome 12q near locus D12S2070 (maximum conditional LOD = 2.35). We found suggestive evidence for a QTL influencing trochanter BMD on chromosome 6 (maximum LOD = 2.27), but no evidence for QTL affecting the femoral neck in men and women combined. In men, we obtained evidence for QTL affecting neck and trochanter BMD on chromosomes 2p near D2S1780 (maximum LOD = 3.98, genomic p = 0.013) and 13q near D13S788 (maximum LOD = 3.46, genomic p = 0.039), respectively. We found no evidence for QTL affecting forearm or hip BMD in premenopausal women. These results provide strong evidence that a QTL on chromosome 4p affects radius BMD in Mexican-American men and women, as well as evidence that QTL on chromosomes 2p and 13q affect hip BMD in men. Our results are consistent with some reports in humans and mice. J Bone Miner Res 2003;18:2245-2252

  7. Visualizing whole-brain DTI tractography with GPU-based Tuboids and LoD management.

    Science.gov (United States)

    Petrovic, Vid; Fallon, James; Kuester, Falko

    2007-01-01

    Diffusion Tensor Imaging (DTI) of the human brain, coupled with tractography techniques, enable the extraction of large-collections of three-dimensional tract pathways per subject. These pathways and pathway bundles represent the connectivity between different brain regions and are critical for the understanding of brain related diseases. A flexible and efficient GPU-based rendering technique for DTI tractography data is presented that addresses common performance bottlenecks and image-quality issues, allowing interactive render rates to be achieved on commodity hardware. An occlusion query-based pathway LoD management system for streamlines/streamtubes/tuboids is introduced that optimizes input geometry, vertex processing, and fragment processing loads, and helps reduce overdraw. The tuboid, a fully-shaded streamtube impostor constructed entirely on the GPU from streamline vertices, is also introduced. Unlike full streamtubes and other impostor constructs, tuboids require little to no preprocessing or extra space over the original streamline data. The supported fragment processing levels of detail range from texture-based draft shading to full raycast normal computation, Phong shading, environment mapping, and curvature-correct text labeling. The presented text labeling technique for tuboids provides adaptive, aesthetically pleasing labels that appear attached to the surface of the tubes. Furthermore, an occlusion query aggregating and scheduling scheme for tuboids is described that reduces the query overhead. Results for a tractography dataset are presented, and demonstrate that LoD-managed tuboids offer benefits over traditional streamtubes both in performance and appearance.

  8. Predicting the number and sizes of IBD regions among family members and evaluating the family size requirement for linkage studies.

    Science.gov (United States)

    Yang, Wanling; Wang, Zhanyong; Wang, Lusheng; Sham, Pak-Chung; Huang, Peng; Lau, Yu Lung

    2008-12-01

    With genotyping of high-density single nucleotide polymorphisms (SNPs) replacing that of microsatellite markers in linkage studies, it becomes possible to accurately determine the genomic regions shared identity by descent (IBD) by family members. In addition to evaluating the likelihood of linkage for a region with the underlining disease (the LOD score approach), an appropriate question to ask is what would be the expected number and sizes of IBD regions among the affecteds, as there could be more than one region reaching the maximum achievable LOD score for a given family. Here, we introduce a computer program to allow the prediction of the total number of IBD regions among family members and their sizes. Reversely, it can be used to predict the portion of the genome that can be excluded from consideration according to the family size and user-defined inheritance mode and penetrance. Such information has implications on the feasibility of conducting linkage analysis on a given family of certain size and structure or on a few small families when interfamily homogeneity can be assumed. It can also help determine the most relevant members to be genotyped for such a study. Simulation results showed that the IBD regions containing true mutations are usually larger than regions IBD due to random chance. We have made use of this feature in our program to allow evaluation of the identified IBD regions based on Bayesian probability calculation and simulation results.

  9. Genetic heterogeneity of Usher syndrome: analysis of 151 families with Usher type I.

    Science.gov (United States)

    Astuto, L M; Weston, M D; Carney, C A; Hoover, D M; Cremers, C W; Wagenaar, M; Moller, C; Smith, R J; Pieke-Dahl, S; Greenberg, J; Ramesar, R; Jacobson, S G; Ayuso, C; Heckenlively, J R; Tamayo, M; Gorin, M B; Reardon, W; Kimberling, W J

    2000-12-01

    Usher syndrome type I is an autosomal recessive disorder marked by hearing loss, vestibular areflexia, and retinitis pigmentosa. Six Usher I genetic subtypes at loci USH1A-USH1F have been reported. The MYO7A gene is responsible for USH1B, the most common subtype. In our analysis, 151 families with Usher I were screened by linkage and mutation analysis. MYO7A mutations were identified in 64 families with Usher I. Of the remaining 87 families, who were negative for MYO7A mutations, 54 were informative for linkage analysis and were screened with the remaining USH1 loci markers. Results of linkage and heterogeneity analyses showed no evidence of Usher types Ia or Ie. However, one maximum LOD score was observed lying within the USH1D region. Two lesser peak LOD scores were observed outside and between the putative regions for USH1D and USH1F, on chromosome 10. A HOMOG chi(2)((1)) plot shows evidence of heterogeneity across the USH1D, USH1F, and intervening regions. These results provide conclusive evidence that the second-most-common subtype of Usher I is due to genes on chromosome 10, and they confirm the existence of one Usher I gene in the previously defined USH1D region, as well as providing evidence for a second, and possibly a third, gene in the 10p/q region.

  10. Assignment of an Usher syndrome type III (USH3) gene to chromosome 3q.

    Science.gov (United States)

    Sankila, E M; Pakarinen, L; Kääriäinen, H; Aittomäki, K; Karjalainen, S; Sistonen, P; de la Chapelle, A

    1995-01-01

    Usher syndrome (USH) refers to genetically and clinically heterogeneous autosomal recessive disorders with combined visual and hearing loss. Type I (USH1) is characterized by a congenital, severe to profound hearing loss and absent vestibular function; in type II (USH2) the hearing loss is congenital and moderate to severe, and the vestibular function is normal. Progressive pigmentary retinopathy (PPR) is present in both types. A third type (USH3) differing from USH2 by the progressive nature of its hearing loss has been suggested. USH3 has previously been estimated to comprise 2% of all USH. However, based on clinical criteria, in Finland 42% of USH patients have progressive hearing loss suggesting enrichment of an USH3 gene. We excluded the four previously mapped USH regions as the site of the USH3 disease locus. Systematic search for USH3 by genetic linkage analyses in 10 multiple affected families using polymorphic microsatellite markers revealed significant linkage with markers mapping to chromosome 3q. Pairwise lod scores at zero recombination distance were 7.87 for D3S1308, and 11.29 for D3S1299, incorporating the observed linkage disequilibrium. Conventional multipoint linkage analysis gave a maximum lod score of 9.88 at D3S1299 assigning USH3 to the 5 cM interval between markers D3S1555 and D3S1279 in 3q21-25.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. Linkage analysis using co-phenotypes in the BRIGHT study reveals novel potential susceptibility loci for hypertension.

    Science.gov (United States)

    Wallace, Chris; Xue, Ming-Zhan; Newhouse, Stephen J; Marcano, Ana Carolina B; Onipinla, Abiodun K; Burke, Beverley; Gungadoo, Johannie; Dobson, Richard J; Brown, Morris; Connell, John M; Dominiczak, Anna; Lathrop, G Mark; Webster, John; Farrall, Martin; Mein, Charles; Samani, Nilesh J; Caulfield, Mark J; Clayton, David G; Munroe, Patricia B

    2006-08-01

    Identification of the genetic influences on human essential hypertension and other complex diseases has proved difficult, partly because of genetic heterogeneity. In many complex-trait resources, additional phenotypic data have been collected, allowing comorbid intermediary phenotypes to be used to characterize more genetically homogeneous subsets. The traditional approach to analyzing covariate-defined subsets has typically depended on researchers' previous expectations for definition of a comorbid subset and leads to smaller data sets, with a concomitant attrition in power. An alternative is to test for dependence between genetic sharing and covariates across the entire data set. This approach offers the advantage of exploiting the full data set and could be widely applied to complex-trait genome scans. However, existing maximum-likelihood methods can be prohibitively computationally expensive, especially since permutation is often required to determine significance. We developed a less computationally intensive score test and applied it to biometric and biochemical covariate data, from 2,044 sibling pairs with severe hypertension, collected by the British Genetics of Hypertension (BRIGHT) study. We found genomewide-significant evidence for linkage with hypertension and several related covariates. The strongest signals were with leaner-body-mass measures on chromosome 20q (maximum LOD = 4.24) and with parameters of renal function on chromosome 5p (maximum LOD = 3.71). After correction for the multiple traits and genetic locations studied, our global genomewide P value was .046. This is the first identity-by-descent regression analysis of hypertension to our knowledge, and it demonstrates the value of this approach for the incorporation of additional phenotypic information in genetic studies of complex traits.

  12. Compliance with the CURB-65 score and the consequences of non-implementation.

    Science.gov (United States)

    Guo, Q; Li, H-Y; Zhou, Y-P; Li, M; Chen, X-K; Liu, H; Peng, H-L; Yu, H-Q; Chen, X; Liu, N; Liang, L-H; Zhao, Q-Z; Jiang, M

    2011-12-01

    The CURB-65 (confusion, urea >7 mmol/l, respiratory rate ≥ 30 breaths/min, low blood pressure and age ≥ 65 years) score is a simple, well-validated tool for the assessment of severity in community-acquired pneumonia (CAP). It is unknown whether it is used routinely in China. To determine the frequency of use of the CURB-65 score in routine hospital practice and the consequences of non-implementation. A retrospective analysis of medical records from 1230 in-patients with CAP in a Chinese medical college-affiliated hospital. No CAP patient underwent the CURB-65 test at admission. Based on the British Thoracic Society guidelines, the 716 (58.2%) in-patients with a CURB65 score of 0 and the 402 (32.7%) in-patients with CURB-65 score of 1 should have received ambulatory treatment, whereas the 14 (1.2%) patients with CURB65 scores of ≥ 3 should have been admitted to the critical care unit. The maximum excess total annual costs for managing CAP patients with CURB-65 scores of 0 and 1 were estimated at respectively US$94 383.12 and US$66 313.92 in the hospital. The CURB-65 scoring tool in patients with CAP was not applied in routine hospital practice, resulting in inappropriate hospitalisation and excess costs.

  13. Cost-effective optimization of real-time PCR based detection of Campylobacter and Salmonella with inhibitor tolerant DNA polymerases

    DEFF Research Database (Denmark)

    Fachmann, Mette Sofie Rousing; Josefsen, Mathilde Hasseldam; Hoorfar, Jeffrey

    2015-01-01

    bacterial cells in two validated real-time PCR assays for Campylobacter and Salmonella. The five best performing (based on: limit of detection (LOD), maximum fluorescence, shape of amplification curves, and amplification efficiency) were subsequently applied to meat and fecal samples. The VeriQuest q......PCR master mix performed best for both meat and fecal samples (LODs of 102 and 104 CFU ml-1 in the purest and crudest DNA extractions, respectively) compared with Tth (LOD=102 -103 and 105 -106 CFU ml-1 ). AmpliTaqGold and HotMasterTaq both performed well (LOD=102 -104 CFU ml-1 ) with meat samples and poorly...... (LOD=103 -106 CFU ml-1 /not detected) with fecal samples. CONCLUSIONS: Applying the VeriQuest qPCR master mix in the two tested real-time PCR assays could allow for simpler sample preparation and thus a reduction in cost. SIGNIFICANCE AND IMPACT OF STUDY: This work exemplifies a cost-effective strategy...

  14. CytoMCS: A Multiple Maximum Common Subgraph Detection Tool for Cytoscape

    DEFF Research Database (Denmark)

    Larsen, Simon; Baumbach, Jan

    2017-01-01

    such analyses we have developed CytoMCS, a Cytoscape app for computing inexact solutions to the maximum common edge subgraph problem for two or more graphs. Our algorithm uses an iterative local search heuristic for computing conserved subgraphs, optimizing a squared edge conservation score that is able...... to detect not only fully conserved edges but also partially conserved edges. It can be applied to any set of directed or undirected, simple graphs loaded as networks into Cytoscape, e.g. protein-protein interaction networks or gene regulatory networks. CytoMCS is available as a Cytoscape app at http://apps.cytoscape.org/apps/cytomcs....

  15. A genome-wide search for genes involved in the radiation-induced gastroschisis

    International Nuclear Information System (INIS)

    Hillebrandt, S.; Streffer, C.

    1997-01-01

    Whole genome linkage analysis of gastroschisis (abdominal wall defect) using geno-typing with micro-satellites of affected BC1 mice [(HLGxC57BL/6J)xHLG] was performed. The HLG inbred strain shows an increased risk in gastroschisis after irradiation of embryos in the 1-cell stage. Previous studies demonstrated, that gastroschisis is a poly-genic trait with a recessive mode of inheritance. Since a recessive inheritance of gastroschisis is assumed, the involved genes must be linked to markers showing a high level of homozygosity in the affected animals. For marker loci on the chromosome 13 and 19 a significantly increased number of homozygotes has been found in mice with gastroschisis comparing to mice without this malformation. The linkage analysis performed by us allowed determining intervals likely to contain genes related to gastroschisis on these two chromosomes. The highest lod score value has been found for the marker locus D19MIT27 very close to Pax2 (lod score=1.23; p=0.017). For the marker D13MIT99 a lod score of 0.85 (p=0.047) was calculated. However, markers more close to the homeo-box gene Msx-2 on the chromosome 13 show lower lod score values than D13MIT99, suggesting that this homeo-box gene is probably not involved in gastroschisis. According to the classification of results of the linkage analysis of complex traits described by Lander and Kruglyak (1995), our data provide a suggestive evidence for the involvement of the analyzed intervals on the chromosomes 19 and 13 to gastroschisis. Further studies are necessary to prove this linkage. (authors)

  16. Assessing Hourly Precipitation Forecast Skill with the Fractions Skill Score

    Science.gov (United States)

    Zhao, Bin; Zhang, Bo

    2018-02-01

    Statistical methods for category (yes/no) forecasts, such as the Threat Score, are typically used in the verification of precipitation forecasts. However, these standard methods are affected by the so-called "double-penalty" problem caused by slight displacements in either space or time with respect to the observations. Spatial techniques have recently been developed to help solve this problem. The fractions skill score (FSS), a neighborhood spatial verification method, directly compares the fractional coverage of events in windows surrounding the observations and forecasts. We applied the FSS to hourly precipitation verification by taking hourly forecast products from the GRAPES (Global/Regional Assimilation Prediction System) regional model and quantitative precipitation estimation products from the National Meteorological Information Center of China during July and August 2016, and investigated the difference between these results and those obtained with the traditional category score. We found that the model spin-up period affected the assessment of stability. Systematic errors had an insignificant role in the fraction Brier score and could be ignored. The dispersion of observations followed a diurnal cycle and the standard deviation of the forecast had a similar pattern to the reference maximum of the fraction Brier score. The coefficient of the forecasts and the observations is similar to the FSS; that is, the FSS may be a useful index that can be used to indicate correlation. Compared with the traditional skill score, the FSS has obvious advantages in distinguishing differences in precipitation time series, especially in the assessment of heavy rainfall.

  17. D-score: a search engine independent MD-score.

    Science.gov (United States)

    Vaudel, Marc; Breiter, Daniela; Beck, Florian; Rahnenführer, Jörg; Martens, Lennart; Zahedi, René P

    2013-03-01

    While peptides carrying PTMs are routinely identified in gel-free MS, the localization of the PTMs onto the peptide sequences remains challenging. Search engine scores of secondary peptide matches have been used in different approaches in order to infer the quality of site inference, by penalizing the localization whenever the search engine similarly scored two candidate peptides with different site assignments. In the present work, we show how the estimation of posterior error probabilities for peptide candidates allows the estimation of a PTM score called the D-score, for multiple search engine studies. We demonstrate the applicability of this score to three popular search engines: Mascot, OMSSA, and X!Tandem, and evaluate its performance using an already published high resolution data set of synthetic phosphopeptides. For those peptides with phosphorylation site inference uncertainty, the number of spectrum matches with correctly localized phosphorylation increased by up to 25.7% when compared to using Mascot alone, although the actual increase depended on the fragmentation method used. Since this method relies only on search engine scores, it can be readily applied to the scoring of the localization of virtually any modification at no additional experimental or in silico cost. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. LOD lab: Experiments at LOD scale

    NARCIS (Netherlands)

    Rietveld, Laurens; Beek, Wouter; Schlobach, Stefan

    2015-01-01

    Contemporary Semantic Web research is in the business of optimizing algorithms for only a handful of datasets such as DBpedia, BSBM, DBLP and only a few more. This means that current practice does not generally take the true variety of Linked Data into account. With hundreds of thousands of datasets

  19. The Zhongshan Score

    Science.gov (United States)

    Zhou, Lin; Guo, Jianming; Wang, Hang; Wang, Guomin

    2015-01-01

    Abstract In the zero ischemia era of nephron-sparing surgery (NSS), a new anatomic classification system (ACS) is needed to adjust to these new surgical techniques. We devised a novel and simple ACS, and compared it with the RENAL and PADUA scores to predict the risk of NSS outcomes. We retrospectively evaluated 789 patients who underwent NSS with available imaging between January 2007 and July 2014. Demographic and clinical data were assessed. The Zhongshan (ZS) score consisted of three parameters. RENAL, PADUA, and ZS scores are divided into three groups, that is, high, moderate, and low scores. For operative time (OT), significant differences were seen between any two groups of ZS score and PADUA score (all P RENAL showed no significant difference between moderate and high complexity in OT, WIT, estimated blood loss, and increase in SCr. Compared with patients with a low score of ZS, those with a high or moderate score had 8.1-fold or 3.3-fold higher risk of surgical complications, respectively (all P RENAL score, patients with a high or moderate score had 5.7-fold or 1.9-fold higher risk of surgical complications, respectively (all P RENAL and PADUA scores. ZS score could be used to reflect the surgical complexity and predict the risk of surgical complications in patients undergoing NSS. PMID:25654399

  20. Cosmic shear measurement with maximum likelihood and maximum a posteriori inference

    Science.gov (United States)

    Hall, Alex; Taylor, Andy

    2017-06-01

    We investigate the problem of noise bias in maximum likelihood and maximum a posteriori estimators for cosmic shear. We derive the leading and next-to-leading order biases and compute them in the context of galaxy ellipticity measurements, extending previous work on maximum likelihood inference for weak lensing. We show that a large part of the bias on these point estimators can be removed using information already contained in the likelihood when a galaxy model is specified, without the need for external calibration. We test these bias-corrected estimators on simulated galaxy images similar to those expected from planned space-based weak lensing surveys, with promising results. We find that the introduction of an intrinsic shape prior can help with mitigation of noise bias, such that the maximum a posteriori estimate can be made less biased than the maximum likelihood estimate. Second-order terms offer a check on the convergence of the estimators, but are largely subdominant. We show how biases propagate to shear estimates, demonstrating in our simple set-up that shear biases can be reduced by orders of magnitude and potentially to within the requirements of planned space-based surveys at mild signal-to-noise ratio. We find that second-order terms can exhibit significant cancellations at low signal-to-noise ratio when Gaussian noise is assumed, which has implications for inferring the performance of shear-measurement algorithms from simplified simulations. We discuss the viability of our point estimators as tools for lensing inference, arguing that they allow for the robust measurement of ellipticity and shear.

  1. Re-Scoring the Game’s Score

    DEFF Research Database (Denmark)

    Gasselseder, Hans-Peter

    2014-01-01

    This study explores immersive presence as well as emotional valence and arousal in the context of dynamic and non-dynamic music scores in the 3rd person action-adventure video game genre while also considering relevant personality traits of the player. 60 subjects answered self-report questionnai......This study explores immersive presence as well as emotional valence and arousal in the context of dynamic and non-dynamic music scores in the 3rd person action-adventure video game genre while also considering relevant personality traits of the player. 60 subjects answered self......-temporal alignment in the resulting emotional congruency of nondiegetic music. Whereas imaginary aspects of immersive presence are systemically affected by the presentation of dynamic music, sensory spatial aspects show higher sensitivity towards the arousal potential of the music score. It is argued...

  2. College Math Assessment: SAT Scores vs. College Math Placement Scores

    Science.gov (United States)

    Foley-Peres, Kathleen; Poirier, Dawn

    2008-01-01

    Many colleges and university's use SAT math scores or math placement tests to place students in the appropriate math course. This study compares the use of math placement scores and SAT scores for 188 freshman students. The student's grades and faculty observations were analyzed to determine if the SAT scores and/or college math assessment scores…

  3. Histopathological Validation of the Surface-Intermediate-Base Margin Score for Standardized Reporting of Resection Technique during Nephron Sparing Surgery.

    Science.gov (United States)

    Minervini, Andrea; Campi, Riccardo; Kutikov, Alexander; Montagnani, Ilaria; Sessa, Francesco; Serni, Sergio; Raspollini, Maria Rosaria; Carini, Marco

    2015-10-01

    The surface-intermediate-base margin score is a novel standardized reporting system of resection techniques during nephron sparing surgery. We validated the surgeon assessed surface-intermediate-base score with microscopic histopathological assessment of partial nephrectomy specimens. Between June and August 2014 data were prospectively collected from 40 consecutive patients undergoing nephron sparing surgery. The surface-intermediate-base score was assigned to all cases. The score specific areas were color coded with tissue margin ink and sectioned for histological evaluation of healthy renal margin thickness. Maximum, minimum and mean thickness of healthy renal margin for each score specific area grade (surface [S] = 0, S = 1 ; intermediate [I] or base [B] = 0, I or B = 1, I or B = 2) was reported. The Mann-Whitney U and Kruskal-Wallis tests were used to compare the thickness of healthy renal margin in S = 0 vs 1 and I or B = 0 vs 1 vs 2 grades, respectively. Maximum, minimum and mean thickness of healthy renal margin was significantly different among score specific area grades S = 0 vs 1, and I or B = 0 vs 1, 0 vs 2 and 1 vs 2 (p <0.001). The main limitations of the study are the low number of the I or B = 1 and I or B = 2 samples and the assumption that each microscopic slide reflects the entire score specific area for histological analysis. The surface-intermediate-base scoring method can be readily harnessed in real-world clinical practice and accurately mirrors histopathological analysis for quantification and reporting of healthy renal margin thickness removed during tumor excision. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  4. Cases in which ancestral maximum likelihood will be confusingly misleading.

    Science.gov (United States)

    Handelman, Tomer; Chor, Benny

    2017-05-07

    Ancestral maximum likelihood (AML) is a phylogenetic tree reconstruction criteria that "lies between" maximum parsimony (MP) and maximum likelihood (ML). ML has long been known to be statistically consistent. On the other hand, Felsenstein (1978) showed that MP is statistically inconsistent, and even positively misleading: There are cases where the parsimony criteria, applied to data generated according to one tree topology, will be optimized on a different tree topology. The question of weather AML is statistically consistent or not has been open for a long time. Mossel et al. (2009) have shown that AML can "shrink" short tree edges, resulting in a star tree with no internal resolution, which yields a better AML score than the original (resolved) model. This result implies that AML is statistically inconsistent, but not that it is positively misleading, because the star tree is compatible with any other topology. We show that AML is confusingly misleading: For some simple, four taxa (resolved) tree, the ancestral likelihood optimization criteria is maximized on an incorrect (resolved) tree topology, as well as on a star tree (both with specific edge lengths), while the tree with the original, correct topology, has strictly lower ancestral likelihood. Interestingly, the two short edges in the incorrect, resolved tree topology are of length zero, and are not adjacent, so this resolved tree is in fact a simple path. While for MP, the underlying phenomenon can be described as long edge attraction, it turns out that here we have long edge repulsion. Copyright © 2017. Published by Elsevier Ltd.

  5. Prehospital score for acute disease: a community-based observational study in Japan

    Directory of Open Access Journals (Sweden)

    Fujiwara Hidekazu

    2007-10-01

    Full Text Available Abstract Background Ambulance usage in Japan has increased consistently because it is free under the national health insurance system. The introduction of refusal for ambulance transfer is being debated nationally. The purpose of the present study was to investigate the relationship between prehospital data and hospitalization outcome for acute disease patients, and to develop a simple prehospital evaluation tool using prehospital data for Japan's emergency medical service system. Methods The subjects were 9,160 consecutive acute disease patients aged ≥ 15 years who were transferred to hospital by Kishiwada City Fire Department ambulance between July 2004 and March 2006. The relationship between prehospital data (age, systolic blood pressure, pulse rate, respiration rate, level of consciousness, SpO2 level and ability to walk and outcome (hospitalization or non-hospitalization was analyzed using logistic regression models. The prehospital score component of each item of prehospital data was determined by beta coefficients. Eligible patients were scored retrospectively and the distribution of outcome was examined. For patients transported to the two main hospitals, outcome after hospitalization was also confirmed. Results A total of 8,330 (91% patients were retrospectively evaluated using a prehospital score with a maximum value of 14. The percentage of patients requiring hospitalization rose from 9% with score = 0 to 100% with score = 14. With a cut-off point score ≥ 2, the sensitivity, specificity, positive predictive value and negative predictive value were 97%, 16%, 39% and 89%, respectively. Among the 6,498 patients transported to the two main hospitals, there were no deaths at scores ≤ 1 and the proportion of non-hospitalization was over 90%. The proportion of deaths increased rapidly at scores ≥ 11. Conclusion The prehospital score could be a useful tool for deciding the refusal of ambulance transfer in Japan's emergency medical

  6. Exploring a Source of Uneven Score Equity across the Test Score Range

    Science.gov (United States)

    Huggins-Manley, Anne Corinne; Qiu, Yuxi; Penfield, Randall D.

    2018-01-01

    Score equity assessment (SEA) refers to an examination of population invariance of equating across two or more subpopulations of test examinees. Previous SEA studies have shown that score equity may be present for examinees scoring at particular test score ranges but absent for examinees scoring at other score ranges. No studies to date have…

  7. Marginal Maximum Likelihood Estimation of Item Response Models in R

    Directory of Open Access Journals (Sweden)

    Matthew S. Johnson

    2007-02-01

    Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.

  8. Maximum Acceleration Recording Circuit

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1995-01-01

    Coarsely digitized maximum levels recorded in blown fuses. Circuit feeds power to accelerometer and makes nonvolatile record of maximum level to which output of accelerometer rises during measurement interval. In comparison with inertia-type single-preset-trip-point mechanical maximum-acceleration-recording devices, circuit weighs less, occupies less space, and records accelerations within narrower bands of uncertainty. In comparison with prior electronic data-acquisition systems designed for same purpose, circuit simpler, less bulky, consumes less power, costs and analysis of data recorded in magnetic or electronic memory devices. Circuit used, for example, to record accelerations to which commodities subjected during transportation on trucks.

  9. Neutron spectra unfolding with maximum entropy and maximum likelihood

    International Nuclear Information System (INIS)

    Itoh, Shikoh; Tsunoda, Toshiharu

    1989-01-01

    A new unfolding theory has been established on the basis of the maximum entropy principle and the maximum likelihood method. This theory correctly embodies the Poisson statistics of neutron detection, and always brings a positive solution over the whole energy range. Moreover, the theory unifies both problems of overdetermined and of underdetermined. For the latter, the ambiguity in assigning a prior probability, i.e. the initial guess in the Bayesian sense, has become extinct by virtue of the principle. An approximate expression of the covariance matrix for the resultant spectra is also presented. An efficient algorithm to solve the nonlinear system, which appears in the present study, has been established. Results of computer simulation showed the effectiveness of the present theory. (author)

  10. Maximum Power from a Solar Panel

    Directory of Open Access Journals (Sweden)

    Michael Miller

    2010-01-01

    Full Text Available Solar energy has become a promising alternative to conventional fossil fuel sources. Solar panels are used to collect solar radiation and convert it into electricity. One of the techniques used to maximize the effectiveness of this energy alternative is to maximize the power output of the solar collector. In this project the maximum power is calculated by determining the voltage and the current of maximum power. These quantities are determined by finding the maximum value for the equation for power using differentiation. After the maximum values are found for each time of day, each individual quantity, voltage of maximum power, current of maximum power, and maximum power is plotted as a function of the time of day.

  11. 75 FR 53730 - Culturally Significant Object Imported for Exhibition Determinations: “The Roman Mosaic from Lod...

    Science.gov (United States)

    2010-09-01

    ...Notice is hereby given of the following determinations: Pursuant to the authority vested in me by the Act of October 19, 1965 (79 Stat. 985; 22 U.S.C. 2459), Executive Order 12047 of March 27, 1978, the Foreign Affairs Reform and Restructuring Act of 1998 (112 Stat. 2681, et seq.; 22 U.S.C. 6501 note, et seq.), Delegation of Authority No. 234 of October 1, 1999, and Delegation of Authority No. 236-3 of August 28, 2000, I hereby determine that the object to be included in the exhibition ``The Roman Mosaic from Lod, Israel,'' imported from abroad for temporary exhibition within the United States, is of cultural significance. The object is imported pursuant to a loan agreement with the foreign owner or custodian. I also determine that the exhibition or display of the exhibit object at the Metropolitan Museum of Art, New York, New York, from on or about September 28, 2010, until on or about April 3, 2011, the Legion of Honor Museum, San Francisco, California, from on or about April 23, 2011, until on or about July 24, 2011, and at possible additional exhibitions or venues yet to be determined, is in the national interest. I have ordered that Public Notice of these Determinations be published in the Federal Register.

  12. A procedure for the detection of linkage with high density SNP arrays in a large pedigree with colorectal cancer

    International Nuclear Information System (INIS)

    Middeldorp, Anneke; Wijnen, Juul T; Wezel, Tom van; Jagmohan-Changur, Shantie; Helmer, Quinta; Klift, Heleen M van der; Tops, Carli MJ; Vasen, Hans FA; Devilee, Peter; Morreau, Hans; Houwing-Duistermaat, Jeanine J

    2007-01-01

    The apparent dominant model of colorectal cancer (CRC) inheritance in several large families, without mutations in known CRC susceptibility genes, suggests the presence of so far unidentified genes with strong or moderate effect on the development of CRC. Linkage analysis could lead to identification of susceptibility genes in such families. In comparison to classical linkage analysis with multi-allelic markers, single nucleotide polymorphism (SNP) arrays have increased information content and can be processed with higher throughput. Therefore, SNP arrays can be excellent tools for linkage analysis. However, the vast number of SNPs on the SNP arrays, combined with large informative pedigrees (e.g. >35–40 bits), presents us with a computational complexity that is challenging for existing statistical packages or even exceeds their capacity. We therefore setup a procedure for linkage analysis in large pedigrees and validated the method by genotyping using SNP arrays of a colorectal cancer family with a known MLH1 germ line mutation. Quality control of the genotype data was performed in Alohomora, Mega2 and SimWalk2, with removal of uninformative SNPs, Mendelian inconsistencies and Mendelian consistent errors, respectively. Linkage disequilibrium was measured by SNPLINK and Merlin. Parametric linkage analysis using two flanking markers was performed using MENDEL. For multipoint parametric linkage analysis and haplotype analysis, SimWalk2 was used. On chromosome 3, in the MLH1-region, a LOD score of 1.9 was found by parametric linkage analysis using two flanking markers. On chromosome 11 a small region with LOD 1.1 was also detected. Upon linkage disequilibrium removal, multipoint linkage analysis yielded a LOD score of 2.1 in the MLH1 region, whereas the LOD score dropped to negative values in the region on chromosome 11. Subsequent haplotype analysis in the MLH1 region perfectly matched the mutation status of the family members. We developed a workflow for linkage

  13. Maximum-Entropy Models of Sequenced Immune Repertoires Predict Antigen-Antibody Affinity.

    Directory of Open Access Journals (Sweden)

    Lorenzo Asti

    2016-04-01

    Full Text Available The immune system has developed a number of distinct complex mechanisms to shape and control the antibody repertoire. One of these mechanisms, the affinity maturation process, works in an evolutionary-like fashion: after binding to a foreign molecule, the antibody-producing B-cells exhibit a high-frequency mutation rate in the genome region that codes for the antibody active site. Eventually, cells that produce antibodies with higher affinity for their cognate antigen are selected and clonally expanded. Here, we propose a new statistical approach based on maximum entropy modeling in which a scoring function related to the binding affinity of antibodies against a specific antigen is inferred from a sample of sequences of the immune repertoire of an individual. We use our inference strategy to infer a statistical model on a data set obtained by sequencing a fairly large portion of the immune repertoire of an HIV-1 infected patient. The Pearson correlation coefficient between our scoring function and the IC50 neutralization titer measured on 30 different antibodies of known sequence is as high as 0.77 (p-value 10-6, outperforming other sequence- and structure-based models.

  14. WebScore: An Effective Page Scoring Approach for Uncertain Web Social Networks

    Directory of Open Access Journals (Sweden)

    Shaojie Qiao

    2011-10-01

    Full Text Available To effectively score pages with uncertainty in web social networks, we first proposed a new concept called transition probability matrix and formally defined the uncertainty in web social networks. Second, we proposed a hybrid page scoring algorithm, called WebScore, based on the PageRank algorithm and three centrality measures including degree, betweenness, and closeness. Particularly,WebScore takes into a full consideration of the uncertainty of web social networks by computing the transition probability from one page to another. The basic idea ofWebScore is to: (1 integrate uncertainty into PageRank in order to accurately rank pages, and (2 apply the centrality measures to calculate the importance of pages in web social networks. In order to verify the performance of WebScore, we developed a web social network analysis system which can partition web pages into distinct groups and score them in an effective fashion. Finally, we conducted extensive experiments on real data and the results show that WebScore is effective at scoring uncertain pages with less time deficiency than PageRank and centrality measures based page scoring algorithms.

  15. Developing a cumulative anatomic scoring system for military perineal and pelvic blast injuries.

    Science.gov (United States)

    Mossadegh, Somayyeh; Midwinter, M; Parker, P

    2013-03-01

    Improvised explosive device (IED) yields in Afghanistan have increased resulting in more proximal injuries. The injury severity score (ISS) is an anatomic aggregate score of the three most severely injured anatomical areas but does not accurately predict severity in IED related pelvi-perineal trauma patients. A scoring system based on abbreviated injury score (AIS) was developed to reflect the severity of these injuries in order to better understand risk factors, develop a tool for future audit and improve performance. Using standard AIS descriptors, injury scales were constructed for the pelvis (1, minor to 6, maximal). The perineum was divided into anterior and posterior zones as relevant to injury patterns and blast direction with each soft tissue structure being allocated a score from its own severity scale. A cumulative score, from 1 to 36 for soft tissue, or a maximum of 42 if a pelvic fracture was involved, was created for all structures injured in the anterior and posterior zones. Using this new scoring system, 77% of patients survived with a pelvi-perineal trauma score (PPTS) below 5. There was a significant increase in mortality, number of pelvic fractures and amputations with increase in score when comparing the first group (score 1-5) to the second group (score 6-10). For scores between 6 and 16 survival was 42% and 22% for scores between 17 and 21. In our cohort of 62 survivors, 1 patient with an IED related pelvi-perineal injury had a 'theoretically un-survivable' maximal ISS of 75 and survived, whereas there were no survivors with a PPTS greater than 22 but this group had no-one with an ISS of 75 suggesting ISS is not an accurate reflection of the true severity of pelvi-perineal blast injury. This scoring system is the initial part of a more complex logistic regression model that will contribute towards a unique trauma scoring system to aid surgical teams in predicting fluid requirements and operative timelines. In austere environments, it may also

  16. Heart valve surgery: EuroSCORE vs. EuroSCORE II vs. Society of Thoracic Surgeons score

    Directory of Open Access Journals (Sweden)

    Muhammad Sharoz Rabbani

    2014-12-01

    Full Text Available Background This is a validation study comparing the European System for Cardiac Operative Risk Evaluation (EuroSCORE II with the previous additive (AES and logistic EuroSCORE (LES and the Society of Thoracic Surgeons’ (STS risk prediction algorithm, for patients undergoing valve replacement with or without bypass in Pakistan. Patients and Methods Clinical data of 576 patients undergoing valve replacement surgery between 2006 and 2013 were retrospectively collected and individual expected risks of death were calculated by all four risk prediction algorithms. Performance of these risk algorithms was evaluated in terms of discrimination and calibration. Results There were 28 deaths (4.8% among 576 patients, which was lower than the predicted mortality of 5.16%, 6.96% and 4.94% by AES, LES and EuroSCORE II but was higher than 2.13% predicted by STS scoring system. For single and double valve replacement procedures, EuroSCORE II was the best predictor of mortality with highest Hosmer and Lemmeshow test (H-L p value (0.346 to 0.689 and area under the receiver operating characteristic (ROC curve (0.637 to 0.898. For valve plus concomitant coronary artery bypass grafting (CABG patients actual mortality was 1.88%. STS calculator came out to be the best predictor of mortality for this subgroup with H-L p value (0.480 to 0.884 and ROC (0.657 to 0.775. Conclusions For Pakistani population EuroSCORE II is an accurate predictor for individual operative risk in patients undergoing isolated valve surgery, whereas STS performs better in the valve plus CABG group.

  17. Male-female differences in Scoliosis Research Society-30 scores in adolescent idiopathic scoliosis.

    Science.gov (United States)

    Roberts, David W; Savage, Jason W; Schwartz, Daniel G; Carreon, Leah Y; Sucato, Daniel J; Sanders, James O; Richards, Benjamin Stephens; Lenke, Lawrence G; Emans, John B; Parent, Stefan; Sarwark, John F

    2011-01-01

    Longitudinal cohort study. To compare functional outcomes between male and female patients before and after surgery for adolescent idiopathic scoliosis (AIS). There is no clear consensus in the existing literature with respect to sex differences in functional outcomes in the surgical treatment of AIS. A prospective, consecutive, multicenter database of patients who underwent surgical correction for adolescent idiopathic scoliosis was analyzed retrospectively. All patients completed Scoliosis Research Society-30 (SRS-30) questionnaires before and 2 years after surgery. Patients with previous spine surgery were excluded. Data were collected for sex, age, Risser grade, previous bracing history, maximum preoperative Cobb angle, curve correction at 2 years, and SRS-30 domain scores. Paired sample t tests were used to compare preoperative and postoperative scores within each sex. Independent sample t tests were used to compare scores between sexes. A P value of Self-image/appearance had the greatest relative improvement. Males had better self-image/appearance scores preoperatively, better pain scores at 2 years, and better mental health and total scores both preoperatively and at 2 years. Both males and females were similarly satisfied with surgery. Males treated with surgery for AIS report better preoperative self-image, less postoperative pain, and better mental health than females. These differences may be clinically significant. For both males and females, the most beneficial effect of surgery is improved self-image/appearance. Overall, the benefits of surgery for AIS are similar for both sexes.

  18. Association of a Dietary Score with Incident Type 2 Diabetes: The Dietary-Based Diabetes-Risk Score (DDS.

    Directory of Open Access Journals (Sweden)

    Ligia J Dominguez

    Full Text Available Strong evidence supports that dietary modifications may decrease incident type 2 diabetes mellitus (T2DM. Numerous diabetes risk models/scores have been developed, but most do not rely specifically on dietary variables or do not fully capture the overall dietary pattern. We prospectively assessed the association of a dietary-based diabetes-risk score (DDS, which integrates optimal food patterns, with the risk of developing T2DM in the SUN ("Seguimiento Universidad de Navarra" longitudinal study.We assessed 17,292 participants initially free of diabetes, followed-up for a mean of 9.2 years. A validated 136-item FFQ was administered at baseline. Taking into account previous literature, the DDS positively weighted vegetables, fruit, whole cereals, nuts, coffee, low-fat dairy, fiber, PUFA, and alcohol in moderate amounts; while it negatively weighted red meat, processed meats and sugar-sweetened beverages. Energy-adjusted quintiles of each item (with exception of moderate alcohol consumption that received either 0 or 5 points were used to build the DDS (maximum: 60 points. Incident T2DM was confirmed through additional detailed questionnaires and review of medical records of participants. We used Cox proportional hazards models adjusted for socio-demographic and anthropometric parameters, health-related habits, and clinical variables to estimate hazard ratios (HR of T2DM.We observed 143 T2DM confirmed cases during follow-up. Better baseline conformity with the DDS was associated with lower incidence of T2DM (multivariable-adjusted HR for intermediate (25-39 points vs. low (11-24 category 0.43 [95% confidence interval (CI 0.21, 0.89]; and for high (40-60 vs. low category 0.32 [95% CI: 0.14, 0.69]; p for linear trend: 0.019.The DDS, a simple score exclusively based on dietary components, showed a strong inverse association with incident T2DM. This score may be applicable in clinical practice to improve dietary habits of subjects at high risk of T2DM

  19. Maximum concentrations at work and maximum biologically tolerable concentration for working materials 1991

    International Nuclear Information System (INIS)

    1991-01-01

    The meaning of the term 'maximum concentration at work' in regard of various pollutants is discussed. Specifically, a number of dusts and smokes are dealt with. The valuation criteria for maximum biologically tolerable concentrations for working materials are indicated. The working materials in question are corcinogeneous substances or substances liable to cause allergies or mutate the genome. (VT) [de

  20. 40 CFR 1042.140 - Maximum engine power, displacement, power density, and maximum in-use engine speed.

    Science.gov (United States)

    2010-07-01

    ... cylinders having an internal diameter of 13.0 cm and a 15.5 cm stroke length, the rounded displacement would... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Maximum engine power, displacement... Maximum engine power, displacement, power density, and maximum in-use engine speed. This section describes...

  1. Credal Networks under Maximum Entropy

    OpenAIRE

    Lukasiewicz, Thomas

    2013-01-01

    We apply the principle of maximum entropy to select a unique joint probability distribution from the set of all joint probability distributions specified by a credal network. In detail, we start by showing that the unique joint distribution of a Bayesian tree coincides with the maximum entropy model of its conditional distributions. This result, however, does not hold anymore for general Bayesian networks. We thus present a new kind of maximum entropy models, which are computed sequentially. ...

  2. A Comparison of Two Scoring Methods for an Automated Speech Scoring System

    Science.gov (United States)

    Xi, Xiaoming; Higgins, Derrick; Zechner, Klaus; Williamson, David

    2012-01-01

    This paper compares two alternative scoring methods--multiple regression and classification trees--for an automated speech scoring system used in a practice environment. The two methods were evaluated on two criteria: construct representation and empirical performance in predicting human scores. The empirical performance of the two scoring models…

  3. Maximum Entropy in Drug Discovery

    Directory of Open Access Journals (Sweden)

    Chih-Yuan Tseng

    2014-07-01

    Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.

  4. NCACO-score: An effective main-chain dependent scoring function for structure modeling

    Directory of Open Access Journals (Sweden)

    Dong Xiaoxi

    2011-05-01

    Full Text Available Abstract Background Development of effective scoring functions is a critical component to the success of protein structure modeling. Previously, many efforts have been dedicated to the development of scoring functions. Despite these efforts, development of an effective scoring function that can achieve both good accuracy and fast speed still presents a grand challenge. Results Based on a coarse-grained representation of a protein structure by using only four main-chain atoms: N, Cα, C and O, we develop a knowledge-based scoring function, called NCACO-score, that integrates different structural information to rapidly model protein structure from sequence. In testing on the Decoys'R'Us sets, we found that NCACO-score can effectively recognize native conformers from their decoys. Furthermore, we demonstrate that NCACO-score can effectively guide fragment assembly for protein structure prediction, which has achieved a good performance in building the structure models for hard targets from CASP8 in terms of both accuracy and speed. Conclusions Although NCACO-score is developed based on a coarse-grained model, it is able to discriminate native conformers from decoy conformers with high accuracy. NCACO is a very effective scoring function for structure modeling.

  5. Maximum Quantum Entropy Method

    OpenAIRE

    Sim, Jae-Hoon; Han, Myung Joon

    2018-01-01

    Maximum entropy method for analytic continuation is extended by introducing quantum relative entropy. This new method is formulated in terms of matrix-valued functions and therefore invariant under arbitrary unitary transformation of input matrix. As a result, the continuation of off-diagonal elements becomes straightforward. Without introducing any further ambiguity, the Bayesian probabilistic interpretation is maintained just as in the conventional maximum entropy method. The applications o...

  6. Predicting the need for massive transfusion in trauma patients: the Traumatic Bleeding Severity Score.

    Science.gov (United States)

    Ogura, Takayuki; Nakamura, Yoshihiko; Nakano, Minoru; Izawa, Yoshimitsu; Nakamura, Mitsunobu; Fujizuka, Kenji; Suzukawa, Masayuki; Lefor, Alan T

    2014-05-01

    The ability to easily predict the need for massive transfusion may improve the process of care, allowing early mobilization of resources. There are currently no clear criteria to activate massive transfusion in severely injured trauma patients. The aims of this study were to create a scoring system to predict the need for massive transfusion and then to validate this scoring system. We reviewed the records of 119 severely injured trauma patients and identified massive transfusion predictors using statistical methods. Each predictor was converted into a simple score based on the odds ratio in a multivariate logistic regression analysis. The Traumatic Bleeding Severity Score (TBSS) was defined as the sum of the component scores. The predictive value of the TBSS for massive transfusion was then validated, using data from 113 severely injured trauma patients. Receiver operating characteristic curve analysis was performed to compare the results of TBSS with the Trauma-Associated Severe Hemorrhage score and the Assessment of Blood Consumption score. In the development phase, five predictors of massive transfusion were identified, including age, systolic blood pressure, the Focused Assessment with Sonography for Trauma scan, severity of pelvic fracture, and lactate level. The maximum TBSS is 57 points. In the validation study, the average TBSS in patients who received massive transfusion was significantly greater (24.2 [6.7]) than the score of patients who did not (6.2 [4.7]) (p operating characteristic curve, sensitivity, and specificity for a TBSS greater than 15 points was 0.985 (significantly higher than the other scoring systems evaluated at 0.892 and 0.813, respectively), 97.4%, and 96.2%, respectively. The TBSS is simple to calculate using an available iOS application and is accurate in predicting the need for massive transfusion. Additional multicenter studies are needed to further validate this scoring system and further assess its utility. Prognostic study

  7. Weighted Maximum-Clique Transversal Sets of Graphs

    OpenAIRE

    Chuan-Min Lee

    2011-01-01

    A maximum-clique transversal set of a graph G is a subset of vertices intersecting all maximum cliques of G. The maximum-clique transversal set problem is to find a maximum-clique transversal set of G of minimum cardinality. Motivated by the placement of transmitters for cellular telephones, Chang, Kloks, and Lee introduced the concept of maximum-clique transversal sets on graphs in 2001. In this paper, we study the weighted version of the maximum-clique transversal set problem for split grap...

  8. Maximum power demand cost

    International Nuclear Information System (INIS)

    Biondi, L.

    1998-01-01

    The charging for a service is a supplier's remuneration for the expenses incurred in providing it. There are currently two charges for electricity: consumption and maximum demand. While no problem arises about the former, the issue is more complicated for the latter and the analysis in this article tends to show that the annual charge for maximum demand arbitrarily discriminates among consumer groups, to the disadvantage of some [it

  9. Prognostic value of Angiographic Perfusion Score (APS) following percutaneous interventions in acute coronary syndromes.

    Science.gov (United States)

    Narain, V S; Fischer, L; Puri, A; Sethi, R; Dwivedi, S K

    2013-01-01

    Identifying reperfusion and predicting post procedure risk is important following Percutaneous Coronary Interventions (PCI). An Angiographic Perfusion Score (APS) combining TIMI flow (TFG) and myocardial perfusion (TMPG) grades before and after PCI can accurately measure both epicardial and myocardial perfusion and predict Major Adverse Cardiac Events (MACE). APS was calculated in 226 (88 ST elevation Myocardial Infarction (STEMI) and 138 Non STEMI) patients. Maximum score being 12, reperfusion was defined as failed: 0-3, partial: 4-9, and full APS: 10-12. Thirty day MACE were observed. APS identified reperfusion significantly more than TMPG alone (STEMI: 50.6% vs 11.8% (p APS group (1.8% vs 22.5%) (p APS detects more low risk reperfused patients, post PCI. Copyright © 2012. Published by Elsevier B.V.

  10. Identification of dynapenia in older adults through the use of grip strength t-scores.

    Science.gov (United States)

    Bohannon, Richard W; Magasi, Susan

    2015-01-01

    The aim of this study was to generate reference values and t-scores (1.0-2.5 standard deviations below average) for grip strength for healthy young adults and to examine the utility of t-scores from this group for the identification of dynapenia in older adults. Our investigation was a population-based, general community secondary analysis of cross-sectional grip strength data utilizing the NIH Toolbox Assessment norming sample. Participants consisted of community-dwelling adults, with age ranges of 20-40 years (n = 558) and 60-85 years (n = 390). The main outcome measure was grip strength using a Jamar plus dynamometer. Maximum grip strengths were consistent over the 20-40-year age group [men 108.0 (SD 22.6) pounds, women 65.8 (SD 14.6) pounds]. Comparison of older group grip strengths to those of the younger reference group revealed (depending on age strata) that 46.2-87.1% of older men and 50.0-82.4% of older women could be designated as dynapenic on the basis of t-scores. The use of reference value t-scores from younger adults is a promising method for determining dynapenia in older adults. © 2014 Wiley Periodicals, Inc.

  11. Soetomo score: score model in early identification of acute haemorrhagic stroke

    Directory of Open Access Journals (Sweden)

    Moh Hasan Machfoed

    2016-06-01

    Full Text Available Aim of the study: On financial or facility constraints of brain imaging, score model is used to predict the occurrence of acute haemorrhagic stroke. Accordingly, this study attempts to develop a new score model, called Soetomo score. Material and methods: The researchers performed a cross-sectional study of 176 acute stroke patients with onset of ≤24 hours who visited emergency unit of Dr. Soetomo Hospital from July 14th to December 14th, 2014. The diagnosis of haemorrhagic stroke was confirmed by head computed tomography scan. There were seven predictors of haemorrhagic stroke which were analysed by using bivariate and multivariate analyses. Furthermore, a multiple discriminant analysis resulted in an equation of Soetomo score model. The receiver operating characteristic procedure resulted in the values of area under curve and intersection point identifying haemorrhagic stroke. Afterward, the diagnostic test value was determined. Results: The equation of Soetomo score model was (3 × loss of consciousness + (3.5 × headache + (4 × vomiting − 4.5. Area under curve value of this score was 88.5% (95% confidence interval = 83.3–93.7%. In the Soetomo score model value of ≥−0.75, the score reached the sensitivity of 82.9%, specificity of 83%, positive predictive value of 78.8%, negative predictive value of 86.5%, positive likelihood ratio of 4.88, negative likelihood ratio of 0.21, false negative of 17.1%, false positive of 17%, and accuracy of 83%. Conclusions: The Soetomo score model value of ≥−0.75 can identify acute haemorrhagic stroke properly on the financial or facility constrains of brain imaging.

  12. Close linkage of the locus for X chromosome-linked severe combined immunodeficiency to polymorphic DNA markers in Xq11-q13

    International Nuclear Information System (INIS)

    de Saint Basile, G.; Arveiler, B.; Oberle, I.

    1987-01-01

    The gene for X chromosome-linked severe combined immunodeficiency (SCID), a disease characterized by a block in early T-cell differentiation, has been mapped to the region Xq11-q13 by linkage analysis with restriction fragment length polymorphisms. High logarithm of odds (lod) scores were obtained with the marker 19.2 (DXS3) and with the marker cpX73 (DXS159) that showed complete cosegregation with the disease locus in the informative families analyzed. Other significant linkages were obtained with several markers from Xq11 to q22. With the help of a recently developed genetic map of the region, it was possible to perform multipoint linkage analysis, and the most likely genetic order is DXS1-(SCID, DXS159)-DXYS1-DXYS12-DXS3, with a maximum multipoint logarithm of odds score of 11.0. The results demonstrate that the SCID locus (gene symbol IMD4) is not closely linked to the locus of Bruton's agammaglobulinemia (a defect in B-cell maturation). They also provide a way for a better estimation of risk for carrier and antenatal diagnosis

  13. Algorithm improvement program nuclide identification algorithm scoring criteria and scoring application.

    Energy Technology Data Exchange (ETDEWEB)

    Enghauser, Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-02-01

    The goal of the Domestic Nuclear Detection Office (DNDO) Algorithm Improvement Program (AIP) is to facilitate gamma-radiation detector nuclide identification algorithm development, improvement, and validation. Accordingly, scoring criteria have been developed to objectively assess the performance of nuclide identification algorithms. In addition, a Microsoft Excel spreadsheet application for automated nuclide identification scoring has been developed. This report provides an overview of the equations, nuclide weighting factors, nuclide equivalencies, and configuration weighting factors used by the application for scoring nuclide identification algorithm performance. Furthermore, this report presents a general overview of the nuclide identification algorithm scoring application including illustrative examples.

  14. Genetic homogeneity of autoimmune polyglandular disease type I

    Energy Technology Data Exchange (ETDEWEB)

    Bjoerses, P.; Aaltonen, J.; Vikman, A. [Univ. of Helsinki (Finland)] [and others

    1996-10-01

    Autoimmune polyglandular disease type I (APECED) is an autosomal recessive autoimmune disease (MIM 240300) characterized by hypoparathyroidism, primary adrenocortical failure, and chronic mucocutaneous candidiasis. The disease is highly prevalent in two isolated populations, the Finnish population and the Iranian Jewish one. Sporadic cases have been identified in many other countries, including almost all European countries. The APECED locus has previously been assigned to chromosome 21q22.3 by linkage analyses in 14 Finnish families. Locus heterogeneity is a highly relevant question in this disease affecting multiple tissues and with great phenotypic diversity. To solve this matter, we performed linkage and haplotype analyses on APECED families rising from different populations. Six microsatellite markers on the critical chromosomal region of 2.6 cM on 21q22.3 were analyzed. Pair-wise linkage analyses revealed significant LOD scores for all these markers, maximum LOD score being 10.23. The obtained haplotype data and the geographic distribution of the great-grandparents of the Finnish APECED patients suggest the presence of one major, relatively old mutation responsible for {approximately}90% of the Finnish cases. Similar evidence for one founder mutation was also found in analyses of Iranian Jewish APECED haplotypes. These haplotypes, however, differed totally from the Finnish ones. The linkage analyses in 21 non-Finnish APECED families originating from several European countries provided independent evidence for linkage to the same chromosomal region on 21q22.3 and revealed no evidence for locus heterogeneity. The haplotype analyses of APECED chromosomes suggest that in different populations APECED is due to a spectrum of mutations in a still unknown gene on chromosome 21. 21 refs., 3 figs., 3 tabs.

  15. Clinical and genetic linkage analysis of a large Venezuelan kindred with Usher syndrome.

    Science.gov (United States)

    Keogh, Ivan J; Godinho, R N; Wu, T Po; Diaz de Palacios, A M; Palacios, N; Bello de Alford, M; De Almada, M I; MarPalacios, N; Vazquez, A; Mattei, R; Seidman, C; Seidman, J; Eavey, R D

    2004-08-01

    To undertake a comprehensive investigation into the very high incidence of congenital deafness on the Macano peninsula of Margarita Island, Venezuela. Numerous visits were made to the isolated island community over a 4-year-period. During these visits, it became apparent that a significant number of individuals complained of problems with hearing and vision. Socioeconomic assessments, family pedigrees and clinical histories were recorded on standard questionnaires. All individuals underwent thorough otolaryngologic and ophthalmologic examinations. Twenty milliliters of peripheral venous blood was obtained from each participant. A genome-wide linkage analysis study was performed. Polymorphic microsatellite markers were amplified by polymerase chain reaction and separated on polyacrylamide gels. An ABI 377XL sequencer was used to separate fragments and LOD scores were calculated by using published software. Twenty-four families were identified, comprising 329 individuals, age range 1-80 years, including 184 children. All families were categorized in the lower two (least affluent) socioeconomic categories. A high incidence of consanguinity was detected. Fifteen individuals (11 adults, 4 children) had profound congenital sensorineural hearing loss, vestibular areflexia and retinitis pigmentosa. A maximum LOD score of 6.76 (Linkage >3.0), between markers D11s4186 and D11s911, confirmed linkage to chromosome 11q13.5. The gene myosin VIIA (MYO7A) was confirmed in the interval. Clinical and genetic findings are consistent with a diagnosis of Usher syndrome 1B for those with hearing and vision problems. We report 15 Usher syndrome 1B individuals from a newly detected Latin American socio-demographic origin, with a very high prevalence of 76 per 100,000 population.

  16. [Results of applying a paediatric early warning score system as a healthcare quality improvement plan].

    Science.gov (United States)

    Rivero-Martín, M J; Prieto-Martínez, S; García-Solano, M; Montilla-Pérez, M; Tena-Martín, E; Ballesteros-García, M M

    2016-06-01

    The aims of this study were to introduce a paediatric early warning score (PEWS) into our daily clinical practice, as well as to evaluate its ability to detect clinical deterioration in children admitted, and to train nursing staff to communicate the information and response effectively. An analysis was performed on the implementation of PEWS in the electronic health records of children (0-15 years) in our paediatric ward from February 2014 to September 2014. The maximum score was 6. Nursing staff reviewed scores >2, and if >3 medical and nursing staff reviewed it. Monitoring indicators: % of admissions with scoring; % of complete data capture; % of scores >3; % of scores >3 reviewed by medical staff, % of changes in treatment due to the warning system, and number of patients who needed Paediatric Intensive Care Unit (PICU) admission, or died without an increased warning score. The data were collected from all patients (931) admitted. The scale was measured 7,917 times, with 78.8% of them with complete data capture. Very few (1.9%) showed scores >3, and 14% of them with changes in clinical management (intensifying treatment or new diagnostic tests). One patient (scored 2) required PICU admission. There were no deaths. Parents or nursing staff concern was registered in 80% of cases. PEWS are useful to provide a standardised assessment of clinical status in the inpatient setting, using a unique scale and implementing data capture. Because of the lack of severe complications requiring PICU admission and deaths, we will have to use other data to evaluate these scales. Copyright © 2016 SECA. Published by Elsevier Espana. All rights reserved.

  17. 49 CFR 230.24 - Maximum allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...

  18. A maximum power point tracking for photovoltaic-SPE system using a maximum current controller

    Energy Technology Data Exchange (ETDEWEB)

    Muhida, Riza [Osaka Univ., Dept. of Physical Science, Toyonaka, Osaka (Japan); Osaka Univ., Dept. of Electrical Engineering, Suita, Osaka (Japan); Park, Minwon; Dakkak, Mohammed; Matsuura, Kenji [Osaka Univ., Dept. of Electrical Engineering, Suita, Osaka (Japan); Tsuyoshi, Akira; Michira, Masakazu [Kobe City College of Technology, Nishi-ku, Kobe (Japan)

    2003-02-01

    Processes to produce hydrogen from solar photovoltaic (PV)-powered water electrolysis using solid polymer electrolysis (SPE) are reported. An alternative control of maximum power point tracking (MPPT) in the PV-SPE system based on the maximum current searching methods has been designed and implemented. Based on the characteristics of voltage-current and theoretical analysis of SPE, it can be shown that the tracking of the maximum current output of DC-DC converter in SPE side will track the MPPT of photovoltaic panel simultaneously. This method uses a proportional integrator controller to control the duty factor of DC-DC converter with pulse-width modulator (PWM). The MPPT performance and hydrogen production performance of this method have been evaluated and discussed based on the results of the experiment. (Author)

  19. Genetic and Dynamic Analyses of Murine Peak Bone Density

    National Research Council Canada - National Science Library

    Beamer, Wesley

    2001-01-01

    .... Locations of the QTLs (LOD scores > 2.9) were established, major effects and modes of inheritance were established, and B6 background congenic strains were established to pursue biological and fine mapping/cloning of genes...

  20. GalaxyDock BP2 score: a hybrid scoring function for accurate protein-ligand docking

    Science.gov (United States)

    Baek, Minkyung; Shin, Woong-Hee; Chung, Hwan Won; Seok, Chaok

    2017-07-01

    Protein-ligand docking is a useful tool for providing atomic-level understanding of protein functions in nature and design principles for artificial ligands or proteins with desired properties. The ability to identify the true binding pose of a ligand to a target protein among numerous possible candidate poses is an essential requirement for successful protein-ligand docking. Many previously developed docking scoring functions were trained to reproduce experimental binding affinities and were also used for scoring binding poses. However, in this study, we developed a new docking scoring function, called GalaxyDock BP2 Score, by directly training the scoring power of binding poses. This function is a hybrid of physics-based, empirical, and knowledge-based score terms that are balanced to strengthen the advantages of each component. The performance of the new scoring function exhibits significant improvement over existing scoring functions in decoy pose discrimination tests. In addition, when the score is used with the GalaxyDock2 protein-ligand docking program, it outperformed other state-of-the-art docking programs in docking tests on the Astex diverse set, the Cross2009 benchmark set, and the Astex non-native set. GalaxyDock BP2 Score and GalaxyDock2 with this score are freely available at http://galaxy.seoklab.org/softwares/galaxydock.html.

  1. Lake Basin Fetch and Maximum Length/Width

    Data.gov (United States)

    Minnesota Department of Natural Resources — Linear features representing the Fetch, Maximum Length and Maximum Width of a lake basin. Fetch, maximum length and average width are calcuated from the lake polygon...

  2. Maximum permissible voltage of YBCO coated conductors

    Energy Technology Data Exchange (ETDEWEB)

    Wen, J.; Lin, B.; Sheng, J.; Xu, J.; Jin, Z. [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Hong, Z., E-mail: zhiyong.hong@sjtu.edu.cn [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Wang, D.; Zhou, H.; Shen, X.; Shen, C. [Qingpu Power Supply Company, State Grid Shanghai Municipal Electric Power Company, Shanghai (China)

    2014-06-15

    Highlights: • We examine three kinds of tapes’ maximum permissible voltage. • We examine the relationship between quenching duration and maximum permissible voltage. • Continuous I{sub c} degradations under repetitive quenching where tapes reaching maximum permissible voltage. • The relationship between maximum permissible voltage and resistance, temperature. - Abstract: Superconducting fault current limiter (SFCL) could reduce short circuit currents in electrical power system. One of the most important thing in developing SFCL is to find out the maximum permissible voltage of each limiting element. The maximum permissible voltage is defined as the maximum voltage per unit length at which the YBCO coated conductors (CC) do not suffer from critical current (I{sub c}) degradation or burnout. In this research, the time of quenching process is changed and voltage is raised until the I{sub c} degradation or burnout happens. YBCO coated conductors test in the experiment are from American superconductor (AMSC) and Shanghai Jiao Tong University (SJTU). Along with the quenching duration increasing, the maximum permissible voltage of CC decreases. When quenching duration is 100 ms, the maximum permissible of SJTU CC, 12 mm AMSC CC and 4 mm AMSC CC are 0.72 V/cm, 0.52 V/cm and 1.2 V/cm respectively. Based on the results of samples, the whole length of CCs used in the design of a SFCL can be determined.

  3. Receiver function estimated by maximum entropy deconvolution

    Institute of Scientific and Technical Information of China (English)

    吴庆举; 田小波; 张乃铃; 李卫平; 曾融生

    2003-01-01

    Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.

  4. A novel stroke locus identified in a northern Sweden pedigree

    DEFF Research Database (Denmark)

    Janunger, T.; Nilsson-Ardnor, S.; Wiklund, P.-G.

    2009-01-01

    OBJECTIVES: The population of northern Sweden is characterized by reduced genetic diversity and a high incidence of stroke. We sought to reduce genetic variation further, using genealogic analysis in a set of nuclear families affected by stroke, and we subsequently performed a genome-wide scan...... to identify novel stroke susceptibility loci. METHODS: Through genealogy, 7 nuclear families with a common ancestor, connected over 8 generations, were identified. A genome-wide scan using 449 microsatellite markers was performed with subsequent haplotype analyses. RESULTS: A maximum allele-sharing lod score...... of 4.81 on chromosome 9q31-q33 was detected. Haplotype analysis identified a common 2.2-megabase interval in the chromosomal region in 4 of the nuclear families, where an overrepresentation of intracerebral hemorrhage was observed. CONCLUSIONS: We have identified a novel susceptibility locus for stroke...

  5. Linkage of the gene that encodes the alpha 1 chain of type V collagen (COL5A1) to type II Ehlers-Danlos syndrome (EDS II).

    Science.gov (United States)

    Loughlin, J; Irven, C; Hardwick, L J; Butcher, S; Walsh, S; Wordsworth, P; Sykes, B

    1995-09-01

    Ehlers-Danlos syndrome (EDS) is a group of heritable disorders of connective tissue with skin, ligaments and blood vessels being the main sites affected. The commonest variant (EDS II) exhibits an autosomal dominant mode of inheritance and is characterized by joint hypermobility, cigarette paper scars, lax skin and excessive bruising. As yet no gene has been linked to EDS II, nor has linkage been established to a specific region of the genome. However, several candidate genes encoding proteins of the extracellular matrix have been excluded. Using an intragenic simple sequence repeat polymorphism, we report linkage of the COL5A1 gene, which encodes the alpha 1(V) chain of type V collagen, to EDS II. A maximum LOD score (Zmax) for linkage of 8.3 at theta = 0.00 was generated for a single large pedigree.

  6. Allegheny County Walk Scores

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Walk Score measures the walkability of any address using a patented system developed by the Walk Score company. For each 2010 Census Tract centroid, Walk Score...

  7. Maximum likelihood fitting of FROC curves under an initial-detection-and-candidate-analysis model

    International Nuclear Information System (INIS)

    Edwards, Darrin C.; Kupinski, Matthew A.; Metz, Charles E.; Nishikawa, Robert M.

    2002-01-01

    We have developed a model for FROC curve fitting that relates the observer's FROC performance not to the ROC performance that would be obtained if the observer's responses were scored on a per image basis, but rather to a hypothesized ROC performance that the observer would obtain in the task of classifying a set of 'candidate detections' as positive or negative. We adopt the assumptions of the Bunch FROC model, namely that the observer's detections are all mutually independent, as well as assumptions qualitatively similar to, but different in nature from, those made by Chakraborty in his AFROC scoring methodology. Under the assumptions of our model, we show that the observer's FROC performance is a linearly scaled version of the candidate analysis ROC curve, where the scaling factors are just given by the FROC operating point coordinates for detecting initial candidates. Further, we show that the likelihood function of the model parameters given observational data takes on a simple form, and we develop a maximum likelihood method for fitting a FROC curve to this data. FROC and AFROC curves are produced for computer vision observer datasets and compared with the results of the AFROC scoring method. Although developed primarily with computer vision schemes in mind, we hope that the methodology presented here will prove worthy of further study in other applications as well

  8. Recurrent epistaxis: predicting risk of 30-day readmission, derivation and validation of RHINO-ooze score.

    Science.gov (United States)

    Addison, A; Paul, C; Kuo, R; Lamyman, A; Martinez-Devesa, P; Hettige, R

    2017-06-01

    To derive and validate a predictive scoring tool (RHINO-ooze score) with good sensitivity and specificity in identifying patients with epistaxis at high risk of 30 day readmission and to enable risk stratification for possible definitive intervention. Using medical databases, we searched for factors influencing recurrent epistaxis. The information ascertained together with our analysis of retrospective data on patients admitted with epistaxis between October 2013 and September 2014, was used as the derivation cohort to develop the predictive scoring model (RHINO-ooze score). The tool was validated by performing statistical analysis on the validation cohort of patients admitted with epistaxis between October 2014 and October 2015. Multiple linear regressions with backwards elimination was used to derive the predictive model. The area under the curve (AUC), sensitivity and specificity were calculated. 834 admissions were encountered within the study period. Using the derivative cohort (n= 302) the RHINO-ooze score with a maximum score of 8 from five variables (Recent admission, Haemorrhage point unidentified, Increasing age over 70, posterior Nasal packing, Oral anticoagulant) was developed. The RHINO-ooze score had a chi-square value of 99.72 with a significance level of smaller than 0.0001 and hence an overall good model fit. Comparison between the derivative and validation groups revealed similar rates of 30-day readmission between the cohorts. The sensitivity and specificity of predicting 30-day readmission in high risk patients with recurrent epistaxis (RHINO-ooze score equal/larger than 6) was 81% and 84%, respectively. The RHINO-ooze scoring tool demonstrates good specificity and sensitivity in predicting the risk of 30 day readmission in patients with epistaxis and can be used as an adjunct to clinical decision making with regards to timing of operative intervention in order to reduce readmission rates.

  9. Ossification score is a better indicator of maturity related changes in eating quality than animal age.

    Science.gov (United States)

    Bonny, S P F; Pethick, D W; Legrand, I; Wierzbicki, J; Allen, P; Farmer, L J; Polkinghorne, R J; Hocquette, J-F; Gardner, G E

    2016-04-01

    Ossification score and animal age are both used as proxies for maturity-related collagen crosslinking and consequently decreases in beef tenderness. Ossification score is strongly influenced by the hormonal status of the animal and may therefore better reflect physiological maturity and consequently eating quality. As part of a broader cross-European study, local consumers scored 18 different muscle types cooked in three ways from 482 carcasses with ages ranging from 590 to 6135 days and ossification scores ranging from 110 to 590. The data were studied across three different maturity ranges; the complete range of maturities, a lesser range and a more mature range. The lesser maturity group consisted of carcasses having either an ossification score of 200 or less or an age of 987 days or less with the remainder in the greater maturity group. The three different maturity ranges were analysed separately with a linear mixed effects model. Across all the data, and for the greater maturity group, animal age had a greater magnitude of effect on eating quality than ossification score. This is likely due to a loss of sensitivity in mature carcasses where ossification approached and even reached the maximum value. In contrast, age had no relationship with eating quality for the lesser maturity group, leaving ossification score as the more appropriate measure. Therefore ossification score is more appropriate for most commercial beef carcasses, however it is inadequate for carcasses with greater maturity such as cull cows. Both measures may therefore be required in models to predict eating quality over populations with a wide range in maturity.

  10. Maximum Gene-Support Tree

    Directory of Open Access Journals (Sweden)

    Yunfeng Shan

    2008-01-01

    Full Text Available Genomes and genes diversify during evolution; however, it is unclear to what extent genes still retain the relationship among species. Model species for molecular phylogenetic studies include yeasts and viruses whose genomes were sequenced as well as plants that have the fossil-supported true phylogenetic trees available. In this study, we generated single gene trees of seven yeast species as well as single gene trees of nine baculovirus species using all the orthologous genes among the species compared. Homologous genes among seven known plants were used for validation of the finding. Four algorithms—maximum parsimony (MP, minimum evolution (ME, maximum likelihood (ML, and neighbor-joining (NJ—were used. Trees were reconstructed before and after weighting the DNA and protein sequence lengths among genes. Rarely a gene can always generate the “true tree” by all the four algorithms. However, the most frequent gene tree, termed “maximum gene-support tree” (MGS tree, or WMGS tree for the weighted one, in yeasts, baculoviruses, or plants was consistently found to be the “true tree” among the species. The results provide insights into the overall degree of divergence of orthologous genes of the genomes analyzed and suggest the following: 1 The true tree relationship among the species studied is still maintained by the largest group of orthologous genes; 2 There are usually more orthologous genes with higher similarities between genetically closer species than between genetically more distant ones; and 3 The maximum gene-support tree reflects the phylogenetic relationship among species in comparison.

  11. A comparison between modified Alvarado score and RIPASA score in the diagnosis of acute appendicitis.

    Science.gov (United States)

    Singla, Anand; Singla, Satpaul; Singh, Mohinder; Singla, Deeksha

    2016-12-01

    Acute appendicitis is a common but elusive surgical condition and remains a diagnostic dilemma. It has many clinical mimickers and diagnosis is primarily made on clinical grounds, leading to the evolution of clinical scoring systems for pin pointing the right diagnosis. The modified Alvarado and RIPASA scoring systems are two important scoring systems, for diagnosis of acute appendicitis. We prospectively compared the two scoring systems for diagnosing acute appendicitis in 50 patients presenting with right iliac fossa pain. The RIPASA score correctly classified 88 % of patients with histologically confirmed acute appendicitis compared with 48.0 % with modified Alvarado score, indicating that RIPASA score is more superior to Modified Alvarado score in our clinical settings.

  12. A score for measuring health risk perception in environmental surveys.

    Science.gov (United States)

    Marcon, Alessandro; Nguyen, Giang; Rava, Marta; Braggion, Marco; Grassi, Mario; Zanolin, Maria Elisabetta

    2015-09-15

    In environmental surveys, risk perception may be a source of bias when information on health outcomes is reported using questionnaires. Using the data from a survey carried out in the largest chipboard industrial district in Italy (Viadana, Mantova), we devised a score of health risk perception and described its determinants in an adult population. In 2006, 3697 parents of children were administered a questionnaire that included ratings on 7 environmental issues. Items dimensionality was studied by factor analysis. After testing equidistance across response options by homogeneity analysis, a risk perception score was devised by summing up item ratings. Factor analysis identified one latent factor, which we interpreted as health risk perception, that explained 65.4% of the variance of five items retained after scaling. The scale (range 0-10, mean ± SD 9.3 ± 1.9) had a good internal consistency (Cronbach's alpha 0.87). Most subjects (80.6%) expressed maximum risk perception (score = 10). Italian mothers showed significantly higher risk perception than foreign fathers. Risk perception was higher for parents of young children, and for older parents with a higher education, than for their counterparts. Actual distance to major roads was not associated with the score, while self-reported intense traffic and frequent air refreshing at home predicted higher risk perception. When investigating health effects of environmental hazards using questionnaires, care should be taken to reduce the possibility of awareness bias at the stage of study planning and data analysis. Including appropriate items in study questionnaires can be useful to derive a measure of health risk perception, which can help to identify confounding of association estimates by risk perception. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Maximum neutron flux in thermal reactors

    International Nuclear Information System (INIS)

    Strugar, P.V.

    1968-12-01

    Direct approach to the problem is to calculate spatial distribution of fuel concentration if the reactor core directly using the condition of maximum neutron flux and comply with thermal limitations. This paper proved that the problem can be solved by applying the variational calculus, i.e. by using the maximum principle of Pontryagin. Mathematical model of reactor core is based on the two-group neutron diffusion theory with some simplifications which make it appropriate from maximum principle point of view. Here applied theory of maximum principle are suitable for application. The solution of optimum distribution of fuel concentration in the reactor core is obtained in explicit analytical form. The reactor critical dimensions are roots of a system of nonlinear equations and verification of optimum conditions can be done only for specific examples

  14. Good validity and reliability of the forgotten joint score in evaluating the outcome of total knee arthroplasty

    DEFF Research Database (Denmark)

    Thomsen, Morten G; Latifi, Roshan; Kallemose, Thomas

    2016-01-01

    . We investigated the validity and reliability of the FJS. Patients and methods - A Danish version of the FJS questionnaire was created according to internationally accepted standards. 360 participants who underwent primary TKA were invited to participate in the study. Of these, 315 were included...... in a validity study and 150 in a reliability study. Correlation between the Oxford knee score (OKS) and the FJS was examined and test-retest evaluation was performed. A ceiling effect was defined as participants reaching a score within 15% of the maximum achievable score. Results - The validity study revealed...... of the FJS (ICC? 0.79). We found a high level of internal consistency (Cronbach's? = 0.96). The ceiling effect for the FJS was 16%, as compared to 37% for the OKS. Interpretation - The FJS showed good construct validity and test-retest reliability. It had a lower ceiling effect than the OKS. The FJS appears...

  15. Late onset autosomal dominant cerebellar ataxia a family description and linkage analysis with the hla system

    Directory of Open Access Journals (Sweden)

    Walter O. Arruda

    1991-09-01

    Full Text Available A family suffering an autosomal dominant form of late onset hereditary cerebellar ataxia is described. Eight affected family members were personally studied, and data from another four were obtained through anamnesis. The mean age of onset was 37.1±5.4 years (27-47 years. The clinical picture consisted basically of a pure ataxic cerebellar syndrome. CT-scan disclosed diffuse cerebellar atrophy with relative sparing of the brainstem (and no involvement of supratentorial structures. Neurophysiological studies (nerve conduction, VEP and BAEP were normal. Twenty-six individuals were typed for HLA histocompatibility antigens. Lod scores were calculated with the computer program LINKMAP. Close linkage of the ataxia gene with the HLA system in this family could be excluded - 0==0,02, z=(-2,17 - and the overall analysis of the lod scores suggest another chromossomal location than chromosome 6.

  16. Ripasa score: a new diagnostic score for diagnosis of acute appendicitis

    International Nuclear Information System (INIS)

    Butt, M.Q.

    2014-01-01

    Objective: To determine the usefulness of RIPASA score for the diagnosis of acute appendicitis using histopathology as a gold standard. Study Design: Cross-sectional study. Place and Duration of Study: Department of General Surgery, Combined Military Hospital, Kohat, from September 2011 to March 2012. Methodology: A total of 267 patients were included in this study. RIPASA score was assessed. The diagnosis of appendicitis was made clinically aided by routine sonography of abdomen. After appendicectomies, resected appendices were sent for histopathological examination. The 15 parameters and the scores generated were age (less than 40 years = 1 point; greater than 40 years = 0.5 point), gender (male = 1 point; female = 0.5 point), Right Iliac Fossa (RIF) pain (0.5 point), migration of pain to RIF (0.5 point), nausea and vomiting (1 point), anorexia (1 point), duration of symptoms (less than 48 hours = 1 point; more than 48 hours = 0.5 point), RIF tenderness (1 point), guarding (2 points), rebound tenderness (1 point), Rovsing's sign (2 points), fever (1 point), raised white cell count (1 point), negative urinalysis (1 point) and foreign national registration identity card (1 point). The optimal cut-off threshold score from the ROC was 7.5. Sensitivity analysis was done. Results: Out of 267 patients, 156 (58.4%) were male while remaining 111 patients (41.6%) were female with mean age of 23.5 +- 9.1 years. Sensitivity of RIPASA score was 96.7%, specificity 93.0%, diagnostic accuracy was 95.1%, positive predictive value was 94.8% and negative predictive value was 95.54%. Conclusion: RIPASA score at a cut-off total score of 7.5 was a useful tool to diagnose appendicitis, in equivocal cases of pain. (author)

  17. 75 FR 43840 - Inflation Adjustment of the Ordinary Maximum and Aggravated Maximum Civil Monetary Penalties for...

    Science.gov (United States)

    2010-07-27

    ...-17530; Notice No. 2] RIN 2130-ZA03 Inflation Adjustment of the Ordinary Maximum and Aggravated Maximum... remains at $250. These adjustments are required by the Federal Civil Penalties Inflation Adjustment Act [email protected] . SUPPLEMENTARY INFORMATION: The Federal Civil Penalties Inflation Adjustment Act of 1990...

  18. Half-width at half-maximum, full-width at half-maximum analysis

    Indian Academy of Sciences (India)

    addition to the well-defined parameter full-width at half-maximum (FWHM). The distribution of ... optical side-lobes in the diffraction pattern resulting in steep central maxima [6], reduc- tion of effects of ... and broad central peak. The idea of.

  19. The International Bleeding Risk Score

    DEFF Research Database (Denmark)

    Laursen, Stig Borbjerg; Laine, L.; Dalton, H.

    2017-01-01

    The International Bleeding Risk Score: A New Risk Score that can Accurately Predict Mortality in Patients with Upper GI-Bleeding.......The International Bleeding Risk Score: A New Risk Score that can Accurately Predict Mortality in Patients with Upper GI-Bleeding....

  20. Two-dimensional maximum entropy image restoration

    International Nuclear Information System (INIS)

    Brolley, J.E.; Lazarus, R.B.; Suydam, B.R.; Trussell, H.J.

    1977-07-01

    An optical check problem was constructed to test P LOG P maximum entropy restoration of an extremely distorted image. Useful recovery of the original image was obtained. Comparison with maximum a posteriori restoration is made. 7 figures

  1. Variation in WNT7A is unlikely to be a cause of familial Congenital Talipes Equinovarus

    Directory of Open Access Journals (Sweden)

    Hennekam Raoul

    2008-06-01

    Full Text Available Abstract Background Genetic factors make an important contribution to the aetiology of congenital talipes equinovarus (CTEV, the most common developmental disorder of the lower limb. WNT7A was suggested as a candidate gene for CTEV on the basis of a genome-wide scan for linkage in a large multi-case family. WNT7A is a plausible candidate gene for CTEV as it provides a signal for pattern formation during limb development, and mutation in WNT7A has been reported in a number of limb malformation syndromes. Methods We investigated the role of WNT7A using a family-based linkage approach in our large series of European multi-case CTEV families. Three microsatellite markers were used, of which one (D3S2385 is intragenic, and the other two (D3S2403, D3S1252 are 700 kb 5' to the start and 20 kb from the 3' end of the gene, respectively. Ninety-one CTEV families, comprising 476 individuals of whom 211 were affected, were genotyped. LOD scores using recessive and incomplete-dominant inheritance models, and non-parametric linkage scores, excluded linkage. Results No significant evidence for linkage was observed using either parametric or non-parametric models. LOD scores for the parametric models remained strongly negative in the regions between the markers, and in the 0.5 cM intervals outside the marker map. No significant lod scores were obtained when the data were analysed allowing for heterogeneity. Conclusion Our evidence suggests that the WNT7A gene is unlikely to be a major contributor to the aetiology of familial CTEV.

  2. Maximum power point tracking

    International Nuclear Information System (INIS)

    Enslin, J.H.R.

    1990-01-01

    A well engineered renewable remote energy system, utilizing the principal of Maximum Power Point Tracking can be m ore cost effective, has a higher reliability and can improve the quality of life in remote areas. This paper reports that a high-efficient power electronic converter, for converting the output voltage of a solar panel, or wind generator, to the required DC battery bus voltage has been realized. The converter is controlled to track the maximum power point of the input source under varying input and output parameters. Maximum power point tracking for relative small systems is achieved by maximization of the output current in a battery charging regulator, using an optimized hill-climbing, inexpensive microprocessor based algorithm. Through practical field measurements it is shown that a minimum input source saving of 15% on 3-5 kWh/day systems can easily be achieved. A total cost saving of at least 10-15% on the capital cost of these systems are achievable for relative small rating Remote Area Power Supply systems. The advantages at larger temperature variations and larger power rated systems are much higher. Other advantages include optimal sizing and system monitor and control

  3. 7 CFR 3565.210 - Maximum interest rate.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Maximum interest rate. 3565.210 Section 3565.210... AGRICULTURE GUARANTEED RURAL RENTAL HOUSING PROGRAM Loan Requirements § 3565.210 Maximum interest rate. The interest rate for a guaranteed loan must not exceed the maximum allowable rate specified by the Agency in...

  4. Prediction of Beck Depression Inventory (BDI-II) Score Using Acoustic Measurements in a Sample of Iium Engineering Students

    Science.gov (United States)

    Fikri Zanil, Muhamad; Nur Wahidah Nik Hashim, Nik; Azam, Huda

    2017-11-01

    Psychiatrist currently relies on questionnaires and interviews for psychological assessment. These conservative methods often miss true positives and might lead to death, especially in cases where a patient might be experiencing suicidal predisposition but was only diagnosed as major depressive disorder (MDD). With modern technology, an assessment tool might aid psychiatrist with a more accurate diagnosis and thus hope to reduce casualty. This project will explore on the relationship between speech features of spoken audio signal (reading) in Bahasa Malaysia with the Beck Depression Inventory scores. The speech features used in this project were Power Spectral Density (PSD), Mel-frequency Ceptral Coefficients (MFCC), Transition Parameter, formant and pitch. According to analysis, the optimum combination of speech features to predict BDI-II scores include PSD, MFCC and Transition Parameters. The linear regression approach with sequential forward/backward method was used to predict the BDI-II scores using reading speech. The result showed 0.4096 mean absolute error (MAE) for female reading speech. For male, the BDI-II scores successfully predicted 100% less than 1 scores difference with MAE of 0.098437. A prediction system called Depression Severity Evaluator (DSE) was developed. The DSE managed to predict one out of five subjects. Although the prediction rate was low, the system precisely predict the score within the maximum difference of 4.93 for each person. This demonstrates that the scores are not random numbers.

  5. How is the injury severity scored? a brief review of scoring systems

    Directory of Open Access Journals (Sweden)

    Mohsen Ebrahimi

    2015-06-01

    Full Text Available The management of injured patients is a critical issue in pre-hospital and emergency departments. Trauma victims are usually young and the injuries may lead to mortality or severe morbidities. The severity of injury can be estimated by observing the anatomic and physiologic evidences. Scoring systems are used to present a scale of describing the severity of the injuries in the victims.We reviewed the evidences of famous scoring systems, the history of their development, applications and their evolutions. We searched electronic database PubMed and Google scholar with keywords: (trauma OR injury AND (severity OR intensity AND (score OR scale.In this paper, we are going to present a definition of scoring systems and discuss the Abbreviated Injury Scale (AIS and Injury Severity Score (ISS, the most acceptable systems, their applications and their advantages and limitations.Several injury-scoring methods have been introduced. Each method has specific features, advantages and disadvantages. The AIS is an anatomical-based scoring system, which provides a standard numerical scale of ranking and comparing injuries. The ISS was established as a platform for trauma data registry. ISS is also an anatomically-based ordinal scale, with a range of 1-75. Several databases and studies are formed based on ISS and are available for trauma management research.Although the ISS is not perfect, it is established as the basic platform of health services and public health researches. The ISS registering system can provide many opportunities for the development of efficient data recording and statistical analyzing models.

  6. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  7. Late-onset Stargardt-like macular dystrophy maps to chromosome 1p13

    Energy Technology Data Exchange (ETDEWEB)

    Kaplan, J.; Gerber, S.; Rozet, J.M. [Hopital des Enfants Malades, Paris (France)] [and others

    1994-09-01

    Stargardt`s disease (MIM 248200), originally described in 1909, is an autosomal recessive condition of childhood, characterized by a sudden and bilateral loss of central vision. Typically, it has an early onset (7 to 12 years), a rapidly progressive course and a poor final outcome. The central area of the retina (macula) displays pigmentary changes in a ring form with depigmentation and atrophy of the retinal pigmentary epithelium (RPE). Perimacular yellowish spots, termed fundus flavimaculatus, are observed in a high percentage of patients. We have recently reported the genetic mapping of Stargardt`s disease to chromosome 1p13. On the other hand, considering that fundus flavimaculatus (MIM 230100) is another form of fleck fundus disease, with a Stargardt-like retinal aspect but with a late-onset and a more progressive course, we decided to test the hypothesis of allelism between typical Stargardt`s disease and late-onset autosomal recessive fundus flavimaculatus. Significant pairwise lod scores were obtained in each of four multiplex families (11 affected individuals, 12 relatives) with four markers of the 1p13 region (Z = 4.79, 4.64, 3.07, 3.16 at loci D1S435, D1S424, D1S236, and D1S415, respectively at {theta} = 0). Multipoint analysis showed that the best estimate for location of the disease gene is between D1S424 and D1S236 (maximum lod score of 5.20) as also observed in Stargardt`s disease. Our results are consistent with the location of the gene responsible of the late-onset Stargardt-like macular dystrophy in the 1p13 region and raise the hypothesis of either allelic mutational events or contiguous genes in this chromosomal region. The question of possible relationship with some age-related macular dystrophies in now open to debate.

  8. Quantification of temporal changes in calcium score in active atherosclerotic plaque in major vessels by {sup 18}F-sodium fluoride PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Ishiwata, Yoshinobu; Kaneta, Tomohiro; Nawata, Shintaro; Hino-Shishikura, Ayako; Yoshida, Keisuke; Inoue, Tomio [Yokohama City University, Graduate School of Medicine, Department of Radiology, Yokohama, Kanagawa (Japan)

    2017-08-15

    Our aim was to assess whether {sup 18}F-NaF PET/CT is able to predict progression of the CT calcium score. Between August 2007 and November 2015, 34 patients (18 women, 16 men; age, mean ± standard deviation, 57.5 ± 13.9 years; age range 19-78 years) with malignancy or orthopaedic disease were enrolled in this study, with approximately 1-year follow-up data. Baseline and follow-up CT images were retrospectively evaluated for the presence of calcification sites in major vessel walls. The maximum and mean CT values (CTmax and CTmean, in Hounsfield units), calcification volumetric score (CVS, in cubic millimetres) and Agatston units score (AU) were evaluated for each site. Subsequent changes in CTmax, CTmean, CVS and AU were calculated and expressed as ΔCTmax, ΔCTmean, ΔCVS and ΔAU, respectively. We then evaluated the relationship between {sup 18}F-NaF uptake (using the maximum target-to-background ratio, TBRmax, and the maximum blood-subtracted {sup 18}F-NaF activity, bsNaFmax, which was obtained by subtracting the SUVmax of each calcified plaque lesion and NaF-avid site from the SUVmean in the right atrium blood pool) and the change in calcified plaque volume and characteristics obtained after 1 year. We detected and analysed 182 calcified plaque sites and 96 hot spots on major vessel walls. {sup 18}F-NaF uptake showed very weak correlations with CTmax, CTmean, CVS, CVS after 1 year, AU and AU after 1 year on both baseline and follow-up PET/CT scans for each site. {sup 18}F-NaF uptake showed no correlation with ΔCTmax or ΔCTmean. However, there was a significant correlation between the intensity of {sup 18}F-NaF uptake and ΔCVS and ΔAU. {sup 18}F-NaF uptake has a strong correlation with calcium score progression which was a predictor of future cardiovascular disease risk. PET/CT using {sup 18}F-NaF may be able to predict calcium score progression which is known to be the major characteristic of atherosclerosis. (orig.)

  9. Predicting Coronary Artery Aneurysms in Kawasaki Disease at a North American Center: An Assessment of Baseline z Scores.

    Science.gov (United States)

    Son, Mary Beth F; Gauvreau, Kimberlee; Kim, Susan; Tang, Alexander; Dedeoglu, Fatma; Fulton, David R; Lo, Mindy S; Baker, Annette L; Sundel, Robert P; Newburger, Jane W

    2017-05-31

    Accurate risk prediction of coronary artery aneurysms (CAAs) in North American children with Kawasaki disease remains a clinical challenge. We sought to determine the predictive utility of baseline coronary dimensions adjusted for body surface area ( z scores) for future CAAs in Kawasaki disease and explored the extent to which addition of established Japanese risk scores to baseline coronary artery z scores improved discrimination for CAA development. We explored the relationships of CAA with baseline z scores; with Kobayashi, Sano, Egami, and Harada risk scores; and with the combination of baseline z scores and risk scores. We defined CAA as a maximum z score (zMax) ≥2.5 of the left anterior descending or right coronary artery at 4 to 8 weeks of illness. Of 261 patients, 77 patients (29%) had a baseline zMax ≥2.0. CAAs occurred in 15 patients (6%). CAAs were strongly associated with baseline zMax ≥2.0 versus Baseline zMax ≥2.0 had a C statistic of 0.77, good sensitivity (80%), and excellent negative predictive value (98%). None of the risk scores alone had adequate discrimination. When high-risk status per the Japanese risk scores was added to models containing baseline zMax ≥2.0, none were significantly better than baseline zMax ≥2.0 alone. In a North American center, baseline zMax ≥2.0 in children with Kawasaki disease demonstrated high predictive utility for later development of CAA. Future studies should validate the utility of our findings. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  10. SCORE - A DESCRIPTION.

    Science.gov (United States)

    SLACK, CHARLES W.

    REINFORCEMENT AND ROLE-REVERSAL TECHNIQUES ARE USED IN THE SCORE PROJECT, A LOW-COST PROGRAM OF DELINQUENCY PREVENTION FOR HARD-CORE TEENAGE STREET CORNER BOYS. COMMITTED TO THE BELIEF THAT THE BOYS HAVE THE POTENTIAL FOR ETHICAL BEHAVIOR, THE SCORE WORKER FOLLOWS B.F. SKINNER'S THEORY OF OPERANT CONDITIONING AND REINFORCES THE DELINQUENT'S GOOD…

  11. Pediatric siMS score: A new, simple and accurate continuous metabolic syndrome score for everyday use in pediatrics.

    Science.gov (United States)

    Vukovic, Rade; Milenkovic, Tatjana; Stojan, George; Vukovic, Ana; Mitrovic, Katarina; Todorovic, Sladjana; Soldatovic, Ivan

    2017-01-01

    The dichotomous nature of the current definition of metabolic syndrome (MS) in youth results in loss of information. On the other hand, the calculation of continuous MS scores using standardized residuals in linear regression (Z scores) or factor scores of principal component analysis (PCA) is highly impractical for clinical use. Recently, a novel, easily calculated continuous MS score called siMS score was developed based on the IDF MS criteria for the adult population. To develop a Pediatric siMS score (PsiMS), a modified continuous MS score for use in the obese youth, based on the original siMS score, while keeping the score as simple as possible and retaining high correlation with more complex scores. The database consisted of clinical data on 153 obese (BMI ≥95th percentile) children and adolescents. Continuous MS scores were calculated using Z scores and PCA, as well as the original siMS score. Four variants of PsiMS score were developed in accordance with IDF criteria for MS in youth and correlation of these scores with PCA and Z score derived MS continuous scores was assessed. PsiMS score calculated using formula: (2xWaist/Height) + (Glucose(mmol/l)/5.6) + (triglycerides(mmol/l)/1.7) + (Systolic BP/130)-(HDL(mmol/l)/1.02) showed the highest correlation with most of the complex continuous scores (0.792-0.901). The original siMS score also showed high correlation with continuous MS scores. PsiMS score represents a practical and accurate score for the evaluation of MS in the obese youth. The original siMS score should be used when evaluating large cohorts consisting of both adults and children.

  12. The dopamine transporter protein gene (SLC6A3): Primary linage mapping and linkage studies in Tourette syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Gelernter, J.; Kruger, S.D.; Pakstis, A.J. [Yale Univ., New Haven, CT (United States)]|[West Haven Veterans Affairs Medical Center, CT (United States)] [and others

    1995-12-10

    The dopamine transporter, the molecule responsible for presynaptic reuptake of dopamine and a major site of action of psychostimulant drugs, including cocaine, is encoded by locus SLC6A3 (alias DAT1). The protein`s actions and DAT`s specific localization to dopaminergic neurons make it a candidate gene for several psychiatric illnesses. SLC6A3 has been mapped to distal chromosome 5p, using physical methods. Genetic linkage methods were used to place SLC6A3 in the genetic linkage map. Four extended pedigrees (one of which overlaps with CEPH) were typed. Linkage with Tourette syndrome (TS) was also examined. SLC6A3 showed close linkage with several markers previously mapped to distal chromosome 5p, including D5S11 (Z{sub max} = 16.0, {theta}{sub M} = {theta}{sub F} = 0.03, results from four families) and D5S678 (Z{sub max} = 7.84, {theta}{sub M} = {theta}{sub F} = 0, results from two families). Observed crossovers established that SLC6A3 is a distal marker close to D5S10 and D5S678, but these three distal markers could not be ordered. Linkage between TS and SLC6A3 could be excluded independently in two branches of a large kindred segregating TS; the lod score in a third family was also negative, but not significant. Cumulative results show a lod score of -6.2 at {theta} = 0 and of -3.9 at {theta} = 0.05 (dominant model, narrow disease definition). SLC6A3 thus maps to distal chromosome 5p by linkage analysis, in agreement with previous physical mapping data. A mutation at SLC6A3 is not causative for TS in the two large families that generated significant negative lod scores (if the parameters of our analyses were correct) and is unlikely to be causative in the family that generated a negative lod score that did not reach significance. These results do not exclude a role for the dopamine transporter in influencing risk for TS in combination with other loci. 23 refs., 1 fig., 2 tabs.

  13. Dual-energy X-ray absorptiometry diagnostic discordance between Z-scores and T-scores in young adults.

    LENUS (Irish Health Repository)

    Carey, John J

    2009-01-01

    Diagnostic criteria for postmenopausal osteoporosis using central dual-energy X-ray absorptiometry (DXA) T-scores have been widely accepted. The validity of these criteria for other populations, including premenopausal women and young men, has not been established. The International Society for Clinical Densitometry (ISCD) recommends using DXA Z-scores, not T-scores, for diagnosis in premenopausal women and men aged 20-49 yr, though studies supporting this position have not been published. We examined diagnostic agreement between DXA-generated T-scores and Z-scores in a cohort of men and women aged 20-49 yr, using 1994 World Health Organization and 2005 ISCD DXA criteria. Four thousand two hundred and seventy-five unique subjects were available for analysis. The agreement between DXA T-scores and Z-scores was moderate (Cohen\\'s kappa: 0.53-0.75). The use of Z-scores resulted in significantly fewer (McNemar\\'s p<0.001) subjects diagnosed with "osteopenia," "low bone mass for age," or "osteoporosis." Thirty-nine percent of Hologic (Hologic, Inc., Bedford, MA) subjects and 30% of Lunar (GE Lunar, GE Madison, WI) subjects diagnosed with "osteoporosis" by T-score were reclassified as either "normal" or "osteopenia" when their Z-score was used. Substitution of DXA Z-scores for T-scores results in significant diagnostic disagreement and significantly fewer persons being diagnosed with low bone mineral density.

  14. Which clinical variable influences health-related quality of life the most after spontaneous subarachnoid hemorrhage? Hunt and Hess scale, Fisher score, World Federation of Neurosurgeons score, Brussels coma score, and Glasgow coma score compared.

    Science.gov (United States)

    Kapapa, Thomas; Tjahjadi, Martin; König, Ralph; Wirtz, Christian Rainer; Woischneck, Dieter

    2013-12-01

    To determine the strength of the correlation between the Hunt and Hess scale, Fisher score, Brussels coma score, World Federation of Neurosurgeons score, and Glasgow coma score and health-related quality of life. Evaluable questionnaires from 236 patients (5.6 years [± standard deviation, 2.854 years] on average after hemorrhage) were included in the analysis. Quality of life was documented using the MOS-36 item short form health survey. Because of the ordinal nature of the variables, Kendall tau was used for calculation. Significance was established as P ≤ 0.05. Weak and very weak correlations were found in general (r ≤ 0.28). The strongest correlations were found between the Glasgow coma score and quality of life (r = 0.236, P = 0.0001). In particular, the "best verbal response" achieved the strongest correlations in the comparison, at r = 0.28/P = 0.0001. The Fisher score showed very weak correlations (r = -0.148/P = 0.012). The Brussels coma score (r = -0.216/P = 0.0001), Hunt and Hess scale (r = -0.197/P = 0.0001), and the World Federation of Neurosurgeons score (r = -0.185/P = 0.0001) revealed stronger correlations, especially in terms of the physical aspects of quality of life. The Glasgow coma scale revealed the strongest, and the Fisher score showed the weakest correlations. Thus the Fisher score, as an indicator of the severity of a hemorrhage, has little significance in terms of health-related quality of life. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Normative scores on the Berg Balance Scale decline after age 70 years in healthy community-dwelling people: a systematic review.

    Science.gov (United States)

    Downs, Stephen; Marquez, Jodie; Chiarelli, Pauline

    2014-06-01

    What is the mean Berg Balance Scale score of healthy elderly people living in the community and how does it vary with age? How much variability in Berg Balance Scale scores is present in groups of healthy elderly people and how does this vary with age? Systematic review with meta-analysis. Any group of healthy community-dwelling people with a mean age of 70 years or greater that has undergone assessment using the Berg Balance Scale. Mean and standard deviations of Berg Balance Scale scores within cohorts of elderly people of known mean age. The search yielded 17 relevant studies contributing data from a total of 1363 participants. The mean Berg Balance Scale scores ranged from 37 to 55 out of a possible maximum score of 56. The standard deviation of Berg Balance Scale scores varied from 1.0 to 9.2. Although participants aged around 70 years had very close to normal Berg Balance Scale scores, there was a significant decline in balance with age at a rate of 0.7 points on the 56-point Berg Balance Scale per year. There was also a strong association between increasing age and increasing variability in balance (R(2) = 0.56, p balance deficits, as measured by the Berg Balance Scale, although balance scores deteriorate and become more variable with age. Copyright © 2014. Published by Elsevier B.V.

  16. Association of modified NUTRIC score with 28-day mortality in critically ill patients.

    Science.gov (United States)

    Mukhopadhyay, Amartya; Henry, Jeyakumar; Ong, Venetia; Leong, Claudia Shu-Fen; Teh, Ai Ling; van Dam, Rob M; Kowitlawakul, Yanika

    2017-08-01

    For patients in the intensive care unit (ICU), nutritional risk assessment is often difficult. Traditional scoring systems cannot be used for patients who are sedated or unconscious since they are unable to provide information on their history of food intake and weight loss. We aim to validate the NUTRIC (NUTrition RIsk in Critically ill) score, an ICU-specific nutrition risk assessment tool in Asian patients. This was an observational study in the medical ICU of a university-affiliated tertiary hospital. We included all adult patients (≥18years) admitted between October 2013 and September 2014 who stayed for more than 24 hours in the ICU. Components of the modified NUTRIC (mNUTRIC) score, demographic details, body mass index (BMI), use of mechanical ventilation (MV), vasopressor drugs, and renal replacement therapy (RRT) were obtained from the ICU database. For patients on MV (maximum 12 days), we calculated the energy intake and nutritional adequacy (energy received ÷ energy recommended) from enteral or parenteral feeding data. Multivariable logistic regression analysis was used with 28-day mortality as the outcome of interest. 401 patients (62% male, mean age 60.0 ± 16.3 years, mean BMI 23.9 ± 6.2 kg/m 2 ) were included. In the univariate analysis, BMI, mNUTRIC score, MV, vasopressor drug, and RRT were associated with 28-day mortality. In the multivariable logistic regression analysis, mNUTRIC score (Odds ratio, OR 1.48, Confidence Interval, CI 1.25-1.74, p Nutritional adequacy was assessed in a subgroup of 273 (68%) patients who received MV for at least 48 hours. Median (IQR) nutritional adequacy was 0.44 (0.15-0.70). In patients with high mNUTRIC score (5-9), higher nutritional adequacy was associated with a lower predicted 28-day mortality; this was not observed in patients with low mNUTRIC (0-4) score (effect modification, p interaction nutritional adequacy may reduce the 28-day mortality in patients with a high mNUTRIC score. Copyright © 2016

  17. Walk Score® and Transit Score® and Walking in the Multi-Ethnic Study of Atherosclerosis

    Science.gov (United States)

    Hirsch, Jana A.; Moore, Kari A.; Evenson, Kelly R.; Rodriguez, Daniel A; Diez Roux, Ana V.

    2013-01-01

    Background Walk Score® and Transit Score® are open-source measures of the neighborhood built environment to support walking (“walkability”) and access to transportation. Purpose To investigate associations of Street Smart Walk Score and Transit Score with self-reported transport and leisure walking using data from a large multi-city and diverse population-based sample of adults. Methods Data from a sample of 4552 residents of Baltimore MD; Chicago IL; Forsyth County NC; Los Angeles CA; New York NY; and St. Paul MN from the Multi-Ethnic Study of Atherosclerosis (2010–2012) were linked to Walk Score and Transit Score (collected in 2012). Logistic and linear regression models estimated ORs of not walking and mean differences in minutes walked, respectively, associated with continuous and categoric Walk Score and Transit Score. All analyses were conducted in 2012. Results After adjustment for site, key sociodemographic, and health variables, a higher Walk Score was associated with lower odds of not walking for transport and more minutes/week of transport walking. Compared to those in a “walker’s paradise,” lower categories of Walk Score were associated with a linear increase in odds of not transport walking and a decline in minutes of leisure walking. An increase in Transit Score was associated with lower odds of not transport walking or leisure walking, and additional minutes/week of leisure walking. Conclusions Walk Score and Transit Score appear to be useful as measures of walkability in analyses of neighborhood effects. PMID:23867022

  18. A split hand-split foot (SHFM3) gene is located at 10q24{yields}25

    Energy Technology Data Exchange (ETDEWEB)

    Gurrieri, F.; Genuardi, M.; Nanni, L.; Sangiorgi, E.; Garofalo, G. [Catholic Univ. of Rome (Italy)] [and others

    1996-04-24

    The split hand-split foot (SHSF) malformation affects the central rays of the upper and lower limbs. It presents either as an isolated defect or in association with other skeletal or non-skeletal abnormalities. An autosomal SHSF locus (SHFM1) was previously mapped to 7q22.1. We report the mapping of a second autosomal SHSF locus to 10q24{yields}25 region. Maximum lod scores of 3.73, 4.33 and 4.33 at a recombination fraction of zero were obtained for the loci D10S198, PAX2 and D10S1239, respectively. An 19 cM critical region could be defined by haplotype analysis and several genes with a potential role in limb morphogenesis are located in this region. Heterogeneity testing indicates the existence of at least one additional autosomal SHSF locus. 36 refs., 3 figs., 3 tabs.

  19. Genetic linkage of mild pseudoachondroplasia (PSACH) to markers in the pericentromeric region of chromosome 19

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, M.D.; Rasmussen, M.; Garber, P.; Rimoin, D.L.; Cohn, D.H. (Steven Spielberg Pediatric Research Center, Los Angeles, CA (United States)); Weber, J.L. (Marshfield Medical Research Foundation, WI (United States)); Yuen, J.; Reinker, K. (Univ. of Hawaii, Honolulu, HI (United States))

    1993-12-01

    Pseudoachondroplasia (PSACH) is a dominantly inherited form of short-limb dwarfism characterized by dysplastic changes in the spine, epiphyses, and metaphyses and early onset osteoarthropathy. Chondrocytes from affected individuals accumulate an unusual appearing material in the rough endoplasmic reticulum, which has led to the hypothesis that a structural abnormality in a cartilage-specific protein produces the phenotype. The authors recently identified a large family with a mild form of pseudoachondroplasia. By genetic linkage to a dinucleotide repeat polymorphic marker (D19S199), they have localized the disease gene to chromosome 19 (maximum lod score of 7.09 at a recombination fraction of 0.03). Analysis of additional markers and recombinations between the linked markers and the phenotype suggests that the disease gene resides within a 6.3-cM interval in the immediate pericentromeric region of the chromosome. 39 refs., 2 figs., 1 tab.

  20. 78 FR 9845 - Minimum and Ordinary Maximum and Aggravated Maximum Civil Monetary Penalties for a Violation of...

    Science.gov (United States)

    2013-02-12

    ... maximum penalty amount of $75,000 for each violation, except that if the violation results in death... the maximum civil penalty for a violation is $175,000 if the violation results in death, serious... Penalties for a Violation of the Hazardous Materials Transportation Laws or Regulations, Orders, Special...

  1. Predicting occupational personality test scores.

    Science.gov (United States)

    Furnham, A; Drakeley, R

    2000-01-01

    The relationship between students' actual test scores and their self-estimated scores on the Hogan Personality Inventory (HPI; R. Hogan & J. Hogan, 1992), an omnibus personality questionnaire, was examined. Despite being given descriptive statistics and explanations of each of the dimensions measured, the students tended to overestimate their scores; yet all correlations between actual and estimated scores were positive and significant. Correlations between self-estimates and actual test scores were highest for sociability, ambition, and adjustment (r = .62 to r = .67). The results are discussed in terms of employers' use and abuse of personality assessment for job recruitment.

  2. Performance analysis and comparison of an Atkinson cycle coupled to variable temperature heat reservoirs under maximum power and maximum power density conditions

    International Nuclear Information System (INIS)

    Wang, P.-Y.; Hou, S.-S.

    2005-01-01

    In this paper, performance analysis and comparison based on the maximum power and maximum power density conditions have been conducted for an Atkinson cycle coupled to variable temperature heat reservoirs. The Atkinson cycle is internally reversible but externally irreversible, since there is external irreversibility of heat transfer during the processes of constant volume heat addition and constant pressure heat rejection. This study is based purely on classical thermodynamic analysis methodology. It should be especially emphasized that all the results and conclusions are based on classical thermodynamics. The power density, defined as the ratio of power output to maximum specific volume in the cycle, is taken as the optimization objective because it considers the effects of engine size as related to investment cost. The results show that an engine design based on maximum power density with constant effectiveness of the hot and cold side heat exchangers or constant inlet temperature ratio of the heat reservoirs will have smaller size but higher efficiency, compression ratio, expansion ratio and maximum temperature than one based on maximum power. From the view points of engine size and thermal efficiency, an engine design based on maximum power density is better than one based on maximum power conditions. However, due to the higher compression ratio and maximum temperature in the cycle, an engine design based on maximum power density conditions requires tougher materials for engine construction than one based on maximum power conditions

  3. Prediction of true test scores from observed item scores and ancillary data.

    Science.gov (United States)

    Haberman, Shelby J; Yao, Lili; Sinharay, Sandip

    2015-05-01

    In many educational tests which involve constructed responses, a traditional test score is obtained by adding together item scores obtained through holistic scoring by trained human raters. For example, this practice was used until 2008 in the case of GRE(®) General Analytical Writing and until 2009 in the case of TOEFL(®) iBT Writing. With use of natural language processing, it is possible to obtain additional information concerning item responses from computer programs such as e-rater(®). In addition, available information relevant to examinee performance may include scores on related tests. We suggest application of standard results from classical test theory to the available data to obtain best linear predictors of true traditional test scores. In performing such analysis, we require estimation of variances and covariances of measurement errors, a task which can be quite difficult in the case of tests with limited numbers of items and with multiple measurements per item. As a consequence, a new estimation method is suggested based on samples of examinees who have taken an assessment more than once. Such samples are typically not random samples of the general population of examinees, so that we apply statistical adjustment methods to obtain the needed estimated variances and covariances of measurement errors. To examine practical implications of the suggested methods of analysis, applications are made to GRE General Analytical Writing and TOEFL iBT Writing. Results obtained indicate that substantial improvements are possible both in terms of reliability of scoring and in terms of assessment reliability. © 2015 The British Psychological Society.

  4. Cardiovascular risk scores for coronary atherosclerosis.

    Science.gov (United States)

    Yalcin, Murat; Kardesoglu, Ejder; Aparci, Mustafa; Isilak, Zafer; Uz, Omer; Yiginer, Omer; Ozmen, Namik; Cingozbay, Bekir Yilmaz; Uzun, Mehmet; Cebeci, Bekir Sitki

    2012-10-01

    The objective of this study was to compare frequently used cardiovascular risk scores in predicting the presence of coronary artery disease (CAD) and 3-vessel disease. In 350 consecutive patients (218 men and 132 women) who underwent coronary angiography, the cardiovascular risk level was determined using the Framingham Risk Score (FRS), the Modified Framingham Risk Score (MFRS), the Prospective Cardiovascular Münster (PROCAM) score, and the Systematic Coronary Risk Evaluation (SCORE). The area under the curve for receiver operating characteristic curves showed that FRS had more predictive value than the other scores for CAD (area under curve, 0.76, P MFRS, PROCAM, and SCORE) may predict the presence and severity of coronary atherosclerosis.The FRS had better predictive value than the other scores.

  5. Forecasting the value of credit scoring

    Science.gov (United States)

    Saad, Shakila; Ahmad, Noryati; Jaffar, Maheran Mohd

    2017-08-01

    Nowadays, credit scoring system plays an important role in banking sector. This process is important in assessing the creditworthiness of customers requesting credit from banks or other financial institutions. Usually, the credit scoring is used when customers send the application for credit facilities. Based on the score from credit scoring, bank will be able to segregate the "good" clients from "bad" clients. However, in most cases the score is useful at that specific time only and cannot be used to forecast the credit worthiness of the same applicant after that. Hence, bank will not know if "good" clients will always be good all the time or "bad" clients may become "good" clients after certain time. To fill up the gap, this study proposes an equation to forecast the credit scoring of the potential borrowers at a certain time by using the historical score related to the assumption. The Mean Absolute Percentage Error (MAPE) is used to measure the accuracy of the forecast scoring. Result shows the forecast scoring is highly accurate as compared to actual credit scoring.

  6. A novel syndrome of autosomal-dominant hyperinsulinemic hypoglycemia linked to a mutation in the human insulin receptor gene

    DEFF Research Database (Denmark)

    Højlund, Kurt; Hansen, Torben; Lajer, Maria

    2004-01-01

    a missense mutation (Arg1174Gln) in the tyrosine kinase domain of the insulin receptor gene that cosegregated with the disease phenotype (logarithm of odds [LOD] score 3.21). In conclusion, we report a novel syndrome of autosomal-dominant hyperinsulinemic hypoglycemia. The findings demonstrate...

  7. LCLS Maximum Credible Beam Power

    International Nuclear Information System (INIS)

    Clendenin, J.

    2005-01-01

    The maximum credible beam power is defined as the highest credible average beam power that the accelerator can deliver to the point in question, given the laws of physics, the beam line design, and assuming all protection devices have failed. For a new accelerator project, the official maximum credible beam power is determined by project staff in consultation with the Radiation Physics Department, after examining the arguments and evidence presented by the appropriate accelerator physicist(s) and beam line engineers. The definitive parameter becomes part of the project's safety envelope. This technical note will first review the studies that were done for the Gun Test Facility (GTF) at SSRL, where a photoinjector similar to the one proposed for the LCLS is being tested. In Section 3 the maximum charge out of the gun for a single rf pulse is calculated. In Section 4, PARMELA simulations are used to track the beam from the gun to the end of the photoinjector. Finally in Section 5 the beam through the matching section and injected into Linac-1 is discussed

  8. Assessment of average of normals (AON) procedure for outlier-free datasets including qualitative values below limit of detection (LoD): an application within tumor markers such as CA 15-3, CA 125, and CA 19-9.

    Science.gov (United States)

    Usta, Murat; Aral, Hale; Mete Çilingirtürk, Ahmet; Kural, Alev; Topaç, Ibrahim; Semerci, Tuna; Hicri Köseoğlu, Mehmet

    2016-11-01

    Average of normals (AON) is a quality control procedure that is sensitive only to systematic errors that can occur in an analytical process in which patient test results are used. The aim of this study was to develop an alternative model in order to apply the AON quality control procedure to datasets that include qualitative values below limit of detection (LoD). The reported patient test results for tumor markers, such as CA 15-3, CA 125, and CA 19-9, analyzed by two instruments, were retrieved from the information system over a period of 5 months, using the calibrator and control materials with the same lot numbers. The median as a measure of central tendency and the median absolute deviation (MAD) as a measure of dispersion were used for the complementary model of AON quality control procedure. The u bias values, which were determined for the bias component of the measurement uncertainty, were partially linked to the percentages of the daily median values of the test results that fall within the control limits. The results for these tumor markers, in which lower limits of reference intervals are not medically important for clinical diagnosis and management, showed that the AON quality control procedure, using the MAD around the median, can be applied for datasets including qualitative values below LoD.

  9. Ocular manifestation in Marfan syndrome: corneal biomechanical properties relate to increased systemic score points.

    Science.gov (United States)

    Scheibenberger, Dido; Frings, Andreas; Steinberg, Johannes; Schüler, Helke; Druchkiv, Vasyl; Katz, Toam; von Kodolitsch, Yskert; Linke, Stephan

    2018-06-01

    To evaluate corneal deformation to an air puff as a new noninvasive tool to document disease status in Marfan syndrome (MFS) METHODS: Prospective observational cohort study. We included patients diagnosed with MFS who had their routine cardiovascular follow-up and applied the revised Ghent nosology to define two subgroups according to a high (≥ 7) and a low (< 7 points) systemic score. Dynamic Scheimpflug-based biomechanical analyses (CorvisST® [CST; Oculus GmbH]) were performed. The main outcome measure was the displacement of the corneal apex as given by the parameters highest concavity (HC; in ms), peak distance (PD; in mm), and highest concavity deformation amplitude (DA; mm). Forty-three eyes of 43 individuals (19 female, 24 male; mean age 42.0 ± 12.0 years, range 18-67 years) diagnosed with MFS were included. Applying the Ghent criteria, 21 patients had an advanced systemic score of ≥ 7, and 22 had score points < 7. There were no differences in age or sex between both groups. In contrast, HC was faster (P = 0.004), and PD (P < 0.001) was longer in those individuals with systemic score ≥ 7; maximum DA did not result in a statistically significant difference between the groups (P = 0.250). In vivo noninvasive biomechanical analyses with CST offer a new, non-invasive method to identify pathologic corneal deformation responses in adults with MFS. In the future, corneal deformation to an air puff could thus assist early identification of patients with high Ghent score as an adjunct to existing diagnostic tests.

  10. Development of the Crohn's disease digestive damage score, the Lémann score

    DEFF Research Database (Denmark)

    Pariente, Benjamin; Cosnes, Jacques; Danese, Silvio

    2011-01-01

    is to outline the methods to develop an instrument that can measure cumulative bowel damage. The project is being conducted by the International Program to develop New Indexes in Crohn's disease (IPNIC) group. This instrument, called the Crohn's Disease Digestive Damage Score (the Lémann score), should take...

  11. Plasma metabolite score correlates with Hypoxia time in a newly born piglet model for asphyxia

    Directory of Open Access Journals (Sweden)

    Julia Kuligowski

    2017-08-01

    Full Text Available Hypoxic-ischemic encephalopathy (HIE secondary to perinatal asphyxia is a leading cause of mortality and acquired long-term neurologic co-morbidities in the neonate. The most successful intervention for the treatment of moderate to severe HIE is moderate whole body hypothermia initiated within 6 h from birth. The objective and prompt identification of infants who are at risk of developing moderate to severe HIE in the critical first hours still remains a challenge. This work proposes a metabolite score calculated based on the relative intensities of three metabolites (choline, 6,8-dihydroxypurine and hypoxanthine that showed maximum correlation with hypoxia time in a consolidated piglet model for neonatal hypoxia-ischemia. The metabolite score's performance as a biomarker for perinatal hypoxia and its usefulness for clinical grading and decision making have been assessed and compared to the performance of lactate which is currently considered the gold standard. For plasma samples withdrawn before and directly after a hypoxic insult, the metabolite score performed similar to lactate. However, it provided an enhanced predictive capacity at 2 h after resuscitation. The present study evidences the usefulness of the metabolite score for improving the early assessment of the severity of the hypoxic insult based on serial determinations in a minimally invasive biofluid. The applicability of the metabolite score for clinical diagnosis and patient stratification for hypothermia treatment has to be confirmed in multicenter trials involving newborns suffering from HIE. Keywords: Hypoxia, Perinatal asphyxia, Newborn, Metabolic biomarker, Neonatal piglet model, Liquid Chromatography – Time-of-Flight Mass Spectrometry (LC-TOF-MS

  12. Risk score predicts high-grade prostate cancer in DNA-methylation positive, histopathologically negative biopsies.

    Science.gov (United States)

    Van Neste, Leander; Partin, Alan W; Stewart, Grant D; Epstein, Jonathan I; Harrison, David J; Van Criekinge, Wim

    2016-09-01

    to a NPV of 96% for high-grade cancer. The risk score, comprising DNA-methylation intensity and traditional clinical risk factors, improved the identification of men with high-grade cancer, with a maximum avoidance of unnecessary repeat biopsies. This risk score resulted in better patient risk stratification and significantly outperformed current risk prediction models such as PCPTRC and PSA. The risk score could help to identify patients with histopathologically negative biopsies harboring high-grade PCa. Prostate 76:1078-1087, 2016. © 2016 The Authors. The Prostate Published by Wiley Periodicals, Inc. © 2016 The Authors. The Prostate Published by Wiley Periodicals, Inc.

  13. Multiple Score Comparison: a network meta-analysis approach to comparison and external validation of prognostic scores

    Directory of Open Access Journals (Sweden)

    Sarah R. Haile

    2017-12-01

    Full Text Available Abstract Background Prediction models and prognostic scores have been increasingly popular in both clinical practice and clinical research settings, for example to aid in risk-based decision making or control for confounding. In many medical fields, a large number of prognostic scores are available, but practitioners may find it difficult to choose between them due to lack of external validation as well as lack of comparisons between them. Methods Borrowing methodology from network meta-analysis, we describe an approach to Multiple Score Comparison meta-analysis (MSC which permits concurrent external validation and comparisons of prognostic scores using individual patient data (IPD arising from a large-scale international collaboration. We describe the challenges in adapting network meta-analysis to the MSC setting, for instance the need to explicitly include correlations between the scores on a cohort level, and how to deal with many multi-score studies. We propose first using IPD to make cohort-level aggregate discrimination or calibration scores, comparing all to a common comparator. Then, standard network meta-analysis techniques can be applied, taking care to consider correlation structures in cohorts with multiple scores. Transitivity, consistency and heterogeneity are also examined. Results We provide a clinical application, comparing prognostic scores for 3-year mortality in patients with chronic obstructive pulmonary disease using data from a large-scale collaborative initiative. We focus on the discriminative properties of the prognostic scores. Our results show clear differences in performance, with ADO and eBODE showing higher discrimination with respect to mortality than other considered scores. The assumptions of transitivity and local and global consistency were not violated. Heterogeneity was small. Conclusions We applied a network meta-analytic methodology to externally validate and concurrently compare the prognostic properties

  14. 13 CFR 107.840 - Maximum term of Financing.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Maximum term of Financing. 107.840... COMPANIES Financing of Small Businesses by Licensees Structuring Licensee's Financing of An Eligible Small Business: Terms and Conditions of Financing § 107.840 Maximum term of Financing. The maximum term of any...

  15. Do Test Scores Buy Happiness?

    Science.gov (United States)

    McCluskey, Neal

    2017-01-01

    Since at least the enactment of No Child Left Behind in 2002, standardized test scores have served as the primary measures of public school effectiveness. Yet, such scores fail to measure the ultimate goal of education: maximizing happiness. This exploratory analysis assesses nation level associations between test scores and happiness, controlling…

  16. 20 CFR 226.52 - Total annuity subject to maximum.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Total annuity subject to maximum. 226.52... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Railroad Retirement Family Maximum § 226.52 Total annuity subject to maximum. The total annuity amount which is compared to the maximum monthly amount to...

  17. Effect of telescopic distal extension removable partial dentures on oral health related quality of life and maximum bite force: A preliminary cross over study.

    Science.gov (United States)

    Elsyad, Moustafa Abdou; Mostafa, Aisha Zakaria

    2018-01-01

    This cross over study aimed to evaluate the effect of telescopic distal extension removable partial dentures on oral health related quality of life and maximum bite force MATERIALS AND METHODS: Twenty patients with complete maxillary edentulism and partially edentulous mandibles with anterior teeth only remaining were selected for this cross over study. All patients received complete maxillary dentures and mandibular partial removable dental prosthesis (PRDP, control). After 3 months of adaptation, PRDP was replaced with conventional telescopic partial dentures (TPD) or telescopic partial dentures with cantilevered extensions (TCPD) in a quasi-random method. Oral health related quality of life (OHRQoL) was measured using OHIP-14 questionnaire and Maximum bite force (MBF) was measured using a bite force transducer. Measurements were performed 3 months after using each of the following prostheses; PRDP, TPD, and TCPD. TCPD showed the OHIP-14 lowest scores (i.e., the highest patient satisfaction with their OHRQoL), followed by TPD, and PRDP showed the highest OHIP-14 scores (i.e., the lowest patient satisfaction with OHRQoL). TCPD showed the highest MBF (70.7 ± 3.71), followed by TPD (57.4 ± 3.43) and the lowest MBF (40.2 ± 2.20) was noted with PRDP. WITHIN The Limitations of This Study, Mandibular Telescopic Distal Extension Removable Partial Dentures with Cantilevered Extensions Were Associated with Improved Oral Health Related Quality of Life and Maximum Bite Force Compared to Telescopic or Conventional PRDP. Telescopic distal extension removable prostheses is an esthetic restoration in partially edentulous patients with free end saddle. This article describes the addition of cantilevered extensions of this prosthesis. The results showed that telescopic distal extension removable prostheses with cantilevered extensions were associated with improved oral health related quality of life and maximum bite force compared to telescopic or conventional RPDs

  18. Oswestry Disability Index scoring made easy.

    Science.gov (United States)

    Mehra, A; Baker, D; Disney, S; Pynsent, P B

    2008-09-01

    Low back pain effects up to 80% of the population at some time during their active life. Questionnaires are available to help measure pain and disability. The Oswestry Disability Index (ODI) is the most commonly used outcome measure for low back pain. The aim of this study was to see if training in completing the ODI forms improved the scoring accuracy. The last 100 ODI forms completed in a hospital's spinal clinic were reviewed retrospectively and errors in the scoring were identified. Staff members involved in scoring the questionnaire were made aware of the errors and the correct method of scoring explained. A chart was created with all possible scores to aid the staff with scoring. A prospective audit on 50 questionnaires was subsequently performed. The retrospective study showed that 33 of the 100 forms had been incorrectly scored. All questionnaires where one or more sections were not completed by the patient were incorrectly scored. A scoring chart was developed and staff training was implemented. This reduced the error rate to 14% in the prospective audit. Clinicians applying outcome measures should read the appropriate literature to ensure they understand the scoring system. Staff must then be given adequate training in the application of the questionnaires.

  19. Sway Area and Velocity Correlated With MobileMat Balance Error Scoring System (BESS) Scores.

    Science.gov (United States)

    Caccese, Jaclyn B; Buckley, Thomas A; Kaminski, Thomas W

    2016-08-01

    The Balance Error Scoring System (BESS) is often used for sport-related concussion balance assessment. However, moderate intratester and intertester reliability may cause low initial sensitivity, suggesting that a more objective balance assessment method is needed. The MobileMat BESS was designed for objective BESS scoring, but the outcome measures must be validated with reliable balance measures. Thus, the purpose of this investigation was to compare MobileMat BESS scores to linear and nonlinear measures of balance. Eighty-eight healthy collegiate student-athletes (age: 20.0 ± 1.4 y, height: 177.7 ± 10.7 cm, mass: 74.8 ± 13.7 kg) completed the MobileMat BESS. MobileMat BESS scores were compared with 95% area, sway velocity, approximate entropy, and sample entropy. MobileMat BESS scores were significantly correlated with 95% area for single-leg (r = .332) and tandem firm (r = .474), and double-leg foam (r = .660); and with sway velocity for single-leg (r = .406) and tandem firm (r = .601), and double-leg (r = .575) and single-leg foam (r = .434). MobileMat BESS scores were not correlated with approximate or sample entropy. MobileMat BESS scores were low to moderately correlated with linear measures, suggesting the ability to identify changes in the center of mass-center of pressure relationship, but not higher-order processing associated with nonlinear measures. These results suggest that the MobileMat BESS may be a clinically-useful tool that provides objective linear balance measures.

  20. [Effect of maximum blood pressure fluctuation on prognosis of patients with acute ischemic stroke within 24 hours after hospital admission].

    Science.gov (United States)

    Wang, H; Tang, Y; Zhang, Y; Xu, K; Zhao, J B

    2018-05-10

    Objective: To investigate the relationship between the maximum blood pressure fluctuation within 24 hours after admission and the prognosis at discharge. Methods: The patients with ischemic stroke admitted in Department of Neurology of the First Affiliated Hospital of Harbin Medical University within 24 hours after onset were consecutively selected from April 2016 to March 2017. The patients were grouped according to the diagnostic criteria of hypertension. Ambulatory blood pressure of the patients within 24 hours after admission were measured with bedside monitors and baseline data were collected. The patients were scored by NIHSS at discharge. The relationships between the maximum values of systolic blood pressure (SBP) or diastolic blood pressure (DBP) and the prognosis at discharge were analyzed. Results: A total of 521 patients with acute ischemic stroke were enrolled. They were divided into normal blood pressure group (82 cases) and hypertension group(439 cases). In normal blood pressure group, the maximum values of SBP and DBP were all in normal distribution ( P >0.05). The maximum value of SBP fluctuation was set at 146.6 mmHg. After adjustment for potential confounders, the OR for poor prognosis at discharge in patients with SBP fluctuation ≥146.6 mmHg was 2.669 (95 %CI : 0.594-11.992) compared with those with SBP fluctuation blood pressure at admission, the maximum values of SBP and DBP within 24 hours after admission had no relationship with prognosis at discharge. In acute ischemic stroke patients with hypertension at admission, the maximum values of SBP and DBP within 24 hours after admission were associated with poor prognosis at discharge.

  1. How to score questionnaires

    NARCIS (Netherlands)

    Hofstee, W.K.B.; Ten Berge, J.M.F.; Hendriks, A.A.J.

    The standard practice in scoring questionnaires consists of adding item scores and standardizing these sums. We present a set of alternative procedures, consisting of (a) correcting for the acquiescence variance that disturbs the structure of the questionnaire; (b) establishing item weights through

  2. Scoring haemophilic arthropathy on X-rays: improving inter- and intra-observer reliability and agreement using a consensus atlas

    Energy Technology Data Exchange (ETDEWEB)

    Foppen, Wouter; Schaaf, Irene C. van der; Beek, Frederik J.A. [University Medical Center Utrecht, Department of Radiology (Netherlands); Verkooijen, Helena M. [University Medical Center Utrecht, Department of Radiology (Netherlands); University Medical Center Utrecht, Julius Center for Health Sciences and Primary Care, Utrecht (Netherlands); Fischer, Kathelijn [University Medical Center Utrecht, Julius Center for Health Sciences and Primary Care, Utrecht (Netherlands); University Medical Center Utrecht, Van Creveldkliniek, Department of Hematology, Utrecht (Netherlands)

    2016-06-15

    The radiological Pettersson score (PS) is widely applied for classification of arthropathy to evaluate costly haemophilia treatment. This study aims to assess and improve inter- and intra-observer reliability and agreement of the PS. Two series of X-rays (bilateral elbows, knees, and ankles) of 10 haemophilia patients (120 joints) with haemophilic arthropathy were scored by three observers according to the PS (maximum score 13/joint). Subsequently, (dis-)agreement in scoring was discussed until consensus. Example images were collected in an atlas. Thereafter, second series of 120 joints were scored using the atlas. One observer rescored the second series after three months. Reliability was assessed by intraclass correlation coefficients (ICC), agreement by limits of agreement (LoA). Median Pettersson score at joint level (PS{sub joint}) of affected joints was 6 (interquartile range 3-9). Using the consensus atlas, inter-observer reliability of the PS{sub joint} improved significantly from 0.94 (95 % confidence interval (CI) 0.91-0.96) to 0.97 (CI 0.96-0.98). LoA improved from ±1.7 to ±1.1 for the PS{sub joint}. Therefore, true differences in arthropathy were differences in the PS{sub joint} of >2 points. Intra-observer reliability of the PS{sub joint} was 0.98 (CI 0.97-0.98), intra-observer LoA were ±0.9 points. Reliability and agreement of the PS improved by using a consensus atlas. (orig.)

  3. Health Disparities Score Composite of Youth and Parent Dyads from an Obesity Prevention Intervention: iCook 4-H

    Directory of Open Access Journals (Sweden)

    Melissa D. Olfert

    2018-05-01

    Full Text Available iCook 4-H is a lifestyle intervention to improve diet, physical activity and mealtime behavior. Control and treatment dyads (adult primary meal preparer and a 9–10-year-old youth completed surveys at baseline and 4, 12, and 24 months. A Health Disparity (HD score composite was developed utilizing a series of 12 questions (maximum score = 12 with a higher score indicating a more severe health disparity. Questions came from the USDA short form U.S. Household Food Security Survey (5, participation in food assistance programs (1, food behavior (2, level of adult education completed (1, marital status (1, and race (1 adult and 1 child. There were 228 dyads (control n = 77; treatment n = 151 enrolled in the iCook 4-H study. Baseline HD scores were 3.00 ± 2.56 among control dyads and 2.97 ± 2.91 among treatment dyads, p = 0.6632. There was a significant decline in the HD score of the treatment group from baseline to 12 months (p = 0.0047 and baseline to 24 months (p = 0.0354. A treatment by 12-month time interaction was found (baseline mean 2.97 ± 2.91 vs. 12-month mean 1.78 ± 2.31; p = 0.0406. This study shows that behavioral change interventions for youth and adults can help improve factors that impact health equity; although, further research is needed to validate this HD score as a measure of health disparities across time.

  4. Maximum likely scale estimation

    DEFF Research Database (Denmark)

    Loog, Marco; Pedersen, Kim Steenstrup; Markussen, Bo

    2005-01-01

    A maximum likelihood local scale estimation principle is presented. An actual implementation of the estimation principle uses second order moments of multiple measurements at a fixed location in the image. These measurements consist of Gaussian derivatives possibly taken at several scales and/or ...

  5. Linkage between company scores and stock returns

    Directory of Open Access Journals (Sweden)

    Saban Celik

    2017-12-01

    Full Text Available Previous studies on company scores conducted at firm-level, generally concluded that there exists a positive relation between company scores and stock returns. Motivated by these studies, this study examines the relationship between company scores (Corporate Governance Score, Economic Score, Environmental Score, and Social Score and stock returns, both at portfolio-level analysis and firm-level cross-sectional regressions. In portfolio-level analysis, stocks are sorted based on each company scores and quintile portfolio are formed with different levels of company scores. Then, existence and significance of raw returns and risk-adjusted returns difference between portfolios with the extreme company scores (portfolio 10 and portfolio 1 is tested. In addition, firm-level cross-sectional regression is performed to examine the significance of company scores effects with control variables. While portfolio-level analysis results indicate that there is no significant relation between company scores and stock returns; firm-level analysis indicates that economic, environmental, and social scores have effect on stock returns, however, significance and direction of these effects change, depending on the included control variables in the cross-sectional regression.

  6. Assignment of the Nance-Horan syndrome to the distal short arm of the X chromosome.

    Science.gov (United States)

    Zhu, D; Alcorn, D M; Antonarakis, S E; Levin, L S; Huang, P C; Mitchell, T N; Warren, A C; Maumenee, I H

    1990-11-01

    There are three types of X-linked cataracts recorded in Mendelian Inheritance in Man (McKusick 1988): congenital total, with posterior sutural opacities in heterozygotes; congenital, with microcornea or slight microphthalmia; and the cataract-dental syndrome or Nance-Horan (NH) syndrome. To identify a DNA marker close to the gene responsible for the NH syndrome, linkage analysis on 36 members in a three-generation pedigree including seven affected males and nine carrier females was performed using 31 DNA markers. A LOD score of 1.662 at theta = 0.16 was obtained with probe 782 from locus DXS85 on Xp22.2-p22.3. Negative LOD scores were found at six loci on the short arm, one distal to DXS85, five proximal, and six probes spanning the long arm were highly negative. These results make the assignment of the locus for NH to the distal end of the short arm of the X chromosome likely.

  7. Determining the sample size for co-dominant molecular marker-assisted linkage detection for a monogenic qualitative trait by controlling the type-I and type-II errors in a segregating F2 population.

    Science.gov (United States)

    Hühn, M; Piepho, H P

    2003-03-01

    Tests for linkage are usually performed using the lod score method. A critical question in linkage analyses is the choice of sample size. The appropriate sample size depends on the desired type-I error and power of the test. This paper investigates the exact type-I error and power of the lod score method in a segregating F(2) population with co-dominant markers and a qualitative monogenic dominant-recessive trait. For illustration, a disease-resistance trait is considered, where the susceptible allele is recessive. A procedure is suggested for finding the appropriate sample size. It is shown that recessive plants have about twice the information content of dominant plants, so the former should be preferred for linkage detection. In some cases the exact alpha-values for a given nominal alpha may be rather small due to the discrete nature of the sampling distribution in small samples. We show that a gain in power is possible by using exact methods.

  8. Possible linkage of non-syndromic cleft lip and palate to the MSX1 homebox gene on chromosome 4p

    Energy Technology Data Exchange (ETDEWEB)

    Wang, S.; Walczak, C.; Erickson, R.P.

    1994-09-01

    The MSX1 (HOX7) gene has been shown recently to cause cleft palate in a mouse model deficient for its product. Several features of this mouse model make the human homolog of this gene an excellent candidate for non-syndromic cleft palate. We tested this hypothesis by linkage studies in two large multiplex human families using a microsatellite marker in the human MSX1 gene. A LOD score of 1.7 was obtained maximizing at a recombination fraction of 0.09. Computer simulation power calculations using the program SIMLINK indicated that a LOD score this large is expected to occur only about 1/200 times by chance alone for a marker locus with comparable informativeness if unlinked to the disease gene. This suggestive finding is being followed up by attempts to recruit and study additional families and by DNA sequence analyses of the MSX1 gene in these families and other cleft lip and/or cleft palate subjects and these further results will also be reported.

  9. The Bandim tuberculosis score

    DEFF Research Database (Denmark)

    Rudolf, Frauke; Joaquim, Luis Carlos; Vieira, Cesaltina

    2013-01-01

    Background: This study was carried out in Guinea-Bissau ’ s capital Bissau among inpatients and outpatients attending for tuberculosis (TB) treatment within the study area of the Bandim Health Project, a Health and Demographic Surveillance Site. Our aim was to assess the variability between 2...... physicians in performing the Bandim tuberculosis score (TBscore), a clinical severity score for pulmonary TB (PTB), and to compare it to the Karnofsky performance score (KPS). Method : From December 2008 to July 2009 we assessed the TBscore and the KPS of 100 PTB patients at inclusion in the TB cohort and...

  10. What Do Test Scores Really Mean? A Latent Class Analysis of Danish Test Score Performance

    DEFF Research Database (Denmark)

    Munk, Martin D.; McIntosh, James

    2014-01-01

    Latent class Poisson count models are used to analyze a sample of Danish test score results from a cohort of individuals born in 1954-55, tested in 1968, and followed until 2011. The procedure takes account of unobservable effects as well as excessive zeros in the data. We show that the test scores...... of intelligence explain a significant proportion of the variation in test scores. This adds to the complexity of interpreting test scores and suggests that school culture and possible incentive problems make it more di¢ cult to understand what the tests measure....

  11. [Propensity score matching in SPSS].

    Science.gov (United States)

    Huang, Fuqiang; DU, Chunlin; Sun, Menghui; Ning, Bing; Luo, Ying; An, Shengli

    2015-11-01

    To realize propensity score matching in PS Matching module of SPSS and interpret the analysis results. The R software and plug-in that could link with the corresponding versions of SPSS and propensity score matching package were installed. A PS matching module was added in the SPSS interface, and its use was demonstrated with test data. Score estimation and nearest neighbor matching was achieved with the PS matching module, and the results of qualitative and quantitative statistical description and evaluation were presented in the form of a graph matching. Propensity score matching can be accomplished conveniently using SPSS software.

  12. [Prognostic scores for pulmonary embolism].

    Science.gov (United States)

    Junod, Alain

    2016-03-23

    Nine prognostic scores for pulmonary embolism (PE), based on retrospective and prospective studies, published between 2000 and 2014, have been analyzed and compared. Most of them aim at identifying PE cases with a low risk to validate their ambulatory care. Important differences in the considered outcomes: global mortality, PE-specific mortality, other complications, sizes of low risk groups, exist between these scores. The most popular score appears to be the PESI and its simplified version. Few good quality studies have tested the applicability of these scores to PE outpatient care, although this approach tends to already generalize in the medical practice.

  13. Maximum gravitational redshift of white dwarfs

    International Nuclear Information System (INIS)

    Shapiro, S.L.; Teukolsky, S.A.

    1976-01-01

    The stability of uniformly rotating, cold white dwarfs is examined in the framework of the Parametrized Post-Newtonian (PPN) formalism of Will and Nordtvedt. The maximum central density and gravitational redshift of a white dwarf are determined as functions of five of the nine PPN parameters (γ, β, zeta 2 , zeta 3 , and zeta 4 ), the total angular momentum J, and the composition of the star. General relativity predicts that the maximum redshifts is 571 km s -1 for nonrotating carbon and helium dwarfs, but is lower for stars composed of heavier nuclei. Uniform rotation can increase the maximum redshift to 647 km s -1 for carbon stars (the neutronization limit) and to 893 km s -1 for helium stars (the uniform rotation limit). The redshift distribution of a larger sample of white dwarfs may help determine the composition of their cores

  14. IW-Scoring: an Integrative Weighted Scoring framework for annotating and prioritizing genetic variations in the noncoding genome.

    Science.gov (United States)

    Wang, Jun; Dayem Ullah, Abu Z; Chelala, Claude

    2018-01-30

    The vast majority of germline and somatic variations occur in the noncoding part of the genome, only a small fraction of which are believed to be functional. From the tens of thousands of noncoding variations detectable in each genome, identifying and prioritizing driver candidates with putative functional significance is challenging. To address this, we implemented IW-Scoring, a new Integrative Weighted Scoring model to annotate and prioritise functionally relevant noncoding variations. We evaluate 11 scoring methods, and apply an unsupervised spectral approach for subsequent selective integration into two linear weighted functional scoring schemas for known and novel variations. IW-Scoring produces stable high-quality performance as the best predictors for three independent data sets. We demonstrate the robustness of IW-Scoring in identifying recurrent functional mutations in the TERT promoter, as well as disease SNPs in proximity to consensus motifs and with gene regulatory effects. Using follicular lymphoma as a paradigmatic cancer model, we apply IW-Scoring to locate 11 recurrently mutated noncoding regions in 14 follicular lymphoma genomes, and validate 9 of these regions in an extension cohort, including the promoter and enhancer regions of PAX5. Overall, IW-Scoring demonstrates greater versatility in identifying trait- and disease-associated noncoding variants. Scores from IW-Scoring as well as other methods are freely available from http://www.snp-nexus.org/IW-Scoring/. © The Author(s) 2018. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Increasing the reliability of the fluid/crystallized difference score from the Kaufman Adolescent and Adult Intelligence Test with reliable component analysis.

    Science.gov (United States)

    Caruso, J C

    2001-06-01

    The unreliability of difference scores is a well documented phenomenon in the social sciences and has led researchers and practitioners to interpret differences cautiously, if at all. In the case of the Kaufman Adult and Adolescent Intelligence Test (KAIT), the unreliability of the difference between the Fluid IQ and the Crystallized IQ is due to the high correlation between the two scales. The consequences of the lack of precision with which differences are identified are wide confidence intervals and unpowerful significance tests (i.e., large differences are required to be declared statistically significant). Reliable component analysis (RCA) was performed on the subtests of the KAIT in order to address these problems. RCA is a new data reduction technique that results in uncorrelated component scores with maximum proportions of reliable variance. Results indicate that the scores defined by RCA have discriminant and convergent validity (with respect to the equally weighted scores) and that differences between the scores, derived from a single testing session, were more reliable than differences derived from equal weighting for each age group (11-14 years, 15-34 years, 35-85+ years). This reliability advantage results in narrower confidence intervals around difference scores and smaller differences required for statistical significance.

  16. Revealing the Maximum Strength in Nanotwinned Copper

    DEFF Research Database (Denmark)

    Lu, L.; Chen, X.; Huang, Xiaoxu

    2009-01-01

    boundary–related processes. We investigated the maximum strength of nanotwinned copper samples with different twin thicknesses. We found that the strength increases with decreasing twin thickness, reaching a maximum at 15 nanometers, followed by a softening at smaller values that is accompanied by enhanced...

  17. 49 CFR 195.406 - Maximum operating pressure.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Maximum operating pressure. 195.406 Section 195.406 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS... HAZARDOUS LIQUIDS BY PIPELINE Operation and Maintenance § 195.406 Maximum operating pressure. (a) Except for...

  18. The RIPASA score for the diagnosis of acute appendicitis: A comparison with the modified Alvarado score.

    Science.gov (United States)

    Díaz-Barrientos, C Z; Aquino-González, A; Heredia-Montaño, M; Navarro-Tovar, F; Pineda-Espinosa, M A; Espinosa de Santillana, I A

    2018-02-06

    Acute appendicitis is the first cause of surgical emergencies. It is still a difficult diagnosis to make, especially in young persons, the elderly, and in reproductive-age women, in whom a series of inflammatory conditions can have signs and symptoms similar to those of acute appendicitis. Different scoring systems have been created to increase diagnostic accuracy, and they are inexpensive, noninvasive, and easy to use and reproduce. The modified Alvarado score is probably the most widely used and accepted in emergency services worldwide. On the other hand, the RIPASA score was formulated in 2010 and has greater sensitivity and specificity. There are very few studies conducted in Mexico that compare the different scoring systems for appendicitis. The aim of our article was to compare the modified Alvarado score and the RIPASA score in the diagnosis of patients with abdominal pain and suspected acute appendicitis. An observational, analytic, and prolective study was conducted within the time frame of July 2002 and February 2014 at the Hospital Universitario de Puebla. The questionnaires used for the evaluation process were applied to the patients suspected of having appendicitis. The RIPASA score with 8.5 as the optimal cutoff value: ROC curve (area .595), sensitivity (93.3%), specificity (8.3%), PPV (91.8%), NPV (10.1%). Modified Alvarado score with 6 as the optimal cutoff value: ROC curve (area .719), sensitivity (75%), specificity (41.6%), PPV (93.7%), NPV (12.5%). The RIPASA score showed no advantages over the modified Alvarado score when applied to patients presenting with suspected acute appendicitis. Copyright © 2018 Asociación Mexicana de Gastroenterología. Publicado por Masson Doyma México S.A. All rights reserved.

  19. A power study of bivariate LOD score analysis of a complex trait and fear/discomfort with strangers.

    Science.gov (United States)

    Ji, Fei; Lee, Dayoung; Mendell, Nancy Role

    2005-12-30

    Complex diseases are often reported along with disease-related traits (DRT). Sometimes investigators consider both disease and DRT phenotypes separately and sometimes they consider individuals as affected if they have either the disease or the DRT, or both. We propose instead to consider the joint distribution of the disease and the DRT and do a linkage analysis assuming a pleiotropic model. We evaluated our results through analysis of the simulated datasets provided by Genetic Analysis Workshop 14. We first conducted univariate linkage analysis of the simulated disease, Kofendrerd Personality Disorder and one of its simulated associated traits, phenotype b (fear/discomfort with strangers). Subsequently, we considered the bivariate phenotype, which combined the information on Kofendrerd Personality Disorder and fear/discomfort with strangers. We developed a program to perform bivariate linkage analysis using an extension to the Elston-Stewart peeling method of likelihood calculation. Using this program we considered the microsatellites within 30 cM of the gene pleiotropic for this simulated disease and DRT. Based on 100 simulations of 300 families we observed excellent power to detect linkage within 10 cM of the disease locus using the DRT and the bivariate trait.

  20. Maximum spectral demands in the near-fault region

    Science.gov (United States)

    Huang, Yin-Nan; Whittaker, Andrew S.; Luco, Nicolas

    2008-01-01

    The Next Generation Attenuation (NGA) relationships for shallow crustal earthquakes in the western United States predict a rotated geometric mean of horizontal spectral demand, termed GMRotI50, and not maximum spectral demand. Differences between strike-normal, strike-parallel, geometric-mean, and maximum spectral demands in the near-fault region are investigated using 147 pairs of records selected from the NGA strong motion database. The selected records are for earthquakes with moment magnitude greater than 6.5 and for closest site-to-fault distance less than 15 km. Ratios of maximum spectral demand to NGA-predicted GMRotI50 for each pair of ground motions are presented. The ratio shows a clear dependence on period and the Somerville directivity parameters. Maximum demands can substantially exceed NGA-predicted GMRotI50 demands in the near-fault region, which has significant implications for seismic design, seismic performance assessment, and the next-generation seismic design maps. Strike-normal spectral demands are a significantly unconservative surrogate for maximum spectral demands for closest distance greater than 3 to 5 km. Scale factors that transform NGA-predicted GMRotI50 to a maximum spectral demand in the near-fault region are proposed.

  1. Maximum allowable load on wheeled mobile manipulators

    International Nuclear Information System (INIS)

    Habibnejad Korayem, M.; Ghariblu, H.

    2003-01-01

    This paper develops a computational technique for finding the maximum allowable load of mobile manipulator during a given trajectory. The maximum allowable loads which can be achieved by a mobile manipulator during a given trajectory are limited by the number of factors; probably the dynamic properties of mobile base and mounted manipulator, their actuator limitations and additional constraints applied to resolving the redundancy are the most important factors. To resolve extra D.O.F introduced by the base mobility, additional constraint functions are proposed directly in the task space of mobile manipulator. Finally, in two numerical examples involving a two-link planar manipulator mounted on a differentially driven mobile base, application of the method to determining maximum allowable load is verified. The simulation results demonstrates the maximum allowable load on a desired trajectory has not a unique value and directly depends on the additional constraint functions which applies to resolve the motion redundancy

  2. Robust Maximum Association Estimators

    NARCIS (Netherlands)

    A. Alfons (Andreas); C. Croux (Christophe); P. Filzmoser (Peter)

    2017-01-01

    textabstractThe maximum association between two multivariate variables X and Y is defined as the maximal value that a bivariate association measure between one-dimensional projections αX and αY can attain. Taking the Pearson correlation as projection index results in the first canonical correlation

  3. Selection of the Maximum Spatial Cluster Size of the Spatial Scan Statistic by Using the Maximum Clustering Set-Proportion Statistic.

    Science.gov (United States)

    Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong

    2016-01-01

    Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set-proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters.

  4. Systematic re-examination of carriers of balanced reciprocal translocations: a strategy to search for candidate regions for common and complex diseases

    DEFF Research Database (Denmark)

    Bache, Iben; Hjorth, Mads; Bugge, Merete

    2006-01-01

    linkage data and/or the translocation co-segregated with the reported phenotype, for example, we found a significant linkage (lod score=2.1) of dyslexia and a co-segregating translocation with a breakpoint in a previously confirmed locus for dyslexia. Furthermore, we identified 441 instances of at least...

  5. Commercial Gold Nanoparticles on Untreated Aluminum Foil: Versatile, Sensitive, and Cost-Effective SERS Substrate

    Directory of Open Access Journals (Sweden)

    Kristina Gudun

    2017-01-01

    Full Text Available We introduce low-cost, tunable, hybrid SERS substrate of commercial gold nanoparticles on untreated aluminum foil (AuNPs@AlF. Two or three AuNP centrifugation/resuspension cycles are proven to be critical in the assay preparation. The limits of detection (LODs for 4-nitrobenzenethiol (NBT and crystal violet (CV on this substrate are about 0.12 nM and 0.19 nM, respectively, while maximum analytical SERS enhancement factors (AEFs are about 107. In comparative assays LODs for CV measured on AuNPs@Au film and AuNPs@glass are about 0.35 nM and 2 nM, respectively. The LOD for melamine detected on AuNPs@ Al foil is 27 ppb with 3 orders of magnitude for linear response range. Overall, AuNPs@AlF demonstrated competitive performance in comparison with AuNPs@ Au film substrate in SERS detection of CV, NBT, and melamine. To check the versatility of the AuNPs@AlF substrate we also detected KNO3 with LODs of 0.7 mM and SERS EF around 2 × 103, which is on the same order with SERS EF reported for this compound in the literature.

  6. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  7. Evaluation of modified Alvarado scoring system and RIPASA scoring system as diagnostic tools of acute appendicitis.

    Science.gov (United States)

    Shuaib, Abdullah; Shuaib, Ali; Fakhra, Zainab; Marafi, Bader; Alsharaf, Khalid; Behbehani, Abdullah

    2017-01-01

    Acute appendicitis is the most common surgical condition presented in emergency departments worldwide. Clinical scoring systems, such as the Alvarado and modified Alvarado scoring systems, were developed with the goal of reducing the negative appendectomy rate to 5%-10%. The Raja Isteri Pengiran Anak Saleha Appendicitis (RIPASA) scoring system was established in 2008 specifically for Asian populations. The aim of this study was to compare the modified Alvarado with the RIPASA scoring system in Kuwait population. This study included 180 patients who underwent appendectomies and were documented as having "acute appendicitis" or "abdominal pain" in the operating theatre logbook (unit B) from November 2014 to March 2016. The sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), diagnostic accuracy, predicted negative appendectomy and receiver operating characteristic (ROC) curve of the modified Alvarado and RIPASA scoring systems were derived using SPSS statistical software. A total of 136 patients were included in this study according to our criteria. The cut-off threshold point of the modified Alvarado score was set at 7.0, which yielded a sensitivity of 82.8% and a specificity of 56%. The PPV was 89.3% and the NPV was 42.4%. The cut-off threshold point of the RIPASA score was set at 7.5, which yielded a 94.5% sensitivity and an 88% specificity. The PPV was 97.2% and the NPV was 78.5%. The predicted negative appendectomy rates were 10.7% and 2.2% for the modified Alvarado and RIPASA scoring systems, respectively. The negative appendectomy rate decreased significantly, from 18.4% to 10.7% for the modified Alvarado, and to 2.2% for the RIPASA scoring system, which was a significant difference (PAsian populations. It consists of 14 clinical parameters that can be obtained from a good patient history, clinical examination and laboratory investigations. The RIPASA scoring system is more accurate and specific than the modified Alvarado

  8. Definition of the locus responsible for systemic carnitine deficiency within a 1.6-cM region of mouse chromosome 11 by detailed linkage analysis

    Energy Technology Data Exchange (ETDEWEB)

    Okita, Kohei; Tokino, Takashi; Nishimori, Hiroyuki [Univ. of Tokyo (Japan)] [and others

    1996-04-15

    Carnitine is an essential cofactor for oxidation of mitochondrial fatty acids. Carnitine deficiency results in failure of energy production by mitochondria and leads to metabolic encephalopathy, lipid-storage myopathy, and cardiomyopathy. The juvenile visceral steatosis (JVS) mouse, an animal model of systemic carnitine deficiency, inherits the JVS phenotype in autosomal recessive fashion, through a mutant allele mapped to mouse chromosome 11. As a step toward identifying the gene responsible for JVS by positional cloning, we attempted to refine the jvs locus in the mouse by detailed linkage analysis with 13 microsatellite markers, using 190 backcross progeny. Among the 13 loci tested, 5 (defined by markers D11Mit24, D11Mit111,D11Nds9, D11Mit86, and D11Mit23) showed no recombination, with a maximum lod score of 52.38. Our results implied that the jvs gene can be sought on mouse chromosome 11 within a genetic distance no greater than about 1.6 cM. 21 refs., 2 figs.

  9. Linkage analysis in a Dutch family with X-linked recessive congenital stationary night blindness (XL-CSNB).

    Science.gov (United States)

    Berger, W; van Duijnhoven, G; Pinckers, A; Smits, A; Ropers, H H; Cremers, F

    1995-01-01

    Linkage analysis has been performed in a large Dutch pedigree with X-linked recessive congenital stationary night blindness (CSNB) by utilizing 16 DNA markers from the proximal short arm of the human X chromosome (Xp21.1-11.2). Thirteen polymorphic markers are at least partially informative and have enabled pairwise and multipoint linkage analysis. For three loci, i.e. DXS228, the monoamine oxidase B gene and the Norrie disease gene (NDG), multipoint linkage studies have yielded maximum lod scores of > 3.0 at a recombination fraction of zero. Analysis of recombination events has enabled us to rule out the possibility that the underlying defect in this family is allelic to RP3; the gene defect could also be excluded from the proximal part of the region known to carry RP2. Linkage data are consistent with a possible involvement of the NDG but mutations in the open reading frame of this gene have not been found.

  10. Identification of a novel locus for a USH3 like syndrome combined with congenital cataract

    DEFF Research Database (Denmark)

    Dad, S.; Østergaard, Elsebet; Thykjær, T.

    2010-01-01

    Usher syndrome (USH) is the most common genetic disease that causes both deafness and blindness. USH is divided into three types, USH1, USH2 and USH3, depending on the age of onset, the course of the disease, and on the degree of vestibular dysfunction. By homozygosity mapping of a consanguineous...... Danish family of Dutch descent, we have identified a novel locus for a rare USH3-like syndrome. The affected family members have a unique association of retinitis pigmentosa, progressive hearing impairment, vestibular dysfunction, and congenital cataract. The phenotype is similar, but not identical...... to that of USH3 patients, as congenital cataract has not been reported for USH3. By homozygosity mapping, we identified a 7.3 Mb locus on chromosome 15q22.2-23 with a maximum multipoint LOD score of 2.0. The locus partially overlaps with the USH1 locus, USH1H, a novel unnamed USH2 locus, and the non-syndromic...

  11. The HAT Score-A Simple Risk Stratification Score for Coagulopathic Bleeding During Adult Extracorporeal Membrane Oxygenation.

    Science.gov (United States)

    Lonergan, Terence; Herr, Daniel; Kon, Zachary; Menaker, Jay; Rector, Raymond; Tanaka, Kenichi; Mazzeffi, Michael

    2017-06-01

    The study objective was to create an adult extracorporeal membrane oxygenation (ECMO) coagulopathic bleeding risk score. Secondary analysis was performed on an existing retrospective cohort. Pre-ECMO variables were tested for association with coagulopathic bleeding, and those with the strongest association were included in a multivariable model. Using this model, a risk stratification score was created. The score's utility was validated by comparing bleeding and transfusion rates between score levels. Bleeding also was examined after stratifying by nadir platelet count and overanticoagulation. Predictive power of the score was compared against the risk score for major bleeding during anti-coagulation for atrial fibrillation (HAS-BLED). Tertiary care academic medical center. The study comprised patients who received venoarterial or venovenous ECMO over a 3-year period, excluding those with an identified source of surgical bleeding during exploration. None. Fifty-three (47.3%) of 112 patients experienced coagulopathic bleeding. A 3-variable score-hypertension, age greater than 65, and ECMO type (HAT)-had fair predictive value (area under the receiver operating characteristic curve [AUC] = 0.66) and was superior to HAS-BLED (AUC = 0.64). As the HAT score increased from 0 to 3, bleeding rates also increased as follows: 30.8%, 48.7%, 63.0%, and 71.4%, respectively. Platelet and fresh frozen plasma transfusion tended to increase with the HAT score, but red blood cell transfusion did not. Nadir platelet count less than 50×10 3 /µL and overanticoagulation during ECMO increased the AUC for the model to 0.73, suggesting additive risk. The HAT score may allow for bleeding risk stratification in adult ECMO patients. Future studies in larger cohorts are necessary to confirm these findings. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Maximum Entropy Fundamentals

    Directory of Open Access Journals (Sweden)

    F. Topsøe

    2001-09-01

    Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over

  13. 22 CFR 201.67 - Maximum freight charges.

    Science.gov (United States)

    2010-04-01

    ..., commodity rate classification, quantity, vessel flag category (U.S.-or foreign-flag), choice of ports, and... the United States. (2) Maximum charter rates. (i) USAID will not finance ocean freight under any... owner(s). (4) Maximum liner rates. USAID will not finance ocean freight for a cargo liner shipment at a...

  14. Instant MuseScore

    CERN Document Server

    Shinn, Maxwell

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. Instant MuseScore is written in an easy-to follow format, packed with illustrations that will help you get started with this music composition software.This book is for musicians who would like to learn how to notate music digitally with MuseScore. Readers should already have some knowledge about musical terminology; however, no prior experience with music notation software is necessary.

  15. Relationship between Students' Scores on Research Methods and Statistics, and Undergraduate Project Scores

    Science.gov (United States)

    Ossai, Peter Agbadobi Uloku

    2016-01-01

    This study examined the relationship between students' scores on Research Methods and statistics, and undergraduate project at the final year. The purpose was to find out whether students matched knowledge of research with project-writing skill. The study adopted an expost facto correlational design. Scores on Research Methods and Statistics for…

  16. Determination of the maximum-depth to potential field sources by a maximum structural index method

    Science.gov (United States)

    Fedi, M.; Florio, G.

    2013-01-01

    A simple and fast determination of the limiting depth to the sources may represent a significant help to the data interpretation. To this end we explore the possibility of determining those source parameters shared by all the classes of models fitting the data. One approach is to determine the maximum depth-to-source compatible with the measured data, by using for example the well-known Bott-Smith rules. These rules involve only the knowledge of the field and its horizontal gradient maxima, and are independent from the density contrast. Thanks to the direct relationship between structural index and depth to sources we work out a simple and fast strategy to obtain the maximum depth by using the semi-automated methods, such as Euler deconvolution or depth-from-extreme-points method (DEXP). The proposed method consists in estimating the maximum depth as the one obtained for the highest allowable value of the structural index (Nmax). Nmax may be easily determined, since it depends only on the dimensionality of the problem (2D/3D) and on the nature of the analyzed field (e.g., gravity field or magnetic field). We tested our approach on synthetic models against the results obtained by the classical Bott-Smith formulas and the results are in fact very similar, confirming the validity of this method. However, while Bott-Smith formulas are restricted to the gravity field only, our method is applicable also to the magnetic field and to any derivative of the gravity and magnetic field. Our method yields a useful criterion to assess the source model based on the (∂f/∂x)max/fmax ratio. The usefulness of the method in real cases is demonstrated for a salt wall in the Mississippi basin, where the estimation of the maximum depth agrees with the seismic information.

  17. 40 CFR 141.13 - Maximum contaminant levels for turbidity.

    Science.gov (United States)

    2010-07-01

    ... turbidity. 141.13 Section 141.13 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER... Maximum contaminant levels for turbidity. The maximum contaminant levels for turbidity are applicable to... part. The maximum contaminant levels for turbidity in drinking water, measured at a representative...

  18. Modelling maximum canopy conductance and transpiration in ...

    African Journals Online (AJOL)

    There is much current interest in predicting the maximum amount of water that can be transpired by Eucalyptus trees. It is possible that industrial waste water may be applied as irrigation water to eucalypts and it is important to predict the maximum transpiration rates of these plantations in an attempt to dispose of this ...

  19. 40 CFR 141.62 - Maximum contaminant levels for inorganic contaminants.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Maximum contaminant levels for inorganic contaminants. 141.62 Section 141.62 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Water Regulations: Maximum Contaminant Levels and Maximum Residual Disinfectant Levels § 141.62 Maximum...

  20. Weak scale from the maximum entropy principle

    Science.gov (United States)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2015-03-01

    The theory of the multiverse and wormholes suggests that the parameters of the Standard Model (SM) are fixed in such a way that the radiation of the S3 universe at the final stage S_rad becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the SM, we can check whether S_rad actually becomes maximum at the observed values. In this paper, we regard S_rad at the final stage as a function of the weak scale (the Higgs expectation value) vh, and show that it becomes maximum around vh = {{O}} (300 GeV) when the dimensionless couplings in the SM, i.e., the Higgs self-coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by vh ˜ T_{BBN}2 / (M_{pl}ye5), where ye is the Yukawa coupling of electron, T_BBN is the temperature at which the Big Bang nucleosynthesis starts, and M_pl is the Planck mass.

  1. Accurate modeling and maximum power point detection of ...

    African Journals Online (AJOL)

    Accurate modeling and maximum power point detection of photovoltaic ... Determination of MPP enables the PV system to deliver maximum available power. ..... adaptive artificial neural network: Proposition for a new sizing procedure.

  2. 40 CFR 141.61 - Maximum contaminant levels for organic contaminants.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Maximum contaminant levels for organic contaminants. 141.61 Section 141.61 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER... Regulations: Maximum Contaminant Levels and Maximum Residual Disinfectant Levels § 141.61 Maximum contaminant...

  3. Maximum Power Training and Plyometrics for Cross-Country Running.

    Science.gov (United States)

    Ebben, William P.

    2001-01-01

    Provides a rationale for maximum power training and plyometrics as conditioning strategies for cross-country runners, examining: an evaluation of training methods (strength training and maximum power training and plyometrics); biomechanic and velocity specificity (role in preventing injury); and practical application of maximum power training and…

  4. Breaking of scored tablets : a review

    NARCIS (Netherlands)

    van Santen, E; Barends, D M; Frijlink, H W

    The literature was reviewed regarding advantages, problems and performance indicators of score lines. Scored tablets provide dose flexibility, ease of swallowing and may reduce the costs of medication. However, many patients are confronted with scored tablets that are broken unequally and with

  5. The last glacial maximum

    Science.gov (United States)

    Clark, P.U.; Dyke, A.S.; Shakun, J.D.; Carlson, A.E.; Clark, J.; Wohlfarth, B.; Mitrovica, J.X.; Hostetler, S.W.; McCabe, A.M.

    2009-01-01

    We used 5704 14C, 10Be, and 3He ages that span the interval from 10,000 to 50,000 years ago (10 to 50 ka) to constrain the timing of the Last Glacial Maximum (LGM) in terms of global ice-sheet and mountain-glacier extent. Growth of the ice sheets to their maximum positions occurred between 33.0 and 26.5 ka in response to climate forcing from decreases in northern summer insolation, tropical Pacific sea surface temperatures, and atmospheric CO2. Nearly all ice sheets were at their LGM positions from 26.5 ka to 19 to 20 ka, corresponding to minima in these forcings. The onset of Northern Hemisphere deglaciation 19 to 20 ka was induced by an increase in northern summer insolation, providing the source for an abrupt rise in sea level. The onset of deglaciation of the West Antarctic Ice Sheet occurred between 14 and 15 ka, consistent with evidence that this was the primary source for an abrupt rise in sea level ???14.5 ka.

  6. Maximum physical capacity testing in cancer patients undergoing chemotherapy

    DEFF Research Database (Denmark)

    Knutsen, L.; Quist, M; Midtgaard, J

    2006-01-01

    BACKGROUND: Over the past few years there has been a growing interest in the field of physical exercise in rehabilitation of cancer patients, leading to requirements for objective maximum physical capacity measurement (maximum oxygen uptake (VO(2max)) and one-repetition maximum (1RM)) to determin...... early in the treatment process. However, the patients were self-referred and thus highly motivated and as such are not necessarily representative of the whole population of cancer patients treated with chemotherapy....... in performing maximum physical capacity tests as these motivated them through self-perceived competitiveness and set a standard that served to encourage peak performance. CONCLUSION: The positive attitudes in this sample towards maximum physical capacity open the possibility of introducing physical testing...

  7. The "Akopian" vault performed by elite male gymnasts: Which biomechanical variables are related to a judge's score?

    Directory of Open Access Journals (Sweden)

    Roman Farana

    2015-03-01

    Full Text Available Background: A vaulting performance takes a short time and it is influenced by and affects the quantity of mechanical variables. The significant relationships between the vaulting score and specific aspects of the gymnast's vault should conduct coaches to monitor these variables as a part of training or routine testing. Objective: The aim of the current study was to determine the biomechanical variables that are related to a successful performance of the Akopian vaults performed by top level male gymnasts during the World Cup competition. Methods: Fifteen top-level male gymnasts participated in this study. For the 3D analysis, two digital camcorders with a frame rate of 50 Hz were used. The data were digitized by the Simi motion software. The Hay and Reid method was used to identify the biomechanical variables that determine the linear and angular motions of the handspring and front somersault vaults. A correlation analysis was used to establish the relationship between the biomechanical variables and the judges' scores. The level of statistical significance was determined at the value of p < .05. Results: In the Akopian vaults, in five out of 24 variables arising from the deterministic model showed a significant relationship to the score. A significant correlation was found in the maximum height of the body center of mass in the second flight phase, in the height of the body center of mass at the mat touchdown, in the change of the vertical velocity during the take-off from the vaulting table, and in the duration of the second flight phase. Conclusions: The results of the study suggest that a successful execution of Akopian vaults and the achievement of a higher score required: to maximize the change in vertical velocity in the table contact phase and maximize vertical velocity in the table take-off phase; to maximize the amplitude of the second flight phase, which is determined by the duration of the second flight phase, by the maximum

  8. MAXIMUM PRINCIPLE FOR SUBSONIC FLOW WITH VARIABLE ENTROPY

    Directory of Open Access Journals (Sweden)

    B. Sizykh Grigory

    2017-01-01

    Full Text Available Maximum principle for subsonic flow is fair for stationary irrotational subsonic gas flows. According to this prin- ciple, if the value of the velocity is not constant everywhere, then its maximum is achieved on the boundary and only on the boundary of the considered domain. This property is used when designing form of an aircraft with a maximum critical val- ue of the Mach number: it is believed that if the local Mach number is less than unit in the incoming flow and on the body surface, then the Mach number is less then unit in all points of flow. The known proof of maximum principle for subsonic flow is based on the assumption that in the whole considered area of the flow the pressure is a function of density. For the ideal and perfect gas (the role of diffusion is negligible, and the Mendeleev-Clapeyron law is fulfilled, the pressure is a function of density if entropy is constant in the entire considered area of the flow. Shows an example of a stationary sub- sonic irrotational flow, in which the entropy has different values on different stream lines, and the pressure is not a function of density. The application of the maximum principle for subsonic flow with respect to such a flow would be unreasonable. This example shows the relevance of the question about the place of the points of maximum value of the velocity, if the entropy is not a constant. To clarify the regularities of the location of these points, was performed the analysis of the com- plete Euler equations (without any simplifying assumptions in 3-D case. The new proof of the maximum principle for sub- sonic flow was proposed. This proof does not rely on the assumption that the pressure is a function of density. Thus, it is shown that the maximum principle for subsonic flow is true for stationary subsonic irrotational flows of ideal perfect gas with variable entropy.

  9. Maximum power point tracker based on fuzzy logic

    International Nuclear Information System (INIS)

    Daoud, A.; Midoun, A.

    2006-01-01

    The solar energy is used as power source in photovoltaic power systems and the need for an intelligent power management system is important to obtain the maximum power from the limited solar panels. With the changing of the sun illumination due to variation of angle of incidence of sun radiation and of the temperature of the panels, Maximum Power Point Tracker (MPPT) enables optimization of solar power generation. The MPPT is a sub-system designed to extract the maximum power from a power source. In the case of solar panels power source. the maximum power point varies as a result of changes in its electrical characteristics which in turn are functions of radiation dose, temperature, ageing and other effects. The MPPT maximum the power output from panels for a given set of conditions by detecting the best working point of the power characteristic and then controls the current through the panels or the voltage across them. Many MPPT methods have been reported in literature. These techniques of MPPT can be classified into three main categories that include: lookup table methods, hill climbing methods and computational methods. The techniques vary according to the degree of sophistication, processing time and memory requirements. The perturbation and observation algorithm (hill climbing technique) is commonly used due to its ease of implementation, and relative tracking efficiency. However, it has been shown that when the insolation changes rapidly, the perturbation and observation method is slow to track the maximum power point. In recent years, the fuzzy controllers are used for maximum power point tracking. This method only requires the linguistic control rules for maximum power point, the mathematical model is not required and therefore the implementation of this control method is easy to real control system. In this paper, we we present a simple robust MPPT using fuzzy set theory where the hardware consists of the microchip's microcontroller unit control card and

  10. Facilitating the Interpretation of English Language Proficiency Scores: Combining Scale Anchoring and Test Score Mapping Methodologies

    Science.gov (United States)

    Powers, Donald; Schedl, Mary; Papageorgiou, Spiros

    2017-01-01

    The aim of this study was to develop, for the benefit of both test takers and test score users, enhanced "TOEFL ITP"® test score reports that go beyond the simple numerical scores that are currently reported. To do so, we applied traditional scale anchoring (proficiency scaling) to item difficulty data in order to develop performance…

  11. Maximum entropy methods

    International Nuclear Information System (INIS)

    Ponman, T.J.

    1984-01-01

    For some years now two different expressions have been in use for maximum entropy image restoration and there has been some controversy over which one is appropriate for a given problem. Here two further entropies are presented and it is argued that there is no single correct algorithm. The properties of the four different methods are compared using simple 1D simulations with a view to showing how they can be used together to gain as much information as possible about the original object. (orig.)

  12. Validity of GRE General Test scores and TOEFL scores for graduate admission to a technical university in Western Europe

    Science.gov (United States)

    Zimmermann, Judith; von Davier, Alina A.; Buhmann, Joachim M.; Heinimann, Hans R.

    2018-01-01

    Graduate admission has become a critical process in tertiary education, whereby selecting valid admissions instruments is key. This study assessed the validity of Graduate Record Examination (GRE) General Test scores for admission to Master's programmes at a technical university in Europe. We investigated the indicative value of GRE scores for the Master's programme grade point average (GGPA) with and without the addition of the undergraduate GPA (UGPA) and the TOEFL score, and of GRE scores for study completion and Master's thesis performance. GRE scores explained 20% of the variation in the GGPA, while additional 7% were explained by the TOEFL score and 3% by the UGPA. Contrary to common belief, the GRE quantitative reasoning score showed only little explanatory power. GRE scores were also weakly related to study progress but not to thesis performance. Nevertheless, GRE and TOEFL scores were found to be sensible admissions instruments. Rigorous methodology was used to obtain highly reliable results.

  13. Maximum entropy deconvolution of low count nuclear medicine images

    International Nuclear Information System (INIS)

    McGrath, D.M.

    1998-12-01

    Maximum entropy is applied to the problem of deconvolving nuclear medicine images, with special consideration for very low count data. The physics of the formation of scintigraphic images is described, illustrating the phenomena which degrade planar estimates of the tracer distribution. Various techniques which are used to restore these images are reviewed, outlining the relative merits of each. The development and theoretical justification of maximum entropy as an image processing technique is discussed. Maximum entropy is then applied to the problem of planar deconvolution, highlighting the question of the choice of error parameters for low count data. A novel iterative version of the algorithm is suggested which allows the errors to be estimated from the predicted Poisson mean values. This method is shown to produce the exact results predicted by combining Poisson statistics and a Bayesian interpretation of the maximum entropy approach. A facility for total count preservation has also been incorporated, leading to improved quantification. In order to evaluate this iterative maximum entropy technique, two comparable methods, Wiener filtering and a novel Bayesian maximum likelihood expectation maximisation technique, were implemented. The comparison of results obtained indicated that this maximum entropy approach may produce equivalent or better measures of image quality than the compared methods, depending upon the accuracy of the system model used. The novel Bayesian maximum likelihood expectation maximisation technique was shown to be preferable over many existing maximum a posteriori methods due to its simplicity of implementation. A single parameter is required to define the Bayesian prior, which suppresses noise in the solution and may reduce the processing time substantially. Finally, maximum entropy deconvolution was applied as a pre-processing step in single photon emission computed tomography reconstruction of low count data. Higher contrast results were

  14. Maximum Principles for Discrete and Semidiscrete Reaction-Diffusion Equation

    Directory of Open Access Journals (Sweden)

    Petr Stehlík

    2015-01-01

    Full Text Available We study reaction-diffusion equations with a general reaction function f on one-dimensional lattices with continuous or discrete time ux′  (or  Δtux=k(ux-1-2ux+ux+1+f(ux, x∈Z. We prove weak and strong maximum and minimum principles for corresponding initial-boundary value problems. Whereas the maximum principles in the semidiscrete case (continuous time exhibit similar features to those of fully continuous reaction-diffusion model, in the discrete case the weak maximum principle holds for a smaller class of functions and the strong maximum principle is valid in a weaker sense. We describe in detail how the validity of maximum principles depends on the nonlinearity and the time step. We illustrate our results on the Nagumo equation with the bistable nonlinearity.

  15. A prospective observational study comparing a physiological scoring system with time-based discharge criteria in pediatric ambulatory surgical patients.

    Science.gov (United States)

    Armstrong, James; Forrest, Helen; Crawford, Mark W

    2015-10-01

    Discharge criteria based on physiological scoring systems can be used in the postanesthesia care unit (PACU) to fast-track patients after ambulatory surgery; however, studies comparing physiological scoring systems with traditional time-based discharge criteria are lacking. The purpose of this study was to compare PACU discharge readiness times using physiological vs time-based discharge criteria in pediatric ambulatory surgical patients. We recorded physiological observations from consecutive American Society of Anesthesiologists physical status I-III patients aged 1-18 yr who were admitted to the PACU after undergoing ambulatory surgery in a tertiary academic pediatric hospital. The physiological score was a combination of the Aldrete and Chung systems. Scores were recorded every 15 min starting upon arrival in the PACU. Patients were considered fit for discharge once they attained a score ≥12 (maximum score, 14), provided no score was zero, with the time to achieve a score ≥12 defining the criteria-based discharge (CBD) time. Patients were discharged from the PACU when both the CBD and the existing time-based discharge (TBD) criteria were met. The CBD and TBD data were compared using Kaplan-Meier and log-rank analysis. Observations from 506 children are presented. Median (interquartile range [IQR]) age was 5.5 [2.8-9.9] yr. Median [IQR] CBD and TBD PACU discharge readiness times were 30 [15-45] min and 60 [45-60] min, respectively. Analysis of Kaplan-Meier curves indicated a significant difference in discharge times using the different criteria (hazard ratio, 5.43; 95% confidence interval, 4.51 to 6.53; P < 0.001). All patients were discharged home without incident. This prospective study suggests that discharge decisions based on physiological criteria have the potential for significantly speeding the transit of children through the PACU, thereby enhancing PACU efficiency and resource utilization.

  16. Relationship between accumulated heat stress during the dry period, body condition score, and reproduction parameters of Holstein cows in tropical conditions.

    Science.gov (United States)

    Avendaño-Reyes, Leonel; Fuquay, John W; Moore, Reuben B; Liu, Zhanglin; Clark, Bruce L; Vierhout, C

    2010-02-01

    To estimate the relationship between heat stress during the last 60 days prepartum, body condition score and certain reproductive traits in the subsequent lactation of Holstein cows, 564 multiparous cows and 290 primiparous cows from four dairy herds were used in a hot, humid region. Maximum prepartum degree days were estimated to quantify the degree of heat stress. Multiple regressions analyses and logistic regression analysis were performed to determine the effect of prepartum heat stress and body condition change on reproductive parameters, which were obtained from DHIA forms at the end of the lactation. Multiparous and primiparous cows which gained body condition score from calving to 60 d postpartum exhibited 28 and 27 fewer days open (P 0.05) of heat stress measurement on days open or services per conception in either multiparous or primiparous cows. During hotter months of calving, multiparous cows showed higher services per conception and primiparous cows showed higher days open and services per conception (P score. Multiparous cows with high body condition score at calving were 1.47 times more likely to present a very difficult calving than cows that calved in October (P reproductive performance was not affected by cumulative prepartum heat stress although it was associated with very difficult calving score.

  17. A Novel Scoring System Approach to Assess Patients with Lyme Disease (Nutech Functional Score).

    Science.gov (United States)

    Shroff, Geeta; Hopf-Seidel, Petra

    2018-01-01

    A bacterial infection by Borrelia burgdorferi referred to as Lyme disease (LD) or borreliosis is transmitted mostly by a bite of the tick Ixodes scapularis in the USA and Ixodes ricinus in Europe. Various tests are used for the diagnosis of LD, but their results are often unreliable. We compiled a list of clinically visible and patient-reported symptoms that are associated with LD. Based on this list, we developed a novel scoring system. Nutech functional Score (NFS), which is a 43 point positional (every symptom is subgraded and each alternative gets some points according to its position) and directional (moves in direction bad to good) scoring system that assesses the patient's condition. The grades of the scoring system have been converted into numeric values for conducting probability based studies. Each symptom is graded from 1 to 5 that runs in direction BAD → GOOD. NFS is a unique tool that can be used universally to assess the condition of patients with LD.

  18. Minimal length, Friedmann equations and maximum density

    Energy Technology Data Exchange (ETDEWEB)

    Awad, Adel [Center for Theoretical Physics, British University of Egypt,Sherouk City 11837, P.O. Box 43 (Egypt); Department of Physics, Faculty of Science, Ain Shams University,Cairo, 11566 (Egypt); Ali, Ahmed Farag [Centre for Fundamental Physics, Zewail City of Science and Technology,Sheikh Zayed, 12588, Giza (Egypt); Department of Physics, Faculty of Science, Benha University,Benha, 13518 (Egypt)

    2014-06-16

    Inspired by Jacobson’s thermodynamic approach, Cai et al. have shown the emergence of Friedmann equations from the first law of thermodynamics. We extend Akbar-Cai derivation http://dx.doi.org/10.1103/PhysRevD.75.084003 of Friedmann equations to accommodate a general entropy-area law. Studying the resulted Friedmann equations using a specific entropy-area law, which is motivated by the generalized uncertainty principle (GUP), reveals the existence of a maximum energy density closed to Planck density. Allowing for a general continuous pressure p(ρ,a) leads to bounded curvature invariants and a general nonsingular evolution. In this case, the maximum energy density is reached in a finite time and there is no cosmological evolution beyond this point which leaves the big bang singularity inaccessible from a spacetime prospective. The existence of maximum energy density and a general nonsingular evolution is independent of the equation of state and the spacial curvature k. As an example we study the evolution of the equation of state p=ωρ through its phase-space diagram to show the existence of a maximum energy which is reachable in a finite time.

  19. Maximum-Entropy Inference with a Programmable Annealer

    Science.gov (United States)

    Chancellor, Nicholas; Szoke, Szilard; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A.

    2016-03-01

    Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition.

  20. An Objective Fluctuation Score for Parkinson's Disease

    Science.gov (United States)

    Horne, Malcolm K.; McGregor, Sarah; Bergquist, Filip

    2015-01-01

    Introduction Establishing the presence and severity of fluctuations is important in managing Parkinson’s Disease yet there is no reliable, objective means of doing this. In this study we have evaluated a Fluctuation Score derived from variations in dyskinesia and bradykinesia scores produced by an accelerometry based system. Methods The Fluctuation Score was produced by summing the interquartile range of bradykinesia scores and dyskinesia scores produced every 2 minutes between 0900-1800 for at least 6 days by the accelerometry based system and expressing it as an algorithm. Results This Score could distinguish between fluctuating and non-fluctuating patients with high sensitivity and selectivity and was significant lower following activation of deep brain stimulators. The scores following deep brain stimulation lay in a band just above the score separating fluctuators from non-fluctuators, suggesting a range representing adequate motor control. When compared with control subjects the score of newly diagnosed patients show a loss of fluctuation with onset of PD. The score was calculated in subjects whose duration of disease was known and this showed that newly diagnosed patients soon develop higher scores which either fall under or within the range representing adequate motor control or instead go on to develop more severe fluctuations. Conclusion The Fluctuation Score described here promises to be a useful tool for identifying patients whose fluctuations are progressing and may require therapeutic changes. It also shows promise as a useful research tool. Further studies are required to more accurately identify therapeutic targets and ranges. PMID:25928634

  1. Clinical role of pathological downgrading after radical prostatectomy in patients with biopsy-proven Gleason score 3+4 prostate cancer

    Science.gov (United States)

    Gondo, Tatsuo; Poon, Bing Ying; Matsumoto, Kazuhiro; Bernstein, Melanie; Sjoberg, Daniel D.; Eastham, James A.

    2014-01-01

    Objective To identify preoperative factors predicting Gleason score downgrading after radical prostatectomy in patients with biopsy Gleason score 3+4 prostate cancer. To determine if prediction of downgrading can identify potential candidates for active surveillance. Patients and Methods We identified 1317 patients with biopsy Gleason score 3+4 prostate cancer who underwent radical prostatectomy at Memorial Sloan-Kettering Cancer Center between 2005 and 2013. Several preoperative and biopsy characteristics were evaluated by forward selection regression, and selected predictors of downgrading were analyzed by multivariable logistic regression. Decision curve analysis was performed to evaluate the clinical utility of the multivariate model. Results Gleason score was downgraded after radical prostatectomy in 115 patients (9%). We developed a multivariable model using age, prostate specific antigen density, percent of positive cores with Gleason 4 cancer out of all cores taken, and maximum percent of cancer involvement within a positive core with Gleason 4 cancer. The area under the curve for this model was 0.75 after ten-fold cross validation. However, decision curve analysis revealed that the model was not clinically helpful in identifying patients who will downgrade at radical prostatectomy for the purpose of reassigning them to active surveillance. Conclusion While patients with pathology Gleason score 3+3 with tertiary Gleason pattern 4 or lower at radical prostatectomy in patients with biopsy Gleason score 3+4 prostate cancer may be potential candidates for active surveillance, decision curve analysis showed limited utility of our model to identify such men. Future study is needed to identify new predictors to help identify potential candidates for active surveillance among patients with biopsy-proven Gleason score 3+4 prostate cancer. PMID:24725760

  2. Combination of scoring schemes for protein docking

    Directory of Open Access Journals (Sweden)

    Schomburg Dietmar

    2007-08-01

    Full Text Available Abstract Background Docking algorithms are developed to predict in which orientation two proteins are likely to bind under natural conditions. The currently used methods usually consist of a sampling step followed by a scoring step. We developed a weighted geometric correlation based on optimised atom specific weighting factors and combined them with our previously published amino acid specific scoring and with a comprehensive SVM-based scoring function. Results The scoring with the atom specific weighting factors yields better results than the amino acid specific scoring. In combination with SVM-based scoring functions the percentage of complexes for which a near native structure can be predicted within the top 100 ranks increased from 14% with the geometric scoring to 54% with the combination of all scoring functions. Especially for the enzyme-inhibitor complexes the results of the ranking are excellent. For half of these complexes a near-native structure can be predicted within the first 10 proposed structures and for more than 86% of all enzyme-inhibitor complexes within the first 50 predicted structures. Conclusion We were able to develop a combination of different scoring schemes which considers a series of previously described and some new scoring criteria yielding a remarkable improvement of prediction quality.

  3. External validation of the NOBLADS score, a risk scoring system for severe acute lower gastrointestinal bleeding.

    Directory of Open Access Journals (Sweden)

    Tomonori Aoki

    Full Text Available We aimed to evaluate the generalizability of NOBLADS, a severe lower gastrointestinal bleeding (LGIB prediction model which we had previously derived when working at a different institution, using an external validation cohort. NOBLADS comprises the following factors: non-steroidal anti-inflammatory drug use, no diarrhea, no abdominal tenderness, blood pressure ≤ 100 mmHg, antiplatelet drug use, albumin < 3.0 g/dL, disease score ≥ 2, and syncope.We retrospectively analyzed 511 patients emergently hospitalized for acute LGIB at the University of Tokyo Hospital, from January 2009 to August 2016. The areas under the receiver operating characteristic curves (ROCs-AUCs for severe bleeding (continuous and/or recurrent bleeding were compared between the original derivation cohort and the external validation cohort.Severe LGIB occurred in 44% of patients. Several clinical factors were significantly different between the external and derivation cohorts (p < 0.05, including background, laboratory data, NOBLADS scores, and diagnosis. The NOBLADS score predicted the severity of LGIB with an AUC value of 0.74 in the external validation cohort and one of 0.77 in the derivation cohort. In the external validation cohort, the score predicted the risk for blood transfusion need (AUC, 0.71, but was not adequate for predicting intervention need (AUC, 0.54. The in-hospital mortality rate was higher in patients with a score ≥ 5 than in those with a score < 5 (AUC, 0.83.Although the external validation cohort clinically differed from the derivation cohort in many ways, we confirmed the moderately high generalizability of NOBLADS, a clinical risk score for severe LGIB. Appropriate triage using this score may support early decision-making in various hospitals.

  4. Differences in distribution of T-scores and Z-scores among bone densitometry tests in postmenopausal women (a comparative study)

    International Nuclear Information System (INIS)

    Wendlova, J.

    2002-01-01

    To determine the character of T-score and Z-score value distribution in individually selected methods of bone densitometry and to compare them using statistical analysis. We examined 56 postmenopausal women with an age between 43 and 68 years with osteopenia or osteoporosis according to the WHO classification. The following measurements were made in each patient: T-score and Z-score for: 1) Stiffness index (S) of the left heel bone, USM (index). 2) Bone mineral density of the left heel bone (BMDh), DEXA (g of Ca hydroxyapatite per cm 2 ). 3) Bone mineral density of trabecular bone of the L1 vertebra (BMDL1). QCT (mg of Ca hydroxyapatite per cm 3 ). The densitometers used in the study were: ultrasonometer to measure heel bone, Achilles plus LUNAR, USA: DEXA to measure heel bone, PIXl, LUNAR, USA: QCT to measure the L1 vertebra, CT, SOMATOM Plus, Siemens, Germany. Statistical analysis: differences between measured values of T-scores (Z-scores) were evaluated by parametric or non-parametric methods of determining the 95 % confidence intervals (C.I.). Differences between Z-score and T-score values for compared measurements were statistically significant; however, these differences were lower for Z-scores. Largest differences in 95 % C.I., characterizing individual measurements of T-score values (in comparison with Z-scores), were found for those densitometers whose age range of the reference groups of young adults differed the most, and conversely, the smallest differences in T-score values were found when the differences between the age ranges of reference groups were smallest. The higher variation in T-score values in comparison to Z-scores is also caused by a non-standard selection of the reference groups of young adults for the QCT, PIXI and Achilles Plus densitometers used in the study. Age characteristics of the reference group for T-scores should be standardized for all types of densitometers. (author)

  5. Probabilistic maximum-value wind prediction for offshore environments

    DEFF Research Database (Denmark)

    Staid, Andrea; Pinson, Pierre; Guikema, Seth D.

    2015-01-01

    statistical models to predict the full distribution of the maximum-value wind speeds in a 3 h interval. We take a detailed look at the performance of linear models, generalized additive models and multivariate adaptive regression splines models using meteorological covariates such as gust speed, wind speed......, convective available potential energy, Charnock, mean sea-level pressure and temperature, as given by the European Center for Medium-Range Weather Forecasts forecasts. The models are trained to predict the mean value of maximum wind speed, and the residuals from training the models are used to develop...... the full probabilistic distribution of maximum wind speed. Knowledge of the maximum wind speed for an offshore location within a given period can inform decision-making regarding turbine operations, planned maintenance operations and power grid scheduling in order to improve safety and reliability...

  6. A diagnostic scoring system for myxedema coma.

    Science.gov (United States)

    Popoveniuc, Geanina; Chandra, Tanu; Sud, Anchal; Sharma, Meeta; Blackman, Marc R; Burman, Kenneth D; Mete, Mihriye; Desale, Sameer; Wartofsky, Leonard

    2014-08-01

    To develop diagnostic criteria for myxedema coma (MC), a decompensated state of extreme hypothyroidism with a high mortality rate if untreated, in order to facilitate its early recognition and treatment. The frequencies of characteristics associated with MC were assessed retrospectively in patients from our institutions in order to derive a semiquantitative diagnostic point scale that was further applied on selected patients whose data were retrieved from the literature. Logistic regression analysis was used to test the predictive power of the score. Receiver operating characteristic (ROC) curve analysis was performed to test the discriminative power of the score. Of the 21 patients examined, 7 were reclassified as not having MC (non-MC), and they were used as controls. The scoring system included a composite of alterations of thermoregulatory, central nervous, cardiovascular, gastrointestinal, and metabolic systems, and presence or absence of a precipitating event. All 14 of our MC patients had a score of ≥60, whereas 6 of 7 non-MC patients had scores of 25 to 50. A total of 16 of 22 MC patients whose data were retrieved from the literature had a score ≥60, and 6 of 22 of these patients scored between 45 and 55. The odds ratio per each score unit increase as a continuum was 1.09 (95% confidence interval [CI], 1.01 to 1.16; P = .019); a score of 60 identified coma, with an odds ratio of 1.22. The area under the ROC curve was 0.88 (95% CI, 0.65 to 1.00), and the score of 60 had 100% sensitivity and 85.71% specificity. A score ≥60 in the proposed scoring system is potentially diagnostic for MC, whereas scores between 45 and 59 could classify patients at risk for MC.

  7. Diagnostic accuracy of guys Hospital stroke score (allen score) in acute supratentorial thrombotic/haemorrhagic stroke

    International Nuclear Information System (INIS)

    Zulfiqar, A.; Toori, K. U.; Khan, S. S.; Hamza, M. I. M.; Zaman, S. U.

    2006-01-01

    A consecutive series of 103 patients, 58% male with mean age of 62 year (range 40-75 years), admitted with supratentorial stroke in our teaching hospital were studied. All patients had Computer Tomography scan brain done after clinical evaluation and application of Allen stroke score. Computer Tomography Scan confirmed thrombotic stroke in 55 (53%) patients and haemorrhagic stroke in 48 (47%) patients. Out of the 55 patients with definitive thrombotic stroke on Computer Tomography Scan, Allen stroke score suggested infarction in 67%, haemorrhage in 6% and remained inconclusive in 27% of cases. In 48 patients with definitive haemorrhagic stroke on Computer Tomography Scan, Allen stroke score suggested haemorrhage in 60%, infarction in 11% and remained inconclusive in 29% of cases. The overall accuracy of Allen stroke score was 66%. (author)

  8. Characterizing graphs of maximum matching width at most 2

    DEFF Research Database (Denmark)

    Jeong, Jisu; Ok, Seongmin; Suh, Geewon

    2017-01-01

    The maximum matching width is a width-parameter that is de ned on a branch-decomposition over the vertex set of a graph. The size of a maximum matching in the bipartite graph is used as a cut-function. In this paper, we characterize the graphs of maximum matching width at most 2 using the minor o...

  9. Change detection on LOD 2 building models with very high resolution spaceborne stereo imagery

    Science.gov (United States)

    Qin, Rongjun

    2014-10-01

    Due to the fast development of the urban environment, the need for efficient maintenance and updating of 3D building models is ever increasing. Change detection is an essential step to spot the changed area for data (map/3D models) updating and urban monitoring. Traditional methods based on 2D images are no longer suitable for change detection in building scale, owing to the increased spectral variability of the building roofs and larger perspective distortion of the very high resolution (VHR) imagery. Change detection in 3D is increasingly being investigated using airborne laser scanning data or matched Digital Surface Models (DSM), but rare study has been conducted regarding to change detection on 3D city models with VHR images, which is more informative but meanwhile more complicated. This is due to the fact that the 3D models are abstracted geometric representation of the urban reality, while the VHR images record everything. In this paper, a novel method is proposed to detect changes directly on LOD (Level of Detail) 2 building models with VHR spaceborne stereo images from a different date, with particular focus on addressing the special characteristics of the 3D models. In the first step, the 3D building models are projected onto a raster grid, encoded with building object, terrain object, and planar faces. The DSM is extracted from the stereo imagery by hierarchical semi-global matching (SGM). In the second step, a multi-channel change indicator is extracted between the 3D models and stereo images, considering the inherent geometric consistency (IGC), height difference, and texture similarity for each planar face. Each channel of the indicator is then clustered with the Self-organizing Map (SOM), with "change", "non-change" and "uncertain change" status labeled through a voting strategy. The "uncertain changes" are then determined with a Markov Random Field (MRF) analysis considering the geometric relationship between faces. In the third step, buildings are

  10. The Maximum Resource Bin Packing Problem

    DEFF Research Database (Denmark)

    Boyar, J.; Epstein, L.; Favrholdt, L.M.

    2006-01-01

    Usually, for bin packing problems, we try to minimize the number of bins used or in the case of the dual bin packing problem, maximize the number or total size of accepted items. This paper presents results for the opposite problems, where we would like to maximize the number of bins used...... algorithms, First-Fit-Increasing and First-Fit-Decreasing for the maximum resource variant of classical bin packing. For the on-line variant, we define maximum resource variants of classical and dual bin packing. For dual bin packing, no on-line algorithm is competitive. For classical bin packing, we find...

  11. Maximum entropy analysis of EGRET data

    DEFF Research Database (Denmark)

    Pohl, M.; Strong, A.W.

    1997-01-01

    EGRET data are usually analysed on the basis of the Maximum-Likelihood method \\cite{ma96} in a search for point sources in excess to a model for the background radiation (e.g. \\cite{hu97}). This method depends strongly on the quality of the background model, and thus may have high systematic unce...... uncertainties in region of strong and uncertain background like the Galactic Center region. Here we show images of such regions obtained by the quantified Maximum-Entropy method. We also discuss a possible further use of MEM in the analysis of problematic regions of the sky....

  12. Density estimation by maximum quantum entropy

    International Nuclear Information System (INIS)

    Silver, R.N.; Wallstrom, T.; Martz, H.F.

    1993-01-01

    A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets

  13. Comparing TACOM scores with subjective workload scores measured by NASA-TLX technique

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea

    2006-01-01

    It is a well-known fact that a large portion of human performance related problems was attributed to the complexity of tasks. Therefore, managing the complexity of tasks is a prerequisite for safety-critical systems such as nuclear power plants (NPPs), because the consequence of a degraded human performance could be more severe than in other systems. From this concern, it is necessary to quantify the complexity of emergency tasks that are stipulated in procedures, because most tasks of NPPs have been specified in the form of procedures. For this reason, Park et al. developed a task complexity measure called TACOM. In this study, in order to confirm the validity of the TACOM measure, subjective workload scores that were measured by the NASA-TLX technique were compared with the associated TACOM scores. To do this, 23 emergency tasks of the reference NPPs were selected, and then subjective workload scores for these emergency tasks were quantified by 18 operators who had a sufficient knowledge about emergency operations

  14. Comparing TACOM scores with subjective workload scores measured by NASA-TLX technique

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2006-07-01

    It is a well-known fact that a large portion of human performance related problems was attributed to the complexity of tasks. Therefore, managing the complexity of tasks is a prerequisite for safety-critical systems such as nuclear power plants (NPPs), because the consequence of a degraded human performance could be more severe than in other systems. From this concern, it is necessary to quantify the complexity of emergency tasks that are stipulated in procedures, because most tasks of NPPs have been specified in the form of procedures. For this reason, Park et al. developed a task complexity measure called TACOM. In this study, in order to confirm the validity of the TACOM measure, subjective workload scores that were measured by the NASA-TLX technique were compared with the associated TACOM scores. To do this, 23 emergency tasks of the reference NPPs were selected, and then subjective workload scores for these emergency tasks were quantified by 18 operators who had a sufficient knowledge about emergency operations.

  15. Admixture mapping of 15,280 African Americans identifies obesity susceptibility loci on chromosomes 5 and X.

    Directory of Open Access Journals (Sweden)

    Ching-Yu Cheng

    2009-05-01

    Full Text Available The prevalence of obesity (body mass index (BMI > or =30 kg/m(2 is higher in African Americans than in European Americans, even after adjustment for socioeconomic factors, suggesting that genetic factors may explain some of the difference. To identify genetic loci influencing BMI, we carried out a pooled analysis of genome-wide admixture mapping scans in 15,280 African Americans from 14 epidemiologic studies. Samples were genotyped at a median of 1,411 ancestry-informative markers. After adjusting for age, sex, and study, BMI was analyzed both as a dichotomized (top 20% versus bottom 20% and a continuous trait. We found that a higher percentage of European ancestry was significantly correlated with lower BMI (rho = -0.042, P = 1.6x10(-7. In the dichotomized analysis, we detected two loci on chromosome X as associated with increased African ancestry: the first at Xq25 (locus-specific LOD = 5.94; genome-wide score = 3.22; case-control Z = -3.94; and the second at Xq13.1 (locus-specific LOD = 2.22; case-control Z = -4.62. Quantitative analysis identified a third locus at 5q13.3 where higher BMI was highly significantly associated with greater European ancestry (locus-specific LOD = 6.27; genome-wide score = 3.46. Further mapping studies with dense sets of markers will be necessary to identify the alleles in these regions of chromosomes X and 5 that may be associated with variation in BMI.

  16. "Score the Core" Web-based pathologist training tool improves the accuracy of breast cancer IHC4 scoring.

    Science.gov (United States)

    Engelberg, Jesse A; Retallack, Hanna; Balassanian, Ronald; Dowsett, Mitchell; Zabaglo, Lila; Ram, Arishneel A; Apple, Sophia K; Bishop, John W; Borowsky, Alexander D; Carpenter, Philip M; Chen, Yunn-Yi; Datnow, Brian; Elson, Sarah; Hasteh, Farnaz; Lin, Fritz; Moatamed, Neda A; Zhang, Yanhong; Cardiff, Robert D

    2015-11-01

    Hormone receptor status is an integral component of decision-making in breast cancer management. IHC4 score is an algorithm that combines hormone receptor, HER2, and Ki-67 status to provide a semiquantitative prognostic score for breast cancer. High accuracy and low interobserver variance are important to ensure the score is accurately calculated; however, few previous efforts have been made to measure or decrease interobserver variance. We developed a Web-based training tool, called "Score the Core" (STC) using tissue microarrays to train pathologists to visually score estrogen receptor (using the 300-point H score), progesterone receptor (percent positive), and Ki-67 (percent positive). STC used a reference score calculated from a reproducible manual counting method. Pathologists in the Athena Breast Health Network and pathology residents at associated institutions completed the exercise. By using STC, pathologists improved their estrogen receptor H score and progesterone receptor and Ki-67 proportion assessment and demonstrated a good correlation between pathologist and reference scores. In addition, we collected information about pathologist performance that allowed us to compare individual pathologists and measures of agreement. Pathologists' assessment of the proportion of positive cells was closer to the reference than their assessment of the relative intensity of positive cells. Careful training and assessment should be used to ensure the accuracy of breast biomarkers. This is particularly important as breast cancer diagnostics become increasingly quantitative and reproducible. Our training tool is a novel approach for pathologist training that can serve as an important component of ongoing quality assessment and can improve the accuracy of breast cancer prognostic biomarkers. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Antifouling paint booster biocides (Irgarol 1051 and diuron) in marinas and ports of Bushehr, Persian Gulf.

    Science.gov (United States)

    Saleh, Abolfazl; Molaei, Saeideh; Sheijooni Fumani, Neda; Abedi, Ehsan

    2016-04-15

    In the present study, antifouling paint booster biocides, Irgarol 1051 and diuron were measured in ports and marinas of Bushehr, Iran. Results showed that in seawater samples taken from ports and marinas, Irgarol was found at the range of less than LOD to 63.4ngL(-1) and diuron was found to be at the range of less than LOD to 29.1ngL(-1) (in Jalali marina). 3,4-dichloroaniline (3,4-DCA), as a degradation product of diuron, was also analyzed and its maximum concentration was 390ngL(-1). Results for analysis of Irgarol 1051 in sediments showed a maximum concentration of 35.4ngg(-1) dry weight in Bandargah marina. A comparison between the results of this study and those of other published works showed that Irgarol and diuron pollutions in ports and marinas of Bushehr located in the Persian Gulf were less than the average of reports from other parts of the world. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Development and validation of a composite scoring system for robot-assisted surgical training--the Robotic Skills Assessment Score.

    Science.gov (United States)

    Chowriappa, Ashirwad J; Shi, Yi; Raza, Syed Johar; Ahmed, Kamran; Stegemann, Andrew; Wilding, Gregory; Kaouk, Jihad; Peabody, James O; Menon, Mani; Hassett, James M; Kesavadas, Thenkurussi; Guru, Khurshid A

    2013-12-01

    A standardized scoring system does not exist in virtual reality-based assessment metrics to describe safe and crucial surgical skills in robot-assisted surgery. This study aims to develop an assessment score along with its construct validation. All subjects performed key tasks on previously validated Fundamental Skills of Robotic Surgery curriculum, which were recorded, and metrics were stored. After an expert consensus for the purpose of content validation (Delphi), critical safety determining procedural steps were identified from the Fundamental Skills of Robotic Surgery curriculum and a hierarchical task decomposition of multiple parameters using a variety of metrics was used to develop Robotic Skills Assessment Score (RSA-Score). Robotic Skills Assessment mainly focuses on safety in operative field, critical error, economy, bimanual dexterity, and time. Following, the RSA-Score was further evaluated for construct validation and feasibility. Spearman correlation tests performed between tasks using the RSA-Scores indicate no cross correlation. Wilcoxon rank sum tests were performed between the two groups. The proposed RSA-Score was evaluated on non-robotic surgeons (n = 15) and on expert-robotic surgeons (n = 12). The expert group demonstrated significantly better performance on all four tasks in comparison to the novice group. Validation of the RSA-Score in this study was carried out on the Robotic Surgical Simulator. The RSA-Score is a valid scoring system that could be incorporated in any virtual reality-based surgical simulator to achieve standardized assessment of fundamental surgical tents during robot-assisted surgery. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Recursive and non-linear logistic regression: moving on from the original EuroSCORE and EuroSCORE II methodologies.

    Science.gov (United States)

    Poullis, Michael

    2014-11-01

    EuroSCORE II, despite improving on the original EuroSCORE system, has not solved all the calibration and predictability issues. Recursive, non-linear and mixed recursive and non-linear regression analysis were assessed with regard to sensitivity, specificity and predictability of the original EuroSCORE and EuroSCORE II systems. The original logistic EuroSCORE, EuroSCORE II and recursive, non-linear and mixed recursive and non-linear regression analyses of these risk models were assessed via receiver operator characteristic curves (ROC) and Hosmer-Lemeshow statistic analysis with regard to the accuracy of predicting in-hospital mortality. Analysis was performed for isolated coronary artery bypass grafts (CABGs) (n = 2913), aortic valve replacement (AVR) (n = 814), mitral valve surgery (n = 340), combined AVR and CABG (n = 517), aortic (n = 350), miscellaneous cases (n = 642), and combinations of the above cases (n = 5576). The original EuroSCORE had an ROC below 0.7 for isolated AVR and combined AVR and CABG. None of the methods described increased the ROC above 0.7. The EuroSCORE II risk model had an ROC below 0.7 for isolated AVR only. Recursive regression, non-linear regression, and mixed recursive and non-linear regression all increased the ROC above 0.7 for isolated AVR. The original EuroSCORE had a Hosmer-Lemeshow statistic that was above 0.05 for all patients and the subgroups analysed. All of the techniques markedly increased the Hosmer-Lemeshow statistic. The EuroSCORE II risk model had a Hosmer-Lemeshow statistic that was significant for all patients (P linear regression failed to improve on the original Hosmer-Lemeshow statistic. The mixed recursive and non-linear regression using the EuroSCORE II risk model was the only model that produced an ROC of 0.7 or above for all patients and procedures and had a Hosmer-Lemeshow statistic that was highly non-significant. The original EuroSCORE and the EuroSCORE II risk models do not have adequate ROC and Hosmer

  20. The Veterans Affairs Cardiac Risk Score: Recalibrating the Atherosclerotic Cardiovascular Disease Score for Applied Use.

    Science.gov (United States)

    Sussman, Jeremy B; Wiitala, Wyndy L; Zawistowski, Matthew; Hofer, Timothy P; Bentley, Douglas; Hayward, Rodney A

    2017-09-01

    Accurately estimating cardiovascular risk is fundamental to good decision-making in cardiovascular disease (CVD) prevention, but risk scores developed in one population often perform poorly in dissimilar populations. We sought to examine whether a large integrated health system can use their electronic health data to better predict individual patients' risk of developing CVD. We created a cohort using all patients ages 45-80 who used Department of Veterans Affairs (VA) ambulatory care services in 2006 with no history of CVD, heart failure, or loop diuretics. Our outcome variable was new-onset CVD in 2007-2011. We then developed a series of recalibrated scores, including a fully refit "VA Risk Score-CVD (VARS-CVD)." We tested the different scores using standard measures of prediction quality. For the 1,512,092 patients in the study, the Atherosclerotic cardiovascular disease risk score had similar discrimination as the VARS-CVD (c-statistic of 0.66 in men and 0.73 in women), but the Atherosclerotic cardiovascular disease model had poor calibration, predicting 63% more events than observed. Calibration was excellent in the fully recalibrated VARS-CVD tool, but simpler techniques tested proved less reliable. We found that local electronic health record data can be used to estimate CVD better than an established risk score based on research populations. Recalibration improved estimates dramatically, and the type of recalibration was important. Such tools can also easily be integrated into health system's electronic health record and can be more readily updated.

  1. The FAt Spondyloarthritis Spine Score (FASSS)

    DEFF Research Database (Denmark)

    Pedersen, Susanne Juhl; Zhao, Zheng; Lambert, Robert Gw

    2013-01-01

    an important measure of treatment efficacy as well as a surrogate marker for new bone formation. The aim of this study was to develop and validate a new scoring method for fat lesions in the spine, the Fat SpA Spine Score (FASSS), which in contrast to the existing scoring method addresses the localization......Studies have shown that fat lesions follow resolution of inflammation in the spine of patients with axial spondyloarthritis (SpA). Fat lesions at vertebral corners have also been shown to predict development of new syndesmophytes. Therefore, scoring of fat lesions in the spine may constitute both...

  2. SOS score: an optimized score to screen acute stroke patients for obstructive sleep apnea.

    Science.gov (United States)

    Camilo, Millene R; Sander, Heidi H; Eckeli, Alan L; Fernandes, Regina M F; Dos Santos-Pontelli, Taiza E G; Leite, Joao P; Pontes-Neto, Octavio M

    2014-09-01

    Obstructive sleep apnea (OSA) is frequent in acute stroke patients, and has been associated with higher mortality and worse prognosis. Polysomnography (PSG) is the gold standard diagnostic method for OSA, but it is impracticable as a routine for all acute stroke patients. We evaluated the accuracy of two OSA screening tools, the Berlin Questionnaire (BQ), and the Epworth Sleepiness Scale (ESS) when administered to relatives of acute stroke patients; we also compared these tools against a combined screening score (SOS score). Ischemic stroke patients were submitted to a full PSG at the first night after onset of symptoms. OSA severity was measured by apnea-hypopnea index (AHI). BQ and ESS were administered to relatives of stroke patients before the PSG and compared to SOS score for accuracy and C-statistics. We prospectively studied 39 patients. OSA (AHI ≥10/h) was present in 76.9%. The SOS score [area under the curve (AUC): 0.812; P = 0.005] and ESS (AUC: 0.789; P = 0.009) had good predictive value for OSA. The SOS score was the only tool with significant predictive value (AUC: 0.686; P = 0.048) for severe OSA (AHI ≥30/h), when compared to ESS (P = 0.119) and BQ (P = 0.191). The threshold of SOS ≤10 showed high sensitivity (90%) and negative predictive value (96.2%) for OSA; SOS ≥20 showed high specificity (100%) and positive predictive value (92.5%) for severe OSA. The SOS score administered to relatives of stroke patients is a useful tool to screen for OSA and may decrease the need for PSG in acute stroke setting. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Analogue of Pontryagin's maximum principle for multiple integrals minimization problems

    OpenAIRE

    Mikhail, Zelikin

    2016-01-01

    The theorem like Pontryagin's maximum principle for multiple integrals is proved. Unlike the usual maximum principle, the maximum should be taken not over all matrices, but only on matrices of rank one. Examples are given.

  4. 3D Space Shift from CityGML LoD3-Based Multiple Building Elements to a 3D Volumetric Object

    Directory of Open Access Journals (Sweden)

    Shen Ying

    2017-01-01

    Full Text Available In contrast with photorealistic visualizations, urban landscape applications, and building information system (BIM, 3D volumetric presentations highlight specific calculations and applications of 3D building elements for 3D city planning and 3D cadastres. Knowing the precise volumetric quantities and the 3D boundary locations of 3D building spaces is a vital index which must remain constant during data processing because the values are related to space occupation, tenure, taxes, and valuation. To meet these requirements, this paper presents a five-step algorithm for performing a 3D building space shift. This algorithm is used to convert multiple building elements into a single 3D volumetric building object while maintaining the precise volume of the 3D space and without changing the 3D locations or displacing the building boundaries. As examples, this study used input data and building elements based on City Geography Markup Language (CityGML LoD3 models. This paper presents a method for 3D urban space and 3D property management with the goal of constructing a 3D volumetric object for an integral building using CityGML objects, by fusing the geometries of various building elements. The resulting objects possess true 3D geometry that can be represented by solid geometry and saved to a CityGML file for effective use in 3D urban planning and 3D cadastres.

  5. Development of the siriraj clinical asthma score.

    Science.gov (United States)

    Vichyanond, Pakit; Veskitkul, Jittima; Rienmanee, Nuanphong; Pacharn, Punchama; Jirapongsananuruk, Orathai; Visitsunthorn, Nualanong

    2013-09-01

    Acute asthmatic attack in children commonly occurs despite the introduction of effective controllers such as inhaled corticosteroids and leukotriene modifiers. Treatment of acute asthmatic attack requires proper evaluation of attack severity and appropriate selection of medical therapy. In children, measurement of lung function is difficult during acute attack and thus clinical asthma scoring may aid physician in making further decision regarding treatment and admission. We enrolled 70 children with acute asthmatic attack with age range from 1 to 12 years (mean ± SD = 51.5 ± 31.8 months) into the study. Twelve selected asthma severity items were assessed by 2 independent observers prior to administration of salbutamol nebulization (up to 3 doses at 20 minutes interval). Decision for further therapy and admission was made by emergency department physician. Three different scoring systems were constructed from items with best validity. Sensitivity, specificity and accuracy of these scores were assessed. Inter-rater reliability was assessed for each score. Review of previous scoring systems was also conducted and reported. Three severity items had poor validity, i.e., cyanosis, depressed cerebral function, and I:E ratio (p > 0.05). Three items had poor inter-rater reliability, i.e., breath sound quality, air entry, and I:E ratio. These items were omitted and three new clinical scores were constructed from the remaining items. Clinical scoring system comprised retractions, dyspnea, O2 saturation, respiratory rate and wheezing (rangeof score 0-10) gave the best accuracy and inter-rater variability and were chosen for clinical use-Siriraj Clinical Asthma Score (SCAS). A Clinical Asthma Score that is simple, relatively easy to administer and with good validity and variability is essential for treatment of acute asthma in children. Several good candidate scores have been introduced in the past. We described the development of the Siriraj Clinical Asthma Score (SCAS) in

  6. A Novel Scoring System Approach to Assess Patients with Lyme Disease (Nutech Functional Score

    Directory of Open Access Journals (Sweden)

    Geeta Shroff

    2018-01-01

    Full Text Available Introduction: A bacterial infection by Borrelia burgdorferi referred to as Lyme disease (LD or borreliosis is transmitted mostly by a bite of the tick Ixodes scapularis in the USA and Ixodes ricinus in Europe. Various tests are used for the diagnosis of LD, but their results are often unreliable. We compiled a list of clinically visible and patient-reported symptoms that are associated with LD. Based on this list, we developed a novel scoring system. Methodology: Nutech functional Score (NFS, which is a 43 point positional (every symptom is subgraded and each alternative gets some points according to its position and directional (moves in direction bad to good scoring system that assesses the patient's condition. Results: The grades of the scoring system have been converted into numeric values for conducting probability based studies. Each symptom is graded from 1 to 5 that runs in direction BAD → GOOD. Conclusion: NFS is a unique tool that can be used universally to assess the condition of patients with LD.

  7. Matching score based face recognition

    NARCIS (Netherlands)

    Boom, B.J.; Beumer, G.M.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2006-01-01

    Accurate face registration is of vital importance to the performance of a face recognition algorithm. We propose a new method: matching score based face registration, which searches for optimal alignment by maximizing the matching score output of a classifier as a function of the different

  8. Microprocessor Controlled Maximum Power Point Tracker for Photovoltaic Application

    International Nuclear Information System (INIS)

    Jiya, J. D.; Tahirou, G.

    2002-01-01

    This paper presents a microprocessor controlled maximum power point tracker for photovoltaic module. Input current and voltage are measured and multiplied within the microprocessor, which contains an algorithm to seek the maximum power point. The duly cycle of the DC-DC converter, at which the maximum power occurs is obtained, noted and adjusted. The microprocessor constantly seeks for improvement of obtained power by varying the duty cycle

  9. A scoring system for ascertainment of incident stroke; the Risk Index Score (RISc).

    Science.gov (United States)

    Kass-Hout, T A; Moyé, L A; Smith, M A; Morgenstern, L B

    2006-01-01

    The main objective of this study was to develop and validate a computer-based statistical algorithm that could be translated into a simple scoring system in order to ascertain incident stroke cases using hospital admission medical records data. The Risk Index Score (RISc) algorithm was developed using data collected prospectively by the Brain Attack Surveillance in Corpus Christi (BASIC) project, 2000. The validity of RISc was evaluated by estimating the concordance of scoring system stroke ascertainment to stroke ascertainment by physician and/or abstractor review of hospital admission records. RISc was developed on 1718 randomly selected patients (training set) and then statistically validated on an independent sample of 858 patients (validation set). A multivariable logistic model was used to develop RISc and subsequently evaluated by goodness-of-fit and receiver operating characteristic (ROC) analyses. The higher the value of RISc, the higher the patient's risk of potential stroke. The study showed RISc was well calibrated and discriminated those who had potential stroke from those that did not on initial screening. In this study we developed and validated a rapid, easy, efficient, and accurate method to ascertain incident stroke cases from routine hospital admission records for epidemiologic investigations. Validation of this scoring system was achieved statistically; however, clinical validation in a community hospital setting is warranted.

  10. Maximum entropy reconstructions for crystallographic imaging; Cristallographie et reconstruction d`images par maximum d`entropie

    Energy Technology Data Exchange (ETDEWEB)

    Papoular, R

    1997-07-01

    The Fourier Transform is of central importance to Crystallography since it allows the visualization in real space of tridimensional scattering densities pertaining to physical systems from diffraction data (powder or single-crystal diffraction, using x-rays, neutrons, electrons or else). In turn, this visualization makes it possible to model and parametrize these systems, the crystal structures of which are eventually refined by Least-Squares techniques (e.g., the Rietveld method in the case of Powder Diffraction). The Maximum Entropy Method (sometimes called MEM or MaxEnt) is a general imaging technique, related to solving ill-conditioned inverse problems. It is ideally suited for tackling undetermined systems of linear questions (for which the number of variables is much larger than the number of equations). It is already being applied successfully in Astronomy, Radioastronomy and Medical Imaging. The advantages of using MAXIMUM Entropy over conventional Fourier and `difference Fourier` syntheses stem from the following facts: MaxEnt takes the experimental error bars into account; MaxEnt incorporate Prior Knowledge (e.g., the positivity of the scattering density in some instances); MaxEnt allows density reconstructions from incompletely phased data, as well as from overlapping Bragg reflections; MaxEnt substantially reduces truncation errors to which conventional experimental Fourier reconstructions are usually prone. The principles of Maximum Entropy imaging as applied to Crystallography are first presented. The method is then illustrated by a detailed example specific to Neutron Diffraction: the search for proton in solids. (author). 17 refs.

  11. Molecular analysis and test of linkage between the FMR-I gene and infantile autism in multiplex families

    Energy Technology Data Exchange (ETDEWEB)

    Hallmayer, J.; Pintado, E.; Lotspeich, L.; Spiker, D.; Kraemer, H.C.; Lee Wong, D.; Lin, A.; Herbert, J.; Cavalli-Sforza, L.L.; Ciaranello, R.D. [Stanford Univ., CA (United States)] [and others

    1994-11-01

    Approximately 2%-5% of autistic children show cytogenetic evidence of the fragile X syndrome. This report tests whether infantile autism in multiplex autism families arises from an unusual manifestion of the fragile X syndrome. This could arise either by expansion of the (CGG)n trinucleotide repeat in FMR-1 or from a mutation elsewhere in the gene. We studied 35 families that met stringent criteria for multiplex autism. Amplification of the trinucleotide repeat and analysis of methylation status were performed in 79 autistic children and in 31 of their unaffected siblings by Southern blot analysis. No examples of amplified repeats were seen in the autistic or control children or in their parents or grandparents. We next examined the hypothesis that there was a mutation elsewhere in the FMR-1 gene, by linkage analysis in 32 of these families. We tested four different dominant models and a recessive model. Linkage to FMR-1 could be excluded (lod score between -24 and -62) in all models by using probes DXS548, FRAXAC1, and FRAXAC2 and the CGG repeat itself. Tests for heterogeneity in this sample were negative, and the occurrence of positive lod scores in this data set could be attributed to chance. Analysis of the data by the affected-sib method also did not show evidence for linkage of any marker to autism. These results enable us to reject the hypothesis that multiplex autism arises from expansion of the (CGG)n trinucleotide repeat in FMR-1. Further, because the overall lod scores for all probes in all models tested were highly negative, linkage to FMR-1 can also be ruled out in multiplex autistic families. 35 refs., 2 figs., 5 tabs.

  12. Genomic screening for dissection of a complex disease: The multiple sclerosis phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Haines, J.L.; Bazyk, A.; Gusella, J.F. [Massachusetts General Hospital, Boston, MA (United States)] [and others

    1994-09-01

    Application of positional cloning to diseases with a complex etiology is fraught with problems. These include undefined modes of inheritance, heterogeneity, and epistasis. Although microsatellite markers now make genotyping the genome a straightforward task, no single analytical method is available to efficiently and accurately use these data for a complex disease. We have developed a multi-stage genomic screening strategy which uses a combination of non-parametric approaches (Affected Pedigree Member (APM) linkage analysis and robust sib pair analysis (SP)), and the parametric lod score approach (using four different genetic models). To warrant follow-up, a marker must have two or more of: a nominal P value of 0.05 or less on the non-parametric tests, or a lod score greater than 1.0 for any model. Two adjacent markers each fulfilling one criterion are also considered for follow-up. These criteria were determined both by simulation studies and our empirical experience in screening a large number of other disorders. We applied this approach to multiple sclerosis (MS), a complex neurological disorder with a strong but ill-defined genetic component. Analysis of the first 91 markers from our screen of 55 multiplex families found 5 markers which met the SP criteria, 13 markers which met the APM criteria, and 8 markers which met the lod score criteria. Five regions (on chromosomes 2, 4, 7, 14, and 19) met our overall criteria. However, no single method identified all of these regions, suggesting that each method is sensitive to various (unknown) influences. The chromosome 14 results were not supported by follow-up typing and analysis of markers in that region, but the chromosome 19 results remain well supported. Updated screening results will be presented.

  13. Comprehensive performance analyses and optimization of the irreversible thermodynamic cycle engines (TCE) under maximum power (MP) and maximum power density (MPD) conditions

    International Nuclear Information System (INIS)

    Gonca, Guven; Sahin, Bahri; Ust, Yasin; Parlak, Adnan

    2015-01-01

    This paper presents comprehensive performance analyses and comparisons for air-standard irreversible thermodynamic cycle engines (TCE) based on the power output, power density, thermal efficiency, maximum dimensionless power output (MP), maximum dimensionless power density (MPD) and maximum thermal efficiency (MEF) criteria. Internal irreversibility of the cycles occurred during the irreversible-adiabatic processes is considered by using isentropic efficiencies of compression and expansion processes. The performances of the cycles are obtained by using engine design parameters such as isentropic temperature ratio of the compression process, pressure ratio, stroke ratio, cut-off ratio, Miller cycle ratio, exhaust temperature ratio, cycle temperature ratio and cycle pressure ratio. The effects of engine design parameters on the maximum and optimal performances are investigated. - Highlights: • Performance analyses are conducted for irreversible thermodynamic cycle engines. • Comprehensive computations are performed. • Maximum and optimum performances of the engines are shown. • The effects of design parameters on performance and power density are examined. • The results obtained may be guidelines to the engine designers

  14. Comparison of the Classifier Oriented Gait Score and the Gait Profile Score based on imitated gait impairments.

    Science.gov (United States)

    Christian, Josef; Kröll, Josef; Schwameder, Hermann

    2017-06-01

    Common summary measures of gait quality such as the Gait Profile Score (GPS) are based on the principle of measuring a distance from the mean pattern of a healthy reference group in a gait pattern vector space. The recently introduced Classifier Oriented Gait Score (COGS) is a pathology specific score that measures this distance in a unique direction, which is indicated by a linear classifier. This approach has potentially improved the discriminatory power to detect subtle changes in gait patterns but does not incorporate a profile of interpretable sub-scores like the GPS. The main aims of this study were to extend the COGS by decomposing it into interpretable sub-scores as realized in the GPS and to compare the discriminative power of the GPS and COGS. Two types of gait impairments were imitated to enable a high level of control of the gait patterns. Imitated impairments were realized by restricting knee extension and inducing leg length discrepancy. The results showed increased discriminatory power of the COGS for differentiating diverse levels of impairment. Comparison of the GPS and COGS sub-scores and their ability to indicate changes in specific variables supports the validity of both scores. The COGS is an overall measure of gait quality with increased power to detect subtle changes in gait patterns and might be well suited for tracing the effect of a therapeutic treatment over time. The newly introduced sub-scores improved the interpretability of the COGS, which is helpful for practical applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Scoring System Improvements to Three Leadership Predictors

    National Research Council Canada - National Science Library

    Dela

    1997-01-01

    .... The modified scoring systems were evaluated by rescoring responses randomly selected from the sample which had been scored according to the scoring systems originally developed for the leadership research...

  16. Shower maximum detector for SDC calorimetry

    International Nuclear Information System (INIS)

    Ernwein, J.

    1994-01-01

    A prototype for the SDC end-cap (EM) calorimeter complete with a pre-shower and a shower maximum detector was tested in beams of electrons and Π's at CERN by an SDC subsystem group. The prototype was manufactured from scintillator tiles and strips read out with 1 mm diameter wave-length shifting fibers. The design and construction of the shower maximum detector is described, and results of laboratory tests on light yield and performance of the scintillator-fiber system are given. Preliminary results on energy and position measurements with the shower max detector in the test beam are shown. (authors). 4 refs., 5 figs

  17. Modelling sequentially scored item responses

    NARCIS (Netherlands)

    Akkermans, W.

    2000-01-01

    The sequential model can be used to describe the variable resulting from a sequential scoring process. In this paper two more item response models are investigated with respect to their suitability for sequential scoring: the partial credit model and the graded response model. The investigation is

  18. The APPLE Score - A Novel Score for the Prediction of Rhythm Outcomes after Repeat Catheter Ablation of Atrial Fibrillation.

    Directory of Open Access Journals (Sweden)

    Jelena Kornej

    Full Text Available Arrhythmia recurrences after catheter ablation occur in up to 50% within one year but their prediction remains challenging. Recently, we developed a novel score for the prediction of rhythm outcomes after single AF ablation demonstrating superiority to other scores. The current study was performed to 1 prove the predictive value of the APPLE score in patients undergoing repeat AF ablation and 2 compare it with the CHADS2 and CHA2DS2-VASc scores.Rhythm outcome between 3-12 months after AF ablation were documented. The APPLE score (one point for Age >65 years, Persistent AF, imPaired eGFR (<60 ml/min/1.73m2, LA diameter ≥43 mm, EF <50% was calculated in every patient before procedure.379 consecutive patients from The Leipzig Heart Center AF Ablation Registry (60±10 years, 65% male, 70% paroxysmal AF undergoing repeat AF catheter ablation were included. Arrhythmia recurrences were observed in 133 patients (35%. While the CHADS2 (AUC 0.577, p = 0.037 and CHA2DS2-VASc scores (AUC 0.590, p = 0.015 demonstrated low predictive value, the APPLE score showed better prediction of arrhythmia recurrences (AUC 0.617, p = 0.002 than other scores (both p<0.001. Compared to patients with an APPLE score of 0, the risk (OR for arrhythmia recurrences was 2.9, 3.0 and 6.0 (all p<0.01 for APPLE scores 1, 2, or ≥3, respectively.The novel APPLE score is superior to the CHADS2 and CHA2DS2-VASc scores for prediction of rhythm outcomes after repeat AF catheter ablation. It may be helpful to identify patients with low, intermediate or high risk for recurrences after repeat procedure.

  19. A comparative study of MR imaging scores and MR perfusion imaging in pre-operative grading of intracranial gliomas

    International Nuclear Information System (INIS)

    Wu Honglin; Chen Junkun; Zhang Zongjun; Lu Guangming; Chen Ziqian; Wang Wei; Ji Xueman; Tang Xiaojun; Li Lin

    2005-01-01

    Objective: To compare the accuracy of MR imaging scores with MR perfusion imaging in pre-operative grading of intracranial gliomas. Methods: Thirty patients with intracranial gliomas (8 low-grade and 22 high-grade, according to WHO criteria) were examined with MR perfusion imaging pre-operatively. The lesions were evaluated by using an MR imaging score based on nine criteria. rCBV of the lesions were calculated by comparing the CBV of the lesion and that of contralateral normal white matter. The scores and ratios in high-grade and low-grade tumours were compared. Results: The MR imaging score of low grade (grades I and II) gliomas (0.67±0.29) were significantly lower than that of grades III (1.32 ± 0.47) (t=-3.48, P=0.003) and IV (1.56 ± 0.20) (t=-7.36, P=0.000) gliomas. There was no statistical difference when MR imaging scores of grades III and IV gliomas (t=-1.39, P=0.182) were compared. The maximum rCBV ratio of low grade (grades I and II) gliomas (2.38 ± 0.66 ) were significantly lower than that of grades III (5.81 ± 3.20) (t=-3.57, P=0.003) and IV (6.99 ± 2.47) (t=-5.09, P=0.001). There was no statistical difference when rCBV ratios of grades III and IV (t =-0.93, P=0.365) gliomas were compared. The accuracy of MR imaging scores in the noninvasive grading of untreated gliomas was all most the same as that of MR perfusion imaging (90.00% vs 89.29%). Conclusion: The MR imaging scores and MR perfusion imaging are two very useful tools in the evaluation of the histopathologic grade of cerebral gliomas. The overall accuracy in the noninvasive grading of gliomas may be imp roved if MR imaging scores and MR perfusion imaging are combined. (authors)

  20. Maximum Power Point Tracking in Variable Speed Wind Turbine Based on Permanent Magnet Synchronous Generator Using Maximum Torque Sliding Mode Control Strategy

    Institute of Scientific and Technical Information of China (English)

    Esmaeil Ghaderi; Hossein Tohidi; Behnam Khosrozadeh

    2017-01-01

    The present study was carried out in order to track the maximum power point in a variable speed turbine by minimizing electromechanical torque changes using a sliding mode control strategy.In this strategy,fhst,the rotor speed is set at an optimal point for different wind speeds.As a result of which,the tip speed ratio reaches an optimal point,mechanical power coefficient is maximized,and wind turbine produces its maximum power and mechanical torque.Then,the maximum mechanical torque is tracked using electromechanical torque.In this technique,tracking error integral of maximum mechanical torque,the error,and the derivative of error are used as state variables.During changes in wind speed,sliding mode control is designed to absorb the maximum energy from the wind and minimize the response time of maximum power point tracking (MPPT).In this method,the actual control input signal is formed from a second order integral operation of the original sliding mode control input signal.The result of the second order integral in this model includes control signal integrity,full chattering attenuation,and prevention from large fluctuations in the power generator output.The simulation results,calculated by using MATLAB/m-file software,have shown the effectiveness of the proposed control strategy for wind energy systems based on the permanent magnet synchronous generator (PMSG).

  1. Combining Teacher Assessment Scores with External Examination ...

    African Journals Online (AJOL)

    Combining Teacher Assessment Scores with External Examination Scores for Certification: Comparative Study of Four Statistical Models. ... University entrance examination scores in mathematics were obtained for a subsample of 115 ...

  2. Hereditary motor and sensory neuropathy with proximal dominancy in the lower extremities, urinary disturbance, and paroxysmal dry cough.

    Science.gov (United States)

    Miura, Shiroh; Shibata, Hiroki; Kida, Hiroshi; Noda, Kazuhito; Tomiyasu, Katsuro; Yamamoto, Ken; Iwaki, Akiko; Ayabe, Mitsuyoshi; Aizawa, Hisamichi; Taniwaki, Takayuki; Fukumaki, Yasuyuki

    2008-10-15

    We studied a four-generation pedigree of a Japanese family with hereditary neuropathy to elucidate the genetic basis of this disease. Twelve members of the family were enrolled in this study. The clinical features were neurogenic muscle weakness with proximal dominancy in the lower extremities, sensory involvement, areflexia, fine postural tremors, painful muscle cramps, elevated creatine kinase levels, recurrent paroxysmal dry cough, and neurogenic bladder. We performed a genome-wide search using genetic loci spaced at about 13 Mb intervals. Although nine chromosomes (1, 3, 4, 5, 6, 10, 17, 19, and 22) had at least one region in which the logarithm of odds (LOD) score was over 1.0, no loci fulfilled the criteria for significant evidence of linkage. Moreover, we analyzed an extra 14 markers on 3p12-q13 (the locus of hereditary motor and sensory neuropathy, proximal dominant form) and an extra five markers on 3p22-p24 (the locus of hereditary sensory neuropathy with chronic cough) and observed LOD scores of hereditary motor and sensory neuropathy with autosomal dominant inheritance.

  3. A locus identified on chromosome18p11.31 is associated with hippocampal abnormalities in a family with mesial temporal lobe epilepsy

    Directory of Open Access Journals (Sweden)

    Claudia Vianna Maurer-Morelli

    2012-08-01

    Full Text Available We aimed to identify the region harboring a putative candidate gene associated with hippocampal abnormalities (HAb in a family with mesial temporal lobe epilepsy (MTLE. Genome-wide scan was performed in one large kindred with MTLE using a total of 332 microsatellite markers at ~12cM intervals. An additional 13 markers were genotyped in the candidate region. Phenotypic classes were defined according to the presence of hippocampal atrophy and/or hyperintense hippocampal T2 signal detected on magnetic resonance imaging. We identified a significant positive LOD score on chromosome 18p11.31 with a Zmax of 3.12 at D18S452. Multipoint LOD scores and haplotype analyses localized the candidate locus within a 6cM interval flanked by D18S976 and D18S967. We present here evidence that HAb, which were previously related mainly to environmental risk factors, may be influenced by genetic predisposition. This finding may have major impact in the study of the mechanisms underlying abnormalities in mesial temporal lobe structures and their relationship with MTLE.

  4. Technology Performance Level (TPL) Scoring Tool

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Jochem [National Renewable Energy Lab. (NREL), Golden, CO (United States); Roberts, Jesse D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Costello, Ronan [Wave Venture, Penstraze (United Kingdom); Bull, Diana L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Babarit, Aurelien [Ecole Centrale de Nantes (France). Lab. of Research in Hydrodynamics, Energetics, and Atmospheric Environment (LHEEA); Neilson, Kim [Ramboll, Copenhagen (Denmark); Bittencourt, Claudio [DNV GL, London (United Kingdom); Kennedy, Ben [Wave Venture, Penstraze (United Kingdom)

    2016-09-01

    Three different ways of combining scores are used in the revised formulation. These are arithmetic mean, geometric mean and multiplication with normalisation. Arithmetic mean is used when combining scores that measure similar attributes, e.g. used for combining costs. The arithmetic mean has the property that it is similar to a logical OR, e.g. when combining costs it does not matter what the individual costs are only what the combined cost is. Geometric mean and Multiplication are used when combining scores that measure disparate attributes. Multiplication is similar to a logical AND, it is used to combine ‘must haves.’ As a result, this method is more punitive than the geometric mean; to get a good score in the combined result it is necessary to have a good score in ALL of the inputs. e.g. the different types of survivability are ‘must haves.’ On balance, the revised TPL is probably less punitive than the previous spreadsheet, multiplication is used sparingly as a method of combining scores. This is in line with the feedback of the Wave Energy Prize judges.

  5. 32 CFR 842.35 - Depreciation and maximum allowances.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Depreciation and maximum allowances. 842.35... LITIGATION ADMINISTRATIVE CLAIMS Personnel Claims (31 U.S.C. 3701, 3721) § 842.35 Depreciation and maximum allowances. The military services have jointly established the “Allowance List-Depreciation Guide” to...

  6. Do medical students’ scores using different assessment instruments predict their scores in clinical reasoning using a computer-based simulation?

    Directory of Open Access Journals (Sweden)

    Fida M

    2015-02-01

    Full Text Available Mariam Fida,1 Salah Eldin Kassab2 1Department of Molecular Medicine, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Bahrain; 2Department of Medical Education, Faculty of Medicine, Suez Canal University, Ismailia, Egypt Purpose: The development of clinical problem-solving skills evolves over time and requires structured training and background knowledge. Computer-based case simulations (CCS have been used for teaching and assessment of clinical reasoning skills. However, previous studies examining the psychometric properties of CCS as an assessment tool have been controversial. Furthermore, studies reporting the integration of CCS into problem-based medical curricula have been limited. Methods: This study examined the psychometric properties of using CCS software (DxR Clinician for assessment of medical students (n=130 studying in a problem-based, integrated multisystem module (Unit IX during the academic year 2011–2012. Internal consistency reliability of CCS scores was calculated using Cronbach's alpha statistics. The relationships between students' scores in CCS components (clinical reasoning, diagnostic performance, and patient management and their scores in other examination tools at the end of the unit including multiple-choice questions, short-answer questions, objective structured clinical examination (OSCE, and real patient encounters were analyzed using stepwise hierarchical linear regression. Results: Internal consistency reliability of CCS scores was high (α=0.862. Inter-item correlations between students' scores in different CCS components and their scores in CCS and other test items were statistically significant. Regression analysis indicated that OSCE scores predicted 32.7% and 35.1% of the variance in clinical reasoning and patient management scores, respectively (P<0.01. Multiple-choice question scores, however, predicted only 15.4% of the variance in diagnostic performance scores (P<0.01, while

  7. External validation of the simple clinical score and the HOTEL score, two scores for predicting short-term mortality after admission to an acute medical unit.

    Science.gov (United States)

    Stræde, Mia; Brabrand, Mikkel

    2014-01-01

    Clinical scores can be of aid to predict early mortality after admission to a medical admission unit. A developed scoring system needs to be externally validated to minimise the risk of the discriminatory power and calibration to be falsely elevated. We performed the present study with the objective of validating the Simple Clinical Score (SCS) and the HOTEL score, two existing risk stratification systems that predict mortality for medical patients based solely on clinical information, but not only vital signs. Pre-planned prospective observational cohort study. Danish 460-bed regional teaching hospital. We included 3046 consecutive patients from 2 October 2008 until 19 February 2009. 26 (0.9%) died within one calendar day and 196 (6.4%) died within 30 days. We calculated SCS for 1080 patients. We found an AUROC of 0.960 (95% confidence interval [CI], 0.932 to 0.988) for 24-hours mortality and 0.826 (95% CI, 0.774-0.879) for 30-day mortality, and goodness-of-fit test, χ(2) = 2.68 (10 degrees of freedom), P = 0.998 and χ(2) = 4.00, P = 0.947, respectively. We included 1470 patients when calculating the HOTEL score. Discriminatory power (AUROC) was 0.931 (95% CI, 0.901-0.962) for 24-hours mortality and goodness-of-fit test, χ(2) = 5.56 (10 degrees of freedom), P = 0.234. We find that both the SCS and HOTEL scores showed an excellent to outstanding ability in identifying patients at high risk of dying with good or acceptable precision.

  8. Estimation and prediction of maximum daily rainfall at Sagar Island using best fit probability models

    Science.gov (United States)

    Mandal, S.; Choudhury, B. U.

    2015-07-01

    Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.

  9. Quadratic prediction of factor scores

    NARCIS (Netherlands)

    Wansbeek, T

    1999-01-01

    Factor scores are naturally predicted by means of their conditional expectation given the indicators y. Under normality this expectation is linear in y but in general it is an unknown function of y. II is discussed that under nonnormality factor scores can be more precisely predicted by a quadratic

  10. Citation analytics: Data exploration and comparative analyses of CiteScores of Open Access and Subscription-Based publications indexed in Scopus (2014-2016).

    Science.gov (United States)

    Atayero, Aderemi A; Popoola, Segun I; Egeonu, Jesse; Oludayo, Olumuyiwa

    2018-08-01

    Citation is one of the important metrics that are used in measuring the relevance and the impact of research publications. The potentials of citation analytics may be exploited to understand the gains of publishing scholarly peer-reviewed research outputs in either Open Access (OA) sources or Subscription-Based (SB) sources in the bid to increase citation impact. However, relevant data required for such comparative analysis must be freely accessible for evidence-based findings and conclusions. In this data article, citation scores ( CiteScores ) of 2542 OA sources and 15,040 SB sources indexed in Scopus from 2014 to 2016 were presented and analyzed based on a set of five inclusion criteria. A robust dataset, which contains the CiteScores of OA and SB publication sources included, is attached as supplementary material to this data article to facilitate further reuse. Descriptive statistics and frequency distributions of OA CiteScores and SB CiteScores are presented in tables. Boxplot representations and scatter plots are provided to show the statistical distributions of OA CiteScores and SB CiteScores across the three sub-categories (Book Series, Journal, and Trade Journal). Correlation coefficient and p-value matrices are made available within the data article. In addition, Probability Density Functions (PDFs) and Cumulative Distribution Functions (CDFs) of OA CiteScores and SB CiteScores are computed and the results are presented using tables and graphs. Furthermore, Analysis of Variance (ANOVA) and multiple comparison post-hoc tests are conducted to understand the statistical difference (and its significance, if any) in the citation impact of OA publication sources and SB publication source based on CiteScore . In the long run, the data provided in this article will help policy makers and researchers in Higher Education Institutions (HEIs) to identify the appropriate publication source type and category for dissemination of scholarly research findings with

  11. The impact of CT radiation dose reduction and iterative reconstruction algorithms from four different vendors on coronary calcium scoring

    Energy Technology Data Exchange (ETDEWEB)

    Willemink, Martin J.; Takx, Richard A.P.; Jong, Pim A. de; Budde, Ricardo P.J.; Schilham, Arnold M.R.; Leiner, Tim [Utrecht University Medical Center, Department of Radiology, Utrecht (Netherlands); Bleys, Ronald L.A.W. [Utrecht University Medical Center, Department of Anatomy, Utrecht (Netherlands); Das, Marco; Wildberger, Joachim E. [Maastricht University Medical Center, Department of Radiology, Maastricht (Netherlands); Prokop, Mathias [Radboud University Nijmegen Medical Center, Department of Radiology, Nijmegen (Netherlands); Buls, Nico; Mey, Johan de [UZ Brussel, Department of Radiology, Brussels (Belgium)

    2014-09-15

    To analyse the effects of radiation dose reduction and iterative reconstruction (IR) algorithms on coronary calcium scoring (CCS). Fifteen ex vivo human hearts were examined in an anthropomorphic chest phantom using computed tomography (CT) systems from four vendors and examined at four dose levels using unenhanced prospectively ECG-triggered protocols. Tube voltage was 120 kV and tube current differed between protocols. CT data were reconstructed with filtered back projection (FBP) and reduced dose CT data with IR. CCS was quantified with Agatston scores, calcification mass and calcification volume. Differences were analysed with the Friedman test. Fourteen hearts showed coronary calcifications. Dose reduction with FBP did not significantly change Agatston scores, calcification volumes and calcification masses (P > 0.05). Maximum differences in Agatston scores were 76, 26, 51 and 161 units, in calcification volume 97, 27, 42 and 162 mm{sup 3}, and in calcification mass 23, 23, 20 and 48 mg, respectively. IR resulted in a trend towards lower Agatston scores and calcification volumes with significant differences for one vendor (P < 0.05). Median relative differences between reference FBP and reduced dose IR for Agatston scores remained within 2.0-4.6 %, 1.0-5.3 %, 1.2-7.7 % and 2.6-4.5 %, for calcification volumes within 2.4-3.9 %, 1.0-5.6 %, 1.1-6.4 % and 3.7-4.7 %, for calcification masses within 1.9-4.1 %, 0.9-7.8 %, 2.9-4.7 % and 2.5-3.9 %, respectively. IR resulted in increased, decreased or similar calcification masses. CCS derived from standard FBP acquisitions was not affected by radiation dose reductions up to 80 %. IR resulted in a trend towards lower Agatston scores and calcification volumes. (orig.)

  12. Clinical role of pathological downgrading after radical prostatectomy in patients with biopsy confirmed Gleason score 3 + 4 prostate cancer.

    Science.gov (United States)

    Gondo, Tatsuo; Poon, Bing Ying; Matsumoto, Kazuhiro; Bernstein, Melanie; Sjoberg, Daniel D; Eastham, James A

    2015-01-01

    To identify preoperative factors predicting Gleason score downgrading after radical prostatectomy (RP) in patients with biopsy Gleason score 3+4 prostate cancer and to determine if prediction of downgrading can identify potential candidates for active surveillance (AS). We identified 1317 patients with biopsy Gleason score 3+4 prostate cancers who underwent RP at the Memorial Sloan-Kettering Cancer Center between 2005 and 2013. Several preoperative and biopsy characteristics were evaluated by forward selection regression, and selected predictors of downgrading were analysed by multivariable logistic regression. Decision curve analysis was used to evaluate the clinical utility of the multivariate model. Gleason score was downgraded after RP in 115 patients (9%). We developed a multivariable model using age, prostate-specific antigen density, percentage of positive cores with Gleason pattern 4 cancer out of all cores taken, and maximum percentage of cancer involvement within a positive core with Gleason pattern 4 cancer. The area under the curve for this model was 0.75 after 10-fold cross validation. However, decision curve analysis revealed that the model was not clinically helpful in identifying patients who will downgrade at RP for the purpose of reassigning them to AS. While patients with pathological Gleason score 3 + 3 with tertiary Gleason pattern ≤4 at RP in patients with biopsy Gleason score 3 + 4 prostate cancer may be potential candidates for AS, decision curve analysis showed limited utility of our model to identify such men. Future study is needed to identify new predictors to help identify potential candidates for AS among patients with biopsy confirmed Gleason score 3 + 4 prostate cancer. © 2014 The Authors. BJU International © 2014 BJU International.

  13. siMS Score: Simple Method for Quantifying Metabolic Syndrome.

    Science.gov (United States)

    Soldatovic, Ivan; Vukovic, Rade; Culafic, Djordje; Gajic, Milan; Dimitrijevic-Sreckovic, Vesna

    2016-01-01

    To evaluate siMS score and siMS risk score, novel continuous metabolic syndrome scores as methods for quantification of metabolic status and risk. Developed siMS score was calculated using formula: siMS score = 2*Waist/Height + Gly/5.6 + Tg/1.7 + TAsystolic/130-HDL/1.02 or 1.28 (for male or female subjects, respectively). siMS risk score was calculated using formula: siMS risk score = siMS score * age/45 or 50 (for male or female subjects, respectively) * family history of cardio/cerebro-vascular events (event = 1.2, no event = 1). A sample of 528 obese and non-obese participants was used to validate siMS score and siMS risk score. Scores calculated as sum of z-scores (each component of metabolic syndrome regressed with age and gender) and sum of scores derived from principal component analysis (PCA) were used for evaluation of siMS score. Variants were made by replacing glucose with HOMA in calculations. Framingham score was used for evaluation of siMS risk score. Correlation between siMS score with sum of z-scores and weighted sum of factors of PCA was high (r = 0.866 and r = 0.822, respectively). Correlation between siMS risk score and log transformed Framingham score was medium to high for age groups 18+,30+ and 35+ (0.835, 0.707 and 0.667, respectively). siMS score and siMS risk score showed high correlation with more complex scores. Demonstrated accuracy together with superior simplicity and the ability to evaluate and follow-up individual patients makes siMS and siMS risk scores very convenient for use in clinical practice and research as well.

  14. Trends in Classroom Observation Scores

    Science.gov (United States)

    Casabianca, Jodi M.; Lockwood, J. R.; McCaffrey, Daniel F.

    2015-01-01

    Observations and ratings of classroom teaching and interactions collected over time are susceptible to trends in both the quality of instruction and rater behavior. These trends have potential implications for inferences about teaching and for study design. We use scores on the Classroom Assessment Scoring System-Secondary (CLASS-S) protocol from…

  15. [The diagnostic scores for deep venous thrombosis].

    Science.gov (United States)

    Junod, A

    2015-08-26

    Seven diagnostic scores for the deep venous thrombosis (DVT) of lower limbs are analyzed and compared. Two features make this exer- cise difficult: the problem of distal DVT and of their proximal extension and the status of patients, whether out- or in-patients. The most popular score is the Wells score (1997), modi- fied in 2003. It includes one subjective ele- ment based on clinical judgment. The Primary Care score 12005), less known, has similar pro- perties, but uses only objective data. The pre- sent trend is to associate clinical scores with the dosage of D-Dimers to rule out with a good sensitivity the probability of TVP. For the upper limb DVT, the Constans score (2008) is available, which can also be coupled with D-Dimers testing (Kleinjan).

  16. 40 CFR 141.63 - Maximum contaminant levels (MCLs) for microbiological contaminants.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Maximum contaminant levels (MCLs) for microbiological contaminants. 141.63 Section 141.63 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Water Regulations: Maximum Contaminant Levels and Maximum Residual Disinfectant Levels § 141.63 Maximum...

  17. Credit scoring for individuals

    Directory of Open Access Journals (Sweden)

    Maria DIMITRIU

    2010-12-01

    Full Text Available Lending money to different borrowers is profitable, but risky. The profits come from the interest rate and the fees earned on the loans. Banks do not want to make loans to borrowers who cannot repay them. Even if the banks do not intend to make bad loans, over time, some of them can become bad. For instance, as a result of the recent financial crisis, the capability of many borrowers to repay their loans were affected, many of them being on default. That’s why is important for the bank to monitor the loans. The purpose of this paper is to focus on credit scoring main issues. As a consequence of this, we presented in this paper the scoring model of an important Romanian Bank. Based on this credit scoring model and taking into account the last lending requirements of the National Bank of Romania, we developed an assessment tool, in Excel, for retail loans which is presented in the case study.

  18. Quality scores for 32,000 genomes

    DEFF Research Database (Denmark)

    Land, Miriam L.; Hyatt, Doug; Jun, Se-Ran

    2014-01-01

    Background More than 80% of the microbial genomes in GenBank are of ‘draft’ quality (12,553 draft vs. 2,679 finished, as of October, 2013). We have examined all the microbial DNA sequences available for complete, draft, and Sequence Read Archive genomes in GenBank as well as three other major...... public databases, and assigned quality scores for more than 30,000 prokaryotic genome sequences. Results Scores were assigned using four categories: the completeness of the assembly, the presence of full-length rRNA genes, tRNA composition and the presence of a set of 102 conserved genes in prokaryotes....... Most (~88%) of the genomes had quality scores of 0.8 or better and can be safely used for standard comparative genomics analysis. We compared genomes across factors that may influence the score. We found that although sequencing depth coverage of over 100x did not ensure a better score, sequencing read...

  19. Particle Swarm Optimization Based of the Maximum Photovoltaic ...

    African Journals Online (AJOL)

    Photovoltaic electricity is seen as an important source of renewable energy. The photovoltaic array is an unstable source of power since the peak power point depends on the temperature and the irradiation level. A maximum peak power point tracking is then necessary for maximum efficiency. In this work, a Particle Swarm ...

  20. 40 CFR 1045.140 - What is my engine's maximum engine power?

    Science.gov (United States)

    2010-07-01

    ...) Maximum engine power for an engine family is generally the weighted average value of maximum engine power... engine family's maximum engine power apply in the following circumstances: (1) For outboard or personal... value for maximum engine power from all the different configurations within the engine family to...

  1. A prospective study of calf factors affecting age, body size, and body condition score at first calving of holstein dairy heifers.

    Science.gov (United States)

    Heinrichs, A J; Heinrichs, B S; Harel, O; Rogers, G W; Place, N T

    2005-08-01

    Data were collected prospectively on parameters related to first calving on 18 farms located in Northeastern Pennsylvania. This project was designed to study possible residual effects of calf management practices and events occurring during the first 16 wk of life on age, BW, skeletal growth, and body condition score at first calving. Multiple imputation method for handling missing data was incorporated in these analyses. This method has the advantage over ad hoc single imputations because the appropriate error structure is maintained. Much similarity was found between the multiple imputation method and a traditional mixed model analysis, except that some estimates from the multiple imputation method seemed more logical in their effects on the parameter measured. Factors related to increased age at first calving were increased difficulty of delivery, antibiotic treatment of sick calves, increased amount of milk or milk replacer fed before weaning, reduced quality of forage fed to weaned calves, maximum humidity, mean daily temperature, and maximum ammonia levels in calf housing areas. Body weight at calving tended to increase with parity of the dam, increased amount of grain fed to calves, increased ammonia levels, and increased mean temperature of the calf housing area. Body condition score at calving tended to be positively influenced by delivery score at first calving, dam parity, and milk or milk replacer dry matter intake. Withers height at calving was positively affected by treatment of animals with antibiotics and increased mean temperature in the calf area. This study demonstrated that nutrition, housing, and management factors that affect health and growth of calves have long-term effects on the animal at least through first calving.

  2. 42 CFR 409.62 - Lifetime maximum on inpatient psychiatric care.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Lifetime maximum on inpatient psychiatric care. 409....62 Lifetime maximum on inpatient psychiatric care. There is a lifetime maximum of 190 days on inpatient psychiatric hospital services available to any beneficiary. Therefore, once an individual receives...

  3. Prognostic validation of a 17-segment score derived from a 20-segment score for myocardial perfusion SPECT interpretation.

    Science.gov (United States)

    Berman, Daniel S; Abidov, Aiden; Kang, Xingping; Hayes, Sean W; Friedman, John D; Sciammarella, Maria G; Cohen, Ishac; Gerlach, James; Waechter, Parker B; Germano, Guido; Hachamovitch, Rory

    2004-01-01

    Recently, a 17-segment model of the left ventricle has been recommended as an optimally weighted approach for interpreting myocardial perfusion single photon emission computed tomography (SPECT). Methods to convert databases from previous 20- to new 17-segment data and criteria for abnormality for the 17-segment scores are needed. Initially, for derivation of the conversion algorithm, 65 patients were studied (algorithm population) (pilot group, n = 28; validation group, n = 37). Three conversion algorithms were derived: algorithm 1, which used mid, distal, and apical scores; algorithm 2, which used distal and apical scores alone; and algorithm 3, which used maximal scores of the distal septal, lateral, and apical segments in the 20-segment model for 3 corresponding segments of the 17-segment model. The prognosis population comprised 16,020 consecutive patients (mean age, 65 +/- 12 years; 41% women) who had exercise or vasodilator stress technetium 99m sestamibi myocardial perfusion SPECT and were followed up for 2.1 +/- 0.8 years. In this population, 17-segment scores were derived from 20-segment scores by use of algorithm 2, which demonstrated the best agreement with expert 17-segment reading in the algorithm population. The prognostic value of the 20- and 17-segment scores was compared by converting the respective summed scores into percent myocardium abnormal. Conversion algorithm 2 was found to be highly concordant with expert visual analysis by the 17-segment model (r = 0.982; kappa = 0.866) in the algorithm population. In the prognosis population, 456 cardiac deaths occurred during follow-up. When the conversion algorithm was applied, extent and severity of perfusion defects were nearly identical by 20- and derived 17-segment scores. The receiver operating characteristic curve areas by 20- and 17-segment perfusion scores were identical for predicting cardiac death (both 0.77 +/- 0.02, P = not significant). The optimal prognostic cutoff value for either 20

  4. Maximum-entropy description of animal movement.

    Science.gov (United States)

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  5. Blind Grid Scoring Record No. 290

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  6. Blind Grid Scoring Record No. 293

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George; Archiable, Robert; Fling, Rick; McClung, Christina

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Blind Grid. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  7. Open Field Scoring Record No. 298

    National Research Council Canada - National Science Library

    Overbay, Jr., Larry; Robitaille, George; Fling, Rick; McClung, Christina

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  8. Open Field Scoring Record No. 299

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Open Field. Scoring Records have been coordinated by Larry Overbay and the standardized UXO Technology Demonstration Site Scoring Committee...

  9. Extreme Maximum Land Surface Temperatures.

    Science.gov (United States)

    Garratt, J. R.

    1992-09-01

    There are numerous reports in the literature of observations of land surface temperatures. Some of these, almost all made in situ, reveal maximum values in the 50°-70°C range, with a few, made in desert regions, near 80°C. Consideration of a simplified form of the surface energy balance equation, utilizing likely upper values of absorbed shortwave flux (1000 W m2) and screen air temperature (55°C), that surface temperatures in the vicinity of 90°-100°C may occur for dry, darkish soils of low thermal conductivity (0.1-0.2 W m1 K1). Numerical simulations confirm this and suggest that temperature gradients in the first few centimeters of soil may reach 0.5°-1°C mm1 under these extreme conditions. The study bears upon the intrinsic interest of identifying extreme maximum temperatures and yields interesting information regarding the comfort zone of animals (including man).

  10. On Maximum Entropy and Inference

    Directory of Open Access Journals (Sweden)

    Luigi Gresele

    2017-11-01

    Full Text Available Maximum entropy is a powerful concept that entails a sharp separation between relevant and irrelevant variables. It is typically invoked in inference, once an assumption is made on what the relevant variables are, in order to estimate a model from data, that affords predictions on all other (dependent variables. Conversely, maximum entropy can be invoked to retrieve the relevant variables (sufficient statistics directly from the data, once a model is identified by Bayesian model selection. We explore this approach in the case of spin models with interactions of arbitrary order, and we discuss how relevant interactions can be inferred. In this perspective, the dimensionality of the inference problem is not set by the number of parameters in the model, but by the frequency distribution of the data. We illustrate the method showing its ability to recover the correct model in a few prototype cases and discuss its application on a real dataset.

  11. THE EFFICIENCY OF TENNIS DOUBLES SCORING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Geoff Pollard

    2010-09-01

    Full Text Available In this paper a family of scoring systems for tennis doubles for testing the hypothesis that pair A is better than pair B versus the alternative hypothesis that pair B is better than A, is established. This family or benchmark of scoring systems can be used as a benchmark against which the efficiency of any doubles scoring system can be assessed. Thus, the formula for the efficiency of any doubles scoring system is derived. As in tennis singles, one scoring system based on the play-the-loser structure is shown to be more efficient than the benchmark systems. An expression for the relative efficiency of two doubles scoring systems is derived. Thus, the relative efficiency of the various scoring systems presently used in doubles can be assessed. The methods of this paper can be extended to a match between two teams of 2, 4, 8, …doubles pairs, so that it is possible to establish a measure for the relative efficiency of the various systems used for tennis contests between teams of players.

  12. Fractal Dimension and Maximum Sunspot Number in Solar Cycle

    Directory of Open Access Journals (Sweden)

    R.-S. Kim

    2006-09-01

    Full Text Available The fractal dimension is a quantitative parameter describing the characteristics of irregular time series. In this study, we use this parameter to analyze the irregular aspects of solar activity and to predict the maximum sunspot number in the following solar cycle by examining time series of the sunspot number. For this, we considered the daily sunspot number since 1850 from SIDC (Solar Influences Data analysis Center and then estimated cycle variation of the fractal dimension by using Higuchi's method. We examined the relationship between this fractal dimension and the maximum monthly sunspot number in each solar cycle. As a result, we found that there is a strong inverse relationship between the fractal dimension and the maximum monthly sunspot number. By using this relation we predicted the maximum sunspot number in the solar cycle from the fractal dimension of the sunspot numbers during the solar activity increasing phase. The successful prediction is proven by a good correlation (r=0.89 between the observed and predicted maximum sunspot numbers in the solar cycles.

  13. Automatic maximum entropy spectral reconstruction in NMR

    International Nuclear Information System (INIS)

    Mobli, Mehdi; Maciejewski, Mark W.; Gryk, Michael R.; Hoch, Jeffrey C.

    2007-01-01

    Developments in superconducting magnets, cryogenic probes, isotope labeling strategies, and sophisticated pulse sequences together have enabled the application, in principle, of high-resolution NMR spectroscopy to biomolecular systems approaching 1 megadalton. In practice, however, conventional approaches to NMR that utilize the fast Fourier transform, which require data collected at uniform time intervals, result in prohibitively lengthy data collection times in order to achieve the full resolution afforded by high field magnets. A variety of approaches that involve nonuniform sampling have been proposed, each utilizing a non-Fourier method of spectrum analysis. A very general non-Fourier method that is capable of utilizing data collected using any of the proposed nonuniform sampling strategies is maximum entropy reconstruction. A limiting factor in the adoption of maximum entropy reconstruction in NMR has been the need to specify non-intuitive parameters. Here we describe a fully automated system for maximum entropy reconstruction that requires no user-specified parameters. A web-accessible script generator provides the user interface to the system

  14. Stationary neutrino radiation transport by maximum entropy closure

    International Nuclear Information System (INIS)

    Bludman, S.A.

    1994-11-01

    The authors obtain the angular distributions that maximize the entropy functional for Maxwell-Boltzmann (classical), Bose-Einstein, and Fermi-Dirac radiation. In the low and high occupancy limits, the maximum entropy closure is bounded by previously known variable Eddington factors that depend only on the flux. For intermediate occupancy, the maximum entropy closure depends on both the occupation density and the flux. The Fermi-Dirac maximum entropy variable Eddington factor shows a scale invariance, which leads to a simple, exact analytic closure for fermions. This two-dimensional variable Eddington factor gives results that agree well with exact (Monte Carlo) neutrino transport calculations out of a collapse residue during early phases of hydrostatic neutron star formation

  15. Maximum power per VA control of vector controlled interior ...

    Indian Academy of Sciences (India)

    Thakur Sumeet Singh

    2018-04-11

    Apr 11, 2018 ... Department of Electrical Engineering, Indian Institute of Technology Delhi, New ... The MPVA operation allows maximum-utilization of the drive-system. ... Permanent magnet motor; unity power factor; maximum VA utilization; ...

  16. Application of maximum entropy to neutron tunneling spectroscopy

    International Nuclear Information System (INIS)

    Mukhopadhyay, R.; Silver, R.N.

    1990-01-01

    We demonstrate the maximum entropy method for the deconvolution of high resolution tunneling data acquired with a quasielastic spectrometer. Given a precise characterization of the instrument resolution function, a maximum entropy analysis of lutidine data obtained with the IRIS spectrometer at ISIS results in an effective factor of three improvement in resolution. 7 refs., 4 figs

  17. The maximum economic depth of groundwater abstraction for irrigation

    Science.gov (United States)

    Bierkens, M. F.; Van Beek, L. P.; de Graaf, I. E. M.; Gleeson, T. P.

    2017-12-01

    Over recent decades, groundwater has become increasingly important for agriculture. Irrigation accounts for 40% of the global food production and its importance is expected to grow further in the near future. Already, about 70% of the globally abstracted water is used for irrigation, and nearly half of that is pumped groundwater. In many irrigated areas where groundwater is the primary source of irrigation water, groundwater abstraction is larger than recharge and we see massive groundwater head decline in these areas. An important question then is: to what maximum depth can groundwater be pumped for it to be still economically recoverable? The objective of this study is therefore to create a global map of the maximum depth of economically recoverable groundwater when used for irrigation. The maximum economic depth is the maximum depth at which revenues are still larger than pumping costs or the maximum depth at which initial investments become too large compared to yearly revenues. To this end we set up a simple economic model where costs of well drilling and the energy costs of pumping, which are a function of well depth and static head depth respectively, are compared with the revenues obtained for the irrigated crops. Parameters for the cost sub-model are obtained from several US-based studies and applied to other countries based on GDP/capita as an index of labour costs. The revenue sub-model is based on gross irrigation water demand calculated with a global hydrological and water resources model, areal coverage of crop types from MIRCA2000 and FAO-based statistics on crop yield and market price. We applied our method to irrigated areas in the world overlying productive aquifers. Estimated maximum economic depths range between 50 and 500 m. Most important factors explaining the maximum economic depth are the dominant crop type in the area and whether or not initial investments in well infrastructure are limiting. In subsequent research, our estimates of

  18. What controls the maximum magnitude of injection-induced earthquakes?

    Science.gov (United States)

    Eaton, D. W. S.

    2017-12-01

    Three different approaches for estimation of maximum magnitude are considered here, along with their implications for managing risk. The first approach is based on a deterministic limit for seismic moment proposed by McGarr (1976), which was originally designed for application to mining-induced seismicity. This approach has since been reformulated for earthquakes induced by fluid injection (McGarr, 2014). In essence, this method assumes that the upper limit for seismic moment release is constrained by the pressure-induced stress change. A deterministic limit is given by the product of shear modulus and the net injected fluid volume. This method is based on the assumptions that the medium is fully saturated and in a state of incipient failure. An alternative geometrical approach was proposed by Shapiro et al. (2011), who postulated that the rupture area for an induced earthquake falls entirely within the stimulated volume. This assumption reduces the maximum-magnitude problem to one of estimating the largest potential slip surface area within a given stimulated volume. Finally, van der Elst et al. (2016) proposed that the maximum observed magnitude, statistically speaking, is the expected maximum value for a finite sample drawn from an unbounded Gutenberg-Richter distribution. These three models imply different approaches for risk management. The deterministic method proposed by McGarr (2014) implies that a ceiling on the maximum magnitude can be imposed by limiting the net injected volume, whereas the approach developed by Shapiro et al. (2011) implies that the time-dependent maximum magnitude is governed by the spatial size of the microseismic event cloud. Finally, the sample-size hypothesis of Van der Elst et al. (2016) implies that the best available estimate of the maximum magnitude is based upon observed seismicity rate. The latter two approaches suggest that real-time monitoring is essential for effective management of risk. A reliable estimate of maximum

  19. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  20. Interobserver variability of the neurological optimality score

    NARCIS (Netherlands)

    Monincx, W. M.; Smolders-de Haas, H.; Bonsel, G. J.; Zondervan, H. A.

    1999-01-01

    To assess the interobserver reliability of the neurological optimality score. The neurological optimality score of 21 full term healthy, neurologically normal newborn infants was determined by two well trained observers. The interclass correlation coefficient was 0.31. Kappa for optimality (score of

  1. [The use of scores in general medicine].

    Science.gov (United States)

    Huber, Ursula; Rösli, Andreas; Ballmer, Peter E; Rippin, Sarah Jane

    2013-10-01

    Scores are tools to combine complex information into a numerical value. In General Medicine, there are scores to assist in making diagnoses and prognoses, scores to assist therapeutic decision making and to evaluate therapeutic results and scores to help physicians when informing and advising patients. We review six of the scoring systems that have the greatest utility for the General Physician in hospital-based care and in General Practice. The Nutritional Risk Screening (NRS 2002) tool is designed to identify hospital patients in danger of malnutrition. The aim is to improve the nutritional status of these patients. The CURB-65 score predicts 30-day mortality in patients with community acquired pneumonia. Patients with a low score can be considered for home treatment, patients with an elevated score require hospitalisation and those with a high score should be treated as having severe pneumonia; treatment in the intensive care unit should be considered. The IAS-AGLA score of the Working Group on Lipids and Atherosclerosis of the Swiss Society of Cardiology calculates the 10-year risk of a myocardial infarction for people living in Switzerland. The working group makes recommendations for preventative treatment according to the calculated risk status. The Body Mass Index, which is calculated by dividing the body weight in kilograms by the height in meters squared and then divided into weight categories, is used to classify people as underweight, of normal weight, overweight or obese. The prognostic value of this classification is discussed. The Mini-Mental State Examination allows the physician to assess important cognitive functions in a simple and standardised form. The Glasgow Coma Scale is used to classify the level of consciousness in patients with head injury. It can be used for triage and correlates with prognosis.

  2. Maximum vehicle cabin temperatures under different meteorological conditions

    Science.gov (United States)

    Grundstein, Andrew; Meentemeyer, Vernon; Dowd, John

    2009-05-01

    A variety of studies have documented the dangerously high temperatures that may occur within the passenger compartment (cabin) of cars under clear sky conditions, even at relatively low ambient air temperatures. Our study, however, is the first to examine cabin temperatures under variable weather conditions. It uses a unique maximum vehicle cabin temperature dataset in conjunction with directly comparable ambient air temperature, solar radiation, and cloud cover data collected from April through August 2007 in Athens, GA. Maximum cabin temperatures, ranging from 41-76°C, varied considerably depending on the weather conditions and the time of year. Clear days had the highest cabin temperatures, with average values of 68°C in the summer and 61°C in the spring. Cloudy days in both the spring and summer were on average approximately 10°C cooler. Our findings indicate that even on cloudy days with lower ambient air temperatures, vehicle cabin temperatures may reach deadly levels. Additionally, two predictive models of maximum daily vehicle cabin temperatures were developed using commonly available meteorological data. One model uses maximum ambient air temperature and average daily solar radiation while the other uses cloud cover percentage as a surrogate for solar radiation. From these models, two maximum vehicle cabin temperature indices were developed to assess the level of danger. The models and indices may be useful for forecasting hazardous conditions, promoting public awareness, and to estimate past cabin temperatures for use in forensic analyses.

  3. Combining Experiments and Simulations Using the Maximum Entropy Principle

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy...... in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results....... Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges....

  4. Interpreting force concept inventory scores: Normalized gain and SAT scores

    Directory of Open Access Journals (Sweden)

    Jeffrey J. Steinert

    2007-05-01

    Full Text Available Preinstruction SAT scores and normalized gains (G on the force concept inventory (FCI were examined for individual students in interactive engagement (IE courses in introductory mechanics at one high school (N=335 and one university (N=292 , and strong, positive correlations were found for both populations ( r=0.57 and r=0.46 , respectively. These correlations are likely due to the importance of cognitive skills and abstract reasoning in learning physics. The larger correlation coefficient for the high school population may be a result of the much shorter time interval between taking the SAT and studying mechanics, because the SAT may provide a more current measure of abilities when high school students begin the study of mechanics than it does for college students, who begin mechanics years after the test is taken. In prior research a strong correlation between FCI G and scores on Lawson’s Classroom Test of Scientific Reasoning for students from the same two schools was observed. Our results suggest that, when interpreting class average normalized FCI gains and comparing different classes, it is important to take into account the variation of students’ cognitive skills, as measured either by the SAT or by Lawson’s test. While Lawson’s test is not commonly given to students in most introductory mechanics courses, SAT scores provide a readily available alternative means of taking account of students’ reasoning abilities. Knowing the students’ cognitive level before instruction also allows one to alter instruction or to use an intervention designed to improve students’ cognitive level.

  5. Interpreting force concept inventory scores: Normalized gain and SAT scores

    Directory of Open Access Journals (Sweden)

    Vincent P. Coletta

    2007-05-01

    Full Text Available Preinstruction SAT scores and normalized gains (G on the force concept inventory (FCI were examined for individual students in interactive engagement (IE courses in introductory mechanics at one high school (N=335 and one university (N=292, and strong, positive correlations were found for both populations (r=0.57 and r=0.46, respectively. These correlations are likely due to the importance of cognitive skills and abstract reasoning in learning physics. The larger correlation coefficient for the high school population may be a result of the much shorter time interval between taking the SAT and studying mechanics, because the SAT may provide a more current measure of abilities when high school students begin the study of mechanics than it does for college students, who begin mechanics years after the test is taken. In prior research a strong correlation between FCI G and scores on Lawson’s Classroom Test of Scientific Reasoning for students from the same two schools was observed. Our results suggest that, when interpreting class average normalized FCI gains and comparing different classes, it is important to take into account the variation of students’ cognitive skills, as measured either by the SAT or by Lawson’s test. While Lawson’s test is not commonly given to students in most introductory mechanics courses, SAT scores provide a readily available alternative means of taking account of students’ reasoning abilities. Knowing the students’ cognitive level before instruction also allows one to alter instruction or to use an intervention designed to improve students’ cognitive level.

  6. Semiparametric score level fusion: Gaussian copula approach

    NARCIS (Netherlands)

    Susyanyo, N.; Klaassen, C.A.J.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2015-01-01

    Score level fusion is an appealing method for combining multi-algorithms, multi- representations, and multi-modality biometrics due to its simplicity. Often, scores are assumed to be independent, but even for dependent scores, accord- ing to the Neyman-Pearson lemma, the likelihood ratio is the

  7. Examining the reliability of ADAS-Cog change scores.

    Science.gov (United States)

    Grochowalski, Joseph H; Liu, Ying; Siedlecki, Karen L

    2016-09-01

    The purpose of this study was to estimate and examine ways to improve the reliability of change scores on the Alzheimer's Disease Assessment Scale, Cognitive Subtest (ADAS-Cog). The sample, provided by the Alzheimer's Disease Neuroimaging Initiative, included individuals with Alzheimer's disease (AD) (n = 153) and individuals with mild cognitive impairment (MCI) (n = 352). All participants were administered the ADAS-Cog at baseline and 1 year, and change scores were calculated as the difference in scores over the 1-year period. Three types of change score reliabilities were estimated using multivariate generalizability. Two methods to increase change score reliability were evaluated: reweighting the subtests of the scale and adding more subtests. Reliability of ADAS-Cog change scores over 1 year was low for both the AD sample (ranging from .53 to .64) and the MCI sample (.39 to .61). Reweighting the change scores from the AD sample improved reliability (.68 to .76), but lengthening provided no useful improvement for either sample. The MCI change scores had low reliability, even with reweighting and adding additional subtests. The ADAS-Cog scores had low reliability for measuring change. Researchers using the ADAS-Cog should estimate and report reliability for their use of the change scores. The ADAS-Cog change scores are not recommended for assessment of meaningful clinical change.

  8. Maximum entropy beam diagnostic tomography

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1985-01-01

    This paper reviews the formalism of maximum entropy beam diagnostic tomography as applied to the Fusion Materials Irradiation Test (FMIT) prototype accelerator. The same formalism has also been used with streak camera data to produce an ultrahigh speed movie of the beam profile of the Experimental Test Accelerator (ETA) at Livermore

  9. Conditional Standard Errors of Measurement for Scale Scores.

    Science.gov (United States)

    Kolen, Michael J.; And Others

    1992-01-01

    A procedure is described for estimating the reliability and conditional standard errors of measurement of scale scores incorporating the discrete transformation of raw scores to scale scores. The method is illustrated using a strong true score model, and practical applications are described. (SLD)

  10. The high-density lipoprotein-adjusted SCORE model worsens SCORE-based risk classification in a contemporary population of 30 824 Europeans

    DEFF Research Database (Denmark)

    Mortensen, Martin B; Afzal, Shoaib; Nordestgaard, Børge G

    2015-01-01

    .8 years of follow-up, 339 individuals died of CVD. In the SCORE target population (age 40-65; n = 30,824), fewer individuals were at baseline categorized as high risk (≥5% 10-year risk of fatal CVD) using SCORE-HDL compared with SCORE (10 vs. 17% in men, 1 vs. 3% in women). SCORE-HDL did not improve...... with SCORE, but deteriorated risk classification based on NRI. Future guidelines should consider lower decision thresholds and prioritize CVD morbidity and people above age 65....

  11. A Maximum Radius for Habitable Planets.

    Science.gov (United States)

    Alibert, Yann

    2015-09-01

    We compute the maximum radius a planet can have in order to fulfill two constraints that are likely necessary conditions for habitability: 1- surface temperature and pressure compatible with the existence of liquid water, and 2- no ice layer at the bottom of a putative global ocean, that would prevent the operation of the geologic carbon cycle to operate. We demonstrate that, above a given radius, these two constraints cannot be met: in the Super-Earth mass range (1-12 Mearth), the overall maximum that a planet can have varies between 1.8 and 2.3 Rearth. This radius is reduced when considering planets with higher Fe/Si ratios, and taking into account irradiation effects on the structure of the gas envelope.

  12. External Validation of the Simple Clinical Score and the HOTEL Score, Two Scores for Predicting Short-Term Mortality after Admission to an Acute Medical Unit

    DEFF Research Database (Denmark)

    Stræde, Mia; Brabrand, Mikkel

    2014-01-01

    with the objective of validating the Simple Clinical Score (SCS) and the HOTEL score, two existing risk stratification systems that predict mortality for medical patients based solely on clinical information, but not only vital signs. METHODS: Pre-planned prospective observational cohort study. SETTING: Danish 460.......932 to 0.988) for 24-hours mortality and 0.826 (95% CI, 0.774-0.879) for 30-day mortality, and goodness-of-fit test, χ2 = 2.68 (10 degrees of freedom), P = 0.998 and χ2 = 4.00, P = 0.947, respectively. We included 1470 patients when calculating the HOTEL score. Discriminatory power (AUROC) was 0.931 (95......% CI, 0.901-0.962) for 24-hours mortality and goodness-of-fit test, χ2 = 5.56 (10 degrees of freedom), P = 0.234. CONCLUSION: We find that both the SCS and HOTEL scores showed an excellent to outstanding ability in identifying patients at high risk of dying with good or acceptable precision....

  13. Validity and reliability of a novel immunosuppressive adverse effects scoring system in renal transplant recipients.

    Science.gov (United States)

    Meaney, Calvin J; Arabi, Ziad; Venuto, Rocco C; Consiglio, Joseph D; Wilding, Gregory E; Tornatore, Kathleen M

    2014-06-12

    After renal transplantation, many patients experience adverse effects from maintenance immunosuppressive drugs. When these adverse effects occur, patient adherence with immunosuppression may be reduced and impact allograft survival. If these adverse effects could be prospectively monitored in an objective manner and possibly prevented, adherence to immunosuppressive regimens could be optimized and allograft survival improved. Prospective, standardized clinical approaches to assess immunosuppressive adverse effects by health care providers are limited. Therefore, we developed and evaluated the application, reliability and validity of a novel adverse effects scoring system in renal transplant recipients receiving calcineurin inhibitor (cyclosporine or tacrolimus) and mycophenolic acid based immunosuppressive therapy. The scoring system included 18 non-renal adverse effects organized into gastrointestinal, central nervous system and aesthetic domains developed by a multidisciplinary physician group. Nephrologists employed this standardized adverse effect evaluation in stable renal transplant patients using physical exam, review of systems, recent laboratory results, and medication adherence assessment during a clinic visit. Stable renal transplant recipients in two clinical studies were evaluated and received immunosuppressive regimens comprised of either cyclosporine or tacrolimus with mycophenolic acid. Face, content, and construct validity were assessed to document these adverse effect evaluations. Inter-rater reliability was determined using the Kappa statistic and intra-class correlation. A total of 58 renal transplant recipients were assessed using the adverse effects scoring system confirming face validity. Nephrologists (subject matter experts) rated the 18 adverse effects as: 3.1 ± 0.75 out of 4 (maximum) regarding clinical importance to verify content validity. The adverse effects scoring system distinguished 1.75-fold increased gastrointestinal adverse

  14. On semidefinite programming relaxations of maximum k-section

    NARCIS (Netherlands)

    de Klerk, E.; Pasechnik, D.V.; Sotirov, R.; Dobre, C.

    2012-01-01

    We derive a new semidefinite programming bound for the maximum k -section problem. For k=2 (i.e. for maximum bisection), the new bound is at least as strong as a well-known bound by Poljak and Rendl (SIAM J Optim 5(3):467–487, 1995). For k ≥ 3the new bound dominates a bound of Karisch and Rendl

  15. MODIFIED ALVARADO SCORING IN ACUTE APPENDICITIS

    Directory of Open Access Journals (Sweden)

    Varadarajan Sujath

    2016-12-01

    Full Text Available BACKGROUND Acute appendicitis is one of the most common surgical emergencies with a lifetime presentation of approximately 1 in 7. Its incidence is 1.5-1.9/1000 in males and females. Surgery for acute appendicitis is based on history, clinical examination and laboratory investigations (e.g. WBC count. Imaging techniques add very little to the efficacy in the diagnosis of appendix. A negative appendicectomy rate of 20-40% has been reported in literature. A difficulty in diagnosis is experienced in very young patients and females of reproductive age. The diagnostic accuracy in assessing acute appendicitis has not improved in spite of rapid advances in management. MATERIALS AND METHODS The modified Alvarado score was applied and assessed for its accuracy in preparation diagnosis of acute appendicitis in 50 patients. The aim of our study is to understand the various presentations of acute appendicitis including the age and gender incidence and the application of the modified Alvarado scoring system in our hospital setup and assessment of the efficacy of the score. RESULTS Our study shows that most involved age group is 3 rd decade with male preponderance. On application of Alvarado score, nausea and vomiting present in 50% and anorexia in 30%, leucocytosis was found in 75% of cases. Sensitivity and specificity of our study were 65% and 40% respectively with positive predictive value of 85% and negative predictive value of 15%. CONCLUSION This study showed that clinical scoring like the Alvarado score can be a cheap and quick tool to apply in emergency departments to rule out acute appendicitis. The implementation of modified Alvarado score is simple and cost effective.

  16. Modeling multisite streamflow dependence with maximum entropy copula

    Science.gov (United States)

    Hao, Z.; Singh, V. P.

    2013-10-01

    Synthetic streamflows at different sites in a river basin are needed for planning, operation, and management of water resources projects. Modeling the temporal and spatial dependence structure of monthly streamflow at different sites is generally required. In this study, the maximum entropy copula method is proposed for multisite monthly streamflow simulation, in which the temporal and spatial dependence structure is imposed as constraints to derive the maximum entropy copula. The monthly streamflows at different sites are then generated by sampling from the conditional distribution. A case study for the generation of monthly streamflow at three sites in the Colorado River basin illustrates the application of the proposed method. Simulated streamflow from the maximum entropy copula is in satisfactory agreement with observed streamflow.

  17. Combination of STOP-Bang Score with Mallampati Score fails to improve specificity in the prediction of sleep-disordered breathing.

    Science.gov (United States)

    Dette, Frank G; Graf, Juergen; Cassel, Werner; Lloyd-Jones, Carla; Boehm, Stefan; Zoremba, Martin; Schramm, Patrick; Pestel, Gunther; Thal, Serge C

    2016-06-01

    Sleep-disordered breathing (SDB) is closely associated with perioperative complications. STOP-Bang score was validated for preoperative screening of SDB. However, STOP-Bang Score lacks adequately high specificity. We aimed to improve it by combining it with the Mallampati Score. The study included 347 patients, in which we assessed both STOP-Bang and Mallampati scores. Overnight oxygen saturation was measured to calculate ODI4%. We calculated the sensitivity and specificity for AHI and ODI4% of both scores separately and in combination. We found that STOP-Bang Score ≥3 was present in 71%, ODI≥5/h (AHI ≥5/h) in 42.6% (39.3%) and ODI≥15/h (AHI ≥15/h) in 13.5% (17.8%). For ODI4%≥5/h (AHI ≥5/h) we observed in men a response rate for sensitivity and specificity of STOP-Bang of 94.5% and 17.1% (90.9% and 12.5%) and in women 66% and 51% (57.8% and 46.9%). For ODI4%≥15/h (AHI≥15/h) it was 92% and 12% (84.6% and 10.3%) and 93% and 49% (75% and 49.2%). For ODI4%≥5 (AHI≥5) sensitivity and specificity of Mallampati score were in men 38.4% and 78.6% (27.3% and 68.2%) and in women 25% and 82.7% (21.9% and 81.3%), for ODI≥15 (AHI ≥15/h) 38.5% and 71.8% (26.9% and 69.2%) and 33.3% and 81.4% (17.9% and 79.6%). In combination, for ODI4%≥15/h, we found sensitivity in men to be 92.3% and in women 93.3%, specificity 10.3% and 41.4%. STOP-Bang Score combined with Mallampati Score fails to increase specificity. Low specificity should be considered when using both scores for preoperative screening of SDB.

  18. The Machine Scoring of Writing

    Science.gov (United States)

    McCurry, Doug

    2010-01-01

    This article provides an introduction to the kind of computer software that is used to score student writing in some high stakes testing programs, and that is being promoted as a teaching and learning tool to schools. It sketches the state of play with machines for the scoring of writing, and describes how these machines work and what they do.…

  19. Scoring an Abstract Contemporary Silent Film

    OpenAIRE

    Frost, Crystal

    2014-01-01

    I composed an original digital audio film score with full sound design for a contemporary silent film called Apple Tree. The film is highly conceptual and interpretive and required a very involved, intricate score to successfully tell the story. In the process of scoring this film, I learned new ways to convey an array of contrasting emotions through music and sound. After analyzing the film's emotional journey, I determined that six defining emotions were the foundation on which to build an ...

  20. Maximum stellar iron core mass

    Indian Academy of Sciences (India)

    60, No. 3. — journal of. March 2003 physics pp. 415–422. Maximum stellar iron core mass. F W GIACOBBE. Chicago Research Center/American Air Liquide ... iron core compression due to the weight of non-ferrous matter overlying the iron cores within large .... thermal equilibrium velocities will tend to be non-relativistic.

  1. Risk scores-the modern Oracle of Delphi?

    Science.gov (United States)

    Kronenberg, Florian; Schwaiger, Johannes P

    2017-03-01

    Recently, 4 new risk scores for the prediction of mortality and cardiovascular events were especially tailored for hemodialysis patients; these scores performed much better than previous scores. Tripepi et al. found that these risk scores were even more predictive for all-cause and cardiovascular death than the measurement of the left ventricular mass index was. Nevertheless, the investigation of left ventricular mass and function has its own place for other reasons. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  2. Methods and statistics for combining motif match scores.

    Science.gov (United States)

    Bailey, T L; Gribskov, M

    1998-01-01

    Position-specific scoring matrices are useful for representing and searching for protein sequence motifs. A sequence family can often be described by a group of one or more motifs, and an effective search must combine the scores for matching a sequence to each of the motifs in the group. We describe three methods for combining match scores and estimating the statistical significance of the combined scores and evaluate the search quality (classification accuracy) and the accuracy of the estimate of statistical significance of each. The three methods are: 1) sum of scores, 2) sum of reduced variates, 3) product of score p-values. We show that method 3) is superior to the other two methods in both regards, and that combining motif scores indeed gives better search accuracy. The MAST sequence homology search algorithm utilizing the product of p-values scoring method is available for interactive use and downloading at URL http:/(/)www.sdsc.edu/MEME.

  3. siMS Score: Simple Method for Quantifying Metabolic Syndrome

    OpenAIRE

    Soldatovic, Ivan; Vukovic, Rade; Culafic, Djordje; Gajic, Milan; Dimitrijevic-Sreckovic, Vesna

    2016-01-01

    Objective To evaluate siMS score and siMS risk score, novel continuous metabolic syndrome scores as methods for quantification of metabolic status and risk. Materials and Methods Developed siMS score was calculated using formula: siMS score = 2*Waist/Height + Gly/5.6 + Tg/1.7 + TAsystolic/130?HDL/1.02 or 1.28 (for male or female subjects, respectively). siMS risk score was calculated using formula: siMS risk score = siMS score * age/45 or 50 (for male or female subjects, respectively) * famil...

  4. Validation of Automated Scoring of Science Assessments

    Science.gov (United States)

    Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.

    2016-01-01

    Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…

  5. The regulation of starch accumulation in Panicum maximum Jacq ...

    African Journals Online (AJOL)

    ... decrease the starch level. These observations are discussed in relation to the photosynthetic characteristics of P. maximum. Keywords: accumulation; botany; carbon assimilation; co2 fixation; growth conditions; mesophyll; metabolites; nitrogen; nitrogen levels; nitrogen supply; panicum maximum; plant physiology; starch; ...

  6. Maximum entropy beam diagnostic tomography

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1985-01-01

    This paper reviews the formalism of maximum entropy beam diagnostic tomography as applied to the Fusion Materials Irradiation Test (FMIT) prototype accelerator. The same formalism has also been used with streak camera data to produce an ultrahigh speed movie of the beam profile of the Experimental Test Accelerator (ETA) at Livermore. 11 refs., 4 figs

  7. A portable storage maximum thermometer

    International Nuclear Information System (INIS)

    Fayart, Gerard.

    1976-01-01

    A clinical thermometer storing the voltage corresponding to the maximum temperature in an analog memory is described. End of the measurement is shown by a lamp switch out. The measurement time is shortened by means of a low thermal inertia platinum probe. This portable thermometer is fitted with cell test and calibration system [fr

  8. A Maximum Resonant Set of Polyomino Graphs

    Directory of Open Access Journals (Sweden)

    Zhang Heping

    2016-05-01

    Full Text Available A polyomino graph P is a connected finite subgraph of the infinite plane grid such that each finite face is surrounded by a regular square of side length one and each edge belongs to at least one square. A dimer covering of P corresponds to a perfect matching. Different dimer coverings can interact via an alternating cycle (or square with respect to them. A set of disjoint squares of P is a resonant set if P has a perfect matching M so that each one of those squares is M-alternating. In this paper, we show that if K is a maximum resonant set of P, then P − K has a unique perfect matching. We further prove that the maximum forcing number of a polyomino graph is equal to the cardinality of a maximum resonant set. This confirms a conjecture of Xu et al. [26]. We also show that if K is a maximal alternating set of P, then P − K has a unique perfect matching.

  9. The sequential trauma score - a new instrument for the sequential mortality prediction in major trauma*

    Directory of Open Access Journals (Sweden)

    Huber-Wagner S

    2010-05-01

    Full Text Available Abstract Background There are several well established scores for the assessment of the prognosis of major trauma patients that all have in common that they can be calculated at the earliest during intensive care unit stay. We intended to develop a sequential trauma score (STS that allows prognosis at several early stages based on the information that is available at a particular time. Study design In a retrospective, multicenter study using data derived from the Trauma Registry of the German Trauma Society (2002-2006, we identified the most relevant prognostic factors from the patients basic data (P, prehospital phase (A, early (B1, and late (B2 trauma room phase. Univariate and logistic regression models as well as score quality criteria and the explanatory power have been calculated. Results A total of 2,354 patients with complete data were identified. From the patients basic data (P, logistic regression showed that age was a significant predictor of survival (AUCmodel p, area under the curve = 0.63. Logistic regression of the prehospital data (A showed that blood pressure, pulse rate, Glasgow coma scale (GCS, and anisocoria were significant predictors (AUCmodel A = 0.76; AUCmodel P + A = 0.82. Logistic regression of the early trauma room phase (B1 showed that peripheral oxygen saturation, GCS, anisocoria, base excess, and thromboplastin time to be significant predictors of survival (AUCmodel B1 = 0.78; AUCmodel P +A + B1 = 0.85. Multivariate analysis of the late trauma room phase (B2 detected cardiac massage, abbreviated injury score (AIS of the head ≥ 3, the maximum AIS, the need for transfusion or massive blood transfusion, to be the most important predictors (AUCmodel B2 = 0.84; AUCfinal model P + A + B1 + B2 = 0.90. The explanatory power - a tool for the assessment of the relative impact of each segment to mortality - is 25% for P, 7% for A, 17% for B1 and 51% for B2. A spreadsheet for the easy calculation of the sequential trauma

  10. Spatio-temporal observations of the tertiary ozone maximum

    Directory of Open Access Journals (Sweden)

    V. F. Sofieva

    2009-07-01

    Full Text Available We present spatio-temporal distributions of the tertiary ozone maximum (TOM, based on GOMOS (Global Ozone Monitoring by Occultation of Stars ozone measurements in 2002–2006. The tertiary ozone maximum is typically observed in the high-latitude winter mesosphere at an altitude of ~72 km. Although the explanation for this phenomenon has been found recently – low concentrations of odd-hydrogen cause the subsequent decrease in odd-oxygen losses – models have had significant deviations from existing observations until recently. Good coverage of polar night regions by GOMOS data has allowed for the first time to obtain spatial and temporal observational distributions of night-time ozone mixing ratio in the mesosphere.

    The distributions obtained from GOMOS data have specific features, which are variable from year to year. In particular, due to a long lifetime of ozone in polar night conditions, the downward transport of polar air by the meridional circulation is clearly observed in the tertiary ozone maximum time series. Although the maximum tertiary ozone mixing ratio is achieved close to the polar night terminator (as predicted by the theory, TOM can be observed also at very high latitudes, not only in the beginning and at the end, but also in the middle of winter. We have compared the observational spatio-temporal distributions of the tertiary ozone maximum with that obtained using WACCM (Whole Atmosphere Community Climate Model and found that the specific features are reproduced satisfactorily by the model.

    Since ozone in the mesosphere is very sensitive to HOx concentrations, energetic particle precipitation can significantly modify the shape of the ozone profiles. In particular, GOMOS observations have shown that the tertiary ozone maximum was temporarily destroyed during the January 2005 and December 2006 solar proton events as a result of the HOx enhancement from the increased ionization.

  11. Volleyball Scoring Systems.

    Science.gov (United States)

    Calhoun, William; Dargahi-Noubary, G. R.; Shi, Yixun

    2002-01-01

    The widespread interest in sports in our culture provides an excellent opportunity to catch students' attention in mathematics and statistics classes. One mathematically interesting aspect of volleyball, which can be used to motivate students, is the scoring system. (MM)

  12. How to calculate an MMSE score from a MODA score (and vice versa) in patients with Alzheimer's disease.

    Science.gov (United States)

    Cazzaniga, R; Francescani, A; Saetti, C; Spinnler, H

    2003-11-01

    The aim of the present study was to provide a statistically sound way of reciprocally converting scores of the mini-mental state examination (MMSE) and the Milan overall dementia assessment (MODA). A consecutive series of 182 patients with "probable" Alzheimer's disease patients was examined with both tests. MODA and MMSE scores proved to be highly correlated. A formula for converting MODA and MMSE scores was generated.

  13. Translation and validation of the new version of the Knee Society Score - The 2011 KS Score - into Brazilian Portuguese.

    Science.gov (United States)

    Silva, Adriana Lucia Pastore E; Croci, Alberto Tesconi; Gobbi, Riccardo Gomes; Hinckel, Betina Bremer; Pecora, José Ricardo; Demange, Marco Kawamura

    2017-01-01

    Translation, cultural adaptation, and validation of the new version of the Knee Society Score - The 2011 KS Score - into Brazilian Portuguese and verification of its measurement properties, reproducibility, and validity. In 2012, the new version of the Knee Society Score was developed and validated. This scale comprises four separate subscales: (a) objective knee score (seven items: 100 points); (b) patient satisfaction score (five items: 40 points); (c) patient expectations score (three items: 15 points); and (d) functional activity score (19 items: 100 points). A total of 90 patients aged 55-85 years were evaluated in a clinical cross-sectional study. The pre-operative translated version was applied to patients with TKA referral, and the post-operative translated version was applied to patients who underwent TKA. Each patient answered the same questionnaire twice and was evaluated by two experts in orthopedic knee surgery. Evaluations were performed pre-operatively and three, six, or 12 months post-operatively. The reliability of the questionnaire was evaluated using the intraclass correlation coefficient (ICC) between the two applications. Internal consistency was evaluated using Cronbach's alpha. The ICC found no difference between the means of the pre-operative, three-month, and six-month post-operative evaluations between sub-scale items. The Brazilian Portuguese version of The 2011 KS Score is a valid and reliable instrument for objective and subjective evaluation of the functionality of Brazilian patients who undergo TKA and revision TKA.

  14. The BRICS (Bronchiectasis Radiologically Indexed CT Score): A Multicenter Study Score for Use in Idiopathic and Postinfective Bronchiectasis.

    Science.gov (United States)

    Bedi, Pallavi; Chalmers, James D; Goeminne, Pieter C; Mai, Cindy; Saravanamuthu, Pira; Velu, Prasad Palani; Cartlidge, Manjit K; Loebinger, Michael R; Jacob, Joe; Kamal, Faisal; Schembri, Nicola; Aliberti, Stefano; Hill, Uta; Harrison, Mike; Johnson, Christopher; Screaton, Nicholas; Haworth, Charles; Polverino, Eva; Rosales, Edmundo; Torres, Antoni; Benegas, Michael N; Rossi, Adriano G; Patel, Dilip; Hill, Adam T

    2018-05-01

    The goal of this study was to develop a simplified radiological score that could assess clinical disease severity in bronchiectasis. The Bronchiectasis Radiologically Indexed CT Score (BRICS) was devised based on a multivariable analysis of the Bhalla score and its ability in predicting clinical parameters of severity. The score was then externally validated in six centers in 302 patients. A total of 184 high-resolution CT scans were scored for the validation cohort. In a multiple logistic regression model, disease severity markers significantly associated with the Bhalla score were percent predicted FEV 1 , sputum purulence, and exacerbations requiring hospital admission. Components of the Bhalla score that were significantly associated with the disease severity markers were bronchial dilatation and number of bronchopulmonary segments with emphysema. The BRICS was developed with these two parameters. The receiver operating-characteristic curve values for BRICS in the derivation cohort were 0.79 for percent predicted FEV 1 , 0.71 for sputum purulence, and 0.75 for hospital admissions per year; these values were 0.81, 0.70, and 0.70, respectively, in the validation cohort. Sputum free neutrophil elastase activity was significantly elevated in the group with emphysema on CT imaging. A simplified CT scoring system can be used as an adjunct to clinical parameters to predict disease severity in patients with idiopathic and postinfective bronchiectasis. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  15. Nutech functional score: A novel scoring system to assess spinal cord injury patients.

    Science.gov (United States)

    Shroff, Geeta; Barthakur, Jitendra Kumar

    2017-06-26

    To develop a new scoring system, nutech functional scores (NFS) for assessing the patients with spinal cord injury (SCI). The conventional scale, American Spinal Injury Association's (ASIA) impairment scale is a measure which precisely describes the severity of the SCI. However, it has various limitations which lead to incomplete assessment of SCI patients. We have developed a 63 point scoring system, i . e ., NFS for patients suffering with SCI. A list of symptoms either common or rare that were found to be associated with SCI was recorded for each patient. On the basis of these lists, we have developed NFS. These lists served as a base to prepare NFS, a 63 point positional (each symptom is sub-graded and get points based on position) and directional (moves in direction BAD → GOOD) scoring system. For non-progressive diseases, 1, 2, 3, 4, 5 denote worst, bad, moderate, good and best (normal), respectively. NFS for SCI has been divided into different groups based on the affected part of the body being assessed, i . e ., motor assessment (shoulders, elbow, wrist, fingers-grasp, fingers-release, hip, knee, ankle and toe), sensory assessment, autonomic assessment, bed sore assessment and general assessment. As probability based studies required a range of (-1, 1) or at least the range of (0, 1) to be useful for real world analysis, the grades were converted to respective numeric values. NFS can be considered as a unique tool to assess the improvement in patients with SCI as it overcomes the limitations of ASIA impairment scale.

  16. Renal dysfunction in liver cirrhosis and its correlation with Child-Pugh score and MELD score

    Science.gov (United States)

    Siregar, G. A.; Gurning, M.

    2018-03-01

    Renal dysfunction (RD) is a serious and common complication in a patient with liver cirrhosis. It provides a poor prognosis. The aim of our study was to evaluate the renal function in liver cirrhosis, also to determine the correlation with the graduation of liver disease assessed by Child-Pugh Score (CPS) and MELD score. This was a cross-sectional study included patients with liver cirrhosis admitted to Adam Malik Hospital Medan in June - August 2016. We divided them into two groups as not having renal dysfunction (serum creatinine SPSS 22.0 was used. Statistical methods used: Chi-square, Fisher exact, one way ANOVA, Kruskal Wallis test and Pearson coefficient of correlation. The level of significance was p<0.05. 55 patients with presented with renal dysfunction were 16 (29.1 %). There was statistically significant inverse correlation between GFR and CPS (r = -0.308), GFR and MELD score (r = -0.278). There was a statistically significant correlation between creatinine and MELD score (r = 0.359), creatinine and CPS (r = 0.382). The increase of the degree of liver damage is related to the increase of renal dysfunction.

  17. Association between spirometry controlled chest CT scores using computer-animated biofeedback and clinical markers of lung disease in children with cystic fibrosis

    DEFF Research Database (Denmark)

    Kongstad, Thomas; Green, Kent; Buchvald, Frederik

    2017-01-01

    Background: Computed tomography (CT) of the lungs is the gold standard for assessing the extent of structural changes in the lungs. Spirometry-controlled chest CT (SCCCT) has improved the usefulness of CT by standardising inspiratory and expiratory lung volumes during imaging. This was a single...... (expressed as % of maximum score) to quantify different aspects of structural lung changes including bronchiectasis, airway wall thickening, mucus plugging, opacities, cysts, bullae and gas trapping. Clinical markers consisted of outcomes from pulmonary function tests, microbiological cultures from sputum......-centre cross-sectional study in children with cystic fibrosis (CF). Using SCCCT we wished to investigate the association between the quantity and extent of structural lung changes and pulmonary function outcomes, and prevalence of known CF lung pathogens. Methods: CT images were analysed by CF-CT scoring...

  18. Direct maximum parsimony phylogeny reconstruction from genotype data

    OpenAIRE

    Sridhar, Srinath; Lam, Fumei; Blelloch, Guy E; Ravi, R; Schwartz, Russell

    2007-01-01

    Abstract Background Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of ge...

  19. Validating the Interpretations and Uses of Test Scores

    Science.gov (United States)

    Kane, Michael T.

    2013-01-01

    To validate an interpretation or use of test scores is to evaluate the plausibility of the claims based on the scores. An argument-based approach to validation suggests that the claims based on the test scores be outlined as an argument that specifies the inferences and supporting assumptions needed to get from test responses to score-based…

  20. A Novel Scoring System Approach to Assess Patients with Lyme Disease (Nutech Functional Score)

    OpenAIRE

    Geeta Shroff; Petra Hopf-Seidel

    2018-01-01

    Introduction: A bacterial infection by Borrelia burgdorferi referred to as Lyme disease (LD) or borreliosis is transmitted mostly by a bite of the tick Ixodes scapularis in the USA and Ixodes ricinus in Europe. Various tests are used for the diagnosis of LD, but their results are often unreliable. We compiled a list of clinically visible and patient-reported symptoms that are associated with LD. Based on this list, we developed a novel scoring system. Methodology: Nutech functional Score (NF...