WorldWideScience

Sample records for maximum parsimonious reconstruction

  1. Direct maximum parsimony phylogeny reconstruction from genotype data

    OpenAIRE

    Sridhar, Srinath; Lam, Fumei; Blelloch, Guy E; Ravi, R; Schwartz, Russell

    2007-01-01

    Abstract Background Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of ge...

  2. Direct maximum parsimony phylogeny reconstruction from genotype data.

    Science.gov (United States)

    Sridhar, Srinath; Lam, Fumei; Blelloch, Guy E; Ravi, R; Schwartz, Russell

    2007-12-05

    Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of genotypes, which consist of conflated combinations of pairs of haplotypes from homologous chromosomes. Currently, there are no general algorithms for the direct reconstruction of maximum parsimony phylogenies from genotype data. Hence phylogenetic applications for autosomal data must therefore rely on other methods for first computationally inferring haplotypes from genotypes. In this work, we develop the first practical method for computing maximum parsimony phylogenies directly from genotype data. We show that the standard practice of first inferring haplotypes from genotypes and then reconstructing a phylogeny on the haplotypes often substantially overestimates phylogeny size. As an immediate application, our method can be used to determine the minimum number of mutations required to explain a given set of observed genotypes. Phylogeny reconstruction directly from unphased data is computationally feasible for moderate-sized problem instances and can lead to substantially more accurate tree size inferences than the standard practice of treating phasing and phylogeny construction as two separate analysis stages. The difference between the approaches is particularly important for downstream applications that require a lower-bound on the number of mutations that the genetic region has undergone.

  3. Direct maximum parsimony phylogeny reconstruction from genotype data

    Directory of Open Access Journals (Sweden)

    Ravi R

    2007-12-01

    Full Text Available Abstract Background Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of genotypes, which consist of conflated combinations of pairs of haplotypes from homologous chromosomes. Currently, there are no general algorithms for the direct reconstruction of maximum parsimony phylogenies from genotype data. Hence phylogenetic applications for autosomal data must therefore rely on other methods for first computationally inferring haplotypes from genotypes. Results In this work, we develop the first practical method for computing maximum parsimony phylogenies directly from genotype data. We show that the standard practice of first inferring haplotypes from genotypes and then reconstructing a phylogeny on the haplotypes often substantially overestimates phylogeny size. As an immediate application, our method can be used to determine the minimum number of mutations required to explain a given set of observed genotypes. Conclusion Phylogeny reconstruction directly from unphased data is computationally feasible for moderate-sized problem instances and can lead to substantially more accurate tree size inferences than the standard practice of treating phasing and phylogeny construction as two separate analysis stages. The difference between the approaches is particularly important for downstream applications that require a lower-bound on the number of mutations that the genetic region has undergone.

  4. Reconstructing phylogenetic networks using maximum parsimony.

    Science.gov (United States)

    Nakhleh, Luay; Jin, Guohua; Zhao, Fengmei; Mellor-Crummey, John

    2005-01-01

    Phylogenies - the evolutionary histories of groups of organisms - are one of the most widely used tools throughout the life sciences, as well as objects of research within systematics, evolutionary biology, epidemiology, etc. Almost every tool devised to date to reconstruct phylogenies produces trees; yet it is widely understood and accepted that trees oversimplify the evolutionary histories of many groups of organims, most prominently bacteria (because of horizontal gene transfer) and plants (because of hybrid speciation). Various methods and criteria have been introduced for phylogenetic tree reconstruction. Parsimony is one of the most widely used and studied criteria, and various accurate and efficient heuristics for reconstructing trees based on parsimony have been devised. Jotun Hein suggested a straightforward extension of the parsimony criterion to phylogenetic networks. In this paper we formalize this concept, and provide the first experimental study of the quality of parsimony as a criterion for constructing and evaluating phylogenetic networks. Our results show that, when extended to phylogenetic networks, the parsimony criterion produces promising results. In a great majority of the cases in our experiments, the parsimony criterion accurately predicts the numbers and placements of non-tree events.

  5. Ancestral sequence reconstruction with Maximum Parsimony

    OpenAIRE

    Herbst, Lina; Fischer, Mareike

    2017-01-01

    One of the main aims in phylogenetics is the estimation of ancestral sequences based on present-day data like, for instance, DNA alignments. One way to estimate the data of the last common ancestor of a given set of species is to first reconstruct a phylogenetic tree with some tree inference method and then to use some method of ancestral state inference based on that tree. One of the best-known methods both for tree inference as well as for ancestral sequence inference is Maximum Parsimony (...

  6. FPGA Hardware Acceleration of a Phylogenetic Tree Reconstruction with Maximum Parsimony Algorithm

    OpenAIRE

    BLOCK, Henry; MARUYAMA, Tsutomu

    2017-01-01

    In this paper, we present an FPGA hardware implementation for a phylogenetic tree reconstruction with a maximum parsimony algorithm. We base our approach on a particular stochastic local search algorithm that uses the Progressive Neighborhood and the Indirect Calculation of Tree Lengths method. This method is widely used for the acceleration of the phylogenetic tree reconstruction algorithm in software. In our implementation, we define a tree structure and accelerate the search by parallel an...

  7. Improved Maximum Parsimony Models for Phylogenetic Networks.

    Science.gov (United States)

    Van Iersel, Leo; Jones, Mark; Scornavacca, Celine

    2018-05-01

    Phylogenetic networks are well suited to represent evolutionary histories comprising reticulate evolution. Several methods aiming at reconstructing explicit phylogenetic networks have been developed in the last two decades. In this article, we propose a new definition of maximum parsimony for phylogenetic networks that permits to model biological scenarios that cannot be modeled by the definitions currently present in the literature (namely, the "hardwired" and "softwired" parsimony). Building on this new definition, we provide several algorithmic results that lay the foundations for new parsimony-based methods for phylogenetic network reconstruction.

  8. Maximum parsimony on subsets of taxa.

    Science.gov (United States)

    Fischer, Mareike; Thatte, Bhalchandra D

    2009-09-21

    In this paper we investigate mathematical questions concerning the reliability (reconstruction accuracy) of Fitch's maximum parsimony algorithm for reconstructing the ancestral state given a phylogenetic tree and a character. In particular, we consider the question whether the maximum parsimony method applied to a subset of taxa can reconstruct the ancestral state of the root more accurately than when applied to all taxa, and we give an example showing that this indeed is possible. A surprising feature of our example is that ignoring a taxon closer to the root improves the reliability of the method. On the other hand, in the case of the two-state symmetric substitution model, we answer affirmatively a conjecture of Li, Steel and Zhang which states that under a molecular clock the probability that the state at a single taxon is a correct guess of the ancestral state is a lower bound on the reconstruction accuracy of Fitch's method applied to all taxa.

  9. Ancestral Sequence Reconstruction with Maximum Parsimony.

    Science.gov (United States)

    Herbst, Lina; Fischer, Mareike

    2017-12-01

    One of the main aims in phylogenetics is the estimation of ancestral sequences based on present-day data like, for instance, DNA alignments. One way to estimate the data of the last common ancestor of a given set of species is to first reconstruct a phylogenetic tree with some tree inference method and then to use some method of ancestral state inference based on that tree. One of the best-known methods both for tree inference and for ancestral sequence inference is Maximum Parsimony (MP). In this manuscript, we focus on this method and on ancestral state inference for fully bifurcating trees. In particular, we investigate a conjecture published by Charleston and Steel in 1995 concerning the number of species which need to have a particular state, say a, at a particular site in order for MP to unambiguously return a as an estimate for the state of the last common ancestor. We prove the conjecture for all even numbers of character states, which is the most relevant case in biology. We also show that the conjecture does not hold in general for odd numbers of character states, but also present some positive results for this case.

  10. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  11. On the quirks of maximum parsimony and likelihood on phylogenetic networks.

    Science.gov (United States)

    Bryant, Christopher; Fischer, Mareike; Linz, Simone; Semple, Charles

    2017-03-21

    Maximum parsimony is one of the most frequently-discussed tree reconstruction methods in phylogenetic estimation. However, in recent years it has become more and more apparent that phylogenetic trees are often not sufficient to describe evolution accurately. For instance, processes like hybridization or lateral gene transfer that are commonplace in many groups of organisms and result in mosaic patterns of relationships cannot be represented by a single phylogenetic tree. This is why phylogenetic networks, which can display such events, are becoming of more and more interest in phylogenetic research. It is therefore necessary to extend concepts like maximum parsimony from phylogenetic trees to networks. Several suggestions for possible extensions can be found in recent literature, for instance the softwired and the hardwired parsimony concepts. In this paper, we analyze the so-called big parsimony problem under these two concepts, i.e. we investigate maximum parsimonious networks and analyze their properties. In particular, we show that finding a softwired maximum parsimony network is possible in polynomial time. We also show that the set of maximum parsimony networks for the hardwired definition always contains at least one phylogenetic tree. Lastly, we investigate some parallels of parsimony to different likelihood concepts on phylogenetic networks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Maximum Parsimony on Phylogenetic networks

    Science.gov (United States)

    2012-01-01

    Background Phylogenetic networks are generalizations of phylogenetic trees, that are used to model evolutionary events in various contexts. Several different methods and criteria have been introduced for reconstructing phylogenetic trees. Maximum Parsimony is a character-based approach that infers a phylogenetic tree by minimizing the total number of evolutionary steps required to explain a given set of data assigned on the leaves. Exact solutions for optimizing parsimony scores on phylogenetic trees have been introduced in the past. Results In this paper, we define the parsimony score on networks as the sum of the substitution costs along all the edges of the network; and show that certain well-known algorithms that calculate the optimum parsimony score on trees, such as Sankoff and Fitch algorithms extend naturally for networks, barring conflicting assignments at the reticulate vertices. We provide heuristics for finding the optimum parsimony scores on networks. Our algorithms can be applied for any cost matrix that may contain unequal substitution costs of transforming between different characters along different edges of the network. We analyzed this for experimental data on 10 leaves or fewer with at most 2 reticulations and found that for almost all networks, the bounds returned by the heuristics matched with the exhaustively determined optimum parsimony scores. Conclusion The parsimony score we define here does not directly reflect the cost of the best tree in the network that displays the evolution of the character. However, when searching for the most parsimonious network that describes a collection of characters, it becomes necessary to add additional cost considerations to prefer simpler structures, such as trees over networks. The parsimony score on a network that we describe here takes into account the substitution costs along the additional edges incident on each reticulate vertex, in addition to the substitution costs along the other edges which are

  13. PTree: pattern-based, stochastic search for maximum parsimony phylogenies

    OpenAIRE

    Gregor, Ivan; Steinbr?ck, Lars; McHardy, Alice C.

    2013-01-01

    Phylogenetic reconstruction is vital to analyzing the evolutionary relationship of genes within and across populations of different species. Nowadays, with next generation sequencing technologies producing sets comprising thousands of sequences, robust identification of the tree topology, which is optimal according to standard criteria such as maximum parsimony, maximum likelihood or posterior probability, with phylogenetic inference methods is a computationally very demanding task. Here, we ...

  14. Bootstrap-based Support of HGT Inferred by Maximum Parsimony

    Directory of Open Access Journals (Sweden)

    Nakhleh Luay

    2010-05-01

    Full Text Available Abstract Background Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. Results In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. Conclusions We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/, and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.

  15. Bootstrap-based support of HGT inferred by maximum parsimony.

    Science.gov (United States)

    Park, Hyun Jung; Jin, Guohua; Nakhleh, Luay

    2010-05-05

    Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/), and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.

  16. Time-Dependent-Asymmetric-Linear-Parsimonious Ancestral State Reconstruction.

    Science.gov (United States)

    Didier, Gilles

    2017-10-01

    The time-dependent-asymmetric-linear parsimony is an ancestral state reconstruction method which extends the standard linear parsimony (a.k.a. Wagner parsimony) approach by taking into account both branch lengths and asymmetric evolutionary costs for reconstructing quantitative characters (asymmetric costs amount to assuming an evolutionary trend toward the direction with the lowest cost). A formal study of the influence of the asymmetry parameter shows that the time-dependent-asymmetric-linear parsimony infers states which are all taken among the known states, except for some degenerate cases corresponding to special values of the asymmetry parameter. This remarkable property holds in particular for the Wagner parsimony. This study leads to a polynomial algorithm which determines, and provides a compact representation of, the parametric reconstruction of a phylogenetic tree, that is for all the unknown nodes, the set of all the possible reconstructed states associated with the asymmetry parameters leading to them. The time-dependent-asymmetric-linear parsimony is finally illustrated with the parametric reconstruction of the body size of cetaceans.

  17. PTree: pattern-based, stochastic search for maximum parsimony phylogenies

    Directory of Open Access Journals (Sweden)

    Ivan Gregor

    2013-06-01

    Full Text Available Phylogenetic reconstruction is vital to analyzing the evolutionary relationship of genes within and across populations of different species. Nowadays, with next generation sequencing technologies producing sets comprising thousands of sequences, robust identification of the tree topology, which is optimal according to standard criteria such as maximum parsimony, maximum likelihood or posterior probability, with phylogenetic inference methods is a computationally very demanding task. Here, we describe a stochastic search method for a maximum parsimony tree, implemented in a software package we named PTree. Our method is based on a new pattern-based technique that enables us to infer intermediate sequences efficiently where the incorporation of these sequences in the current tree topology yields a phylogenetic tree with a lower cost. Evaluation across multiple datasets showed that our method is comparable to the algorithms implemented in PAUP* or TNT, which are widely used by the bioinformatics community, in terms of topological accuracy and runtime. We show that our method can process large-scale datasets of 1,000–8,000 sequences. We believe that our novel pattern-based method enriches the current set of tools and methods for phylogenetic tree inference. The software is available under: http://algbio.cs.uni-duesseldorf.de/webapps/wa-download/.

  18. PTree: pattern-based, stochastic search for maximum parsimony phylogenies.

    Science.gov (United States)

    Gregor, Ivan; Steinbrück, Lars; McHardy, Alice C

    2013-01-01

    Phylogenetic reconstruction is vital to analyzing the evolutionary relationship of genes within and across populations of different species. Nowadays, with next generation sequencing technologies producing sets comprising thousands of sequences, robust identification of the tree topology, which is optimal according to standard criteria such as maximum parsimony, maximum likelihood or posterior probability, with phylogenetic inference methods is a computationally very demanding task. Here, we describe a stochastic search method for a maximum parsimony tree, implemented in a software package we named PTree. Our method is based on a new pattern-based technique that enables us to infer intermediate sequences efficiently where the incorporation of these sequences in the current tree topology yields a phylogenetic tree with a lower cost. Evaluation across multiple datasets showed that our method is comparable to the algorithms implemented in PAUP* or TNT, which are widely used by the bioinformatics community, in terms of topological accuracy and runtime. We show that our method can process large-scale datasets of 1,000-8,000 sequences. We believe that our novel pattern-based method enriches the current set of tools and methods for phylogenetic tree inference. The software is available under: http://algbio.cs.uni-duesseldorf.de/webapps/wa-download/.

  19. On the Quirks of Maximum Parsimony and Likelihood on Phylogenetic Networks

    OpenAIRE

    Bryant, Christopher; Fischer, Mareike; Linz, Simone; Semple, Charles

    2015-01-01

    Maximum parsimony is one of the most frequently-discussed tree reconstruction methods in phylogenetic estimation. However, in recent years it has become more and more apparent that phylogenetic trees are often not sufficient to describe evolution accurately. For instance, processes like hybridization or lateral gene transfer that are commonplace in many groups of organisms and result in mosaic patterns of relationships cannot be represented by a single phylogenetic tree. This is why phylogene...

  20. Efficient parsimony-based methods for phylogenetic network reconstruction.

    Science.gov (United States)

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2007-01-15

    Phylogenies--the evolutionary histories of groups of organisms-play a major role in representing relationships among biological entities. Although many biological processes can be effectively modeled as tree-like relationships, others, such as hybrid speciation and horizontal gene transfer (HGT), result in networks, rather than trees, of relationships. Hybrid speciation is a significant evolutionary mechanism in plants, fish and other groups of species. HGT plays a major role in bacterial genome diversification and is a significant mechanism by which bacteria develop resistance to antibiotics. Maximum parsimony is one of the most commonly used criteria for phylogenetic tree inference. Roughly speaking, inference based on this criterion seeks the tree that minimizes the amount of evolution. In 1990, Jotun Hein proposed using this criterion for inferring the evolution of sequences subject to recombination. Preliminary results on small synthetic datasets. Nakhleh et al. (2005) demonstrated the criterion's application to phylogenetic network reconstruction in general and HGT detection in particular. However, the naive algorithms used by the authors are inapplicable to large datasets due to their demanding computational requirements. Further, no rigorous theoretical analysis of computing the criterion was given, nor was it tested on biological data. In the present work we prove that the problem of scoring the parsimony of a phylogenetic network is NP-hard and provide an improved fixed parameter tractable algorithm for it. Further, we devise efficient heuristics for parsimony-based reconstruction of phylogenetic networks. We test our methods on both synthetic and biological data (rbcL gene in bacteria) and obtain very promising results.

  1. Maximum parsimony, substitution model, and probability phylogenetic trees.

    Science.gov (United States)

    Weng, J F; Thomas, D A; Mareels, I

    2011-01-01

    The problem of inferring phylogenies (phylogenetic trees) is one of the main problems in computational biology. There are three main methods for inferring phylogenies-Maximum Parsimony (MP), Distance Matrix (DM) and Maximum Likelihood (ML), of which the MP method is the most well-studied and popular method. In the MP method the optimization criterion is the number of substitutions of the nucleotides computed by the differences in the investigated nucleotide sequences. However, the MP method is often criticized as it only counts the substitutions observable at the current time and all the unobservable substitutions that really occur in the evolutionary history are omitted. In order to take into account the unobservable substitutions, some substitution models have been established and they are now widely used in the DM and ML methods but these substitution models cannot be used within the classical MP method. Recently the authors proposed a probability representation model for phylogenetic trees and the reconstructed trees in this model are called probability phylogenetic trees. One of the advantages of the probability representation model is that it can include a substitution model to infer phylogenetic trees based on the MP principle. In this paper we explain how to use a substitution model in the reconstruction of probability phylogenetic trees and show the advantage of this approach with examples.

  2. Mixed integer linear programming for maximum-parsimony phylogeny inference.

    Science.gov (United States)

    Sridhar, Srinath; Lam, Fumei; Blelloch, Guy E; Ravi, R; Schwartz, Russell

    2008-01-01

    Reconstruction of phylogenetic trees is a fundamental problem in computational biology. While excellent heuristic methods are available for many variants of this problem, new advances in phylogeny inference will be required if we are to be able to continue to make effective use of the rapidly growing stores of variation data now being gathered. In this paper, we present two integer linear programming (ILP) formulations to find the most parsimonious phylogenetic tree from a set of binary variation data. One method uses a flow-based formulation that can produce exponential numbers of variables and constraints in the worst case. The method has, however, proven extremely efficient in practice on datasets that are well beyond the reach of the available provably efficient methods, solving several large mtDNA and Y-chromosome instances within a few seconds and giving provably optimal results in times competitive with fast heuristics than cannot guarantee optimality. An alternative formulation establishes that the problem can be solved with a polynomial-sized ILP. We further present a web server developed based on the exponential-sized ILP that performs fast maximum parsimony inferences and serves as a front end to a database of precomputed phylogenies spanning the human genome.

  3. MPBoot: fast phylogenetic maximum parsimony tree inference and bootstrap approximation.

    Science.gov (United States)

    Hoang, Diep Thi; Vinh, Le Sy; Flouri, Tomáš; Stamatakis, Alexandros; von Haeseler, Arndt; Minh, Bui Quang

    2018-02-02

    The nonparametric bootstrap is widely used to measure the branch support of phylogenetic trees. However, bootstrapping is computationally expensive and remains a bottleneck in phylogenetic analyses. Recently, an ultrafast bootstrap approximation (UFBoot) approach was proposed for maximum likelihood analyses. However, such an approach is still missing for maximum parsimony. To close this gap we present MPBoot, an adaptation and extension of UFBoot to compute branch supports under the maximum parsimony principle. MPBoot works for both uniform and non-uniform cost matrices. Our analyses on biological DNA and protein showed that under uniform cost matrices, MPBoot runs on average 4.7 (DNA) to 7 times (protein data) (range: 1.2-20.7) faster than the standard parsimony bootstrap implemented in PAUP*; but 1.6 (DNA) to 4.1 times (protein data) slower than the standard bootstrap with a fast search routine in TNT (fast-TNT). However, for non-uniform cost matrices MPBoot is 5 (DNA) to 13 times (protein data) (range:0.3-63.9) faster than fast-TNT. We note that MPBoot achieves better scores more frequently than PAUP* and fast-TNT. However, this effect is less pronounced if an intensive but slower search in TNT is invoked. Moreover, experiments on large-scale simulated data show that while both PAUP* and TNT bootstrap estimates are too conservative, MPBoot bootstrap estimates appear more unbiased. MPBoot provides an efficient alternative to the standard maximum parsimony bootstrap procedure. It shows favorable performance in terms of run time, the capability of finding a maximum parsimony tree, and high bootstrap accuracy on simulated as well as empirical data sets. MPBoot is easy-to-use, open-source and available at http://www.cibiv.at/software/mpboot .

  4. Systematics and morphological evolution within the moss family Bryaceae: a comparison between parsimony and Bayesian methods for reconstruction of ancestral character states.

    Science.gov (United States)

    Pedersen, Niklas; Holyoak, David T; Newton, Angela E

    2007-06-01

    The Bryaceae are a large cosmopolitan moss family including genera of significant morphological and taxonomic complexity. Phylogenetic relationships within the Bryaceae were reconstructed based on DNA sequence data from all three genomic compartments. In addition, maximum parsimony and Bayesian inference were employed to reconstruct ancestral character states of 38 morphological plus four habitat characters and eight insertion/deletion events. The recovered phylogenetic patterns are generally in accord with previous phylogenies based on chloroplast DNA sequence data and three major clades are identified. The first clade comprises Bryum bornholmense, B. rubens, B. caespiticium, and Plagiobryum. This corroborates the hypothesis suggested by previous studies that several Bryum species are more closely related to Plagiobryum than to the core Bryum species. The second clade includes Acidodontium, Anomobryum, and Haplodontium, while the third clade contains the core Bryum species plus Imbribryum. Within the latter clade, B. subapiculatum and B. tenuisetum form the sister clade to Imbribryum. Reconstructions of ancestral character states under maximum parsimony and Bayesian inference suggest fourteen morphological synapomorphies for the ingroup and synapomorphies are detected for most clades within the ingroup. Maximum parsimony and Bayesian reconstructions of ancestral character states are mostly congruent although Bayesian inference shows that the posterior probability of ancestral character states may decrease dramatically when node support is taken into account. Bayesian inference also indicates that reconstructions may be ambiguous at internal nodes for highly polymorphic characters.

  5. The worst case complexity of maximum parsimony.

    Science.gov (United States)

    Carmel, Amir; Musa-Lempel, Noa; Tsur, Dekel; Ziv-Ukelson, Michal

    2014-11-01

    One of the core classical problems in computational biology is that of constructing the most parsimonious phylogenetic tree interpreting an input set of sequences from the genomes of evolutionarily related organisms. We reexamine the classical maximum parsimony (MP) optimization problem for the general (asymmetric) scoring matrix case, where rooted phylogenies are implied, and analyze the worst case bounds of three approaches to MP: The approach of Cavalli-Sforza and Edwards, the approach of Hendy and Penny, and a new agglomerative, "bottom-up" approach we present in this article. We show that the second and third approaches are faster than the first one by a factor of Θ(√n) and Θ(n), respectively, where n is the number of species.

  6. A unifying model of genome evolution under parsimony.

    Science.gov (United States)

    Paten, Benedict; Zerbino, Daniel R; Hickey, Glenn; Haussler, David

    2014-06-19

    Parsimony and maximum likelihood methods of phylogenetic tree estimation and parsimony methods for genome rearrangements are central to the study of genome evolution yet to date they have largely been pursued in isolation. We present a data structure called a history graph that offers a practical basis for the analysis of genome evolution. It conceptually simplifies the study of parsimonious evolutionary histories by representing both substitutions and double cut and join (DCJ) rearrangements in the presence of duplications. The problem of constructing parsimonious history graphs thus subsumes related maximum parsimony problems in the fields of phylogenetic reconstruction and genome rearrangement. We show that tractable functions can be used to define upper and lower bounds on the minimum number of substitutions and DCJ rearrangements needed to explain any history graph. These bounds become tight for a special type of unambiguous history graph called an ancestral variation graph (AVG), which constrains in its combinatorial structure the number of operations required. We finally demonstrate that for a given history graph G, a finite set of AVGs describe all parsimonious interpretations of G, and this set can be explored with a few sampling moves. This theoretical study describes a model in which the inference of genome rearrangements and phylogeny can be unified under parsimony.

  7. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history.

    Science.gov (United States)

    Cherry, Joshua L

    2017-02-23

    Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data. The algorithm is applied to bacterial data sets containing up to nearly 2000 genomes with several thousand variable nucleotide sites. Run times are several seconds or less. Computational experiments show that maximum compatibility is less sensitive than maximum parsimony to the inclusion of nucleotide data that, though derived from actual sequence reads, has been identified as likely to be misleading. Maximum compatibility is a useful tool for certain phylogenetic problems, such as inferring the relationships among closely-related bacteria from whole-genome sequence data. The algorithm presented here rapidly solves fairly large problems of this type, and provides robustness against misleading characters than can pollute large-scale sequencing data.

  8. On the Accuracy of Ancestral Sequence Reconstruction for Ultrametric Trees with Parsimony.

    Science.gov (United States)

    Herbst, Lina; Fischer, Mareike

    2018-04-01

    We examine a mathematical question concerning the reconstruction accuracy of the Fitch algorithm for reconstructing the ancestral sequence of the most recent common ancestor given a phylogenetic tree and sequence data for all taxa under consideration. In particular, for the symmetric four-state substitution model which is also known as Jukes-Cantor model, we answer affirmatively a conjecture of Li, Steel and Zhang which states that for any ultrametric phylogenetic tree and a symmetric model, the Fitch parsimony method using all terminal taxa is more accurate, or at least as accurate, for ancestral state reconstruction than using any particular terminal taxon or any particular pair of taxa. This conjecture had so far only been answered for two-state data by Fischer and Thatte. Here, we focus on answering the biologically more relevant case with four states, which corresponds to ancestral sequence reconstruction from DNA or RNA data.

  9. Phylogenetic analysis using parsimony and likelihood methods.

    Science.gov (United States)

    Yang, Z

    1996-02-01

    The assumptions underlying the maximum-parsimony (MP) method of phylogenetic tree reconstruction were intuitively examined by studying the way the method works. Computer simulations were performed to corroborate the intuitive examination. Parsimony appears to involve very stringent assumptions concerning the process of sequence evolution, such as constancy of substitution rates between nucleotides, constancy of rates across nucleotide sites, and equal branch lengths in the tree. For practical data analysis, the requirement of equal branch lengths means similar substitution rates among lineages (the existence of an approximate molecular clock), relatively long interior branches, and also few species in the data. However, a small amount of evolution is neither a necessary nor a sufficient requirement of the method. The difficulties involved in the application of current statistical estimation theory to tree reconstruction were discussed, and it was suggested that the approach proposed by Felsenstein (1981, J. Mol. Evol. 17: 368-376) for topology estimation, as well as its many variations and extensions, differs fundamentally from the maximum likelihood estimation of a conventional statistical parameter. Evidence was presented showing that the Felsenstein approach does not share the asymptotic efficiency of the maximum likelihood estimator of a statistical parameter. Computer simulations were performed to study the probability that MP recovers the true tree under a hierarchy of models of nucleotide substitution; its performance relative to the likelihood method was especially noted. The results appeared to support the intuitive examination of the assumptions underlying MP. When a simple model of nucleotide substitution was assumed to generate data, the probability that MP recovers the true topology could be as high as, or even higher than, that for the likelihood method. When the assumed model became more complex and realistic, e.g., when substitution rates were

  10. Assessing the accuracy of ancestral protein reconstruction methods.

    Directory of Open Access Journals (Sweden)

    Paul D Williams

    2006-06-01

    Full Text Available The phylogenetic inference of ancestral protein sequences is a powerful technique for the study of molecular evolution, but any conclusions drawn from such studies are only as good as the accuracy of the reconstruction method. Every inference method leads to errors in the ancestral protein sequence, resulting in potentially misleading estimates of the ancestral protein's properties. To assess the accuracy of ancestral protein reconstruction methods, we performed computational population evolution simulations featuring near-neutral evolution under purifying selection, speciation, and divergence using an off-lattice protein model where fitness depends on the ability to be stable in a specified target structure. We were thus able to compare the thermodynamic properties of the true ancestral sequences with the properties of "ancestral sequences" inferred by maximum parsimony, maximum likelihood, and Bayesian methods. Surprisingly, we found that methods such as maximum parsimony and maximum likelihood that reconstruct a "best guess" amino acid at each position overestimate thermostability, while a Bayesian method that sometimes chooses less-probable residues from the posterior probability distribution does not. Maximum likelihood and maximum parsimony apparently tend to eliminate variants at a position that are slightly detrimental to structural stability simply because such detrimental variants are less frequent. Other properties of ancestral proteins might be similarly overestimated. This suggests that ancestral reconstruction studies require greater care to come to credible conclusions regarding functional evolution. Inferred functional patterns that mimic reconstruction bias should be reevaluated.

  11. Assessing the accuracy of ancestral protein reconstruction methods.

    Science.gov (United States)

    Williams, Paul D; Pollock, David D; Blackburne, Benjamin P; Goldstein, Richard A

    2006-06-23

    The phylogenetic inference of ancestral protein sequences is a powerful technique for the study of molecular evolution, but any conclusions drawn from such studies are only as good as the accuracy of the reconstruction method. Every inference method leads to errors in the ancestral protein sequence, resulting in potentially misleading estimates of the ancestral protein's properties. To assess the accuracy of ancestral protein reconstruction methods, we performed computational population evolution simulations featuring near-neutral evolution under purifying selection, speciation, and divergence using an off-lattice protein model where fitness depends on the ability to be stable in a specified target structure. We were thus able to compare the thermodynamic properties of the true ancestral sequences with the properties of "ancestral sequences" inferred by maximum parsimony, maximum likelihood, and Bayesian methods. Surprisingly, we found that methods such as maximum parsimony and maximum likelihood that reconstruct a "best guess" amino acid at each position overestimate thermostability, while a Bayesian method that sometimes chooses less-probable residues from the posterior probability distribution does not. Maximum likelihood and maximum parsimony apparently tend to eliminate variants at a position that are slightly detrimental to structural stability simply because such detrimental variants are less frequent. Other properties of ancestral proteins might be similarly overestimated. This suggests that ancestral reconstruction studies require greater care to come to credible conclusions regarding functional evolution. Inferred functional patterns that mimic reconstruction bias should be reevaluated.

  12. Inferring phylogenetic networks by the maximum parsimony criterion: a case study.

    Science.gov (United States)

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2007-01-01

    Horizontal gene transfer (HGT) may result in genes whose evolutionary histories disagree with each other, as well as with the species tree. In this case, reconciling the species and gene trees results in a network of relationships, known as the "phylogenetic network" of the set of species. A phylogenetic network that incorporates HGT consists of an underlying species tree that captures vertical inheritance and a set of edges which model the "horizontal" transfer of genetic material. In a series of papers, Nakhleh and colleagues have recently formulated a maximum parsimony (MP) criterion for phylogenetic networks, provided an array of computationally efficient algorithms and heuristics for computing it, and demonstrated its plausibility on simulated data. In this article, we study the performance and robustness of this criterion on biological data. Our findings indicate that MP is very promising when its application is extended to the domain of phylogenetic network reconstruction and HGT detection. In all cases we investigated, the MP criterion detected the correct number of HGT events required to map the evolutionary history of a gene data set onto the species phylogeny. Furthermore, our results indicate that the criterion is robust with respect to both incomplete taxon sampling and the use of different site substitution matrices. Finally, our results show that the MP criterion is very promising in detecting HGT in chimeric genes, whose evolutionary histories are a mix of vertical and horizontal evolution. Besides the performance analysis of MP, our findings offer new insights into the evolution of 4 biological data sets and new possible explanations of HGT scenarios in their evolutionary history.

  13. Dirichlet Process Parsimonious Mixtures for clustering

    OpenAIRE

    Chamroukhi, Faicel; Bartcus, Marius; Glotin, Hervé

    2015-01-01

    The parsimonious Gaussian mixture models, which exploit an eigenvalue decomposition of the group covariance matrices of the Gaussian mixture, have shown their success in particular in cluster analysis. Their estimation is in general performed by maximum likelihood estimation and has also been considered from a parametric Bayesian prospective. We propose new Dirichlet Process Parsimonious mixtures (DPPM) which represent a Bayesian nonparametric formulation of these parsimonious Gaussian mixtur...

  14. Bayesian, Maximum Parsimony and UPGMA Models for Inferring the Phylogenies of Antelopes Using Mitochondrial Markers

    OpenAIRE

    Khan, Haseeb A.; Arif, Ibrahim A.; Bahkali, Ali H.; Al Farhan, Ahmad H.; Al Homaidan, Ali A.

    2008-01-01

    This investigation was aimed to compare the inference of antelope phylogenies resulting from the 16S rRNA, cytochrome-b (cyt-b) and d-loop segments of mitochondrial DNA using three different computational models including Bayesian (BA), maximum parsimony (MP) and unweighted pair group method with arithmetic mean (UPGMA). The respective nucleotide sequences of three Oryx species (Oryx leucoryx, Oryx dammah and Oryx gazella) and an out-group (Addax nasomaculatus) were aligned and subjected to B...

  15. A Maximum Parsimony Model to Reconstruct Phylogenetic Network in Honey Bee Evolution

    OpenAIRE

    Usha Chouhan; K. R. Pardasani

    2007-01-01

    Phylogenies ; The evolutionary histories of groups of species are one of the most widely used tools throughout the life sciences, as well as objects of research with in systematic, evolutionary biology. In every phylogenetic analysis reconstruction produces trees. These trees represent the evolutionary histories of many groups of organisms, bacteria due to horizontal gene transfer and plants due to process of hybridization. The process of gene transfer in bacteria and hyb...

  16. A short proof that phylogenetic tree reconstruction by maximum likelihood is hard.

    Science.gov (United States)

    Roch, Sebastien

    2006-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  17. A Short Proof that Phylogenetic Tree Reconstruction by Maximum Likelihood is Hard

    OpenAIRE

    Roch, S.

    2005-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  18. REGEN: Ancestral Genome Reconstruction for Bacteria.

    Science.gov (United States)

    Yang, Kuan; Heath, Lenwood S; Setubal, João C

    2012-07-18

    Ancestral genome reconstruction can be understood as a phylogenetic study with more details than a traditional phylogenetic tree reconstruction. We present a new computational system called REGEN for ancestral bacterial genome reconstruction at both the gene and replicon levels. REGEN reconstructs gene content, contiguous gene runs, and replicon structure for each ancestral genome. Along each branch of the phylogenetic tree, REGEN infers evolutionary events, including gene creation and deletion and replicon fission and fusion. The reconstruction can be performed by either a maximum parsimony or a maximum likelihood method. Gene content reconstruction is based on the concept of neighboring gene pairs. REGEN was designed to be used with any set of genomes that are sufficiently related, which will usually be the case for bacteria within the same taxonomic order. We evaluated REGEN using simulated genomes and genomes in the Rhizobiales order.

  19. A mixed integer linear programming model to reconstruct phylogenies from single nucleotide polymorphism haplotypes under the maximum parsimony criterion

    Science.gov (United States)

    2013-01-01

    Background Phylogeny estimation from aligned haplotype sequences has attracted more and more attention in the recent years due to its importance in analysis of many fine-scale genetic data. Its application fields range from medical research, to drug discovery, to epidemiology, to population dynamics. The literature on molecular phylogenetics proposes a number of criteria for selecting a phylogeny from among plausible alternatives. Usually, such criteria can be expressed by means of objective functions, and the phylogenies that optimize them are referred to as optimal. One of the most important estimation criteria is the parsimony which states that the optimal phylogeny T∗for a set H of n haplotype sequences over a common set of variable loci is the one that satisfies the following requirements: (i) it has the shortest length and (ii) it is such that, for each pair of distinct haplotypes hi,hj∈H, the sum of the edge weights belonging to the path from hi to hj in T∗ is not smaller than the observed number of changes between hi and hj. Finding the most parsimonious phylogeny for H involves solving an optimization problem, called the Most Parsimonious Phylogeny Estimation Problem (MPPEP), which is NP-hard in many of its versions. Results In this article we investigate a recent version of the MPPEP that arises when input data consist of single nucleotide polymorphism haplotypes extracted from a population of individuals on a common genomic region. Specifically, we explore the prospects for improving on the implicit enumeration strategy of implicit enumeration strategy used in previous work using a novel problem formulation and a series of strengthening valid inequalities and preliminary symmetry breaking constraints to more precisely bound the solution space and accelerate implicit enumeration of possible optimal phylogenies. We present the basic formulation and then introduce a series of provable valid constraints to reduce the solution space. We then prove

  20. Automatic maximum entropy spectral reconstruction in NMR

    International Nuclear Information System (INIS)

    Mobli, Mehdi; Maciejewski, Mark W.; Gryk, Michael R.; Hoch, Jeffrey C.

    2007-01-01

    Developments in superconducting magnets, cryogenic probes, isotope labeling strategies, and sophisticated pulse sequences together have enabled the application, in principle, of high-resolution NMR spectroscopy to biomolecular systems approaching 1 megadalton. In practice, however, conventional approaches to NMR that utilize the fast Fourier transform, which require data collected at uniform time intervals, result in prohibitively lengthy data collection times in order to achieve the full resolution afforded by high field magnets. A variety of approaches that involve nonuniform sampling have been proposed, each utilizing a non-Fourier method of spectrum analysis. A very general non-Fourier method that is capable of utilizing data collected using any of the proposed nonuniform sampling strategies is maximum entropy reconstruction. A limiting factor in the adoption of maximum entropy reconstruction in NMR has been the need to specify non-intuitive parameters. Here we describe a fully automated system for maximum entropy reconstruction that requires no user-specified parameters. A web-accessible script generator provides the user interface to the system

  1. REGEN: Ancestral Genome Reconstruction for Bacteria

    Directory of Open Access Journals (Sweden)

    João C. Setubal

    2012-07-01

    Full Text Available Ancestral genome reconstruction can be understood as a phylogenetic study with more details than a traditional phylogenetic tree reconstruction. We present a new computational system called REGEN for ancestral bacterial genome reconstruction at both the gene and replicon levels. REGEN reconstructs gene content, contiguous gene runs, and replicon structure for each ancestral genome. Along each branch of the phylogenetic tree, REGEN infers evolutionary events, including gene creation and deletion and replicon fission and fusion. The reconstruction can be performed by either a maximum parsimony or a maximum likelihood method. Gene content reconstruction is based on the concept of neighboring gene pairs. REGEN was designed to be used with any set of genomes that are sufficiently related, which will usually be the case for bacteria within the same taxonomic order. We evaluated REGEN using simulated genomes and genomes in the Rhizobiales order.

  2. Bayesian methods outperform parsimony but at the expense of precision in the estimation of phylogeny from discrete morphological data.

    Science.gov (United States)

    O'Reilly, Joseph E; Puttick, Mark N; Parry, Luke; Tanner, Alastair R; Tarver, James E; Fleming, James; Pisani, Davide; Donoghue, Philip C J

    2016-04-01

    Different analytical methods can yield competing interpretations of evolutionary history and, currently, there is no definitive method for phylogenetic reconstruction using morphological data. Parsimony has been the primary method for analysing morphological data, but there has been a resurgence of interest in the likelihood-based Mk-model. Here, we test the performance of the Bayesian implementation of the Mk-model relative to both equal and implied-weight implementations of parsimony. Using simulated morphological data, we demonstrate that the Mk-model outperforms equal-weights parsimony in terms of topological accuracy, and implied-weights performs the most poorly. However, the Mk-model produces phylogenies that have less resolution than parsimony methods. This difference in the accuracy and precision of parsimony and Bayesian approaches to topology estimation needs to be considered when selecting a method for phylogeny reconstruction. © 2016 The Authors.

  3. Ancestral sequence reconstruction in primate mitochondrial DNA: compositional bias and effect on functional inference.

    Science.gov (United States)

    Krishnan, Neeraja M; Seligmann, Hervé; Stewart, Caro-Beth; De Koning, A P Jason; Pollock, David D

    2004-10-01

    Reconstruction of ancestral DNA and amino acid sequences is an important means of inferring information about past evolutionary events. Such reconstructions suggest changes in molecular function and evolutionary processes over the course of evolution and are used to infer adaptation and convergence. Maximum likelihood (ML) is generally thought to provide relatively accurate reconstructed sequences compared to parsimony, but both methods lead to the inference of multiple directional changes in nucleotide frequencies in primate mitochondrial DNA (mtDNA). To better understand this surprising result, as well as to better understand how parsimony and ML differ, we constructed a series of computationally simple "conditional pathway" methods that differed in the number of substitutions allowed per site along each branch, and we also evaluated the entire Bayesian posterior frequency distribution of reconstructed ancestral states. We analyzed primate mitochondrial cytochrome b (Cyt-b) and cytochrome oxidase subunit I (COI) genes and found that ML reconstructs ancestral frequencies that are often more different from tip sequences than are parsimony reconstructions. In contrast, frequency reconstructions based on the posterior ensemble more closely resemble extant nucleotide frequencies. Simulations indicate that these differences in ancestral sequence inference are probably due to deterministic bias caused by high uncertainty in the optimization-based ancestral reconstruction methods (parsimony, ML, Bayesian maximum a posteriori). In contrast, ancestral nucleotide frequencies based on an average of the Bayesian set of credible ancestral sequences are much less biased. The methods involving simpler conditional pathway calculations have slightly reduced likelihood values compared to full likelihood calculations, but they can provide fairly unbiased nucleotide reconstructions and may be useful in more complex phylogenetic analyses than considered here due to their speed and

  4. Méthodes avancées pour la résolution du problème de maximum parcimonie

    OpenAIRE

    Vazquez Ortiz , Karla Esmeralda

    2016-01-01

    Phylogenetic reconstruction is considered a central underpinning of diverse fields like ecology, molecular biology and physiology where genealogical relationships of species or gene sequences represented as trees can provide the most meaningful insights into biology. Maximum Parsimony (MP) is an important approach to solve the phylogenetic reconstruction based on an optimality criterion under which the tree that minimizes the total number of genetic transformations is preferred. In this thesi...

  5. Simultaneous maximum a posteriori longitudinal PET image reconstruction

    Science.gov (United States)

    Ellis, Sam; Reader, Andrew J.

    2017-09-01

    Positron emission tomography (PET) is frequently used to monitor functional changes that occur over extended time scales, for example in longitudinal oncology PET protocols that include routine clinical follow-up scans to assess the efficacy of a course of treatment. In these contexts PET datasets are currently reconstructed into images using single-dataset reconstruction methods. Inspired by recently proposed joint PET-MR reconstruction methods, we propose to reconstruct longitudinal datasets simultaneously by using a joint penalty term in order to exploit the high degree of similarity between longitudinal images. We achieved this by penalising voxel-wise differences between pairs of longitudinal PET images in a one-step-late maximum a posteriori (MAP) fashion, resulting in the MAP simultaneous longitudinal reconstruction (SLR) method. The proposed method reduced reconstruction errors and visually improved images relative to standard maximum likelihood expectation-maximisation (ML-EM) in simulated 2D longitudinal brain tumour scans. In reconstructions of split real 3D data with inserted simulated tumours, noise across images reconstructed with MAP-SLR was reduced to levels equivalent to doubling the number of detected counts when using ML-EM. Furthermore, quantification of tumour activities was largely preserved over a variety of longitudinal tumour changes, including changes in size and activity, with larger changes inducing larger biases relative to standard ML-EM reconstructions. Similar improvements were observed for a range of counts levels, demonstrating the robustness of the method when used with a single penalty strength. The results suggest that longitudinal regularisation is a simple but effective method of improving reconstructed PET images without using resolution degrading priors.

  6. Maximum entropy reconstructions for crystallographic imaging; Cristallographie et reconstruction d`images par maximum d`entropie

    Energy Technology Data Exchange (ETDEWEB)

    Papoular, R

    1997-07-01

    The Fourier Transform is of central importance to Crystallography since it allows the visualization in real space of tridimensional scattering densities pertaining to physical systems from diffraction data (powder or single-crystal diffraction, using x-rays, neutrons, electrons or else). In turn, this visualization makes it possible to model and parametrize these systems, the crystal structures of which are eventually refined by Least-Squares techniques (e.g., the Rietveld method in the case of Powder Diffraction). The Maximum Entropy Method (sometimes called MEM or MaxEnt) is a general imaging technique, related to solving ill-conditioned inverse problems. It is ideally suited for tackling undetermined systems of linear questions (for which the number of variables is much larger than the number of equations). It is already being applied successfully in Astronomy, Radioastronomy and Medical Imaging. The advantages of using MAXIMUM Entropy over conventional Fourier and `difference Fourier` syntheses stem from the following facts: MaxEnt takes the experimental error bars into account; MaxEnt incorporate Prior Knowledge (e.g., the positivity of the scattering density in some instances); MaxEnt allows density reconstructions from incompletely phased data, as well as from overlapping Bragg reflections; MaxEnt substantially reduces truncation errors to which conventional experimental Fourier reconstructions are usually prone. The principles of Maximum Entropy imaging as applied to Crystallography are first presented. The method is then illustrated by a detailed example specific to Neutron Diffraction: the search for proton in solids. (author). 17 refs.

  7. Parsimonious relevance models

    NARCIS (Netherlands)

    Meij, E.; Weerkamp, W.; Balog, K.; de Rijke, M.; Myang, S.-H.; Oard, D.W.; Sebastiani, F.; Chua, T.-S.; Leong, M.-K.

    2008-01-01

    We describe a method for applying parsimonious language models to re-estimate the term probabilities assigned by relevance models. We apply our method to six topic sets from test collections in five different genres. Our parsimonious relevance models (i) improve retrieval effectiveness in terms of

  8. Reconstruction of ancestral RNA sequences under multiple structural constraints.

    Science.gov (United States)

    Tremblay-Savard, Olivier; Reinharz, Vladimir; Waldispühl, Jérôme

    2016-11-11

    Secondary structures form the scaffold of multiple sequence alignment of non-coding RNA (ncRNA) families. An accurate reconstruction of ancestral ncRNAs must use this structural signal. However, the inference of ancestors of a single ncRNA family with a single consensus structure may bias the results towards sequences with high affinity to this structure, which are far from the true ancestors. In this paper, we introduce achARNement, a maximum parsimony approach that, given two alignments of homologous ncRNA families with consensus secondary structures and a phylogenetic tree, simultaneously calculates ancestral RNA sequences for these two families. We test our methodology on simulated data sets, and show that achARNement outperforms classical maximum parsimony approaches in terms of accuracy, but also reduces by several orders of magnitude the number of candidate sequences. To conclude this study, we apply our algorithms on the Glm clan and the FinP-traJ clan from the Rfam database. Our results show that our methods reconstruct small sets of high-quality candidate ancestors with better agreement to the two target structures than with classical approaches. Our program is freely available at: http://csb.cs.mcgill.ca/acharnement .

  9. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    Science.gov (United States)

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  10. Maximum entropy method in momentum density reconstruction

    International Nuclear Information System (INIS)

    Dobrzynski, L.; Holas, A.

    1997-01-01

    The Maximum Entropy Method (MEM) is applied to the reconstruction of the 3-dimensional electron momentum density distributions observed through the set of Compton profiles measured along various crystallographic directions. It is shown that the reconstruction of electron momentum density may be reliably carried out with the aid of simple iterative algorithm suggested originally by Collins. A number of distributions has been simulated in order to check the performance of MEM. It is shown that MEM can be recommended as a model-free approach. (author). 13 refs, 1 fig

  11. Reconstruction of ancestral RNA sequences under multiple structural constraints

    Directory of Open Access Journals (Sweden)

    Olivier Tremblay-Savard

    2016-11-01

    Full Text Available Abstract Background Secondary structures form the scaffold of multiple sequence alignment of non-coding RNA (ncRNA families. An accurate reconstruction of ancestral ncRNAs must use this structural signal. However, the inference of ancestors of a single ncRNA family with a single consensus structure may bias the results towards sequences with high affinity to this structure, which are far from the true ancestors. Methods In this paper, we introduce achARNement, a maximum parsimony approach that, given two alignments of homologous ncRNA families with consensus secondary structures and a phylogenetic tree, simultaneously calculates ancestral RNA sequences for these two families. Results We test our methodology on simulated data sets, and show that achARNement outperforms classical maximum parsimony approaches in terms of accuracy, but also reduces by several orders of magnitude the number of candidate sequences. To conclude this study, we apply our algorithms on the Glm clan and the FinP-traJ clan from the Rfam database. Conclusions Our results show that our methods reconstruct small sets of high-quality candidate ancestors with better agreement to the two target structures than with classical approaches. Our program is freely available at: http://csb.cs.mcgill.ca/acharnement .

  12. Choosing the best ancestral character state reconstruction method.

    Science.gov (United States)

    Royer-Carenzi, Manuela; Pontarotti, Pierre; Didier, Gilles

    2013-03-01

    Despite its intrinsic difficulty, ancestral character state reconstruction is an essential tool for testing evolutionary hypothesis. Two major classes of approaches to this question can be distinguished: parsimony- or likelihood-based approaches. We focus here on the second class of methods, more specifically on approaches based on continuous-time Markov modeling of character evolution. Among them, we consider the most-likely-ancestor reconstruction, the posterior-probability reconstruction, the likelihood-ratio method, and the Bayesian approach. We discuss and compare the above-mentioned methods over several phylogenetic trees, adding the maximum-parsimony method performance in the comparison. Under the assumption that the character evolves according a continuous-time Markov process, we compute and compare the expectations of success of each method for a broad range of model parameter values. Moreover, we show how the knowledge of the evolution model parameters allows to compute upper bounds of reconstruction performances, which are provided as references. The results of all these reconstruction methods are quite close one to another, and the expectations of success are not so far from their theoretical upper bounds. But the performance ranking heavily depends on the topology of the studied tree, on the ancestral node that is to be inferred and on the parameter values. Consequently, we propose a protocol providing for each parameter value the best method in terms of expectation of success, with regard to the phylogenetic tree and the ancestral node to infer. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. The Impact of Reconstruction Methods, Phylogenetic Uncertainty and Branch Lengths on Inference of Chromosome Number Evolution in American Daisies (Melampodium, Asteraceae).

    Science.gov (United States)

    McCann, Jamie; Schneeweiss, Gerald M; Stuessy, Tod F; Villaseñor, Jose L; Weiss-Schneeweiss, Hanna

    2016-01-01

    Chromosome number change (polyploidy and dysploidy) plays an important role in plant diversification and speciation. Investigating chromosome number evolution commonly entails ancestral state reconstruction performed within a phylogenetic framework, which is, however, prone to uncertainty, whose effects on evolutionary inferences are insufficiently understood. Using the chromosomally diverse plant genus Melampodium (Asteraceae) as model group, we assess the impact of reconstruction method (maximum parsimony, maximum likelihood, Bayesian methods), branch length model (phylograms versus chronograms) and phylogenetic uncertainty (topological and branch length uncertainty) on the inference of chromosome number evolution. We also address the suitability of the maximum clade credibility (MCC) tree as single representative topology for chromosome number reconstruction. Each of the listed factors causes considerable incongruence among chromosome number reconstructions. Discrepancies between inferences on the MCC tree from those made by integrating over a set of trees are moderate for ancestral chromosome numbers, but severe for the difference of chromosome gains and losses, a measure of the directionality of dysploidy. Therefore, reliance on single trees, such as the MCC tree, is strongly discouraged and model averaging, taking both phylogenetic and model uncertainty into account, is recommended. For studying chromosome number evolution, dedicated models implemented in the program ChromEvol and ordered maximum parsimony may be most appropriate. Chromosome number evolution in Melampodium follows a pattern of bidirectional dysploidy (starting from x = 11 to x = 9 and x = 14, respectively) with no prevailing direction.

  14. The Impact of Reconstruction Methods, Phylogenetic Uncertainty and Branch Lengths on Inference of Chromosome Number Evolution in American Daisies (Melampodium, Asteraceae.

    Directory of Open Access Journals (Sweden)

    Jamie McCann

    Full Text Available Chromosome number change (polyploidy and dysploidy plays an important role in plant diversification and speciation. Investigating chromosome number evolution commonly entails ancestral state reconstruction performed within a phylogenetic framework, which is, however, prone to uncertainty, whose effects on evolutionary inferences are insufficiently understood. Using the chromosomally diverse plant genus Melampodium (Asteraceae as model group, we assess the impact of reconstruction method (maximum parsimony, maximum likelihood, Bayesian methods, branch length model (phylograms versus chronograms and phylogenetic uncertainty (topological and branch length uncertainty on the inference of chromosome number evolution. We also address the suitability of the maximum clade credibility (MCC tree as single representative topology for chromosome number reconstruction. Each of the listed factors causes considerable incongruence among chromosome number reconstructions. Discrepancies between inferences on the MCC tree from those made by integrating over a set of trees are moderate for ancestral chromosome numbers, but severe for the difference of chromosome gains and losses, a measure of the directionality of dysploidy. Therefore, reliance on single trees, such as the MCC tree, is strongly discouraged and model averaging, taking both phylogenetic and model uncertainty into account, is recommended. For studying chromosome number evolution, dedicated models implemented in the program ChromEvol and ordered maximum parsimony may be most appropriate. Chromosome number evolution in Melampodium follows a pattern of bidirectional dysploidy (starting from x = 11 to x = 9 and x = 14, respectively with no prevailing direction.

  15. Seeking parsimony in hydrology and water resources technology

    Science.gov (United States)

    Koutsoyiannis, D.

    2009-04-01

    systems to single numbers (a probability or an expected value), and statistics provides the empirical basis of summarizing data, making inference from them, and supporting decision making in water resource management. Unfortunately, the current state of the art in probability, statistics and their union, often called stochastics, is not fully satisfactory for the needs of modelling of hydrological and water resource systems. A first problem is that stochastic modelling has traditionally relied on classical statistics, which is based on the independent "coin-tossing" prototype, rather than on the study of real-world systems whose behaviour is very different from the classical prototype. A second problem is that the stochastic models (particularly the multivariate ones) are often not parsimonious themselves. Therefore, substantial advancement of stochastics is necessary in a new paradigm of parsimonious hydrological modelling. These ideas are illustrated using several examples, namely: (a) hydrological modelling of a karst system in Bosnia and Herzegovina using three different approaches ranging from parsimonious to detailed "physically-based"; (b) parsimonious modelling of a peculiar modified catchment in Greece; (c) a stochastic approach that can replace parameter-excessive ARMA-type models with a generalized algorithm that produces any shape of autocorrelation function (consistent with the accuracy provided by the data) using a couple of parameters; (d) a multivariate stochastic approach which replaces a huge number of parameters estimated from data with coefficients estimated by the principle of maximum entropy; and (e) a parsimonious approach for decision making in multi-reservoir systems using a handful of parameters instead of thousands of decision variables.

  16. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history

    OpenAIRE

    Cherry, Joshua L.

    2017-01-01

    Background Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Results Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data....

  17. Reconstruction of ancestral RNA sequences under multiple structural constraints

    OpenAIRE

    Tremblay-Savard, Olivier; Reinharz, Vladimir; Waldisp?hl, J?r?me

    2016-01-01

    Background Secondary structures form the scaffold of multiple sequence alignment of non-coding RNA (ncRNA) families. An accurate reconstruction of ancestral ncRNAs must use this structural signal. However, the inference of ancestors of a single ncRNA family with a single consensus structure may bias the results towards sequences with high affinity to this structure, which are far from the true ancestors. Methods In this paper, we introduce achARNement, a maximum parsimony approach that, given...

  18. Peyronie's Reconstruction for Maximum Length and Girth Gain: Geometrical Principles

    Directory of Open Access Journals (Sweden)

    Paulo H. Egydio

    2008-01-01

    Full Text Available Peyronie's disease has been associated with penile shortening and some degree of erectile dysfunction. Surgical reconstruction should be based on giving a functional penis, that is, rectifying the penis with rigidity enough to make the sexual intercourse. The procedure should be discussed preoperatively in terms of length and girth reconstruction in order to improve patient satisfaction. The tunical reconstruction for maximum penile length and girth restoration should be based on the maximum length of the dissected neurovascular bundle possible and the application of geometrical principles to define the precise site and size of tunical incision and grafting procedure. As penile rectification and rigidity are required to achieve complete functional restoration of the penis and 20 to 54% of patients experience associated erectile dysfunction, penile straightening alone may not be enough to provide complete functional restoration. Therefore, phosphodiesterase inhibitors, self-injection, or penile prosthesis may need to be added in some cases.

  19. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  20. A comparison of ancestral state reconstruction methods for quantitative characters.

    Science.gov (United States)

    Royer-Carenzi, Manuela; Didier, Gilles

    2016-09-07

    Choosing an ancestral state reconstruction method among the alternatives available for quantitative characters may be puzzling. We present here a comparison of seven of them, namely the maximum likelihood, restricted maximum likelihood, generalized least squares under Brownian, Brownian-with-trend and Ornstein-Uhlenbeck models, phylogenetic independent contrasts and squared parsimony methods. A review of the relations between these methods shows that the maximum likelihood, the restricted maximum likelihood and the generalized least squares under Brownian model infer the same ancestral states and can only be distinguished by the distributions accounting for the reconstruction uncertainty which they provide. The respective accuracy of the methods is assessed over character evolution simulated under a Brownian motion with (and without) directional or stabilizing selection. We give the general form of ancestral state distributions conditioned on leaf states under the simulation models. Ancestral distributions are used first, to give a theoretical lower bound of the expected reconstruction error, and second, to develop an original evaluation scheme which is more efficient than comparing the reconstructed and the simulated states. Our simulations show that: (i) the distributions of the reconstruction uncertainty provided by the methods generally make sense (some more than others); (ii) it is essential to detect the presence of an evolutionary trend and to choose a reconstruction method accordingly; (iii) all the methods show good performances on characters under stabilizing selection; (iv) without trend or stabilizing selection, the maximum likelihood method is generally the most accurate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    Science.gov (United States)

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  2. Parsimonious Language Models for Information Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Robertson, Stephen; Zaragoza, Hugo

    We systematically investigate a new approach to estimating the parameters of language models for information retrieval, called parsimonious language models. Parsimonious language models explicitly address the relation between levels of language models that are typically used for smoothing. As such,

  3. LensEnt2: Maximum-entropy weak lens reconstruction

    Science.gov (United States)

    Marshall, P. J.; Hobson, M. P.; Gull, S. F.; Bridle, S. L.

    2013-08-01

    LensEnt2 is a maximum entropy reconstructor of weak lensing mass maps. The method takes each galaxy shape as an independent estimator of the reduced shear field and incorporates an intrinsic smoothness, determined by Bayesian methods, into the reconstruction. The uncertainties from both the intrinsic distribution of galaxy shapes and galaxy shape estimation are carried through to the final mass reconstruction, and the mass within arbitrarily shaped apertures are calculated with corresponding uncertainties. The input is a galaxy ellipticity catalog with each measured galaxy shape treated as a noisy tracer of the reduced shear field, which is inferred on a fine pixel grid assuming positivity, and smoothness on scales of w arcsec where w is an input parameter. The ICF width w can be chosen by computing the evidence for it.

  4. [Reconstruction of the phylogenetic position of larch (Larix sukaczewii Dylis) by sequencing data for the trnK intron of chloroplast DNA].

    Science.gov (United States)

    Bashalkhanov, S I; Konstantinov, Iu M; Verbitskiĭ, D S; Kobzev, V F

    2003-10-01

    To reconstruct the systematic relationships of larch Larix sukaczewii, we used the chloroplast trnK intron sequences of L. decidua, L. sukaczewii, L. sibirica, L. czekanovskii, and L. gmelinii. Analysis of phylogenetic trees constructed using the maximum parsimony and maximum likelihood methods showed a clear divergence of the trnK intron sequences between L. sukaczewii and L. sibirica. This divergence reaches intraspecific level, which supports a previously published hypothesis on the taxonomic isolation of L. sukaczewii.

  5. Cases in which ancestral maximum likelihood will be confusingly misleading.

    Science.gov (United States)

    Handelman, Tomer; Chor, Benny

    2017-05-07

    Ancestral maximum likelihood (AML) is a phylogenetic tree reconstruction criteria that "lies between" maximum parsimony (MP) and maximum likelihood (ML). ML has long been known to be statistically consistent. On the other hand, Felsenstein (1978) showed that MP is statistically inconsistent, and even positively misleading: There are cases where the parsimony criteria, applied to data generated according to one tree topology, will be optimized on a different tree topology. The question of weather AML is statistically consistent or not has been open for a long time. Mossel et al. (2009) have shown that AML can "shrink" short tree edges, resulting in a star tree with no internal resolution, which yields a better AML score than the original (resolved) model. This result implies that AML is statistically inconsistent, but not that it is positively misleading, because the star tree is compatible with any other topology. We show that AML is confusingly misleading: For some simple, four taxa (resolved) tree, the ancestral likelihood optimization criteria is maximized on an incorrect (resolved) tree topology, as well as on a star tree (both with specific edge lengths), while the tree with the original, correct topology, has strictly lower ancestral likelihood. Interestingly, the two short edges in the incorrect, resolved tree topology are of length zero, and are not adjacent, so this resolved tree is in fact a simple path. While for MP, the underlying phenomenon can be described as long edge attraction, it turns out that here we have long edge repulsion. Copyright © 2017. Published by Elsevier Ltd.

  6. Bayesian, maximum parsimony and UPGMA models for inferring the phylogenies of antelopes using mitochondrial markers.

    Science.gov (United States)

    Khan, Haseeb A; Arif, Ibrahim A; Bahkali, Ali H; Al Farhan, Ahmad H; Al Homaidan, Ali A

    2008-10-06

    This investigation was aimed to compare the inference of antelope phylogenies resulting from the 16S rRNA, cytochrome-b (cyt-b) and d-loop segments of mitochondrial DNA using three different computational models including Bayesian (BA), maximum parsimony (MP) and unweighted pair group method with arithmetic mean (UPGMA). The respective nucleotide sequences of three Oryx species (Oryx leucoryx, Oryx dammah and Oryx gazella) and an out-group (Addax nasomaculatus) were aligned and subjected to BA, MP and UPGMA models for comparing the topologies of respective phylogenetic trees. The 16S rRNA region possessed the highest frequency of conserved sequences (97.65%) followed by cyt-b (94.22%) and d-loop (87.29%). There were few transitions (2.35%) and none transversions in 16S rRNA as compared to cyt-b (5.61% transitions and 0.17% transversions) and d-loop (11.57% transitions and 1.14% transversions) while comparing the four taxa. All the three mitochondrial segments clearly differentiated the genus Addax from Oryx using the BA or UPGMA models. The topologies of all the gamma-corrected Bayesian trees were identical irrespective of the marker type. The UPGMA trees resulting from 16S rRNA and d-loop sequences were also identical (Oryx dammah grouped with Oryx leucoryx) to Bayesian trees except that the UPGMA tree based on cyt-b showed a slightly different phylogeny (Oryx dammah grouped with Oryx gazella) with a low bootstrap support. However, the MP model failed to differentiate the genus Addax from Oryx. These findings demonstrate the efficiency and robustness of BA and UPGMA methods for phylogenetic analysis of antelopes using mitochondrial markers.

  7. Penalized maximum likelihood reconstruction for x-ray differential phase-contrast tomography

    International Nuclear Information System (INIS)

    Brendel, Bernhard; Teuffenbach, Maximilian von; Noël, Peter B.; Pfeiffer, Franz; Koehler, Thomas

    2016-01-01

    Purpose: The purpose of this work is to propose a cost function with regularization to iteratively reconstruct attenuation, phase, and scatter images simultaneously from differential phase contrast (DPC) acquisitions, without the need of phase retrieval, and examine its properties. Furthermore this reconstruction method is applied to an acquisition pattern that is suitable for a DPC tomographic system with continuously rotating gantry (sliding window acquisition), overcoming the severe smearing in noniterative reconstruction. Methods: We derive a penalized maximum likelihood reconstruction algorithm to directly reconstruct attenuation, phase, and scatter image from the measured detector values of a DPC acquisition. The proposed penalty comprises, for each of the three images, an independent smoothing prior. Image quality of the proposed reconstruction is compared to images generated with FBP and iterative reconstruction after phase retrieval. Furthermore, the influence between the priors is analyzed. Finally, the proposed reconstruction algorithm is applied to experimental sliding window data acquired at a synchrotron and results are compared to reconstructions based on phase retrieval. Results: The results show that the proposed algorithm significantly increases image quality in comparison to reconstructions based on phase retrieval. No significant mutual influence between the proposed independent priors could be observed. Further it could be illustrated that the iterative reconstruction of a sliding window acquisition results in images with substantially reduced smearing artifacts. Conclusions: Although the proposed cost function is inherently nonconvex, it can be used to reconstruct images with less aliasing artifacts and less streak artifacts than reconstruction methods based on phase retrieval. Furthermore, the proposed method can be used to reconstruct images of sliding window acquisitions with negligible smearing artifacts

  8. Maximum Gene-Support Tree

    Directory of Open Access Journals (Sweden)

    Yunfeng Shan

    2008-01-01

    Full Text Available Genomes and genes diversify during evolution; however, it is unclear to what extent genes still retain the relationship among species. Model species for molecular phylogenetic studies include yeasts and viruses whose genomes were sequenced as well as plants that have the fossil-supported true phylogenetic trees available. In this study, we generated single gene trees of seven yeast species as well as single gene trees of nine baculovirus species using all the orthologous genes among the species compared. Homologous genes among seven known plants were used for validation of the finding. Four algorithms—maximum parsimony (MP, minimum evolution (ME, maximum likelihood (ML, and neighbor-joining (NJ—were used. Trees were reconstructed before and after weighting the DNA and protein sequence lengths among genes. Rarely a gene can always generate the “true tree” by all the four algorithms. However, the most frequent gene tree, termed “maximum gene-support tree” (MGS tree, or WMGS tree for the weighted one, in yeasts, baculoviruses, or plants was consistently found to be the “true tree” among the species. The results provide insights into the overall degree of divergence of orthologous genes of the genomes analyzed and suggest the following: 1 The true tree relationship among the species studied is still maintained by the largest group of orthologous genes; 2 There are usually more orthologous genes with higher similarities between genetically closer species than between genetically more distant ones; and 3 The maximum gene-support tree reflects the phylogenetic relationship among species in comparison.

  9. Comparison of tomography reconstruction by maximum entropy and filtered retro projection

    International Nuclear Information System (INIS)

    Abdala, F.J.P.; Simpson, D.M.; Roberty, N.C.

    1992-01-01

    The tomographic reconstruction with few projections is studied, comparing the maximum entropy method with filtered retro projection. Simulations with and without the presence of noise and also with the presence of an object of high density inside of the skull are showed. (C.G.C.)

  10. Parsimonious Wavelet Kernel Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Wang Qin

    2015-11-01

    Full Text Available In this study, a parsimonious scheme for wavelet kernel extreme learning machine (named PWKELM was introduced by combining wavelet theory and a parsimonious algorithm into kernel extreme learning machine (KELM. In the wavelet analysis, bases that were localized in time and frequency to represent various signals effectively were used. Wavelet kernel extreme learning machine (WELM maximized its capability to capture the essential features in “frequency-rich” signals. The proposed parsimonious algorithm also incorporated significant wavelet kernel functions via iteration in virtue of Householder matrix, thus producing a sparse solution that eased the computational burden and improved numerical stability. The experimental results achieved from the synthetic dataset and a gas furnace instance demonstrated that the proposed PWKELM is efficient and feasible in terms of improving generalization accuracy and real time performance.

  11. B-Spline potential function for maximum a-posteriori image reconstruction in fluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Shilpa Dilipkumar

    2015-03-01

    Full Text Available An iterative image reconstruction technique employing B-Spline potential function in a Bayesian framework is proposed for fluorescence microscopy images. B-splines are piecewise polynomials with smooth transition, compact support and are the shortest polynomial splines. Incorporation of the B-spline potential function in the maximum-a-posteriori reconstruction technique resulted in improved contrast, enhanced resolution and substantial background reduction. The proposed technique is validated on simulated data as well as on the images acquired from fluorescence microscopes (widefield, confocal laser scanning fluorescence and super-resolution 4Pi microscopy. A comparative study of the proposed technique with the state-of-art maximum likelihood (ML and maximum-a-posteriori (MAP with quadratic potential function shows its superiority over the others. B-Spline MAP technique can find applications in several imaging modalities of fluorescence microscopy like selective plane illumination microscopy, localization microscopy and STED.

  12. Bias in phylogenetic reconstruction of vertebrate rhodopsin sequences.

    Science.gov (United States)

    Chang, B S; Campbell, D L

    2000-08-01

    Two spurious nodes were found in phylogenetic analyses of vertebrate rhodopsin sequences in comparison with well-established vertebrate relationships. These spurious reconstructions were well supported in bootstrap analyses and occurred independently of the method of phylogenetic analysis used (parsimony, distance, or likelihood). Use of this data set of vertebrate rhodopsin sequences allowed us to exploit established vertebrate relationships, as well as the considerable amount known about the molecular evolution of this gene, in order to identify important factors contributing to the spurious reconstructions. Simulation studies using parametric bootstrapping indicate that it is unlikely that the spurious nodes in the parsimony analyses are due to long branches or other topological effects. Rather, they appear to be due to base compositional bias at third positions, codon bias, and convergent evolution at nucleotide positions encoding the hydrophobic residues isoleucine, leucine, and valine. LogDet distance methods, as well as maximum-likelihood methods which allow for nonstationary changes in base composition, reduce but do not entirely eliminate support for the spurious resolutions. Inclusion of five additional rhodopsin sequences in the phylogenetic analyses largely corrected one of the spurious reconstructions while leaving the other unaffected. The additional sequences not only were more proximal to the corrected node, but were also found to have intermediate levels of base composition and codon bias as compared with neighboring sequences on the tree. This study shows that the spurious reconstructions can be corrected either by excluding third positions, as well as those encoding the amino acids Ile, Val, and Leu (which may not be ideal, as these sites can contain useful phylogenetic signal for other parts of the tree), or by the addition of sequences that reduce problems associated with convergent evolution.

  13. Last Glacial Maximum Salinity Reconstruction

    Science.gov (United States)

    Homola, K.; Spivack, A. J.

    2016-12-01

    It has been previously demonstrated that salinity can be reconstructed from sediment porewater. The goal of our study is to reconstruct high precision salinity during the Last Glacial Maximum (LGM). Salinity is usually determined at high precision via conductivity, which requires a larger volume of water than can be extracted from a sediment core, or via chloride titration, which yields lower than ideal precision. It has been demonstrated for water column samples that high precision density measurements can be used to determine salinity at the precision of a conductivity measurement using the equation of state of seawater. However, water column seawater has a relatively constant composition, in contrast to porewater, where variations from standard seawater composition occur. These deviations, which affect the equation of state, must be corrected for through precise measurements of each ion's concentration and knowledge of apparent partial molar density in seawater. We have developed a density-based method for determining porewater salinity that requires only 5 mL of sample, achieving density precisions of 10-6 g/mL. We have applied this method to porewater samples extracted from long cores collected along a N-S transect across the western North Atlantic (R/V Knorr cruise KN223). Density was determined to a precision of 2.3x10-6 g/mL, which translates to salinity uncertainty of 0.002 gms/kg if the effect of differences in composition is well constrained. Concentrations of anions (Cl-, and SO4-2) and cations (Na+, Mg+, Ca+2, and K+) were measured. To correct salinities at the precision required to unravel LGM Meridional Overturning Circulation, our ion precisions must be better than 0.1% for SO4-/Cl- and Mg+/Na+, and 0.4% for Ca+/Na+, and K+/Na+. Alkalinity, pH and Dissolved Inorganic Carbon of the porewater were determined to precisions better than 4% when ratioed to Cl-, and used to calculate HCO3-, and CO3-2. Apparent partial molar densities in seawater were

  14. Preliminary attempt on maximum likelihood tomosynthesis reconstruction of DEI data

    International Nuclear Information System (INIS)

    Wang Zhentian; Huang Zhifeng; Zhang Li; Kang Kejun; Chen Zhiqiang; Zhu Peiping

    2009-01-01

    Tomosynthesis is a three-dimension reconstruction method that can remove the effect of superimposition with limited angle projections. It is especially promising in mammography where radiation dose is concerned. In this paper, we propose a maximum likelihood tomosynthesis reconstruction algorithm (ML-TS) on the apparent absorption data of diffraction enhanced imaging (DEI). The motivation of this contribution is to develop a tomosynthesis algorithm in low-dose or noisy circumstances and make DEI get closer to clinic application. The theoretical statistical models of DEI data in physics are analyzed and the proposed algorithm is validated with the experimental data at the Beijing Synchrotron Radiation Facility (BSRF). The results of ML-TS have better contrast compared with the well known 'shift-and-add' algorithm and FBP algorithm. (authors)

  15. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  16. A maximum-likelihood reconstruction algorithm for tomographic gamma-ray nondestructive assay

    International Nuclear Information System (INIS)

    Prettyman, T.H.; Estep, R.J.; Cole, R.A.; Sheppard, G.A.

    1994-01-01

    A new tomographic reconstruction algorithm for nondestructive assay with high resolution gamma-ray spectroscopy (HRGS) is presented. The reconstruction problem is formulated using a maximum-likelihood approach in which the statistical structure of both the gross and continuum measurements used to determine the full-energy response in HRGS is precisely modeled. An accelerated expectation-maximization algorithm is used to determine the optimal solution. The algorithm is applied to safeguards and environmental assays of large samples (for example, 55-gal. drums) in which high continuum levels caused by Compton scattering are routinely encountered. Details of the implementation of the algorithm and a comparative study of the algorithm's performance are presented

  17. Parsimonious Surface Wave Interferometry

    KAUST Repository

    Li, Jing

    2017-10-24

    To decrease the recording time of a 2D seismic survey from a few days to one hour or less, we present a parsimonious surface-wave interferometry method. Interferometry allows for the creation of a large number of virtual shot gathers from just two reciprocal shot gathers by crosscoherence of trace pairs, where the virtual surface waves can be inverted for the S-wave velocity model by wave-equation dispersion inversion (WD). Synthetic and field data tests suggest that parsimonious wave-equation dispersion inversion (PWD) gives S-velocity tomograms that are comparable to those obtained from a full survey with a shot at each receiver. The limitation of PWD is that the virtual data lose some information so that the resolution of the S-velocity tomogram can be modestly lower than that of the S-velocity tomogram inverted from a conventional survey.

  18. Parsimonious Surface Wave Interferometry

    KAUST Repository

    Li, Jing; Hanafy, Sherif; Schuster, Gerard T.

    2017-01-01

    To decrease the recording time of a 2D seismic survey from a few days to one hour or less, we present a parsimonious surface-wave interferometry method. Interferometry allows for the creation of a large number of virtual shot gathers from just two reciprocal shot gathers by crosscoherence of trace pairs, where the virtual surface waves can be inverted for the S-wave velocity model by wave-equation dispersion inversion (WD). Synthetic and field data tests suggest that parsimonious wave-equation dispersion inversion (PWD) gives S-velocity tomograms that are comparable to those obtained from a full survey with a shot at each receiver. The limitation of PWD is that the virtual data lose some information so that the resolution of the S-velocity tomogram can be modestly lower than that of the S-velocity tomogram inverted from a conventional survey.

  19. A maximum entropy reconstruction technique for tomographic particle image velocimetry

    International Nuclear Information System (INIS)

    Bilsky, A V; Lozhkin, V A; Markovich, D M; Tokarev, M P

    2013-01-01

    This paper studies a novel approach for reducing tomographic PIV computational complexity. The proposed approach is an algebraic reconstruction technique, termed MENT (maximum entropy). This technique computes the three-dimensional light intensity distribution several times faster than SMART, using at least ten times less memory. Additionally, the reconstruction quality remains nearly the same as with SMART. This paper presents the theoretical computation performance comparison for MENT, SMART and MART, followed by validation using synthetic particle images. Both the theoretical assessment and validation of synthetic images demonstrate significant computational time reduction. The data processing accuracy of MENT was compared to that of SMART in a slot jet experiment. A comparison of the average velocity profiles shows a high level of agreement between the results obtained with MENT and those obtained with SMART. (paper)

  20. A Multi-Criterion Evolutionary Approach Applied to Phylogenetic Reconstruction

    OpenAIRE

    Cancino, W.; Delbem, A.C.B.

    2010-01-01

    In this paper, we proposed an MOEA approach, called PhyloMOEA which solves the phylogenetic inference problem using maximum parsimony and maximum likelihood criteria. The PhyloMOEA's development was motivated by several studies in the literature (Huelsenbeck, 1995; Jin & Nei, 1990; Kuhner & Felsenstein, 1994; Tateno et al., 1994), which point out that various phylogenetic inference methods lead to inconsistent solutions. Techniques using parsimony and likelihood criteria yield to different tr...

  1. A large version of the small parsimony problem

    DEFF Research Database (Denmark)

    Fredslund, Jakob; Hein, Jotun; Scharling, Tejs

    2003-01-01

    the most parsimonious assignment of nucleotides. The gaps of the alignment are represented in a so-called gap graph, and through theoretically sound preprocessing the graph is reduced to pave the way for a running time which in all but the most pathological examples is far better than the exponential worst......Given a multiple alignment over $k$ sequences, an evolutionary tree relating the sequences, and a subadditive gap penalty function (e.g. an affine function), we reconstruct the internal nodes of the tree optimally: we find the optimal explanation in terms of indels of the observed gaps and find...... case time. E.g. for a tree with nine leaves and a random alignment of length 10.000 with 60% gaps, the running time is on average around 45 seconds. For a real alignment of length 9868 of nine HIV-1 sequences, the running time is less than one second....

  2. Maximum-entropy networks pattern detection, network reconstruction and graph combinatorics

    CERN Document Server

    Squartini, Tiziano

    2017-01-01

    This book is an introduction to maximum-entropy models of random graphs with given topological properties and their applications. Its original contribution is the reformulation of many seemingly different problems in the study of both real networks and graph theory within the unified framework of maximum entropy. Particular emphasis is put on the detection of structural patterns in real networks, on the reconstruction of the properties of networks from partial information, and on the enumeration and sampling of graphs with given properties.  After a first introductory chapter explaining the motivation, focus, aim and message of the book, chapter 2 introduces the formal construction of maximum-entropy ensembles of graphs with local topological constraints. Chapter 3 focuses on the problem of pattern detection in real networks and provides a powerful way to disentangle nontrivial higher-order structural features from those that can be traced back to simpler local constraints. Chapter 4 focuses on the problem o...

  3. Penalised Maximum Likelihood Simultaneous Longitudinal PET Image Reconstruction with Difference-Image Priors.

    Science.gov (United States)

    Ellis, Sam; Reader, Andrew J

    2018-04-26

    Many clinical contexts require the acquisition of multiple positron emission tomography (PET) scans of a single subject, for example to observe and quantify changes in functional behaviour in tumours after treatment in oncology. Typically, the datasets from each of these scans are reconstructed individually, without exploiting the similarities between them. We have recently shown that sharing information between longitudinal PET datasets by penalising voxel-wise differences during image reconstruction can improve reconstructed images by reducing background noise and increasing the contrast-to-noise ratio of high activity lesions. Here we present two additional novel longitudinal difference-image priors and evaluate their performance using 2D simulation studies and a 3D real dataset case study. We have previously proposed a simultaneous difference-image-based penalised maximum likelihood (PML) longitudinal image reconstruction method that encourages sparse difference images (DS-PML), and in this work we propose two further novel prior terms. The priors are designed to encourage longitudinal images with corresponding differences which have i) low entropy (DE-PML), and ii) high sparsity in their spatial gradients (DTV-PML). These two new priors and the originally proposed longitudinal prior were applied to 2D simulated treatment response [ 18 F]fluorodeoxyglucose (FDG) brain tumour datasets and compared to standard maximum likelihood expectation-maximisation (MLEM) reconstructions. These 2D simulation studies explored the effects of penalty strengths, tumour behaviour, and inter-scan coupling on reconstructed images. Finally, a real two-scan longitudinal data series acquired from a head and neck cancer patient was reconstructed with the proposed methods and the results compared to standard reconstruction methods. Using any of the three priors with an appropriate penalty strength produced images with noise levels equivalent to those seen when using standard

  4. Maximum entropy reconstruction of poloidal magnetic field and radial electric field profiles in tokamaks

    Science.gov (United States)

    Chen, Yihang; Xiao, Chijie; Yang, Xiaoyi; Wang, Tianbo; Xu, Tianchao; Yu, Yi; Xu, Min; Wang, Long; Lin, Chen; Wang, Xiaogang

    2017-10-01

    The Laser-driven Ion beam trace probe (LITP) is a new diagnostic method for measuring poloidal magnetic field (Bp) and radial electric field (Er) in tokamaks. LITP injects a laser-driven ion beam into the tokamak, and Bp and Er profiles can be reconstructed using tomography methods. A reconstruction code has been developed to validate the LITP theory, and both 2D reconstruction of Bp and simultaneous reconstruction of Bp and Er have been attained. To reconstruct from experimental data with noise, Maximum Entropy and Gaussian-Bayesian tomography methods were applied and improved according to the characteristics of the LITP problem. With these improved methods, a reconstruction error level below 15% has been attained with a data noise level of 10%. These methods will be further tested and applied in the following LITP experiments. Supported by the ITER-CHINA program 2015GB120001, CHINA MOST under 2012YQ030142 and National Natural Science Foundation Abstract of China under 11575014 and 11375053.

  5. Dynamical reconstruction of the global ocean state during the Last Glacial Maximum

    Science.gov (United States)

    Kurahashi-Nakamura, Takasumi; Paul, André; Losch, Martin

    2017-04-01

    The global ocean state for the modern age and for the Last Glacial Maximum (LGM) was dynamically reconstructed with a sophisticated data assimilation technique. A substantial amount of data including global seawater temperature, salinity (only for the modern estimate), and the isotopic composition of oxygen and carbon (only in the Atlantic for the LGM) were integrated into an ocean general circulation model with the help of the adjoint method, thereby the model was optimized to reconstruct plausible continuous fields of tracers, overturning circulation and water mass distribution. The adjoint-based LGM state estimation of this study represents the state of the art in terms of the length of forward model runs, the number of observations assimilated, and the model domain. Compared to the modern state, the reconstructed continuous sea-surface temperature field for the LGM shows a global-mean cooling of 2.2 K, and the reconstructed LGM ocean has a more vigorous Atlantic meridional overturning circulation, shallower North Atlantic Deep Water (NADW) equivalent, stronger stratification, and more saline deep water.

  6. Preconditioned alternating projection algorithms for maximum a posteriori ECT reconstruction

    International Nuclear Information System (INIS)

    Krol, Andrzej; Li, Si; Shen, Lixin; Xu, Yuesheng

    2012-01-01

    We propose a preconditioned alternating projection algorithm (PAPA) for solving the maximum a posteriori (MAP) emission computed tomography (ECT) reconstruction problem. Specifically, we formulate the reconstruction problem as a constrained convex optimization problem with the total variation (TV) regularization. We then characterize the solution of the constrained convex optimization problem and show that it satisfies a system of fixed-point equations defined in terms of two proximity operators raised from the convex functions that define the TV-norm and the constraint involved in the problem. The characterization (of the solution) via the proximity operators that define two projection operators naturally leads to an alternating projection algorithm for finding the solution. For efficient numerical computation, we introduce to the alternating projection algorithm a preconditioning matrix (the EM-preconditioner) for the dense system matrix involved in the optimization problem. We prove theoretically convergence of the PAPA. In numerical experiments, performance of our algorithms, with an appropriately selected preconditioning matrix, is compared with performance of the conventional MAP expectation-maximization (MAP-EM) algorithm with TV regularizer (EM-TV) and that of the recently developed nested EM-TV algorithm for ECT reconstruction. Based on the numerical experiments performed in this work, we observe that the alternating projection algorithm with the EM-preconditioner outperforms significantly the EM-TV in all aspects including the convergence speed, the noise in the reconstructed images and the image quality. It also outperforms the nested EM-TV in the convergence speed while providing comparable image quality. (paper)

  7. Parsimonious Refraction Interferometry and Tomography

    KAUST Repository

    Hanafy, Sherif; Schuster, Gerard T.

    2017-01-01

    We present parsimonious refraction interferometry and tomography where a densely populated refraction data set can be obtained from two reciprocal and several infill shot gathers. The assumptions are that the refraction arrivals are head waves

  8. Mandibular kinematics and maximum voluntary bite force following segmental resection of the mandible without or with reconstruction.

    Science.gov (United States)

    Linsen, Sabine S; Oikonomou, Annina; Martini, Markus; Teschke, Marcus

    2018-05-01

    The purpose was to analyze mandibular kinematics and maximum voluntary bite force in patients following segmental resection of the mandible without and with reconstruction (autologous bone, alloplastic total temporomandibular joint replacement (TMJ TJR)). Subjects operated from April 2002 to August 2014 were enrolled in the study. Condylar (CRoM) and incisal (InRoM) range of motion and deflection during opening, condylar retrusion, incisal lateral excursion, mandibular rotation angle during opening, and maximum voluntary bite force were determined on the non-affected site and compared between groups. Influence of co-factors (defect size, soft tissue deficit, neck dissection, radiotherapy, occlusal contact zones (OCZ), and time) was determined. Twelve non-reconstructed and 26 reconstructed patients (13 autologous, 13 TMJ TJR) were included in the study. InRoM opening and bite force were significantly higher (P ≤ .024), and both condylar and incisal deflection during opening significantly lower (P ≤ .027) in reconstructed patients compared with non-reconstructed. Differences between the autologous and the TMJ TJR group were statistically not significant. Co-factors defect size, soft tissue deficit, and neck dissection had the greatest impact on kinematics and number of OCZs on bite force. Reconstructed patients (both autologous and TMJ TJR) have better overall function than non-reconstructed patients. Reconstruction of segmental mandibular resection has positive effects on mandibular function. TMJ TJR seems to be a suitable technique for the reconstruction of mandibular defects including the TMJ complex.

  9. On simulated annealing phase transitions in phylogeny reconstruction.

    Science.gov (United States)

    Strobl, Maximilian A R; Barker, Daniel

    2016-08-01

    Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Phylogenetic reconstruction of the family Acrypteridae (Orthoptera: Acridoidea) based on mitochondrial cytochrome B gene.

    Science.gov (United States)

    Huo, Guangming; Jiang, Guofang; Sun, Zhengli; Liu, Dianfeng; Zhang, Yalin; Lu, Lin

    2007-04-01

    Sequences from the mitochondrial cytochrome b gene (Cyt b) were determined for 25 species from the superfamily Acridoidae and the homologous sequences of 19 species of grasshoppers were downloaded from the GenBank data library. The purpose was to develop a molecular phylogeny of the Acrypteridae, and to interpret the phylogenetic position of the family within the superfamily Acridoidea. Phylogeny was reconstructed by Maximum-parsimony (MP) and Bayesian criteria using Yunnanites coriacea and Tagasta marginella as outgroups. The alignment length of the fragments was 384 bp after excluding ambiguous sites, including 167 parsimony informative sites. In the fragments, the percentages of A + T and G + C were 70.7% and 29.3%, respectively. The monophyly of Arcypteridae is not supported by phylogenetic trees. Within the Arcypteridae, neither Arcypterinae nor Ceracrinae is supported as a monophyletic group. The current genus Chorthippus is not a monophyletic group, and should be a polyphyletic group. The present results are significantly different from the classification scheme of Arcypteridae, which is based on morphology.

  11. High-Performance Phylogeny Reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Tiffani L. Williams

    2004-11-10

    Under the Alfred P. Sloan Fellowship in Computational Biology, I have been afforded the opportunity to study phylogenetics--one of the most important and exciting disciplines in computational biology. A phylogeny depicts an evolutionary relationship among a set of organisms (or taxa). Typically, a phylogeny is represented by a binary tree, where modern organisms are placed at the leaves and ancestral organisms occupy internal nodes, with the edges of the tree denoting evolutionary relationships. The task of phylogenetics is to infer this tree from observations upon present-day organisms. Reconstructing phylogenies is a major component of modern research programs in many areas of biology and medicine, but it is enormously expensive. The most commonly used techniques attempt to solve NP-hard problems such as maximum likelihood and maximum parsimony, typically by bounded searches through an exponentially-sized tree-space. For example, there are over 13 billion possible trees for 13 organisms. Phylogenetic heuristics that quickly analyze large amounts of data accurately will revolutionize the biological field. This final report highlights my activities in phylogenetics during the two-year postdoctoral period at the University of New Mexico under Prof. Bernard Moret. Specifically, this report reports my scientific, community and professional activities as an Alfred P. Sloan Postdoctoral Fellow in Computational Biology.

  12. Theoretical Analysis of Penalized Maximum-Likelihood Patlak Parametric Image Reconstruction in Dynamic PET for Lesion Detection.

    Science.gov (United States)

    Yang, Li; Wang, Guobao; Qi, Jinyi

    2016-04-01

    Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.

  13. A new global reconstruction of temperature changes at the Last Glacial Maximum

    Directory of Open Access Journals (Sweden)

    J. D. Annan

    2013-02-01

    Full Text Available Some recent compilations of proxy data both on land and ocean (MARGO Project Members, 2009; Bartlein et al., 2011; Shakun et al., 2012, have provided a new opportunity for an improved assessment of the overall climatic state of the Last Glacial Maximum. In this paper, we combine these proxy data with the ensemble of structurally diverse state of the art climate models which participated in the PMIP2 project (Braconnot et al., 2007 to generate a spatially complete reconstruction of surface air (and sea surface temperatures. We test a variety of approaches, and show that multiple linear regression performs well for this application. Our reconstruction is significantly different to and more accurate than previous approaches and we obtain an estimated global mean cooling of 4.0 ± 0.8 °C (95% CI.

  14. Reconstruction of the electron momentum density distribution by the maximum entropy method

    International Nuclear Information System (INIS)

    Dobrzynski, L.

    1996-01-01

    The application of the Maximum Entropy Algorithm to the analysis of the Compton profiles is discussed. It is shown that the reconstruction of electron momentum density may be reliably carried out. However, there are a number of technical problems which have to be overcome in order to produce trustworthy results. In particular one needs the experimental Compton profiles measured for many directions, and to have efficient computational resources. The use of various cross-checks is recommended. (orig.)

  15. Rapid maximum likelihood ancestral state reconstruction of continuous characters: A rerooting-free algorithm.

    Science.gov (United States)

    Goolsby, Eric W

    2017-04-01

    Ancestral state reconstruction is a method used to study the evolutionary trajectories of quantitative characters on phylogenies. Although efficient methods for univariate ancestral state reconstruction under a Brownian motion model have been described for at least 25 years, to date no generalization has been described to allow more complex evolutionary models, such as multivariate trait evolution, non-Brownian models, missing data, and within-species variation. Furthermore, even for simple univariate Brownian motion models, most phylogenetic comparative R packages compute ancestral states via inefficient tree rerooting and full tree traversals at each tree node, making ancestral state reconstruction extremely time-consuming for large phylogenies. Here, a computationally efficient method for fast maximum likelihood ancestral state reconstruction of continuous characters is described. The algorithm has linear complexity relative to the number of species and outperforms the fastest existing R implementations by several orders of magnitude. The described algorithm is capable of performing ancestral state reconstruction on a 1,000,000-species phylogeny in fewer than 2 s using a standard laptop, whereas the next fastest R implementation would take several days to complete. The method is generalizable to more complex evolutionary models, such as phylogenetic regression, within-species variation, non-Brownian evolutionary models, and multivariate trait evolution. Because this method enables fast repeated computations on phylogenies of virtually any size, implementation of the described algorithm can drastically alleviate the computational burden of many otherwise prohibitively time-consuming tasks requiring reconstruction of ancestral states, such as phylogenetic imputation of missing data, bootstrapping procedures, Expectation-Maximization algorithms, and Bayesian estimation. The described ancestral state reconstruction algorithm is implemented in the Rphylopars

  16. Reconstructible phylogenetic networks: do not distinguish the indistinguishable.

    Science.gov (United States)

    Pardi, Fabio; Scornavacca, Celine

    2015-04-01

    Phylogenetic networks represent the evolution of organisms that have undergone reticulate events, such as recombination, hybrid speciation or lateral gene transfer. An important way to interpret a phylogenetic network is in terms of the trees it displays, which represent all the possible histories of the characters carried by the organisms in the network. Interestingly, however, different networks may display exactly the same set of trees, an observation that poses a problem for network reconstruction: from the perspective of many inference methods such networks are "indistinguishable". This is true for all methods that evaluate a phylogenetic network solely on the basis of how well the displayed trees fit the available data, including all methods based on input data consisting of clades, triples, quartets, or trees with any number of taxa, and also sequence-based approaches such as popular formalisations of maximum parsimony and maximum likelihood for networks. This identifiability problem is partially solved by accounting for branch lengths, although this merely reduces the frequency of the problem. Here we propose that network inference methods should only attempt to reconstruct what they can uniquely identify. To this end, we introduce a novel definition of what constitutes a uniquely reconstructible network. For any given set of indistinguishable networks, we define a canonical network that, under mild assumptions, is unique and thus representative of the entire set. Given data that underwent reticulate evolution, only the canonical form of the underlying phylogenetic network can be uniquely reconstructed. While on the methodological side this will imply a drastic reduction of the solution space in network inference, for the study of reticulate evolution this is a fundamental limitation that will require an important change of perspective when interpreting phylogenetic networks.

  17. Reconstructible phylogenetic networks: do not distinguish the indistinguishable.

    Directory of Open Access Journals (Sweden)

    Fabio Pardi

    2015-04-01

    Full Text Available Phylogenetic networks represent the evolution of organisms that have undergone reticulate events, such as recombination, hybrid speciation or lateral gene transfer. An important way to interpret a phylogenetic network is in terms of the trees it displays, which represent all the possible histories of the characters carried by the organisms in the network. Interestingly, however, different networks may display exactly the same set of trees, an observation that poses a problem for network reconstruction: from the perspective of many inference methods such networks are "indistinguishable". This is true for all methods that evaluate a phylogenetic network solely on the basis of how well the displayed trees fit the available data, including all methods based on input data consisting of clades, triples, quartets, or trees with any number of taxa, and also sequence-based approaches such as popular formalisations of maximum parsimony and maximum likelihood for networks. This identifiability problem is partially solved by accounting for branch lengths, although this merely reduces the frequency of the problem. Here we propose that network inference methods should only attempt to reconstruct what they can uniquely identify. To this end, we introduce a novel definition of what constitutes a uniquely reconstructible network. For any given set of indistinguishable networks, we define a canonical network that, under mild assumptions, is unique and thus representative of the entire set. Given data that underwent reticulate evolution, only the canonical form of the underlying phylogenetic network can be uniquely reconstructed. While on the methodological side this will imply a drastic reduction of the solution space in network inference, for the study of reticulate evolution this is a fundamental limitation that will require an important change of perspective when interpreting phylogenetic networks.

  18. Wobbling and LSF-based maximum likelihood expectation maximization reconstruction for wobbling PET

    International Nuclear Information System (INIS)

    Kim, Hang-Keun; Son, Young-Don; Kwon, Dae-Hyuk; Joo, Yohan; Cho, Zang-Hee

    2016-01-01

    Positron emission tomography (PET) is a widely used imaging modality; however, the PET spatial resolution is not yet satisfactory for precise anatomical localization of molecular activities. Detector size is the most important factor because it determines the intrinsic resolution, which is approximately half of the detector size and determines the ultimate PET resolution. Detector size, however, cannot be made too small because both the decreased detection efficiency and the increased septal penetration effect degrade the image quality. A wobbling and line spread function (LSF)-based maximum likelihood expectation maximization (WL-MLEM) algorithm, which combined the MLEM iterative reconstruction algorithm with wobbled sampling and LSF-based deconvolution using the system matrix, was proposed for improving the spatial resolution of PET without reducing the scintillator or detector size. The new algorithm was evaluated using a simulation, and its performance was compared with that of the existing algorithms, such as conventional MLEM and LSF-based MLEM. Simulations demonstrated that the WL-MLEM algorithm yielded higher spatial resolution and image quality than the existing algorithms. The WL-MLEM algorithm with wobbling PET yielded substantially improved resolution compared with conventional algorithms with stationary PET. The algorithm can be easily extended to other iterative reconstruction algorithms, such as maximum a priori (MAP) and ordered subset expectation maximization (OSEM). The WL-MLEM algorithm with wobbling PET may offer improvements in both sensitivity and resolution, the two most sought-after features in PET design. - Highlights: • This paper proposed WL-MLEM algorithm for PET and demonstrated its performance. • WL-MLEM algorithm effectively combined wobbling and line spread function based MLEM. • WL-MLEM provided improvements in the spatial resolution and the PET image quality. • WL-MLEM can be easily extended to the other iterative

  19. Parsimonious Refraction Interferometry and Tomography

    KAUST Repository

    Hanafy, Sherif

    2017-02-04

    We present parsimonious refraction interferometry and tomography where a densely populated refraction data set can be obtained from two reciprocal and several infill shot gathers. The assumptions are that the refraction arrivals are head waves, and a pair of reciprocal shot gathers and several infill shot gathers are recorded over the line of interest. Refraction traveltimes from these shot gathers are picked and spawned into O(N2) virtual refraction traveltimes generated by N virtual sources, where N is the number of geophones in the 2D survey. The virtual traveltimes can be inverted to give the velocity tomogram. This enormous increase in the number of traveltime picks and associated rays, compared to the many fewer traveltimes from the reciprocal and infill shot gathers, allows for increased model resolution and a better condition number with the system of normal equations. A significant benefit is that the parsimonious survey and the associated traveltime picking is far less time consuming than that for a standard refraction survey with a dense distribution of sources.

  20. A parallel implementation of a maximum entropy reconstruction algorithm for PET images in a visual language

    International Nuclear Information System (INIS)

    Bastiens, K.; Lemahieu, I.

    1994-01-01

    The application of a maximum entropy reconstruction algorithm to PET images requires a lot of computing resources. A parallel implementation could seriously reduce the execution time. However, programming a parallel application is still a non trivial task, needing specialized people. In this paper a programming environment based on a visual programming language is used for a parallel implementation of the reconstruction algorithm. This programming environment allows less experienced programmers to use the performance of multiprocessor systems. (authors)

  1. A parallel implementation of a maximum entropy reconstruction algorithm for PET images in a visual language

    Energy Technology Data Exchange (ETDEWEB)

    Bastiens, K; Lemahieu, I [University of Ghent - ELIS Department, St. Pietersnieuwstraat 41, B-9000 Ghent (Belgium)

    1994-12-31

    The application of a maximum entropy reconstruction algorithm to PET images requires a lot of computing resources. A parallel implementation could seriously reduce the execution time. However, programming a parallel application is still a non trivial task, needing specialized people. In this paper a programming environment based on a visual programming language is used for a parallel implementation of the reconstruction algorithm. This programming environment allows less experienced programmers to use the performance of multiprocessor systems. (authors). 8 refs, 3 figs, 1 tab.

  2. Maximum a posteriori reconstruction of the Patlak parametric image from sinograms in dynamic PET

    International Nuclear Information System (INIS)

    Wang Guobao; Fu Lin; Qi Jinyi

    2008-01-01

    Parametric imaging using the Patlak graphical method has been widely used to analyze dynamic PET data. Conventionally a Patlak parametric image is generated by reconstructing a sequence of dynamic images first and then performing Patlak graphical analysis on the time-activity curves pixel-by-pixel. However, because it is rather difficult to model the noise distribution in reconstructed images, the spatially variant noise correlation is simply ignored in the Patlak analysis, which leads to sub-optimal results. In this paper we present a Bayesian method for reconstructing Patlak parametric images directly from raw sinogram data by incorporating the Patlak plot model into the image reconstruction procedure. A preconditioned conjugate gradient algorithm is used to find the maximum a posteriori solution. The proposed direct method is statistically more efficient than the conventional indirect approach because the Poisson noise distribution in PET data can be accurately modeled in the direct reconstruction. The computation cost of the direct method is similar to reconstruction time of two dynamic frames. Therefore, when more than two dynamic frames are used in the Patlak analysis, the direct method is faster than the conventional indirect approach. We conduct computer simulations to validate the proposed direct method. Comparisons with the conventional indirect approach show that the proposed method results in a more accurate estimate of the parametric image. The proposed method has been applied to dynamic fully 3D PET data from a microPET scanner

  3. Maximum entropy based reconstruction of soft X ray emissivity profiles in W7-AS

    International Nuclear Information System (INIS)

    Ertl, K.; Linden, W. von der; Dose, V.; Weller, A.

    1996-01-01

    The reconstruction of 2-D emissivity profiles from soft X ray tomography measurements constitutes a highly underdetermined and ill-posed inversion problem, because of the restricted viewing access, the number of chords and the increased noise level in most plasma devices. An unbiased and consistent probabilistic approach within the framework of Bayesian inference is provided by the maximum entropy method, which is independent of model assumptions, but allows any prior knowledge available to be incorporated. The formalism is applied to the reconstruction of emissivity profiles in an NBI heated plasma discharge to determine the dependence of the Shafranov shift on β, the reduction of which was a particular objective in designing the advanced W7-AS stellarator. (author). 40 refs, 7 figs

  4. Including RNA secondary structures improves accuracy and robustness in reconstruction of phylogenetic trees.

    Science.gov (United States)

    Keller, Alexander; Förster, Frank; Müller, Tobias; Dandekar, Thomas; Schultz, Jörg; Wolf, Matthias

    2010-01-15

    In several studies, secondary structures of ribosomal genes have been used to improve the quality of phylogenetic reconstructions. An extensive evaluation of the benefits of secondary structure, however, is lacking. This is the first study to counter this deficiency. We inspected the accuracy and robustness of phylogenetics with individual secondary structures by simulation experiments for artificial tree topologies with up to 18 taxa and for divergency levels in the range of typical phylogenetic studies. We chose the internal transcribed spacer 2 of the ribosomal cistron as an exemplary marker region. Simulation integrated the coevolution process of sequences with secondary structures. Additionally, the phylogenetic power of marker size duplication was investigated and compared with sequence and sequence-structure reconstruction methods. The results clearly show that accuracy and robustness of Neighbor Joining trees are largely improved by structural information in contrast to sequence only data, whereas a doubled marker size only accounts for robustness. Individual secondary structures of ribosomal RNA sequences provide a valuable gain of information content that is useful for phylogenetics. Thus, the usage of ITS2 sequence together with secondary structure for taxonomic inferences is recommended. Other reconstruction methods as maximum likelihood, bayesian inference or maximum parsimony may equally profit from secondary structure inclusion. This article was reviewed by Shamil Sunyaev, Andrea Tanzer (nominated by Frank Eisenhaber) and Eugene V. Koonin. Reviewed by Shamil Sunyaev, Andrea Tanzer (nominated by Frank Eisenhaber) and Eugene V. Koonin. For the full reviews, please go to the Reviewers' comments section.

  5. Reconstruction of North American drainage basins and river discharge since the Last Glacial Maximum

    Directory of Open Access Journals (Sweden)

    A. D. Wickert

    2016-11-01

    Full Text Available Over the last glacial cycle, ice sheets and the resultant glacial isostatic adjustment (GIA rearranged river systems. As these riverine threads that tied the ice sheets to the sea were stretched, severed, and restructured, they also shrank and swelled with the pulse of meltwater inputs and time-varying drainage basin areas, and sometimes delivered enough meltwater to the oceans in the right places to influence global climate. Here I present a general method to compute past river flow paths, drainage basin geometries, and river discharges, by combining models of past ice sheets, glacial isostatic adjustment, and climate. The result is a time series of synthetic paleohydrographs and drainage basin maps from the Last Glacial Maximum to present for nine major drainage basins – the Mississippi, Rio Grande, Colorado, Columbia, Mackenzie, Hudson Bay, Saint Lawrence, Hudson, and Susquehanna/Chesapeake Bay. These are based on five published reconstructions of the North American ice sheets. I compare these maps with drainage reconstructions and discharge histories based on a review of observational evidence, including river deposits and terraces, isotopic records, mineral provenance markers, glacial moraine histories, and evidence of ice stream and tunnel valley flow directions. The sharp boundaries of the reconstructed past drainage basins complement the flexurally smoothed GIA signal that is more often used to validate ice-sheet reconstructions, and provide a complementary framework to reduce nonuniqueness in model reconstructions of the North American ice-sheet complex.

  6. Bayesian hierarchical models for regional climate reconstructions of the last glacial maximum

    Science.gov (United States)

    Weitzel, Nils; Hense, Andreas; Ohlwein, Christian

    2017-04-01

    Spatio-temporal reconstructions of past climate are important for the understanding of the long term behavior of the climate system and the sensitivity to forcing changes. Unfortunately, they are subject to large uncertainties, have to deal with a complex proxy-climate structure, and a physically reasonable interpolation between the sparse proxy observations is difficult. Bayesian Hierarchical Models (BHMs) are a class of statistical models that is well suited for spatio-temporal reconstructions of past climate because they permit the inclusion of multiple sources of information (e.g. records from different proxy types, uncertain age information, output from climate simulations) and quantify uncertainties in a statistically rigorous way. BHMs in paleoclimatology typically consist of three stages which are modeled individually and are combined using Bayesian inference techniques. The data stage models the proxy-climate relation (often named transfer function), the process stage models the spatio-temporal distribution of the climate variables of interest, and the prior stage consists of prior distributions of the model parameters. For our BHMs, we translate well-known proxy-climate transfer functions for pollen to a Bayesian framework. In addition, we can include Gaussian distributed local climate information from preprocessed proxy records. The process stage combines physically reasonable spatial structures from prior distributions with proxy records which leads to a multivariate posterior probability distribution for the reconstructed climate variables. The prior distributions that constrain the possible spatial structure of the climate variables are calculated from climate simulation output. We present results from pseudoproxy tests as well as new regional reconstructions of temperatures for the last glacial maximum (LGM, ˜ 21,000 years BP). These reconstructions combine proxy data syntheses with information from climate simulations for the LGM that were

  7. Type I STS markers are more informative than cytochrome B in phylogenetic reconstruction of the Mustelidae (Mammalia: Carnivora).

    Science.gov (United States)

    Koepfli, Klaus-Peter; Wayne, Robert K

    2003-10-01

    We compared the utility of five nuclear gene segments amplified with type I sequence-tagged site (STS) primers versus the complete mitochondrial cytochrome b (cyt b) gene in resolving phylogenetic relationships within the Mustelidae, a large and ecomorphologically diverse family of mammalian carnivores. Maximum parsimony and likelihood analyses of separate and combined data sets were used to address questions regarding the levels of homoplasy, incongruence, and information content within and among loci. All loci showed limited resolution in the separate analyses because of either a low amount of informative variation (nuclear genes) or high levels of homoplasy (cyt b). Individually or combined, the nuclear gene sequences had less homoplasy, retained more signal, and were more decisive, even though cyt b contained more potentially informative variation than all the nuclear sequences combined. We obtained a well-resolved and supported phylogeny when the nuclear sequences were combined. Maximum likelihood and Bayesian phylogenetic analyses of the total combined data (nuclear and mitochondrial DNA sequences) were able to better accommodate the high levels of homoplasy in the cyt b data than was an equally weighted maximum parsimony analysis. Furthermore, partition Bremer support analyses of the total combined tree showed that the relative support of the nuclear and mitochondrial genes differed according to whether or not the homoplasy in the cyt b gene was downweighted. Although the cyt b gene contributed phylogenetic signal for most major groupings, the nuclear gene sequences were more effective in reconstructing the deeper nodes of the combined tree in the equally weighted parsimony analysis, as judged by the variable-length bootstrap method. The total combined data supported the monophyly of the Lutrinae (otters), whereas the Melinae (badgers) and Mustelinae (weasels, martens) were both paraphyletic. The American badger, Taxidea taxus (Taxidiinae), was the most

  8. Morphological character evolution of Amorphophallus (Araceae) based on a combined phylogenetic analysis of trnL, rbcL, and LEAFY second intron sequences

    NARCIS (Netherlands)

    Sedayu, A.; Eurlings, M.C.M.; Gravendeel, B.; Hetterscheid, W.L.A.

    2010-01-01

    Sequences of three different genes in 69 taxa of Amorphophallus were combined to reconstruct the molecular phylogeny of this species-rich Aroid genus. The data set was analyzed by three different methods, Maximum Parsimony, Maximum Likelihood and Bayesian analysis, producing slightly different tree

  9. Temperature reconstruction and volcanic eruption signal from tree-ring width and maximum latewood density over the past 304 years in the southeastern Tibetan Plateau.

    Science.gov (United States)

    Li, Mingqi; Huang, Lei; Yin, Zhi-Yong; Shao, Xuemei

    2017-11-01

    This study presents a 304-year mean July-October maximum temperature reconstruction for the southeastern Tibetan Plateau based on both tree-ring width and maximum latewood density data. The reconstruction explained 58% of the variance in July-October maximum temperature during the calibration period (1958-2005). On the decadal scale, we identified two prominent cold periods during AD 1801-1833 and 1961-2003 and two prominent warm periods during AD 1730-1800 and 1928-1960, which are consistent with other reconstructions from the nearby region. Based on the reconstructed temperature series and volcanic eruption chronology, we found that most extreme cold years were in good agreement with major volcanic eruptions, such as 1816 after the Tambora eruption in 1815. Also, clusters of volcanic eruptions probably made the 1810s the coldest decade in the past 300 years. Our results indicated that fingerprints of major volcanic eruptions can be found in the reconstructed temperature records, while the responses of regional climate to these eruption events varied in space and time in the southeastern Tibetan Plateau.

  10. Including RNA secondary structures improves accuracy and robustness in reconstruction of phylogenetic trees

    Directory of Open Access Journals (Sweden)

    Dandekar Thomas

    2010-01-01

    Full Text Available Abstract Background In several studies, secondary structures of ribosomal genes have been used to improve the quality of phylogenetic reconstructions. An extensive evaluation of the benefits of secondary structure, however, is lacking. Results This is the first study to counter this deficiency. We inspected the accuracy and robustness of phylogenetics with individual secondary structures by simulation experiments for artificial tree topologies with up to 18 taxa and for divergency levels in the range of typical phylogenetic studies. We chose the internal transcribed spacer 2 of the ribosomal cistron as an exemplary marker region. Simulation integrated the coevolution process of sequences with secondary structures. Additionally, the phylogenetic power of marker size duplication was investigated and compared with sequence and sequence-structure reconstruction methods. The results clearly show that accuracy and robustness of Neighbor Joining trees are largely improved by structural information in contrast to sequence only data, whereas a doubled marker size only accounts for robustness. Conclusions Individual secondary structures of ribosomal RNA sequences provide a valuable gain of information content that is useful for phylogenetics. Thus, the usage of ITS2 sequence together with secondary structure for taxonomic inferences is recommended. Other reconstruction methods as maximum likelihood, bayesian inference or maximum parsimony may equally profit from secondary structure inclusion. Reviewers This article was reviewed by Shamil Sunyaev, Andrea Tanzer (nominated by Frank Eisenhaber and Eugene V. Koonin. Open peer review Reviewed by Shamil Sunyaev, Andrea Tanzer (nominated by Frank Eisenhaber and Eugene V. Koonin. For the full reviews, please go to the Reviewers' comments section.

  11. Maximum likelihood inference of small trees in the presence of long branches.

    Science.gov (United States)

    Parks, Sarah L; Goldman, Nick

    2014-09-01

    The statistical basis of maximum likelihood (ML), its robustness, and the fact that it appears to suffer less from biases lead to it being one of the most popular methods for tree reconstruction. Despite its popularity, very few analytical solutions for ML exist, so biases suffered by ML are not well understood. One possible bias is long branch attraction (LBA), a regularly cited term generally used to describe a propensity for long branches to be joined together in estimated trees. Although initially mentioned in connection with inconsistency of parsimony, LBA has been claimed to affect all major phylogenetic reconstruction methods, including ML. Despite the widespread use of this term in the literature, exactly what LBA is and what may be causing it is poorly understood, even for simple evolutionary models and small model trees. Studies looking at LBA have focused on the effect of two long branches on tree reconstruction. However, to understand the effect of two long branches it is also important to understand the effect of just one long branch. If ML struggles to reconstruct one long branch, then this may have an impact on LBA. In this study, we look at the effect of one long branch on three-taxon tree reconstruction. We show that, counterintuitively, long branches are preferentially placed at the tips of the tree. This can be understood through the use of analytical solutions to the ML equation and distance matrix methods. We go on to look at the placement of two long branches on four-taxon trees, showing that there is no attraction between long branches, but that for extreme branch lengths long branches are joined together disproportionally often. These results illustrate that even small model trees are still interesting to help understand how ML phylogenetic reconstruction works, and that LBA is a complicated phenomenon that deserves further study. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  12. Fast Construction of Near Parsimonious Hybridization Networks for Multiple Phylogenetic Trees.

    Science.gov (United States)

    Mirzaei, Sajad; Wu, Yufeng

    2016-01-01

    Hybridization networks represent plausible evolutionary histories of species that are affected by reticulate evolutionary processes. An established computational problem on hybridization networks is constructing the most parsimonious hybridization network such that each of the given phylogenetic trees (called gene trees) is "displayed" in the network. There have been several previous approaches, including an exact method and several heuristics, for this NP-hard problem. However, the exact method is only applicable to a limited range of data, and heuristic methods can be less accurate and also slow sometimes. In this paper, we develop a new algorithm for constructing near parsimonious networks for multiple binary gene trees. This method is more efficient for large numbers of gene trees than previous heuristics. This new method also produces more parsimonious results on many simulated datasets as well as a real biological dataset than a previous method. We also show that our method produces topologically more accurate networks for many datasets.

  13. Parsimonious refraction interferometry

    KAUST Repository

    Hanafy, Sherif

    2016-09-06

    We present parsimonious refraction interferometry where a densely populated refraction data set can be obtained from just two shot gathers. The assumptions are that the first arrivals are comprised of head waves and direct waves, and a pair of reciprocal shot gathers is recorded over the line of interest. The refraction traveltimes from these reciprocal shot gathers can be picked and decomposed into O(N2) refraction traveltimes generated by N virtual sources, where N is the number of geophones in the 2D survey. This enormous increase in the number of virtual traveltime picks and associated rays, compared to the 2N traveltimes from the two reciprocal shot gathers, allows for increased model resolution and better condition numbers in the normal equations. Also, a reciprocal survey is far less time consuming than a standard refraction survey with a dense distribution of sources.

  14. Parsimonious refraction interferometry

    KAUST Repository

    Hanafy, Sherif; Schuster, Gerard T.

    2016-01-01

    We present parsimonious refraction interferometry where a densely populated refraction data set can be obtained from just two shot gathers. The assumptions are that the first arrivals are comprised of head waves and direct waves, and a pair of reciprocal shot gathers is recorded over the line of interest. The refraction traveltimes from these reciprocal shot gathers can be picked and decomposed into O(N2) refraction traveltimes generated by N virtual sources, where N is the number of geophones in the 2D survey. This enormous increase in the number of virtual traveltime picks and associated rays, compared to the 2N traveltimes from the two reciprocal shot gathers, allows for increased model resolution and better condition numbers in the normal equations. Also, a reciprocal survey is far less time consuming than a standard refraction survey with a dense distribution of sources.

  15. A simplified parsimonious higher order multivariate Markov chain model

    Science.gov (United States)

    Wang, Chao; Yang, Chuan-sheng

    2017-09-01

    In this paper, a simplified parsimonious higher-order multivariate Markov chain model (SPHOMMCM) is presented. Moreover, parameter estimation method of TPHOMMCM is give. Numerical experiments shows the effectiveness of TPHOMMCM.

  16. A tridiagonal parsimonious higher order multivariate Markov chain model

    Science.gov (United States)

    Wang, Chao; Yang, Chuan-sheng

    2017-09-01

    In this paper, we present a tridiagonal parsimonious higher-order multivariate Markov chain model (TPHOMMCM). Moreover, estimation method of the parameters in TPHOMMCM is give. Numerical experiments illustrate the effectiveness of TPHOMMCM.

  17. IRT Item Parameter Recovery with Marginal Maximum Likelihood Estimation Using Loglinear Smoothing Models

    Science.gov (United States)

    Casabianca, Jodi M.; Lewis, Charles

    2015-01-01

    Loglinear smoothing (LLS) estimates the latent trait distribution while making fewer assumptions about its form and maintaining parsimony, thus leading to more precise item response theory (IRT) item parameter estimates than standard marginal maximum likelihood (MML). This article provides the expectation-maximization algorithm for MML estimation…

  18. Electron density profile reconstruction by maximum entropy method with multichannel HCN laser interferometer system on SPAC VII

    International Nuclear Information System (INIS)

    Kubo, S.; Narihara, K.; Tomita, Y.; Hasegawa, M.; Tsuzuki, T.; Mohri, A.

    1988-01-01

    A multichannel HCN laser interferometer system has been developed to investigate the plasma electron confinement properties in SPAC VII device. Maximum entropy method is applied to reconstruct the electron density profile from measured line integrated data. Particle diffusion coefficient in the peripheral region of the REB ring core spherator was obtained from the evolution of the density profile. (author)

  19. A community-based geological reconstruction of Antarctic Ice Sheet deglaciation since the Last Glacial Maximum

    Science.gov (United States)

    Bentley, Michael J.; Ó Cofaigh, Colm; Anderson, John B.; Conway, Howard; Davies, Bethan; Graham, Alastair G. C.; Hillenbrand, Claus-Dieter; Hodgson, Dominic A.; Jamieson, Stewart S. R.; Larter, Robert D.; Mackintosh, Andrew; Smith, James A.; Verleyen, Elie; Ackert, Robert P.; Bart, Philip J.; Berg, Sonja; Brunstein, Daniel; Canals, Miquel; Colhoun, Eric A.; Crosta, Xavier; Dickens, William A.; Domack, Eugene; Dowdeswell, Julian A.; Dunbar, Robert; Ehrmann, Werner; Evans, Jeffrey; Favier, Vincent; Fink, David; Fogwill, Christopher J.; Glasser, Neil F.; Gohl, Karsten; Golledge, Nicholas R.; Goodwin, Ian; Gore, Damian B.; Greenwood, Sarah L.; Hall, Brenda L.; Hall, Kevin; Hedding, David W.; Hein, Andrew S.; Hocking, Emma P.; Jakobsson, Martin; Johnson, Joanne S.; Jomelli, Vincent; Jones, R. Selwyn; Klages, Johann P.; Kristoffersen, Yngve; Kuhn, Gerhard; Leventer, Amy; Licht, Kathy; Lilly, Katherine; Lindow, Julia; Livingstone, Stephen J.; Massé, Guillaume; McGlone, Matt S.; McKay, Robert M.; Melles, Martin; Miura, Hideki; Mulvaney, Robert; Nel, Werner; Nitsche, Frank O.; O'Brien, Philip E.; Post, Alexandra L.; Roberts, Stephen J.; Saunders, Krystyna M.; Selkirk, Patricia M.; Simms, Alexander R.; Spiegel, Cornelia; Stolldorf, Travis D.; Sugden, David E.; van der Putten, Nathalie; van Ommen, Tas; Verfaillie, Deborah; Vyverman, Wim; Wagner, Bernd; White, Duanne A.; Witus, Alexandra E.; Zwartz, Dan

    2014-09-01

    A robust understanding of Antarctic Ice Sheet deglacial history since the Last Glacial Maximum is important in order to constrain ice sheet and glacial-isostatic adjustment models, and to explore the forcing mechanisms responsible for ice sheet retreat. Such understanding can be derived from a broad range of geological and glaciological datasets and recent decades have seen an upsurge in such data gathering around the continent and Sub-Antarctic islands. Here, we report a new synthesis of those datasets, based on an accompanying series of reviews of the geological data, organised by sector. We present a series of timeslice maps for 20 ka, 15 ka, 10 ka and 5 ka, including grounding line position and ice sheet thickness changes, along with a clear assessment of levels of confidence. The reconstruction shows that the Antarctic Ice sheet did not everywhere reach the continental shelf edge at its maximum, that initial retreat was asynchronous, and that the spatial pattern of deglaciation was highly variable, particularly on the inner shelf. The deglacial reconstruction is consistent with a moderate overall excess ice volume and with a relatively small Antarctic contribution to meltwater pulse 1a. We discuss key areas of uncertainty both around the continent and by time interval, and we highlight potential priorities for future work. The synthesis is intended to be a resource for the modelling and glacial geological community.

  20. Comparative analysis of maximum renal longitudinal length with positional changes on ultrasound with multiplanar reconstructed MR image in Korea Adults

    International Nuclear Information System (INIS)

    Jang, Yun Hee; Cho, Bum Sang; Kang, Min Ho; Kang, Woo Young; Lee, Jisun; Kim, Yook; Lee, Soo Hyun; Lee, Soo Jung; Lee, Jin Yong

    2016-01-01

    The purpose of this study was to determine a suitable position in which the measured length on ultrasound is close to the true renal length obtained through a multiplanar reconstructed MR image. A total of 33 individuals (males: 15, females: 18) without any underlying renal disease were included in the present study. Renal length was measured as the longest axis at the level of the renal hilum in three positions-supine, lateral decubitus, and prone, respectively. With a 3.0 T MR scanner, 3D eTHRIVE was acquired. Subsequently, the maximum longitudinal length of both the kidneys was measured through multiplanar reconstructed MR images. Paired t-test was used to compare the renal length obtained from ultrasonographic measurement with the length obtained through multiplanar reconstructed MR images. Our study demonstrated significant difference between sonographic renal length in three positions and renal length through MRI (p < 0.001). However, the longest longitudinal length of right kidney among the measured three values by ultrasound was statistically similar to the renal length measured by reconstructed MR image. Among them, the lateral decubitus position showed the strongest correlation with true renal length (right: 0.887; left: 0.849). We recommend measurement of the maximum renal longitudinal length in all possible positions on ultrasonography. If not allowed, the best measurement is on the lateral decubitus showing the strongest correlation coefficient with true renal length

  1. Comparative analysis of maximum renal longitudinal length with positional changes on ultrasound with multiplanar reconstructed MR image in Korea Adults

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Yun Hee; Cho, Bum Sang; Kang, Min Ho; Kang, Woo Young; Lee, Jisun; Kim, Yook; Lee, Soo Hyun; Lee, Soo Jung [Dept. of Radiology, Chungbuk National University Hospital, Cheongju (Korea, Republic of); Lee, Jin Yong [Public Health Medical Service, Seoul National University Boramae Medical Center, Seoul (Korea, Republic of)

    2016-07-15

    The purpose of this study was to determine a suitable position in which the measured length on ultrasound is close to the true renal length obtained through a multiplanar reconstructed MR image. A total of 33 individuals (males: 15, females: 18) without any underlying renal disease were included in the present study. Renal length was measured as the longest axis at the level of the renal hilum in three positions-supine, lateral decubitus, and prone, respectively. With a 3.0 T MR scanner, 3D eTHRIVE was acquired. Subsequently, the maximum longitudinal length of both the kidneys was measured through multiplanar reconstructed MR images. Paired t-test was used to compare the renal length obtained from ultrasonographic measurement with the length obtained through multiplanar reconstructed MR images. Our study demonstrated significant difference between sonographic renal length in three positions and renal length through MRI (p < 0.001). However, the longest longitudinal length of right kidney among the measured three values by ultrasound was statistically similar to the renal length measured by reconstructed MR image. Among them, the lateral decubitus position showed the strongest correlation with true renal length (right: 0.887; left: 0.849). We recommend measurement of the maximum renal longitudinal length in all possible positions on ultrasonography. If not allowed, the best measurement is on the lateral decubitus showing the strongest correlation coefficient with true renal length.

  2. Maximum entropy reconstruction of spin densities involving non uniform prior

    International Nuclear Information System (INIS)

    Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.

    1997-01-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing

  3. Principle of Parsimony, Fake Science, and Scales

    Science.gov (United States)

    Yeh, T. C. J.; Wan, L.; Wang, X. S.

    2017-12-01

    Considering difficulties in predicting exact motions of water molecules, and the scale of our interests (bulk behaviors of many molecules), Fick's law (diffusion concept) has been created to predict solute diffusion process in space and time. G.I. Taylor (1921) demonstrated that random motion of the molecules reach the Fickian regime in less a second if our sampling scale is large enough to reach ergodic condition. Fick's law is widely accepted for describing molecular diffusion as such. This fits the definition of the parsimony principle at the scale of our concern. Similarly, advection-dispersion or convection-dispersion equation (ADE or CDE) has been found quite satisfactory for analysis of concentration breakthroughs of solute transport in uniformly packed soil columns. This is attributed to the solute is often released over the entire cross-section of the column, which has sampled many pore-scale heterogeneities and met the ergodicity assumption. Further, the uniformly packed column contains a large number of stationary pore-size heterogeneity. The solute thus reaches the Fickian regime after traveling a short distance along the column. Moreover, breakthrough curves are concentrations integrated over the column cross-section (the scale of our interest), and they meet the ergodicity assumption embedded in the ADE and CDE. To the contrary, scales of heterogeneity in most groundwater pollution problems evolve as contaminants travel. They are much larger than the scale of our observations and our interests so that the ergodic and the Fickian conditions are difficult. Upscaling the Fick's law for solution dispersion, and deriving universal rules of the dispersion to the field- or basin-scale pollution migrations are merely misuse of the parsimony principle and lead to a fake science ( i.e., the development of theories for predicting processes that can not be observed.) The appropriate principle of parsimony for these situations dictates mapping of large

  4. Parsimonious wave-equation travel-time inversion for refraction waves

    KAUST Repository

    Fu, Lei; Hanafy, Sherif M.; Schuster, Gerard T.

    2017-01-01

    We present a parsimonious wave-equation travel-time inversion technique for refraction waves. A dense virtual refraction dataset can be generated from just two reciprocal shot gathers for the sources at the endpoints of the survey line, with N

  5. Direct reconstruction of the source intensity distribution of a clinical linear accelerator using a maximum likelihood expectation maximization algorithm.

    Science.gov (United States)

    Papaconstadopoulos, P; Levesque, I R; Maglieri, R; Seuntjens, J

    2016-02-07

    Direct determination of the source intensity distribution of clinical linear accelerators is still a challenging problem for small field beam modeling. Current techniques most often involve special equipment and are difficult to implement in the clinic. In this work we present a maximum-likelihood expectation-maximization (MLEM) approach to the source reconstruction problem utilizing small fields and a simple experimental set-up. The MLEM algorithm iteratively ray-traces photons from the source plane to the exit plane and extracts corrections based on photon fluence profile measurements. The photon fluence profiles were determined by dose profile film measurements in air using a high density thin foil as build-up material and an appropriate point spread function (PSF). The effect of other beam parameters and scatter sources was minimized by using the smallest field size ([Formula: see text] cm(2)). The source occlusion effect was reproduced by estimating the position of the collimating jaws during this process. The method was first benchmarked against simulations for a range of typical accelerator source sizes. The sources were reconstructed with an accuracy better than 0.12 mm in the full width at half maximum (FWHM) to the respective electron sources incident on the target. The estimated jaw positions agreed within 0.2 mm with the expected values. The reconstruction technique was also tested against measurements on a Varian Novalis Tx linear accelerator and compared to a previously commissioned Monte Carlo model. The reconstructed FWHM of the source agreed within 0.03 mm and 0.11 mm to the commissioned electron source in the crossplane and inplane orientations respectively. The impact of the jaw positioning, experimental and PSF uncertainties on the reconstructed source distribution was evaluated with the former presenting the dominant effect.

  6. A class representative model for Pure Parsimony Haplotyping under uncertain data.

    Directory of Open Access Journals (Sweden)

    Daniele Catanzaro

    Full Text Available The Pure Parsimony Haplotyping (PPH problem is a NP-hard combinatorial optimization problem that consists of finding the minimum number of haplotypes necessary to explain a given set of genotypes. PPH has attracted more and more attention in recent years due to its importance in analysis of many fine-scale genetic data. Its application fields range from mapping complex disease genes to inferring population histories, passing through designing drugs, functional genomics and pharmacogenetics. In this article we investigate, for the first time, a recent version of PPH called the Pure Parsimony Haplotype problem under Uncertain Data (PPH-UD. This version mainly arises when the input genotypes are not accurate, i.e., when some single nucleotide polymorphisms are missing or affected by errors. We propose an exact approach to solution of PPH-UD based on an extended version of Catanzaro et al.[1] class representative model for PPH, currently the state-of-the-art integer programming model for PPH. The model is efficient, accurate, compact, polynomial-sized, easy to implement, solvable with any solver for mixed integer programming, and usable in all those cases for which the parsimony criterion is well suited for haplotype estimation.

  7. Parsimonious wave-equation travel-time inversion for refraction waves

    KAUST Repository

    Fu, Lei

    2017-02-14

    We present a parsimonious wave-equation travel-time inversion technique for refraction waves. A dense virtual refraction dataset can be generated from just two reciprocal shot gathers for the sources at the endpoints of the survey line, with N geophones evenly deployed along the line. These two reciprocal shots contain approximately 2N refraction travel times, which can be spawned into O(N2) refraction travel times by an interferometric transformation. Then, these virtual refraction travel times are used with a source wavelet to create N virtual refraction shot gathers, which are the input data for wave-equation travel-time inversion. Numerical results show that the parsimonious wave-equation travel-time tomogram has about the same accuracy as the tomogram computed by standard wave-equation travel-time inversion. The most significant benefit is that a reciprocal survey is far less time consuming than the standard refraction survey where a source is excited at each geophone location.

  8. On the Five-Moment Hamburger Maximum Entropy Reconstruction

    Science.gov (United States)

    Summy, D. P.; Pullin, D. I.

    2018-05-01

    We consider the Maximum Entropy Reconstruction (MER) as a solution to the five-moment truncated Hamburger moment problem in one dimension. In the case of five monomial moment constraints, the probability density function (PDF) of the MER takes the form of the exponential of a quartic polynomial. This implies a possible bimodal structure in regions of moment space. An analytical model is developed for the MER PDF applicable near a known singular line in a centered, two-component, third- and fourth-order moment (μ _3 , μ _4 ) space, consistent with the general problem of five moments. The model consists of the superposition of a perturbed, centered Gaussian PDF and a small-amplitude packet of PDF-density, called the outlying moment packet (OMP), sitting far from the mean. Asymptotic solutions are obtained which predict the shape of the perturbed Gaussian and both the amplitude and position on the real line of the OMP. The asymptotic solutions show that the presence of the OMP gives rise to an MER solution that is singular along a line in (μ _3 , μ _4 ) space emanating from, but not including, the point representing a standard normal distribution, or thermodynamic equilibrium. We use this analysis of the OMP to develop a numerical regularization of the MER, creating a procedure we call the Hybrid MER (HMER). Compared with the MER, the HMER is a significant improvement in terms of robustness and efficiency while preserving accuracy in its prediction of other important distribution features, such as higher order moments.

  9. Maximum likelihood pedigree reconstruction using integer linear programming.

    Science.gov (United States)

    Cussens, James; Bartlett, Mark; Jones, Elinor M; Sheehan, Nuala A

    2013-01-01

    Large population biobanks of unrelated individuals have been highly successful in detecting common genetic variants affecting diseases of public health concern. However, they lack the statistical power to detect more modest gene-gene and gene-environment interaction effects or the effects of rare variants for which related individuals are ideally required. In reality, most large population studies will undoubtedly contain sets of undeclared relatives, or pedigrees. Although a crude measure of relatedness might sometimes suffice, having a good estimate of the true pedigree would be much more informative if this could be obtained efficiently. Relatives are more likely to share longer haplotypes around disease susceptibility loci and are hence biologically more informative for rare variants than unrelated cases and controls. Distant relatives are arguably more useful for detecting variants with small effects because they are less likely to share masking environmental effects. Moreover, the identification of relatives enables appropriate adjustments of statistical analyses that typically assume unrelatedness. We propose to exploit an integer linear programming optimisation approach to pedigree learning, which is adapted to find valid pedigrees by imposing appropriate constraints. Our method is not restricted to small pedigrees and is guaranteed to return a maximum likelihood pedigree. With additional constraints, we can also search for multiple high-probability pedigrees and thus account for the inherent uncertainty in any particular pedigree reconstruction. The true pedigree is found very quickly by comparison with other methods when all individuals are observed. Extensions to more complex problems seem feasible. © 2012 Wiley Periodicals, Inc.

  10. Phylogenetic Trees From Sequences

    Science.gov (United States)

    Ryvkin, Paul; Wang, Li-San

    In this chapter, we review important concepts and approaches for phylogeny reconstruction from sequence data.We first cover some basic definitions and properties of phylogenetics, and briefly explain how scientists model sequence evolution and measure sequence divergence. We then discuss three major approaches for phylogenetic reconstruction: distance-based phylogenetic reconstruction, maximum parsimony, and maximum likelihood. In the third part of the chapter, we review how multiple phylogenies are compared by consensus methods and how to assess confidence using bootstrapping. At the end of the chapter are two sections that list popular software packages and additional reading.

  11. Implementation of non-linear filters for iterative penalized maximum likelihood image reconstruction

    International Nuclear Information System (INIS)

    Liang, Z.; Gilland, D.; Jaszczak, R.; Coleman, R.

    1990-01-01

    In this paper, the authors report on the implementation of six edge-preserving, noise-smoothing, non-linear filters applied in image space for iterative penalized maximum-likelihood (ML) SPECT image reconstruction. The non-linear smoothing filters implemented were the median filter, the E 6 filter, the sigma filter, the edge-line filter, the gradient-inverse filter, and the 3-point edge filter with gradient-inverse filter, and the 3-point edge filter with gradient-inverse weight. A 3 x 3 window was used for all these filters. The best image obtained, by viewing the profiles through the image in terms of noise-smoothing, edge-sharpening, and contrast, was the one smoothed with the 3-point edge filter. The computation time for the smoothing was less than 1% of one iteration, and the memory space for the smoothing was negligible. These images were compared with the results obtained using Bayesian analysis

  12. Implementation of linear filters for iterative penalized maximum likelihood SPECT reconstruction

    International Nuclear Information System (INIS)

    Liang, Z.

    1991-01-01

    This paper reports on six low-pass linear filters applied in frequency space implemented for iterative penalized maximum-likelihood (ML) SPECT image reconstruction. The filters implemented were the Shepp-Logan filter, the Butterworth filer, the Gaussian filter, the Hann filter, the Parzen filer, and the Lagrange filter. The low-pass filtering was applied in frequency space to projection data for the initial estimate and to the difference of projection data and reprojected data for higher order approximations. The projection data were acquired experimentally from a chest phantom consisting of non-uniform attenuating media. All the filters could effectively remove the noise and edge artifacts associated with ML approach if the frequency cutoff was properly chosen. The improved performance of the Parzen and Lagrange filters relative to the others was observed. The best image, by viewing its profiles in terms of noise-smoothing, edge-sharpening, and contrast, was the one obtained with the Parzen filter. However, the Lagrange filter has the potential to consider the characteristics of detector response function

  13. Live phylogeny with polytomies: Finding the most compact parsimonious trees.

    Science.gov (United States)

    Papamichail, D; Huang, A; Kennedy, E; Ott, J-L; Miller, A; Papamichail, G

    2017-08-01

    Construction of phylogenetic trees has traditionally focused on binary trees where all species appear on leaves, a problem for which numerous efficient solutions have been developed. Certain application domains though, such as viral evolution and transmission, paleontology, linguistics, and phylogenetic stemmatics, often require phylogeny inference that involves placing input species on ancestral tree nodes (live phylogeny), and polytomies. These requirements, despite their prevalence, lead to computationally harder algorithmic solutions and have been sparsely examined in the literature to date. In this article we prove some unique properties of most parsimonious live phylogenetic trees with polytomies, and their mapping to traditional binary phylogenetic trees. We show that our problem reduces to finding the most compact parsimonious tree for n species, and describe a novel efficient algorithm to find such trees without resorting to exhaustive enumeration of all possible tree topologies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Phylogenetic tree reconstruction accuracy and model fit when proportions of variable sites change across the tree.

    Science.gov (United States)

    Shavit Grievink, Liat; Penny, David; Hendy, Michael D; Holland, Barbara R

    2010-05-01

    Commonly used phylogenetic models assume a homogeneous process through time in all parts of the tree. However, it is known that these models can be too simplistic as they do not account for nonhomogeneous lineage-specific properties. In particular, it is now widely recognized that as constraints on sequences evolve, the proportion and positions of variable sites can vary between lineages causing heterotachy. The extent to which this model misspecification affects tree reconstruction is still unknown. Here, we evaluate the effect of changes in the proportions and positions of variable sites on model fit and tree estimation. We consider 5 current models of nucleotide sequence evolution in a Bayesian Markov chain Monte Carlo framework as well as maximum parsimony (MP). We show that for a tree with 4 lineages where 2 nonsister taxa undergo a change in the proportion of variable sites tree reconstruction under the best-fitting model, which is chosen using a relative test, often results in the wrong tree. In this case, we found that an absolute test of model fit is a better predictor of tree estimation accuracy. We also found further evidence that MP is not immune to heterotachy. In addition, we show that increased sampling of taxa that have undergone a change in proportion and positions of variable sites is critical for accurate tree reconstruction.

  15. Phylogeny of Salsoleae s.l. (Chenopodiaceae) based on DNA sequence data from ITS, psbB-psbH, and rbcL, with emphasis on taxa of northwestern China

    Science.gov (United States)

    Zhi-Bin Wen; Ming-Li Zhang; Ge-Lin Zhu; Stewart C. Sanderson

    2010-01-01

    To reconstruct phylogeny and verify the monophyly of major subgroups, a total of 52 species representing almost all species of Salsoleae s.l. in China were sampled, with analysis based on three molecular markers (nrDNA ITS, cpDNA psbB-psbH and rbcL), using maximum parsimony, maximum likelihood, and Bayesian inference methods. Our molecular evidence provides strong...

  16. EREM: Parameter Estimation and Ancestral Reconstruction by Expectation-Maximization Algorithm for a Probabilistic Model of Genomic Binary Characters Evolution.

    Science.gov (United States)

    Carmel, Liran; Wolf, Yuri I; Rogozin, Igor B; Koonin, Eugene V

    2010-01-01

    Evolutionary binary characters are features of species or genes, indicating the absence (value zero) or presence (value one) of some property. Examples include eukaryotic gene architecture (the presence or absence of an intron in a particular locus), gene content, and morphological characters. In many studies, the acquisition of such binary characters is assumed to represent a rare evolutionary event, and consequently, their evolution is analyzed using various flavors of parsimony. However, when gain and loss of the character are not rare enough, a probabilistic analysis becomes essential. Here, we present a comprehensive probabilistic model to describe the evolution of binary characters on a bifurcating phylogenetic tree. A fast software tool, EREM, is provided, using maximum likelihood to estimate the parameters of the model and to reconstruct ancestral states (presence and absence in internal nodes) and events (gain and loss events along branches).

  17. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  18. Calculating the probability of multitaxon evolutionary trees: bootstrappers Gambit.

    OpenAIRE

    Lake, J A

    1995-01-01

    The reconstruction of multitaxon trees from molecular sequences is confounded by the variety of algorithms and criteria used to evaluate trees, making it difficult to compare the results of different analyses. A global method of multitaxon phylogenetic reconstruction described here, Bootstrappers Gambit, can be used with any four-taxon algorithm, including distance, maximum likelihood, and parsimony methods. It incorporates a Bayesian-Jeffreys'-bootstrap analysis to provide a uniform probabil...

  19. Phylogenetic relationships among populations of Pristurus rupestris Blanford,1874 (Sauria: Sphaerodactylidae) in southern Iran

    OpenAIRE

    YOUSOFI, SUGOL; POUYANI, ESKANDAR RASTEGAR; HOJATI, VIDA

    2015-01-01

    We examined intraspecific relationships of the subspecies Pristurus rupestris iranicus from the northern Persian Gulf area (Hormozgan, Bushehr, and Sistan and Baluchestan provinces). Phylogenetic relationships among these samples were estimated based on the mitochondrial cytochrome b gene. We used three methods of phylogenetic tree reconstruction (maximum likelihood, maximum parsimony, and Bayesian inference). The sampled populations were divided into 5 clades but exhibit little genetic diver...

  20. Evaluation of robustness of maximum likelihood cone-beam CT reconstruction with total variation regularization

    International Nuclear Information System (INIS)

    Stsepankou, D; Arns, A; Hesser, J; Ng, S K; Zygmanski, P

    2012-01-01

    The objective of this paper is to evaluate an iterative maximum likelihood (ML) cone–beam computed tomography (CBCT) reconstruction with total variation (TV) regularization with respect to the robustness of the algorithm due to data inconsistencies. Three different and (for clinical application) typical classes of errors are considered for simulated phantom and measured projection data: quantum noise, defect detector pixels and projection matrix errors. To quantify those errors we apply error measures like mean square error, signal-to-noise ratio, contrast-to-noise ratio and streak indicator. These measures are derived from linear signal theory and generalized and applied for nonlinear signal reconstruction. For quality check, we focus on resolution and CT-number linearity based on a Catphan phantom. All comparisons are made versus the clinical standard, the filtered backprojection algorithm (FBP). In our results, we confirm and substantially extend previous results on iterative reconstruction such as massive undersampling of the number of projections. Errors of projection matrix parameters of up to 1° projection angle deviations are still in the tolerance level. Single defect pixels exhibit ring artifacts for each method. However using defect pixel compensation, allows up to 40% of defect pixels for passing the standard clinical quality check. Further, the iterative algorithm is extraordinarily robust in the low photon regime (down to 0.05 mAs) when compared to FPB, allowing for extremely low-dose image acquisitions, a substantial issue when considering daily CBCT imaging for position correction in radiotherapy. We conclude that the ML method studied herein is robust under clinical quality assurance conditions. Consequently, low-dose regime imaging, especially for daily patient localization in radiation therapy is possible without change of the current hardware of the imaging system. (paper)

  1. The performance of phylogenetic algorithms in estimating haplotype genealogies with migration.

    Science.gov (United States)

    Salzburger, Walter; Ewing, Greg B; Von Haeseler, Arndt

    2011-05-01

    Genealogies estimated from haplotypic genetic data play a prominent role in various biological disciplines in general and in phylogenetics, population genetics and phylogeography in particular. Several software packages have specifically been developed for the purpose of reconstructing genealogies from closely related, and hence, highly similar haplotype sequence data. Here, we use simulated data sets to test the performance of traditional phylogenetic algorithms, neighbour-joining, maximum parsimony and maximum likelihood in estimating genealogies from nonrecombining haplotypic genetic data. We demonstrate that these methods are suitable for constructing genealogies from sets of closely related DNA sequences with or without migration. As genealogies based on phylogenetic reconstructions are fully resolved, but not necessarily bifurcating, and without reticulations, these approaches outperform widespread 'network' constructing methods. In our simulations of coalescent scenarios involving panmictic, symmetric and asymmetric migration, we found that phylogenetic reconstruction methods performed well, while the statistical parsimony approach as implemented in TCS performed poorly. Overall, parsimony as implemented in the PHYLIP package performed slightly better than other methods. We further point out that we are not making the case that widespread 'network' constructing methods are bad, but that traditional phylogenetic tree finding methods are applicable to haplotypic data and exhibit reasonable performance with respect to accuracy and robustness. We also discuss some of the problems of converting a tree to a haplotype genealogy, in particular that it is nonunique. © 2011 Blackwell Publishing Ltd.

  2. Algorithms for computing parsimonious evolutionary scenarios for genome evolution, the last universal common ancestor and dominance of horizontal gene transfer in the evolution of prokaryotes

    Directory of Open Access Journals (Sweden)

    Galperin Michael Y

    2003-01-01

    Full Text Available Abstract Background Comparative analysis of sequenced genomes reveals numerous instances of apparent horizontal gene transfer (HGT, at least in prokaryotes, and indicates that lineage-specific gene loss might have been even more common in evolution. This complicates the notion of a species tree, which needs to be re-interpreted as a prevailing evolutionary trend, rather than the full depiction of evolution, and makes reconstruction of ancestral genomes a non-trivial task. Results We addressed the problem of constructing parsimonious scenarios for individual sets of orthologous genes given a species tree. The orthologous sets were taken from the database of Clusters of Orthologous Groups of proteins (COGs. We show that the phyletic patterns (patterns of presence-absence in completely sequenced genomes of almost 90% of the COGs are inconsistent with the hypothetical species tree. Algorithms were developed to reconcile the phyletic patterns with the species tree by postulating gene loss, COG emergence and HGT (the latter two classes of events were collectively treated as gene gains. We prove that each of these algorithms produces a parsimonious evolutionary scenario, which can be represented as mapping of loss and gain events on the species tree. The distribution of the evolutionary events among the tree nodes substantially depends on the underlying assumptions of the reconciliation algorithm, e.g. whether or not independent gene gains (gain after loss after gain are permitted. Biological considerations suggest that, on average, gene loss might be a more likely event than gene gain. Therefore different gain penalties were used and the resulting series of reconstructed gene sets for the last universal common ancestor (LUCA of the extant life forms were analysed. The number of genes in the reconstructed LUCA gene sets grows as the gain penalty increases. However, qualitative examination of the LUCA versions reconstructed with different gain penalties

  3. Entropy and transverse section reconstruction

    International Nuclear Information System (INIS)

    Gullberg, G.T.

    1976-01-01

    A new approach to the reconstruction of a transverse section using projection data from multiple views incorporates the concept of maximum entropy. The principle of maximizing information entropy embodies the assurance of minimizing bias or prejudice in the reconstruction. Using maximum entropy is a necessary condition for the reconstructed image. This entropy criterion is most appropriate for 3-D reconstruction of objects from projections where the system is underdetermined or the data are limited statistically. This is the case in nuclear medicine time limitations in patient studies do not yield sufficient projections

  4. Time-Lapse Monitoring of Subsurface Fluid Flow using Parsimonious Seismic Interferometry

    KAUST Repository

    Hanafy, Sherif; Li, Jing; Schuster, Gerard T.

    2017-01-01

    of parsimonious seismic interferometry with the time-lapse mentoring idea with field examples, where we were able to record 30 different data sets within a 2-hour period. The recorded data are then processed to generate 30 snapshots that shows the spread of water

  5. DupTree: a program for large-scale phylogenetic analyses using gene tree parsimony.

    Science.gov (United States)

    Wehe, André; Bansal, Mukul S; Burleigh, J Gordon; Eulenstein, Oliver

    2008-07-01

    DupTree is a new software program for inferring rooted species trees from collections of gene trees using the gene tree parsimony approach. The program implements a novel algorithm that significantly improves upon the run time of standard search heuristics for gene tree parsimony, and enables the first truly genome-scale phylogenetic analyses. In addition, DupTree allows users to examine alternate rootings and to weight the reconciliation costs for gene trees. DupTree is an open source project written in C++. DupTree for Mac OS X, Windows, and Linux along with a sample dataset and an on-line manual are available at http://genome.cs.iastate.edu/CBL/DupTree

  6. EREM: Parameter Estimation and Ancestral Reconstruction by Expectation-Maximization Algorithm for a Probabilistic Model of Genomic Binary Characters Evolution

    Directory of Open Access Journals (Sweden)

    Liran Carmel

    2010-01-01

    Full Text Available Evolutionary binary characters are features of species or genes, indicating the absence (value zero or presence (value one of some property. Examples include eukaryotic gene architecture (the presence or absence of an intron in a particular locus, gene content, and morphological characters. In many studies, the acquisition of such binary characters is assumed to represent a rare evolutionary event, and consequently, their evolution is analyzed using various flavors of parsimony. However, when gain and loss of the character are not rare enough, a probabilistic analysis becomes essential. Here, we present a comprehensive probabilistic model to describe the evolution of binary characters on a bifurcating phylogenetic tree. A fast software tool, EREM, is provided, using maximum likelihood to estimate the parameters of the model and to reconstruct ancestral states (presence and absence in internal nodes and events (gain and loss events along branches.

  7. Phylogenetic inferences of Nepenthes species in Peninsular Malaysia revealed by chloroplast (trnL intron) and nuclear (ITS) DNA sequences

    OpenAIRE

    Bunawan, Hamidun; Yen, Choong Chee; Yaakop, Salmah; Noor, Normah Mohd

    2017-01-01

    Background The chloroplastic trnL intron and the nuclear internal transcribed spacer (ITS) region were sequenced for 11 Nepenthes species recorded in Peninsular Malaysia to examine their phylogenetic relationship and to evaluate the usage of trnL intron and ITS sequences for phylogenetic reconstruction of this genus. Results Phylogeny reconstruction was carried out using neighbor-joining, maximum parsimony and Bayesian analyses. All the trees revealed two major clusters, a lowland group consi...

  8. Maximum principle and convergence of central schemes based on slope limiters

    KAUST Repository

    Mehmetoglu, Orhan; Popov, Bojan

    2012-01-01

    A maximum principle and convergence of second order central schemes is proven for scalar conservation laws in dimension one. It is well known that to establish a maximum principle a nonlinear piecewise linear reconstruction is needed and a typical choice is the minmod limiter. Unfortunately, this implies that the scheme uses a first order reconstruction at local extrema. The novelty here is that we allow local nonlinear reconstructions which do not reduce to first order at local extrema and still prove maximum principle and convergence. © 2011 American Mathematical Society.

  9. Reconstruction algorithm in compressed sensing based on maximum a posteriori estimation

    International Nuclear Information System (INIS)

    Takeda, Koujin; Kabashima, Yoshiyuki

    2013-01-01

    We propose a systematic method for constructing a sparse data reconstruction algorithm in compressed sensing at a relatively low computational cost for general observation matrix. It is known that the cost of ℓ 1 -norm minimization using a standard linear programming algorithm is O(N 3 ). We show that this cost can be reduced to O(N 2 ) by applying the approach of posterior maximization. Furthermore, in principle, the algorithm from our approach is expected to achieve the widest successful reconstruction region, which is evaluated from theoretical argument. We also discuss the relation between the belief propagation-based reconstruction algorithm introduced in preceding works and our approach

  10. Potentials and limitations of histone repeat sequences for phylogenetic reconstruction of Sophophora.

    Science.gov (United States)

    Baldo, A M; Les, D H; Strausbaugh, L D

    1999-11-01

    Simplified DNA sequence acquisition has provided many new data sets that are useful for phylogenetic reconstruction, including single- and multiple-copy nuclear and organellar genes. Although transcribed regions receive much attention, nontranscribed regions have recently been added to the repertoire of sequences suitable for phylogenetic studies, especially for closely related taxa. We evaluated the efficacy of a small portion of the histone repeat for phylogenetic reconstruction among Drosophila species. Histone repeats in invertebrates offer distinct advantages similar to those of widely used ribosomal repeats. First, the units are tandemly repeated and undergo concerted evolution. Second, histone repeats include both highly conserved coding and variable intergenic regions. This composition facilitates application of "universal" primers spanning potentially informative sites. We examined a small region of the histone repeat, including the intergenic spacer segments of coding regions from the divergently transcribed H2A and H2B histone genes. The spacer (about 230 bp) exists as a mosaic with highly conserved functional motifs interspersed with rapidly diverging regions; the former aid in alignment of the spacer. There are no ambiguities in alignment of coding regions. Coding and noncoding regions were analyzed together and separately for phylogenetic information. Parsimony, distance, and maximum-likelihood methods successfully retrieve the corroborated phylogeny for the taxa examined. This study demonstrates the resolving power of a small histone region which may now be added to the growing collection of phylogenetically useful DNA sequences.

  11. Failed refutations: further comments on parsimony and likelihood methods and their relationship to Popper's degree of corroboration.

    Science.gov (United States)

    de Queiroz, Kevin; Poe, Steven

    2003-06-01

    Kluge's (2001, Syst. Biol. 50:322-330) continued arguments that phylogenetic methods based on the statistical principle of likelihood are incompatible with the philosophy of science described by Karl Popper are based on false premises related to Kluge's misrepresentations of Popper's philosophy. Contrary to Kluge's conjectures, likelihood methods are not inherently verificationist; they do not treat every instance of a hypothesis as confirmation of that hypothesis. The historical nature of phylogeny does not preclude phylogenetic hypotheses from being evaluated using the probability of evidence. The low absolute probabilities of hypotheses are irrelevant to the correct interpretation of Popper's concept termed degree of corroboration, which is defined entirely in terms of relative probabilities. Popper did not advocate minimizing background knowledge; in any case, the background knowledge of both parsimony and likelihood methods consists of the general assumption of descent with modification and additional assumptions that are deterministic, concerning which tree is considered most highly corroborated. Although parsimony methods do not assume (in the sense of entailing) that homoplasy is rare, they do assume (in the sense of requiring to obtain a correct phylogenetic inference) certain things about patterns of homoplasy. Both parsimony and likelihood methods assume (in the sense of implying by the manner in which they operate) various things about evolutionary processes, although violation of those assumptions does not always cause the methods to yield incorrect phylogenetic inferences. Test severity is increased by sampling additional relevant characters rather than by character reanalysis, although either interpretation is compatible with the use of phylogenetic likelihood methods. Neither parsimony nor likelihood methods assess test severity (critical evidence) when used to identify a most highly corroborated tree(s) based on a single method or model and a

  12. Regional compensation for statistical maximum likelihood reconstruction error of PET image pixels

    International Nuclear Information System (INIS)

    Forma, J; Ruotsalainen, U; Niemi, J A

    2013-01-01

    In positron emission tomography (PET), there is an increasing interest in studying not only the regional mean tracer concentration, but its variation arising from local differences in physiology, the tissue heterogeneity. However, in reconstructed images this physiological variation is shadowed by a large reconstruction error, which is caused by noisy data and the inversion of tomographic problem. We present a new procedure which can quantify the error variation in regional reconstructed values for given PET measurement, and reveal the remaining tissue heterogeneity. The error quantification is made by creating and reconstructing the noise realizations of virtual sinograms, which are statistically similar with the measured sinogram. Tests with physical phantom data show that the characterization of error variation and the true heterogeneity are possible, despite the existing model error when real measurement is considered. (paper)

  13. Molecular phylogeny of the neritidae (Gastropoda: Neritimorpha) based on the mitochondrial genes cytochrome oxidase I (COI) and 16S rRNA

    International Nuclear Information System (INIS)

    Quintero Galvis, Julian Fernando; Castro, Lyda Raquel

    2013-01-01

    The family Neritidae has representatives in tropical and subtropical regions that occur in a variety of environments, and its known fossil record dates back to the late Cretaceous. However there have been few studies of molecular phylogeny in this family. We performed a phylogenetic reconstruction of the family Neritidae using the COI (722 bp) and the 16S rRNA (559 bp) regions of the mitochondrial genome. Neighbor-joining, maximum parsimony and Bayesian inference were performed. The best phylogenetic reconstruction was obtained using the COI region, and we consider it an appropriate marker for phylogenetic studies within the group. Consensus analysis (COI +16S rRNA) generally obtained the same tree topologies and confirmed that the genus Nerita is monophyletic. The consensus analysis using parsimony recovered a monophyletic group consisting of the genera Neritina, Septaria, Theodoxus, Puperita, and Clithon, while in the Bayesian analyses Theodoxus is separated from the other genera. The phylogenetic status of the species from the genus Nerita from the Colombian Caribbean generated in this study was consistent with that reported for the genus in previous studies. In the resulting consensus tree obtained using maximum parsimony, we included information on habitat type for each species, to map the evolution by habitat. Species of the family Neritidae possibly have their origin in marine environments, which is consistent with conclusions from previous reports based on anatomical studies.

  14. Consequence Valuing as Operation and Process: A Parsimonious Analysis of Motivation

    Science.gov (United States)

    Whelan, Robert; Barnes-Holmes, Dermot

    2010-01-01

    The concept of the motivating operation (MO) has been subject to 3 criticisms: (a) the terms and concepts employed do not always overlap with traditional behavior-analytic verbal practices; (b) the dual nature of the MO is unclear; and (c) there is a lack of adequate contact with empirical data. We offer a more parsimonious approach to motivation,…

  15. A simplified parsimonious higher order multivariate Markov chain model with new convergence condition

    Science.gov (United States)

    Wang, Chao; Yang, Chuan-sheng

    2017-09-01

    In this paper, we present a simplified parsimonious higher-order multivariate Markov chain model with new convergence condition. (TPHOMMCM-NCC). Moreover, estimation method of the parameters in TPHOMMCM-NCC is give. Numerical experiments illustrate the effectiveness of TPHOMMCM-NCC.

  16. A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series

    Directory of Open Access Journals (Sweden)

    Fernando Luiz Cyrino Oliveira

    2014-01-01

    Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.

  17. phangorn: phylogenetic analysis in R.

    Science.gov (United States)

    Schliep, Klaus Peter

    2011-02-15

    phangorn is a package for phylogenetic reconstruction and analysis in the R language. Previously it was only possible to estimate phylogenetic trees with distance methods in R. phangorn, now offers the possibility of reconstructing phylogenies with distance based methods, maximum parsimony or maximum likelihood (ML) and performing Hadamard conjugation. Extending the general ML framework, this package provides the possibility of estimating mixture and partition models. Furthermore, phangorn offers several functions for comparing trees, phylogenetic models or splits, simulating character data and performing congruence analyses. phangorn can be obtained through the CRAN homepage http://cran.r-project.org/web/packages/phangorn/index.html. phangorn is licensed under GPL 2.

  18. The plunge in German electricity futures prices – Analysis using a parsimonious fundamental model

    International Nuclear Information System (INIS)

    Kallabis, Thomas; Pape, Christian; Weber, Christoph

    2016-01-01

    The German market has seen a plunge in wholesale electricity prices from 2007 until 2014, with base futures prices dropping by more than 40%. This is frequently attributed to the unexpected high increase in renewable power generation. Using a parsimonious fundamental model, we determine the respective impact of supply and demand shocks on electricity futures prices. The used methodology is based on a piecewise linear approximation of the supply stack and time-varying price-inelastic demand. This parsimonious model is able to replicate electricity futures prices and discover non-linear dependencies in futures price formation. We show that emission prices have a higher impact on power prices than renewable penetration. Changes in renewables, demand and installed capacities turn out to be similarly important for explaining the decrease in operation margins of conventional power plants. We thus argue for the establishment of an independent authority to stabilize emission prices. - Highlights: •We build a parsimonious fundamental model based on a piecewise linear bid stack. •We use the model to investigate impact factors for the plunge in German futures prices. •Largest impact by CO_2 price developments followed by demand and renewable feed-in. •Power plant operating profits strongly affected by demand and renewables. •We argue that stabilizing CO_2 emission prices could provide better market signals.

  19. Phylogenetic reconstruction of South American felids defined by protein electrophoresis.

    Science.gov (United States)

    Slattery, J P; Johnson, W E; Goldman, D; O'Brien, S J

    1994-09-01

    Phylogenetic associations among six closely related South American felid species were defined by changes in protein-encoding gene loci. We analyzed proteins isolated from skin fibroblasts using two-dimensional electrophoresis and allozymes extracted from blood cells. Genotypes were determined for multiple individuals of ocelot, margay, tigrina, Geoffroy's cat, kodkod, and pampas cat at 548 loci resolved by two-dimensional electrophoresis and 44 allozyme loci. Phenograms were constructed using the methods of Fitch-Margoliash and neighbor-joining on a matrix of Nei's unbiased genetic distances for all pairs of species. Results of a relative-rate test indicate changes in two-dimensional electrophoresis data are constant among all South American felids with respect to a hyena outgroup. Allelic frequencies were transformed to discrete character states for maximum parsimony analysis. Phylogenetic reconstruction indicates a major split occurred approximately 5-6 million years ago, leading to three groups within the ocelot lineage. The earliest divergence led to Leopardus tigrina, followed by a split between an ancestor of an unresolved trichotomy of three species (Oncifelis guigna, O. geoffroyi, and Lynchailuris colocolo) and a recent common ancestor of Leopardus pardalis and L. wiedii. The results suggest that modern South American felids are monophyletic and evolved rapidly after the formation of the Panama land bridge between North and South America.

  20. ROC [Receiver Operating Characteristics] study of maximum likelihood estimator human brain image reconstructions in PET [Positron Emission Tomography] clinical practice

    International Nuclear Information System (INIS)

    Llacer, J.; Veklerov, E.; Nolan, D.; Grafton, S.T.; Mazziotta, J.C.; Hawkins, R.A.; Hoh, C.K.; Hoffman, E.J.

    1990-10-01

    This paper will report on the progress to date in carrying out Receiver Operating Characteristics (ROC) studies comparing Maximum Likelihood Estimator (MLE) and Filtered Backprojection (FBP) reconstructions of normal and abnormal human brain PET data in a clinical setting. A previous statistical study of reconstructions of the Hoffman brain phantom with real data indicated that the pixel-to-pixel standard deviation in feasible MLE images is approximately proportional to the square root of the number of counts in a region, as opposed to a standard deviation which is high and largely independent of the number of counts in FBP. A preliminary ROC study carried out with 10 non-medical observers performing a relatively simple detectability task indicates that, for the majority of observers, lower standard deviation translates itself into a statistically significant detectability advantage in MLE reconstructions. The initial results of ongoing tests with four experienced neurologists/nuclear medicine physicians are presented. Normal cases of 18 F -- fluorodeoxyglucose (FDG) cerebral metabolism studies and abnormal cases in which a variety of lesions have been introduced into normal data sets have been evaluated. We report on the results of reading the reconstructions of 90 data sets, each corresponding to a single brain slice. It has become apparent that the design of the study based on reading single brain slices is too insensitive and we propose a variation based on reading three consecutive slices at a time, rating only the center slice. 9 refs., 2 figs., 1 tab

  1. Philosophy and phylogenetic inference: a comparison of likelihood and parsimony methods in the context of Karl Popper's writings on corroboration.

    Science.gov (United States)

    de Queiroz, K; Poe, S

    2001-06-01

    Advocates of cladistic parsimony methods have invoked the philosophy of Karl Popper in an attempt to argue for the superiority of those methods over phylogenetic methods based on Ronald Fisher's statistical principle of likelihood. We argue that the concept of likelihood in general, and its application to problems of phylogenetic inference in particular, are highly compatible with Popper's philosophy. Examination of Popper's writings reveals that his concept of corroboration is, in fact, based on likelihood. Moreover, because probabilistic assumptions are necessary for calculating the probabilities that define Popper's corroboration, likelihood methods of phylogenetic inference--with their explicit probabilistic basis--are easily reconciled with his concept. In contrast, cladistic parsimony methods, at least as described by certain advocates of those methods, are less easily reconciled with Popper's concept of corroboration. If those methods are interpreted as lacking probabilistic assumptions, then they are incompatible with corroboration. Conversely, if parsimony methods are to be considered compatible with corroboration, then they must be interpreted as carrying implicit probabilistic assumptions. Thus, the non-probabilistic interpretation of cladistic parsimony favored by some advocates of those methods is contradicted by an attempt by the same authors to justify parsimony methods in terms of Popper's concept of corroboration. In addition to being compatible with Popperian corroboration, the likelihood approach to phylogenetic inference permits researchers to test the assumptions of their analytical methods (models) in a way that is consistent with Popper's ideas about the provisional nature of background knowledge.

  2. Molecular phylogenetic reconstruction of the endemic Asian salamander family Hynobiidae (Amphibia, Caudata).

    Science.gov (United States)

    Weisrock, David W; Macey, J Robert; Matsui, Masafumi; Mulcahy, Daniel G; Papenfuss, Theodore J

    2013-01-01

    The salamander family Hynobiidae contains over 50 species and has been the subject of a number of molecular phylogenetic investigations aimed at reconstructing branches across the entire family. In general, studies using the greatest amount of sequence data have used reduced taxon sampling, while the study with the greatest taxon sampling has used a limited sequence data set. Here, we provide insights into the phylogenetic history of the Hynobiidae using both dense taxon sampling and a large mitochondrial DNA sequence data set. We report exclusive new mitochondrial DNA data of 2566 aligned bases (with 151 excluded sites, of included sites 1157 are variable with 957 parsimony informative). This is sampled from two genic regions encoding a 12S-16S region (the 3' end of 12S rRNA, tRNA(VAI), and the 5' end of 16S rRNA), and a ND2-COI region (ND2, tRNA(Trp), tRNA(Ala), tRNA(Asn), the origin for light strand replication--O(L), tRNA(Cys), tRNAT(Tyr), and the 5' end of COI). Analyses using parsimony, Bayesian, and maximum likelihood optimality criteria produce similar phylogenetic trees, with discordant branches generally receiving low levels of branch support. Monophyly of the Hynobiidae is strongly supported across all analyses, as is the sister relationship and deep divergence between the genus Onychodactylus with all remaining hynobiids. Within this latter grouping our phylogenetic results identify six clades that are relatively divergent from one another, but for which there is minimal support for their phylogenetic placement. This includes the genus Batrachuperus, the genus Hynobius, the genus Pachyhynobius, the genus Salamandrella, a clade containing the genera Ranodon and Paradactylodon, and a clade containing the genera Liua and Pseudohynobius. This latter clade receives low bootstrap support in the parsimony analysis, but is consistent across all three analytical methods. Our results also clarify a number of well-supported relationships within the larger

  3. constNJ: an algorithm to reconstruct sets of phylogenetic trees satisfying pairwise topological constraints.

    Science.gov (United States)

    Matsen, Frederick A

    2010-06-01

    This article introduces constNJ (constrained neighbor-joining), an algorithm for phylogenetic reconstruction of sets of trees with constrained pairwise rooted subtree-prune-regraft (rSPR) distance. We are motivated by the problem of constructing sets of trees that must fit into a recombination, hybridization, or similar network. Rather than first finding a set of trees that are optimal according to a phylogenetic criterion (e.g., likelihood or parsimony) and then attempting to fit them into a network, constNJ estimates the trees while enforcing specified rSPR distance constraints. The primary input for constNJ is a collection of distance matrices derived from sequence blocks which are assumed to have evolved in a tree-like manner, such as blocks of an alignment which do not contain any recombination breakpoints. The other input is a set of rSPR constraint inequalities for any set of pairs of trees. constNJ is consistent and a strict generalization of the neighbor-joining algorithm; it uses the new notion of maximum agreement partitions (MAPs) to assure that the resulting trees satisfy the given rSPR distance constraints.

  4. Quality Assurance Based on Descriptive and Parsimonious Appearance Models

    DEFF Research Database (Denmark)

    Nielsen, Jannik Boll; Eiríksson, Eyþór Rúnar; Kristensen, Rasmus Lyngby

    2015-01-01

    In this positional paper, we discuss the potential benefits of using appearance models in additive manufacturing, metal casting, wind turbine blade production, and 3D content acquisition. Current state of the art in acquisition and rendering of appearance cannot easily be used for quality assurance...... in these areas. The common denominator is the need for descriptive and parsimonious appearance models. By ‘parsimonious’ we mean with few parameters so that a model is useful both for fast acquisition, robust fitting, and fast rendering of appearance. The word ‘descriptive’ refers to the fact that a model should...

  5. Vector Autoregressions with Parsimoniously Time Varying Parameters and an Application to Monetary Policy

    DEFF Research Database (Denmark)

    Callot, Laurent; Kristensen, Johannes Tang

    the monetary policy response to inflation and business cycle fluctuations in the US by estimating a parsimoniously time varying parameter Taylor rule.We document substantial changes in the policy response of the Fed in the 1970s and 1980s, and since 2007, but also document the stability of this response...

  6. Hypothesis of the Disappearance of the Limits of Improvidence and Parsimony in the Function of Consumption in an Islamic Economy

    Directory of Open Access Journals (Sweden)

    محمد أحمد حسن الأفندي

    2018-04-01

    Full Text Available There is a rich literature about the analysis of consumption behavior from the perspective of Islamic economy. The focus of such literature has been on the incorporation of the effect of moral values on individuals’ consumption behavior. However, studies on consumption did not pay enough heed to the analysis of the ultimate effect of faith values on the track of consumption behavior over time. This desired track of consumption involves showing certain hypotheses and probabilities. This study suggests a normative statement which includes the gradual disappearance of parsimony and improvidence over time. This disappearance would correct the deviation of actual consumption of society members from the desired moderate consumption level, so as to make households’ consumption behavior at the desired level which is consistent with Islamic Sharia. The study emphasizes the need to develop analysis and research in two integrated directions: i conducting more empirical studies to examine the consistency of the normative statement with evidence from real situations, and ii conducting more analysis to develop a specific measure for the desired consumption levels as well as the limits of parsimony and improvidence. Keywords: Disappearance of improvidence and parsimony limits, Desired moderate consumption level, Actual consumption, Improvidence and parsimony consumption levels, Track of households’ consumption behavior.

  7. Improving parallel imaging by jointly reconstructing multi-contrast data.

    Science.gov (United States)

    Bilgic, Berkin; Kim, Tae Hyung; Liao, Congyu; Manhard, Mary Kate; Wald, Lawrence L; Haldar, Justin P; Setsompop, Kawin

    2018-08-01

    To develop parallel imaging techniques that simultaneously exploit coil sensitivity encoding, image phase prior information, similarities across multiple images, and complementary k-space sampling for highly accelerated data acquisition. We introduce joint virtual coil (JVC)-generalized autocalibrating partially parallel acquisitions (GRAPPA) to jointly reconstruct data acquired with different contrast preparations, and show its application in 2D, 3D, and simultaneous multi-slice (SMS) acquisitions. We extend the joint parallel imaging concept to exploit limited support and smooth phase constraints through Joint (J-) LORAKS formulation. J-LORAKS allows joint parallel imaging from limited autocalibration signal region, as well as permitting partial Fourier sampling and calibrationless reconstruction. We demonstrate highly accelerated 2D balanced steady-state free precession with phase cycling, SMS multi-echo spin echo, 3D multi-echo magnetization-prepared rapid gradient echo, and multi-echo gradient recalled echo acquisitions in vivo. Compared to conventional GRAPPA, proposed joint acquisition/reconstruction techniques provide more than 2-fold reduction in reconstruction error. JVC-GRAPPA takes advantage of additional spatial encoding from phase information and image similarity, and employs different sampling patterns across acquisitions. J-LORAKS achieves a more parsimonious low-rank representation of local k-space by considering multiple images as additional coils. Both approaches provide dramatic improvement in artifact and noise mitigation over conventional single-contrast parallel imaging reconstruction. Magn Reson Med 80:619-632, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  8. Phylogenetic relationships of click beetles (Coleoptera: Elateridae) inferred from 28S ribosomal DNA: insights into the evolution of bioluminescence in Elateridae.

    Science.gov (United States)

    Sagegami-Oba, Reiko; Oba, Yuichi; Ohira, Hitoo

    2007-02-01

    Although the taxonomy of click beetles (family Elateridae) has been studied extensively, inconsistencies remain. We examine here the relationships between species of Elateridae based on partial sequences of nuclear 28S ribosomal DNA. Specimens were collected primarily from Japan, while luminous click beetles were also sampled from Central and South America to investigate the origins of bioluminescence in Elateridae. Neighbor-joining, maximum-parsimony, and maximum-likelihood analyses produced a consistent basal topology with high statistical support that is partially congruent with the results of previous investigations based on the morphological characteristics of larvae and adults. The most parsimonious reconstruction of the "luminous" and "nonluminous" states, based on the present molecular phylogeny, indicates that the ancestral state of Elateridae was nonluminous. This suggests that the bioluminescence in click beetle evolved independent of that of other luminous beetles, such as Lampyridae, despite their common mechanisms of bioluminescence.

  9. A parsimonious dynamic model for river water quality assessment.

    Science.gov (United States)

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.

  10. Beyond technology acceptance to effective technology use: a parsimonious and actionable model.

    Science.gov (United States)

    Holahan, Patricia J; Lesselroth, Blake J; Adams, Kathleen; Wang, Kai; Church, Victoria

    2015-05-01

    To develop and test a parsimonious and actionable model of effective technology use (ETU). Cross-sectional survey of primary care providers (n = 53) in a large integrated health care organization that recently implemented new medication reconciliation technology. Surveys assessed 5 technology-related perceptions (compatibility with work values, implementation climate, compatibility with work processes, perceived usefulness, and ease of use) and 1 outcome variable, ETU. ETU was measured as both consistency and quality of technology use. Compatibility with work values and implementation climate were found to have differential effects on consistency and quality of use. When implementation climate was strong, consistency of technology use was high. However, quality of technology use was high only when implementation climate was strong and values compatibility was high. This is an important finding and highlights the importance of users' workplace values as a key determinant of quality of use. To extend our effectiveness in implementing new health care information technology, we need parsimonious models that include actionable determinants of ETU and account for the differential effects of these determinants on the multiple dimensions of ETU. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. A framelet-based iterative maximum-likelihood reconstruction algorithm for spectral CT

    Science.gov (United States)

    Wang, Yingmei; Wang, Ge; Mao, Shuwei; Cong, Wenxiang; Ji, Zhilong; Cai, Jian-Feng; Ye, Yangbo

    2016-11-01

    Standard computed tomography (CT) cannot reproduce spectral information of an object. Hardware solutions include dual-energy CT which scans the object twice in different x-ray energy levels, and energy-discriminative detectors which can separate lower and higher energy levels from a single x-ray scan. In this paper, we propose a software solution and give an iterative algorithm that reconstructs an image with spectral information from just one scan with a standard energy-integrating detector. The spectral information obtained can be used to produce color CT images, spectral curves of the attenuation coefficient μ (r,E) at points inside the object, and photoelectric images, which are all valuable imaging tools in cancerous diagnosis. Our software solution requires no change on hardware of a CT machine. With the Shepp-Logan phantom, we have found that although the photoelectric and Compton components were not perfectly reconstructed, their composite effect was very accurately reconstructed as compared to the ground truth and the dual-energy CT counterpart. This means that our proposed method has an intrinsic benefit in beam hardening correction and metal artifact reduction. The algorithm is based on a nonlinear polychromatic acquisition model for x-ray CT. The key technique is a sparse representation of iterations in a framelet system. Convergence of the algorithm is studied. This is believed to be the first application of framelet imaging tools to a nonlinear inverse problem.

  12. Permutationally invariant state reconstruction

    DEFF Research Database (Denmark)

    Moroder, Tobias; Hyllus, Philipp; Tóth, Géza

    2012-01-01

    Feasible tomography schemes for large particle numbers must possess, besides an appropriate data acquisition protocol, an efficient way to reconstruct the density operator from the observed finite data set. Since state reconstruction typically requires the solution of a nonlinear large-scale opti...... optimization, which has clear advantages regarding speed, control and accuracy in comparison to commonly employed numerical routines. First prototype implementations easily allow reconstruction of a state of 20 qubits in a few minutes on a standard computer.......-scale optimization problem, this is a major challenge in the design of scalable tomography schemes. Here we present an efficient state reconstruction scheme for permutationally invariant quantum state tomography. It works for all common state-of-the-art reconstruction principles, including, in particular, maximum...

  13. Assessing internet addiction using the parsimonious internet addiction components model—A preliminary study.

    OpenAIRE

    Kuss, D.J.; Shorter, G.W.; Rooij, A.J. van; Griffiths, M.D.; Schoenmakers, T.M.

    2014-01-01

    Internet usage has grown exponentially over the last decade. Research indicates that excessive Internet use can lead to symptoms associated with addiction. To date, assessment of potential Internet addiction has varied regarding populations studied and instruments used, making reliable prevalence estimations difficult. To overcome the present problems a preliminary study was conducted testing a parsimonious Internet addiction components model based on Griffiths’ addiction components (Journal ...

  14. 40 CFR 63.43 - Maximum achievable control technology (MACT) determinations for constructed and reconstructed...

    Science.gov (United States)

    2010-07-01

    ... Administrator, and shall provide a summary in a compatible electronic format for inclusion in the MACT data base... paragraph (d) of this section. (2) In each instance where a constructed or reconstructed major source would...) In each instance where the owner or operator contends that a constructed or reconstructed major...

  15. Phylogenetic Reconstruction, Morphological Diversification and Generic Delimitation of Disepalum (Annonaceae).

    Science.gov (United States)

    Li, Pui-Sze; Thomas, Daniel C; Saunders, Richard M K

    2015-01-01

    Taxonomic delimitation of Disepalum (Annonaceae) is contentious, with some researchers favoring a narrow circumscription following segregation of the genus Enicosanthellum. We reconstruct the phylogeny of Disepalum and related taxa based on four chloroplast and two nuclear DNA regions as a framework for clarifying taxonomic delimitation and assessing evolutionary transitions in key morphological characters. Maximum parsimony, maximum likelihood and Bayesian methods resulted in a consistent, well-resolved and strongly supported topology. Disepalum s.l. is monophyletic and strongly supported, with Disepalum s.str. and Enicosanthellum retrieved as sister groups. Although this topology is consistent with both taxonomic delimitations, the distribution of morphological synapomorphies provides greater support for the inclusion of Enicosanthellum within Disepalum s.l. We propose a novel infrageneric classification with two subgenera. Subgen. Disepalum (= Disepalum s.str.) is supported by numerous synapomorphies, including the reduction of the calyx to two sepals and connation of petals. Subgen. Enicosanthellum lacks obvious morphological synapomorphies, but possesses several diagnostic characters (symplesiomorphies), including a trimerous calyx and free petals in two whorls. We evaluate changes in petal morphology in relation to hypotheses of the genetic control of floral development and suggest that the compression of two petal whorls into one and the associated fusion of contiguous petals may be associated with the loss of the pollination chamber, which in turn may be associated with a shift in primary pollinator. We also suggest that the formation of pollen octads may be selectively advantageous when pollinator visits are infrequent, although this would only be applicable if multiple ovules could be fertilized by each octad; since the flowers are apocarpous, this would require an extragynoecial compitum to enable intercarpellary growth of pollen tubes. We furthermore

  16. Things fall apart: biological species form unconnected parsimony networks.

    Science.gov (United States)

    Hart, Michael W; Sunday, Jennifer

    2007-10-22

    The generality of operational species definitions is limited by problematic definitions of between-species divergence. A recent phylogenetic species concept based on a simple objective measure of statistically significant genetic differentiation uses between-species application of statistical parsimony networks that are typically used for population genetic analysis within species. Here we review recent phylogeographic studies and reanalyse several mtDNA barcoding studies using this method. We found that (i) alignments of DNA sequences typically fall apart into a separate subnetwork for each Linnean species (but with a higher rate of true positives for mtDNA data) and (ii) DNA sequences from single species typically stick together in a single haplotype network. Departures from these patterns are usually consistent with hybridization or cryptic species diversity.

  17. Comparison of Boolean analysis and standard phylogenetic methods using artificially evolved and natural mt-tRNA sequences from great apes.

    Science.gov (United States)

    Ari, Eszter; Ittzés, Péter; Podani, János; Thi, Quynh Chi Le; Jakó, Eena

    2012-04-01

    Boolean analysis (or BOOL-AN; Jakó et al., 2009. BOOL-AN: A method for comparative sequence analysis and phylogenetic reconstruction. Mol. Phylogenet. Evol. 52, 887-97.), a recently developed method for sequence comparison uses the Iterative Canonical Form of Boolean functions. It considers sequence information in a way entirely different from standard phylogenetic methods (i.e. Maximum Parsimony, Maximum-Likelihood, Neighbor-Joining, and Bayesian analysis). The performance and reliability of Boolean analysis were tested and compared with the standard phylogenetic methods, using artificially evolved - simulated - nucleotide sequences and the 22 mitochondrial tRNA genes of the great apes. At the outset, we assumed that the phylogeny of Hominidae is generally well established, and the guide tree of artificial sequence evolution can also be used as a benchmark. These offer a possibility to compare and test the performance of different phylogenetic methods. Trees were reconstructed by each method from 2500 simulated sequences and 22 mitochondrial tRNA sequences. We also introduced a special re-sampling method for Boolean analysis on permuted sequence sites, the P-BOOL-AN procedure. Considering the reliability values (branch support values of consensus trees and Robinson-Foulds distances) we used for simulated sequence trees produced by different phylogenetic methods, BOOL-AN appeared as the most reliable method. Although the mitochondrial tRNA sequences of great apes are relatively short (59-75 bases long) and the ratio of their constant characters is about 75%, BOOL-AN, P-BOOL-AN and the Bayesian approach produced the same tree-topology as the established phylogeny, while the outcomes of Maximum Parsimony, Maximum-Likelihood and Neighbor-Joining methods were equivocal. We conclude that Boolean analysis is a promising alternative to existing methods of sequence comparison for phylogenetic reconstruction and congruence analysis. Copyright © 2012 Elsevier Inc. All

  18. Reconstructing temperatures in the Maritime Alps, Italy, since the Last Glacial Maximum using cosmogenic noble gas paleothermometry

    Science.gov (United States)

    Tremblay, Marissa; Spagnolo, Matteo; Ribolini, Adriano; Shuster, David

    2016-04-01

    The Gesso Valley, located in the southwestern-most, Maritime portion of the European Alps, contains an exceptionally well-preserved record of glacial advances during the late Pleistocene and Holocene. Detailed geomorphic mapping, geochronology of glacial deposits, and glacier reconstructions indicate that glaciers in this Mediterranean region responded to millennial scale climate variability differently than glaciers in the interior of the European Alps. This suggests that the Mediterranean Sea somehow modulated the climate of this region. However, since glaciers respond to changes in temperature and precipitation, both variables were potentially influenced by proximity to the Sea. To disentangle the competing effects of temperature and precipitation changes on glacier size, we are constraining past temperature variations in the Gesso Valley since the Last Glacial Maximum (LGM) using cosmogenic noble gas paleothermometry. The cosmogenic noble gases 3He and 21Ne experience diffusive loss from common minerals like quartz and feldspars at Earth surface temperatures. Cosmogenic noble gas paleothermometry utilizes this open-system behavior to quantitatively constrain thermal histories of rocks during exposure to cosmic ray particles at the Earth's surface. We will present measurements of cosmogenic 3He in quartz sampled from moraines in the Gesso Valley with LGM, Bühl stadial, and Younger Dryas ages. With these 3He measurements and experimental data quantifying the diffusion kinetics of 3He in quartz, we will provide a preliminary temperature reconstruction for the Gesso Valley since the LGM. Future work on samples from younger moraines in the valley system will be used to fill in details of the more recent temperature history.

  19. Three-dimensional sheaf of ultrasound planes reconstruction (SOUPR) of ablated volumes.

    Science.gov (United States)

    Ingle, Atul; Varghese, Tomy

    2014-08-01

    This paper presents an algorithm for 3-D reconstruction of tumor ablations using ultrasound shear wave imaging with electrode vibration elastography. Radio-frequency ultrasound data frames are acquired over imaging planes that form a subset of a sheaf of planes sharing a common axis of intersection. Shear wave velocity is estimated separately on each imaging plane using a piecewise linear function fitting technique with a fast optimization routine. An interpolation algorithm then computes velocity maps on a fine grid over a set of C-planes that are perpendicular to the axis of the sheaf. A full 3-D rendering of the ablation can then be created from this stack of C-planes; hence the name "Sheaf Of Ultrasound Planes Reconstruction" or SOUPR. The algorithm is evaluated through numerical simulations and also using data acquired from a tissue mimicking phantom. Reconstruction quality is gauged using contrast and contrast-to-noise ratio measurements and changes in quality from using increasing number of planes in the sheaf are quantified. The highest contrast of 5 dB is seen between the stiffest and softest regions of the phantom. Under certain idealizing assumptions on the true shape of the ablation, good reconstruction quality while maintaining fast processing rate can be obtained with as few as six imaging planes suggesting that the method is suited for parsimonious data acquisitions with very few sparsely chosen imaging planes.

  20. Ancestral sequence alignment under optimal conditions

    Directory of Open Access Journals (Sweden)

    Brown Daniel G

    2005-11-01

    Full Text Available Abstract Background Multiple genome alignment is an important problem in bioinformatics. An important subproblem used by many multiple alignment approaches is that of aligning two multiple alignments. Many popular alignment algorithms for DNA use the sum-of-pairs heuristic, where the score of a multiple alignment is the sum of its induced pairwise alignment scores. However, the biological meaning of the sum-of-pairs of pairs heuristic is not obvious. Additionally, many algorithms based on the sum-of-pairs heuristic are complicated and slow, compared to pairwise alignment algorithms. An alternative approach to aligning alignments is to first infer ancestral sequences for each alignment, and then align the two ancestral sequences. In addition to being fast, this method has a clear biological basis that takes into account the evolution implied by an underlying phylogenetic tree. In this study we explore the accuracy of aligning alignments by ancestral sequence alignment. We examine the use of both maximum likelihood and parsimony to infer ancestral sequences. Additionally, we investigate the effect on accuracy of allowing ambiguity in our ancestral sequences. Results We use synthetic sequence data that we generate by simulating evolution on a phylogenetic tree. We use two different types of phylogenetic trees: trees with a period of rapid growth followed by a period of slow growth, and trees with a period of slow growth followed by a period of rapid growth. We examine the alignment accuracy of four ancestral sequence reconstruction and alignment methods: parsimony, maximum likelihood, ambiguous parsimony, and ambiguous maximum likelihood. Additionally, we compare against the alignment accuracy of two sum-of-pairs algorithms: ClustalW and the heuristic of Ma, Zhang, and Wang. Conclusion We find that allowing ambiguity in ancestral sequences does not lead to better multiple alignments. Regardless of whether we use parsimony or maximum likelihood, the

  1. Reconstructing Oceanographic Conditions From the Holocene to the Last Glacial Maximum in the Bay of Bengal

    Science.gov (United States)

    Miller, J.; Dekens, P. S.; Weber, M. E.; Spiess, V.; France-Lanord, C.

    2015-12-01

    The International Ocean Discovery Program (IODP) Expedition 354 drilled 7 sites in the Bay of Bengal, providing a unique opportunity to improve our understanding of the link between glacial cycles, tropical oceanographic changes, and monsoon strength. Deep-sea sediment cores of the Bengal Fan fluctuate between sand, hemipelagic and terrestrial sediment layers. All but one of the sites (U1454) contain a layer of calcareous clay in the uppermost part of the core that is late Pleistocene in age. During Expedition 354 site U1452C was sampled at high resolution (every 2cm) by a broad group of collaborators with the goal of reconstructing monsoon strength and oceanographic conditions using a variety of proxies. The top 480 cm of site U1452C (8ºN, 87ºE, 3671m water depth) contains primarily nannofossil rich calcareous clay. The relatively high abundance of foraminifera will allow us to generate a high resolution record of sea surface temperature (SST) and sea surface salinity (SSS) using standard foraminifera proxies. We will present oxygen isotopes (δ18O) and Mg/Ca data of mixed layer planktonic foraminifera from the top 70cm of the core, representing the Holocene to the last glacial maximum. δ18O of planktonic foraminifera records global ice volume and local SST and SSS, while Mg/Ca of foraminifera is a proxy for SST. The paired Mg/Ca and δ18O measurements on the same samples of foraminifera, together with published estimates with global ocean δ18O, can be used to reconstruct both SST and local δ18O of seawater, which is a function of the evaporation/precipitation balance. In future work, the local SSS and SST during the LGM will be paired with terrestrial and other oceanic proxies to increase our understanding of how global climate is connected to monsoon strength.

  2. Density estimation by maximum quantum entropy

    International Nuclear Information System (INIS)

    Silver, R.N.; Wallstrom, T.; Martz, H.F.

    1993-01-01

    A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets

  3. Comprehensive Phylogenetic Analysis of Bovine Non-aureus Staphylococci Species Based on Whole-Genome Sequencing

    Science.gov (United States)

    Naushad, Sohail; Barkema, Herman W.; Luby, Christopher; Condas, Larissa A. Z.; Nobrega, Diego B.; Carson, Domonique A.; De Buck, Jeroen

    2016-01-01

    Non-aureus staphylococci (NAS), a heterogeneous group of a large number of species and subspecies, are the most frequently isolated pathogens from intramammary infections in dairy cattle. Phylogenetic relationships among bovine NAS species are controversial and have mostly been determined based on single-gene trees. Herein, we analyzed phylogeny of bovine NAS species using whole-genome sequencing (WGS) of 441 distinct isolates. In addition, evolutionary relationships among bovine NAS were estimated from multilocus data of 16S rRNA, hsp60, rpoB, sodA, and tuf genes and sequences from these and numerous other single genes/proteins. All phylogenies were created with FastTree, Maximum-Likelihood, Maximum-Parsimony, and Neighbor-Joining methods. Regardless of methodology, WGS-trees clearly separated bovine NAS species into five monophyletic coherent clades. Furthermore, there were consistent interspecies relationships within clades in all WGS phylogenetic reconstructions. Except for the Maximum-Parsimony tree, multilocus data analysis similarly produced five clades. There were large variations in determining clades and interspecies relationships in single gene/protein trees, under different methods of tree constructions, highlighting limitations of using single genes for determining bovine NAS phylogeny. However, based on WGS data, we established a robust phylogeny of bovine NAS species, unaffected by method or model of evolutionary reconstructions. Therefore, it is now possible to determine associations between phylogeny and many biological traits, such as virulence, antimicrobial resistance, environmental niche, geographical distribution, and host specificity. PMID:28066335

  4. Assessing Internet addiction using the parsimonious Internet addiction components model - a preliminary study [forthcoming

    OpenAIRE

    Kuss, DJ; Shorter, GW; Van Rooij, AJ; Griffiths, MD; Schoenmakers, T

    2014-01-01

    Internet usage has grown exponentially over the last decade. Research indicates that excessive Internet use can lead to symptoms associated with addiction. To date, assessment of potential Internet addiction has varied regarding populations studied and instruments used, making reliable prevalence estimations difficult. To overcome the present problems a preliminary study was conducted testing a parsimonious Internet addiction components model based on Griffiths’ addiction components (2005), i...

  5. Seeing the elephant: Parsimony, functionalism, and the emergent design of contempt and other sentiments.

    Science.gov (United States)

    Gervais, Matthew M; Fessler, Daniel M T

    2017-01-01

    The target article argues that contempt is a sentiment, and that sentiments are the deep structure of social affect. The 26 commentaries meet these claims with a range of exciting extensions and applications, as well as critiques. Most significantly, we reply that construction and emergence are necessary for, not incompatible with, evolved design, while parsimony requires explanatory adequacy and predictive accuracy, not mere simplicity.

  6. Molecular Phylogenetics: Mathematical Framework and Unsolved Problems

    Science.gov (United States)

    Xia, Xuhua

    Phylogenetic relationship is essential in dating evolutionary events, reconstructing ancestral genes, predicting sites that are important to natural selection, and, ultimately, understanding genomic evolution. Three categories of phylogenetic methods are currently used: the distance-based, the maximum parsimony, and the maximum likelihood method. Here, I present the mathematical framework of these methods and their rationales, provide computational details for each of them, illustrate analytically and numerically the potential biases inherent in these methods, and outline computational challenges and unresolved problems. This is followed by a brief discussion of the Bayesian approach that has been recently used in molecular phylogenetics.

  7. The maximum likelihood estimator method of image reconstruction: Its fundamental characteristics and their origin

    International Nuclear Information System (INIS)

    Llacer, J.; Veklerov, E.

    1987-05-01

    We review our recent work characterizing the image reconstruction properties of the MLE algorithm. We studied its convergence properties and confirmed the onset of image deterioration, which is a function of the number of counts in the source. By modulating the weight given to projection tubes with high numbers of counts with respect to those with low numbers of counts in the reconstruction process, we have confirmed that image deterioration is due to an attempt by the algorithm to match projection data tubes with high numbers of counts too closely to the iterative image projections. We developed a stopping rule for the algorithm that tests the hypothesis that a reconstructed image could have given the initial projection data in a manner consistent with the underlying assumption of Poisson distributed variables. The rule was applied to two mathematically generated phantoms with success and to a third phantom with exact (no statistical fluctuations) projection data. We conclude that the behavior of the target functions whose extrema are sought in iterative schemes is more important in the early stages of the reconstruction than in the later stages, when the extrema are being approached but with the Poisson nature of the measurement. 11 refs., 14 figs

  8. Mitochondrial phylogeny of the Chrysisignita (Hymenoptera: Chrysididae) species group based on simultaneous Bayesian alignment and phylogeny reconstruction.

    Science.gov (United States)

    Soon, Villu; Saarma, Urmas

    2011-07-01

    The ignita species group within the genus Chrysis includes over 100 cuckoo wasp species, which all lead a parasitic lifestyle and exhibit very similar morphology. The lack of robust, diagnostic morphological characters has hindered phylogenetic reconstructions and contributed to frequent misidentification and inconsistent interpretations of species in this group. Therefore, molecular phylogenetic analysis is the most suitable approach for resolving the phylogeny and taxonomy of this group. We present a well-resolved phylogeny of the Chrysis ignita species group based on mitochondrial sequence data from 41 ingroup and six outgroup taxa. Although our emphasis was on European taxa, we included samples from most of the distribution range of the C. ignita species group to test for monophyly. We used a continuous mitochondrial DNA sequence consisting of 16S rRNA, tRNA(Val), 12S rRNA and ND4. The location of the ND4 gene at the 3' end of this continuous sequence, following 12S rRNA, represents a novel mitochondrial gene arrangement for insects. Due to difficulties in aligning rRNA genes, two different Bayesian approaches were employed to reconstruct phylogeny: (1) using a reduced data matrix including only those positions that could be aligned with confidence; or (2) using the full sequence dataset while estimating alignment and phylogeny simultaneously. In addition maximum-parsimony and maximum-likelihood analyses were performed to test the robustness of the Bayesian approaches. Although all approaches yielded trees with similar topology, considerably more nodes were resolved with analyses using the full data matrix. Phylogenetic analysis supported the monophyly of the C. ignita species group and divided its species into well-supported clades. The resultant phylogeny was only partly in accordance with published subgroupings based on morphology. Our results suggest that several taxa currently treated as subspecies or names treated as synonyms may in fact constitute

  9. A Parsimonious Instrument for Predicting Students' Intent to Pursue a Sales Career: Scale Development and Validation

    Science.gov (United States)

    Peltier, James W.; Cummins, Shannon; Pomirleanu, Nadia; Cross, James; Simon, Rob

    2014-01-01

    Students' desire and intention to pursue a career in sales continue to lag behind industry demand for sales professionals. This article develops and validates a reliable and parsimonious scale for measuring and predicting student intention to pursue a selling career. The instrument advances previous scales in three ways. The instrument is…

  10. Phylogeny reconstruction and hybrid analysis of populus (Salicaceae) based on nucleotide sequences of multiple single-copy nuclear genes and plastid fragments.

    Science.gov (United States)

    Wang, Zhaoshan; Du, Shuhui; Dayanandan, Selvadurai; Wang, Dongsheng; Zeng, Yanfei; Zhang, Jianguo

    2014-01-01

    Populus (Salicaceae) is one of the most economically and ecologically important genera of forest trees. The complex reticulate evolution and lack of highly variable orthologous single-copy DNA markers have posed difficulties in resolving the phylogeny of this genus. Based on a large data set of nuclear and plastid DNA sequences, we reconstructed robust phylogeny of Populus using parsimony, maximum likelihood and Bayesian inference methods. The resulting phylogenetic trees showed better resolution at both inter- and intra-sectional level than previous studies. The results revealed that (1) the plastid-based phylogenetic tree resulted in two main clades, suggesting an early divergence of the maternal progenitors of Populus; (2) three advanced sections (Populus, Aigeiros and Tacamahaca) are of hybrid origin; (3) species of the section Tacamahaca could be divided into two major groups based on plastid and nuclear DNA data, suggesting a polyphyletic nature of the section; and (4) many species proved to be of hybrid origin based on the incongruence between plastid and nuclear DNA trees. Reticulate evolution may have played a significant role in the evolution history of Populus by facilitating rapid adaptive radiations into different environments.

  11. Phylogeny reconstruction and hybrid analysis of populus (Salicaceae based on nucleotide sequences of multiple single-copy nuclear genes and plastid fragments.

    Directory of Open Access Journals (Sweden)

    Zhaoshan Wang

    Full Text Available Populus (Salicaceae is one of the most economically and ecologically important genera of forest trees. The complex reticulate evolution and lack of highly variable orthologous single-copy DNA markers have posed difficulties in resolving the phylogeny of this genus. Based on a large data set of nuclear and plastid DNA sequences, we reconstructed robust phylogeny of Populus using parsimony, maximum likelihood and Bayesian inference methods. The resulting phylogenetic trees showed better resolution at both inter- and intra-sectional level than previous studies. The results revealed that (1 the plastid-based phylogenetic tree resulted in two main clades, suggesting an early divergence of the maternal progenitors of Populus; (2 three advanced sections (Populus, Aigeiros and Tacamahaca are of hybrid origin; (3 species of the section Tacamahaca could be divided into two major groups based on plastid and nuclear DNA data, suggesting a polyphyletic nature of the section; and (4 many species proved to be of hybrid origin based on the incongruence between plastid and nuclear DNA trees. Reticulate evolution may have played a significant role in the evolution history of Populus by facilitating rapid adaptive radiations into different environments.

  12. Reconstruction of electrical impedance tomography (EIT) images based on the expectation maximum (EM) method.

    Science.gov (United States)

    Wang, Qi; Wang, Huaxiang; Cui, Ziqiang; Yang, Chengyi

    2012-11-01

    Electrical impedance tomography (EIT) calculates the internal conductivity distribution within a body using electrical contact measurements. The image reconstruction for EIT is an inverse problem, which is both non-linear and ill-posed. The traditional regularization method cannot avoid introducing negative values in the solution. The negativity of the solution produces artifacts in reconstructed images in presence of noise. A statistical method, namely, the expectation maximization (EM) method, is used to solve the inverse problem for EIT in this paper. The mathematical model of EIT is transformed to the non-negatively constrained likelihood minimization problem. The solution is obtained by the gradient projection-reduced Newton (GPRN) iteration method. This paper also discusses the strategies of choosing parameters. Simulation and experimental results indicate that the reconstructed images with higher quality can be obtained by the EM method, compared with the traditional Tikhonov and conjugate gradient (CG) methods, even with non-negative processing. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Bayesian image reconstruction for improving detection performance of muon tomography.

    Science.gov (United States)

    Wang, Guobao; Schultz, Larry J; Qi, Jinyi

    2009-05-01

    Muon tomography is a novel technology that is being developed for detecting high-Z materials in vehicles or cargo containers. Maximum likelihood methods have been developed for reconstructing the scattering density image from muon measurements. However, the instability of maximum likelihood estimation often results in noisy images and low detectability of high-Z targets. In this paper, we propose using regularization to improve the image quality of muon tomography. We formulate the muon reconstruction problem in a Bayesian framework by introducing a prior distribution on scattering density images. An iterative shrinkage algorithm is derived to maximize the log posterior distribution. At each iteration, the algorithm obtains the maximum a posteriori update by shrinking an unregularized maximum likelihood update. Inverse quadratic shrinkage functions are derived for generalized Laplacian priors and inverse cubic shrinkage functions are derived for generalized Gaussian priors. Receiver operating characteristic studies using simulated data demonstrate that the Bayesian reconstruction can greatly improve the detection performance of muon tomography.

  14. Noise and physical limits to maximum resolution of PET images

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L.; Espana, S. [Dpto. Fisica Atomica, Molecular y Nuclear, Facultad de Ciencias Fisicas, Universidad Complutense de Madrid, Avda. Complutense s/n, E-28040 Madrid (Spain); Vicente, E.; Vaquero, J.J.; Desco, M. [Unidad de Medicina y Cirugia Experimental, Hospital GU ' Gregorio Maranon' , E-28007 Madrid (Spain); Udias, J.M. [Dpto. Fisica Atomica, Molecular y Nuclear, Facultad de Ciencias Fisicas, Universidad Complutense de Madrid, Avda. Complutense s/n, E-28040 Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es

    2007-10-01

    In this work we show that there is a limit for the maximum resolution achievable with a high resolution PET scanner, as well as for the best signal-to-noise ratio, which are ultimately related to the physical effects involved in the emission and detection of the radiation and thus they cannot be overcome with any particular reconstruction method. These effects prevent the spatial high frequency components of the imaged structures to be recorded by the scanner. Therefore, the information encoded in these high frequencies cannot be recovered by any reconstruction technique. Within this framework, we have determined the maximum resolution achievable for a given acquisition as a function of data statistics and scanner parameters, like the size of the crystals or the inter-crystal scatter. In particular, the noise level in the data as a limitation factor to yield high-resolution images in tomographs with small crystal sizes is outlined. These results have implications regarding how to decide the optimal number of voxels of the reconstructed image or how to design better PET scanners.

  15. Noise and physical limits to maximum resolution of PET images

    International Nuclear Information System (INIS)

    Herraiz, J.L.; Espana, S.; Vicente, E.; Vaquero, J.J.; Desco, M.; Udias, J.M.

    2007-01-01

    In this work we show that there is a limit for the maximum resolution achievable with a high resolution PET scanner, as well as for the best signal-to-noise ratio, which are ultimately related to the physical effects involved in the emission and detection of the radiation and thus they cannot be overcome with any particular reconstruction method. These effects prevent the spatial high frequency components of the imaged structures to be recorded by the scanner. Therefore, the information encoded in these high frequencies cannot be recovered by any reconstruction technique. Within this framework, we have determined the maximum resolution achievable for a given acquisition as a function of data statistics and scanner parameters, like the size of the crystals or the inter-crystal scatter. In particular, the noise level in the data as a limitation factor to yield high-resolution images in tomographs with small crystal sizes is outlined. These results have implications regarding how to decide the optimal number of voxels of the reconstructed image or how to design better PET scanners

  16. Reconstructing the ups and downs of primate brain evolution: implications for adaptive hypotheses and Homo floresiensis

    Directory of Open Access Journals (Sweden)

    Barton Robert A

    2010-01-01

    Full Text Available Abstract Background Brain size is a key adaptive trait. It is often assumed that increasing brain size was a general evolutionary trend in primates, yet recent fossil discoveries have documented brain size decreases in some lineages, raising the question of how general a trend there was for brains to increase in mass over evolutionary time. We present the first systematic phylogenetic analysis designed to answer this question. Results We performed ancestral state reconstructions of three traits (absolute brain mass, absolute body mass, relative brain mass using 37 extant and 23 extinct primate species and three approaches to ancestral state reconstruction: parsimony, maximum likelihood and Bayesian Markov-chain Monte Carlo. Both absolute and relative brain mass generally increased over evolutionary time, but body mass did not. Nevertheless both absolute and relative brain mass decreased along several branches. Applying these results to the contentious case of Homo floresiensis, we find a number of scenarios under which the proposed evolution of Homo floresiensis' small brain appears to be consistent with patterns observed along other lineages, dependent on body mass and phylogenetic position. Conclusions Our results confirm that brain expansion began early in primate evolution and show that increases occurred in all major clades. Only in terms of an increase in absolute mass does the human lineage appear particularly striking, with both the rate of proportional change in mass and relative brain size having episodes of greater expansion elsewhere on the primate phylogeny. However, decreases in brain mass also occurred along branches in all major clades, and we conclude that, while selection has acted to enlarge primate brains, in some lineages this trend has been reversed. Further analyses of the phylogenetic position of Homo floresiensis and better body mass estimates are required to confirm the plausibility of the evolution of its small brain

  17. The conquering of North America: dated phylogenetic and biogeographic inference of migratory behavior in bee hummingbirds.

    Science.gov (United States)

    Licona-Vera, Yuyini; Ornelas, Juan Francisco

    2017-06-05

    Geographical and temporal patterns of diversification in bee hummingbirds (Mellisugini) were assessed with respect to the evolution of migration, critical for colonization of North America. We generated a dated multilocus phylogeny of the Mellisugini based on a dense sampling using Bayesian inference, maximum-likelihood and maximum parsimony methods, and reconstructed the ancestral states of distributional areas in a Bayesian framework and migratory behavior using maximum parsimony, maximum-likelihood and re-rooting methods. All phylogenetic analyses confirmed monophyly of the Mellisugini and the inclusion of Atthis, Calothorax, Doricha, Eulidia, Mellisuga, Microstilbon, Myrmia, Tilmatura, and Thaumastura. Mellisugini consists of two clades: (1) South American species (including Tilmatura dupontii), and (2) species distributed in North and Central America and the Caribbean islands. The second clade consists of four subclades: Mexican (Calothorax, Doricha) and Caribbean (Archilochus, Calliphlox, Mellisuga) sheartails, Calypte, and Selasphorus (incl. Atthis). Coalescent-based dating places the origin of the Mellisugini in the mid-to-late Miocene, with crown ages of most subclades in the early Pliocene, and subsequent species splits in the Pleistocene. Bee hummingbirds reached western North America by the end of the Miocene and the ancestral mellisuginid (bee hummingbirds) was reconstructed as sedentary, with four independent gains of migratory behavior during the evolution of the Mellisugini. Early colonization of North America and subsequent evolution of migration best explained biogeographic and diversification patterns within the Mellisugini. The repeated evolution of long-distance migration by different lineages was critical for the colonization of North America, contributing to the radiation of bee hummingbirds. Comparative phylogeography is needed to test whether the repeated evolution of migration resulted from northward expansion of southern sedentary

  18. Evidence of shallow positron traps in ion-implanted InP observed by maximum entropy reconstruction of positron lifetime distribution: a test of MELT

    International Nuclear Information System (INIS)

    Chen, Z.Q.; Wang, S.J.

    1999-01-01

    A newly developed maximum entropy method, which was realized by the computer program MELT introduced by Shukla et al., was used to analyze positron lifetime spectra measured in semiconductors. Several simulation studies were done to test the performance of this algorithm. Reliable reconstruction of positron lifetime distributions can be extracted at relatively lower counts, which shows the applicability and superiority of this method. Two positron lifetime spectra measured in ion-implanted p-InP(Zn) at 140 and 280 K, respectively were analyzed by this program. The lifetime distribution differed greatly for the two temperatures, giving direct evidence of the existence of shallow positron traps at low temperature

  19. Molecular phylogeny and larval morphological diversity of the lanternfish genus Hygophum (Teleostei: Myctophidae).

    Science.gov (United States)

    Yamaguchi, M; Miya, M; Okiyama, M; Nishida, M

    2000-04-01

    Larvae of the deep-sea lanternfish genus Hygophum (Myctophidae) exhibit a remarkable morphological diversity that is quite unexpected, considering their homogeneous adult morphology. In an attempt to elucidate the evolutionary patterns of such larval morphological diversity, nucleotide sequences of a portion of the mitochondrially encoded 16S ribosomal RNA gene were determined for seven Hygophum species and three outgroup taxa. Secondary structure-based alignment resulted in a character matrix consisting of 1172 bp of unambiguously aligned sequences, which were subjected to phylogenetic analyses using maximum-parsimony, maximum-likelihood, and neighbor-joining methods. The resultant tree topologies from the three methods were congruent, with most nodes, including that of the genus Hygophum, being strongly supported by various tree statistics. The most parsimonious reconstruction of the three previously recognized, distinct larval morphs onto the molecular phylogeny revealed that one of the morphs had originated as the common ancestor of the genus, the other two having diversified separately in two subsequent major clades. The patterns of such diversification are discussed in terms of the unusual larval eye morphology and geographic distribution. Copyright 2000 Academic Press.

  20. Quantifying the Impact of Immediate Reconstruction in Postmastectomy Radiation: A Large, Dose-Volume Histogram-Based Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ohri, Nisha [Department of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Cordeiro, Peter G. [Department of Plastic Surgery, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Keam, Jennifer [Department of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Ballangrud, Ase [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Shi Weiji; Zhang Zhigang [Department of Biostatistics and Epidemiology, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Nerbun, Claire T.; Woch, Katherine M.; Stein, Nicholas F.; Zhou Ying [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); McCormick, Beryl; Powell, Simon N. [Department of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Ho, Alice Y., E-mail: HoA1234@mskcc.org [Department of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, New York (United States)

    2012-10-01

    Purpose: To assess the impact of immediate breast reconstruction on postmastectomy radiation (PMRT) using dose-volume histogram (DVH) data. Methods and Materials: Two hundred forty-seven women underwent PMRT at our center, 196 with implant reconstruction and 51 without reconstruction. Patients with reconstruction were treated with tangential photons, and patients without reconstruction were treated with en-face electron fields and customized bolus. Twenty percent of patients received internal mammary node (IMN) treatment. The DVH data were compared between groups. Ipsilateral lung parameters included V20 (% volume receiving 20 Gy), V40 (% volume receiving 40 Gy), mean dose, and maximum dose. Heart parameters included V25 (% volume receiving 25 Gy), mean dose, and maximum dose. IMN coverage was assessed when applicable. Chest wall coverage was assessed in patients with reconstruction. Propensity-matched analysis adjusted for potential confounders of laterality and IMN treatment. Results: Reconstruction was associated with lower lung V20, mean dose, and maximum dose compared with no reconstruction (all P<.0001). These associations persisted on propensity-matched analysis (all P<.0001). Heart doses were similar between groups (P=NS). Ninety percent of patients with reconstruction had excellent chest wall coverage (D95 >98%). IMN coverage was superior in patients with reconstruction (D95 >92.0 vs 75.7%, P<.001). IMN treatment significantly increased lung and heart parameters in patients with reconstruction (all P<.05) but minimally affected those without reconstruction (all P>.05). Among IMN-treated patients, only lower lung V20 in those without reconstruction persisted (P=.022), and mean and maximum heart doses were higher than in patients without reconstruction (P=.006, P=.015, respectively). Conclusions: Implant reconstruction does not compromise the technical quality of PMRT when the IMNs are untreated. Treatment technique, not reconstruction, is the primary

  1. Parsimonious Ways to Use Vision for Navigation

    Directory of Open Access Journals (Sweden)

    Paul Graham

    2012-05-01

    Full Text Available The use of visual information for navigation appears to be a universal strategy for sighted animals, amongst which, one particular group of expert navigators are the ants. The broad interest in studies of ant navigation is in part due to their small brains, thus biomimetic engineers expect to be impressed by elegant control solutions, and psychologists might hope for a description of the minimal cognitive requirements for complex spatial behaviours. In this spirit, we have been taking an interdisciplinary approach to the visual guided navigation of ants in their natural habitat. Behavioural experiments and natural image statistics show that visual navigation need not depend on the remembering or recognition of objects. Further modelling work suggests how simple behavioural routines might enable navigation using familiarity detection rather than explicit recall, and we present a proof of concept that visual navigation using familiarity can be achieved without specifying when or what to learn, nor separating routes into sequences of waypoints. We suggest that our current model represents the only detailed and complete model of insect route guidance to date. What's more, we believe the suggested mechanisms represent useful parsimonious hypotheses for the visually guided navigation in larger-brain animals.

  2. A clinical perspective of accelerated statistical reconstruction

    International Nuclear Information System (INIS)

    Hutton, B.F.; Hudson, H.M.; Beekman, F.J.

    1997-01-01

    Although the potential benefits of maximum likelihood reconstruction have been recognised for many years, the technique has only recently found widespread popularity in clinical practice. Factors which have contributed to the wider acceptance include improved models for the emission process, better understanding of the properties of the algorithm and, not least, the practicality of application with the development of acceleration schemes and the improved speed of computers. The objective in this article is to present a framework for applying maximum likelihood reconstruction for a wide range of clinically based problems. The article draws particularly on the experience of the three authors in applying an acceleration scheme involving use of ordered subsets to a range of applications. The potential advantages of statistical reconstruction techniques include: (a) the ability to better model the emission and detection process, in order to make the reconstruction converge to a quantitative image, (b) the inclusion of a statistical noise model which results in better noise characteristics, and (c) the possibility to incorporate prior knowledge about the distribution being imaged. The great flexibility in adapting the reconstruction for a specific model results in these techniques having wide applicability to problems in clinical nuclear medicine. (orig.). With 8 figs., 1 tab

  3. The scenario on the origin of translation in the RNA world: in principle of replication parsimony

    Directory of Open Access Journals (Sweden)

    Ma Wentao

    2010-11-01

    Full Text Available Abstract Background It is now believed that in the origin of life, proteins should have been "invented" in an RNA world. However, due to the complexity of a possible RNA-based proto-translation system, this evolving process seems quite complicated and the associated scenario remains very blurry. Considering that RNA can bind amino acids with specificity, it has been reasonably supposed that initial peptides might have been synthesized on "RNA templates" containing multiple amino acid binding sites. This "Direct RNA Template (DRT" mechanism is attractive because it should be the simplest mechanism for RNA to synthesize peptides, thus very likely to have been adopted initially in the RNA world. Then, how this mechanism could develop into a proto-translation system mechanism is an interesting problem. Presentation of the hypothesis Here an explanation to this problem is shown considering the principle of "replication parsimony" --- genetic information tends to be utilized in a parsimonious way under selection pressure, due to its replication cost (e.g., in the RNA world, nucleotides and ribozymes for RNA replication. Because a DRT would be quite long even for a short peptide, its replication cost would be great. Thus the diversity and the length of functional peptides synthesized by the DRT mechanism would be seriously limited. Adaptors (proto-tRNAs would arise to allow a DRT's complementary strand (called "C-DRT" here to direct the synthesis of the same peptide synthesized by the DRT itself. Because the C-DRT is a necessary part in the DRT's replication, fewer turns of the DRT's replication would be needed to synthesize definite copies of the functional peptide, thus saving the replication cost. Acting through adaptors, C-DRTs could transform into much shorter templates (called "proto-mRNAs" here and substitute the role of DRTs, thus significantly saving the replication cost. A proto-rRNA corresponding to the small subunit rRNA would then emerge

  4. A new mathematical modeling for pure parsimony haplotyping problem.

    Science.gov (United States)

    Feizabadi, R; Bagherian, M; Vaziri, H R; Salahi, M

    2016-11-01

    Pure parsimony haplotyping (PPH) problem is important in bioinformatics because rational haplotyping inference plays important roles in analysis of genetic data, mapping complex genetic diseases such as Alzheimer's disease, heart disorders and etc. Haplotypes and genotypes are m-length sequences. Although several integer programing models have already been presented for PPH problem, its NP-hardness characteristic resulted in ineffectiveness of those models facing the real instances especially instances with many heterozygous sites. In this paper, we assign a corresponding number to each haplotype and genotype and based on those numbers, we set a mixed integer programing model. Using numbers, instead of sequences, would lead to less complexity of the new model in comparison with previous models in a way that there are neither constraints nor variables corresponding to heterozygous nucleotide sites in it. Experimental results approve the efficiency of the new model in producing better solution in comparison to two state-of-the art haplotyping approaches. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. The Red Sea during the Last Glacial Maximum: implications for sea level reconstructions

    Science.gov (United States)

    Gildor, H.; Biton, E.; Peltier, W. R.

    2006-12-01

    The Red Sea (RS) is a semi-enclosed basin connected to the Indian Ocean via a narrow and shallow strait, and surrounded by arid areas which exhibits high sensitivity to atmospheric changes and sea level reduction. We have used the MIT GCM to investigate the changes in the hydrography and circulation in the RS in response to reduced sea level, variability in the Indian monsoons, and changes in atmospheric temperature and humidity that occurred during the Last Glacial Maximum (LGM). The model results show high sensitivity to sea level reduction especially in the salinity field (increasing with the reduction in sea level) together with a mild atmospheric impact. Sea level reduction decreases the stratification, increases subsurface temperatures, and alters the circulation pattern at the Strait of Bab el Mandab, which experiences a transition from submaximal flow to maximal flow. The reduction in sea level at LGM alters the location of deep water formation which shifts to an open sea convective site in the northern part of the RS compared to present day situation in which deep water is formed from the Gulf of Suez outflow. Our main result based on both the GCM and on a simple hydraulic control model which takes into account mixing process at the Strait of Bab El Mandeb, is that sea level was reduced by only ~100 m in the Bab El Mandeb region during the LGM, i.e. the water depth at the Hanish sill (the shallowest part in the Strait Bab el Mandab) was around 34 m. This result agrees with the recent reconstruction of the LGM low stand of the sea in this region based upon the ICE-5G (VM2) model of Peltier (2004).

  6. Phylogenetic Reconstruction, Morphological Diversification and Generic Delimitation of Disepalum (Annonaceae.

    Directory of Open Access Journals (Sweden)

    Pui-Sze Li

    Full Text Available Taxonomic delimitation of Disepalum (Annonaceae is contentious, with some researchers favoring a narrow circumscription following segregation of the genus Enicosanthellum. We reconstruct the phylogeny of Disepalum and related taxa based on four chloroplast and two nuclear DNA regions as a framework for clarifying taxonomic delimitation and assessing evolutionary transitions in key morphological characters. Maximum parsimony, maximum likelihood and Bayesian methods resulted in a consistent, well-resolved and strongly supported topology. Disepalum s.l. is monophyletic and strongly supported, with Disepalum s.str. and Enicosanthellum retrieved as sister groups. Although this topology is consistent with both taxonomic delimitations, the distribution of morphological synapomorphies provides greater support for the inclusion of Enicosanthellum within Disepalum s.l. We propose a novel infrageneric classification with two subgenera. Subgen. Disepalum (= Disepalum s.str. is supported by numerous synapomorphies, including the reduction of the calyx to two sepals and connation of petals. Subgen. Enicosanthellum lacks obvious morphological synapomorphies, but possesses several diagnostic characters (symplesiomorphies, including a trimerous calyx and free petals in two whorls. We evaluate changes in petal morphology in relation to hypotheses of the genetic control of floral development and suggest that the compression of two petal whorls into one and the associated fusion of contiguous petals may be associated with the loss of the pollination chamber, which in turn may be associated with a shift in primary pollinator. We also suggest that the formation of pollen octads may be selectively advantageous when pollinator visits are infrequent, although this would only be applicable if multiple ovules could be fertilized by each octad; since the flowers are apocarpous, this would require an extragynoecial compitum to enable intercarpellary growth of pollen tubes

  7. Mosasauroid phylogeny under multiple phylogenetic methods provides new insights on the evolution of aquatic adaptations in the group.

    Directory of Open Access Journals (Sweden)

    Tiago R Simões

    Full Text Available Mosasauroids were a successful lineage of squamate reptiles (lizards and snakes that radiated during the Late Cretaceous (95-66 million years ago. They can be considered one of the few lineages in the evolutionary history of tetrapods to have acquired a fully aquatic lifestyle, similarly to whales, ichthyosaurs and plesiosaurs. Despite a long history of research on this group, their phylogenetic relationships have only been tested so far using traditional (unweighted maximum parsimony. However, hypotheses of mosasauroid relationships and the recently proposed multiple origins of aquatically adapted pelvic and pedal features in this group can be more thoroughly tested by methods that take into account variation in branch lengths and evolutionary rates. In this study, we present the first mosasauroid phylogenetic analysis performed under different analytical methods, including maximum likelihood, Bayesian inference, and implied weighting maximum parsimony. The results indicate a lack of congruence in the topological position of halisaurines and Dallasaurus. Additionally, the genus Prognathodon is paraphyletic under all hypotheses. Interestingly, a number of traditional mosasauroid clades become weakly supported, or unresolved, under Bayesian analyses. The reduced resolutions in some consensus trees create ambiguities concerning the evolution of fully aquatic pelvic/pedal conditions under many analyses. However, when enough resolution was obtained, reversals of the pelvic/pedal conditions were favoured by parsimony and likelihood ancestral state reconstructions instead of independent origins of aquatic features in mosasauroids. It is concluded that most of the observed discrepancies among the results can be associated with different analytical procedures, but also due to limited postcranial data on halisaurines, yaguarasaurines and Dallasaurus.

  8. Mosasauroid phylogeny under multiple phylogenetic methods provides new insights on the evolution of aquatic adaptations in the group

    Science.gov (United States)

    Vernygora, Oksana; Paparella, Ilaria; Jimenez-Huidobro, Paulina; Caldwell, Michael W.

    2017-01-01

    Mosasauroids were a successful lineage of squamate reptiles (lizards and snakes) that radiated during the Late Cretaceous (95–66 million years ago). They can be considered one of the few lineages in the evolutionary history of tetrapods to have acquired a fully aquatic lifestyle, similarly to whales, ichthyosaurs and plesiosaurs. Despite a long history of research on this group, their phylogenetic relationships have only been tested so far using traditional (unweighted) maximum parsimony. However, hypotheses of mosasauroid relationships and the recently proposed multiple origins of aquatically adapted pelvic and pedal features in this group can be more thoroughly tested by methods that take into account variation in branch lengths and evolutionary rates. In this study, we present the first mosasauroid phylogenetic analysis performed under different analytical methods, including maximum likelihood, Bayesian inference, and implied weighting maximum parsimony. The results indicate a lack of congruence in the topological position of halisaurines and Dallasaurus. Additionally, the genus Prognathodon is paraphyletic under all hypotheses. Interestingly, a number of traditional mosasauroid clades become weakly supported, or unresolved, under Bayesian analyses. The reduced resolutions in some consensus trees create ambiguities concerning the evolution of fully aquatic pelvic/pedal conditions under many analyses. However, when enough resolution was obtained, reversals of the pelvic/pedal conditions were favoured by parsimony and likelihood ancestral state reconstructions instead of independent origins of aquatic features in mosasauroids. It is concluded that most of the observed discrepancies among the results can be associated with different analytical procedures, but also due to limited postcranial data on halisaurines, yaguarasaurines and Dallasaurus. PMID:28467456

  9. Direct comparison of phase-sensitive vibrational sum frequency generation with maximum entropy method: case study of water.

    Science.gov (United States)

    de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie

    2011-12-14

    We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics

  10. Hide and vanish: data sets where the most parsimonious tree is known but hard to find, and their implications for tree search methods.

    Science.gov (United States)

    Goloboff, Pablo A

    2014-10-01

    Three different types of data sets, for which the uniquely most parsimonious tree can be known exactly but is hard to find with heuristic tree search methods, are studied. Tree searches are complicated more by the shape of the tree landscape (i.e. the distribution of homoplasy on different trees) than by the sheer abundance of homoplasy or character conflict. Data sets of Type 1 are those constructed by Radel et al. (2013). Data sets of Type 2 present a very rugged landscape, with narrow peaks and valleys, but relatively low amounts of homoplasy. For such a tree landscape, subjecting the trees to TBR and saving suboptimal trees produces much better results when the sequence of clipping for the tree branches is randomized instead of fixed. An unexpected finding for data sets of Types 1 and 2 is that starting a search from a random tree instead of a random addition sequence Wagner tree may increase the probability that the search finds the most parsimonious tree; a small artificial example where these probabilities can be calculated exactly is presented. Data sets of Type 3, the most difficult data sets studied here, comprise only congruent characters, and a single island with only one most parsimonious tree. Even if there is a single island, missing entries create a very flat landscape which is difficult to traverse with tree search algorithms because the number of equally parsimonious trees that need to be saved and swapped to effectively move around the plateaus is too large. Minor modifications of the parameters of tree drifting, ratchet, and sectorial searches allow travelling around these plateaus much more efficiently than saving and swapping large numbers of equally parsimonious trees with TBR. For these data sets, two new related criteria for selecting taxon addition sequences in Wagner trees (the "selected" and "informative" addition sequences) produce much better results than the standard random or closest addition sequences. These new methods for Wagner

  11. SEAPODYM-LTL: a parsimonious zooplankton dynamic biomass model

    Science.gov (United States)

    Conchon, Anna; Lehodey, Patrick; Gehlen, Marion; Titaud, Olivier; Senina, Inna; Séférian, Roland

    2017-04-01

    Mesozooplankton organisms are of critical importance for the understanding of early life history of most fish stocks, as well as the nutrient cycles in the ocean. Ongoing climate change and the need for improved approaches to the management of living marine resources has driven recent advances in zooplankton modelling. The classical modeling approach tends to describe the whole biogeochemical and plankton cycle with increasing complexity. We propose here a different and parsimonious zooplankton dynamic biomass model (SEAPODYM-LTL) that is cost efficient and can be advantageously coupled with primary production estimated either from satellite derived ocean color data or biogeochemical models. In addition, the adjoint code of the model is developed allowing a robust optimization approach for estimating the few parameters of the model. In this study, we run the first optimization experiments using a global database of climatological zooplankton biomass data and we make a comparative analysis to assess the importance of resolution and primary production inputs on model fit to observations. We also compare SEAPODYM-LTL outputs to those produced by a more complex biogeochemical model (PISCES) but sharing the same physical forcings.

  12. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J.

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  13. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems.

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  14. Data driven discrete-time parsimonious identification of a nonlinear state-space model for a weakly nonlinear system with short data record

    Science.gov (United States)

    Relan, Rishi; Tiels, Koen; Marconato, Anna; Dreesen, Philippe; Schoukens, Johan

    2018-05-01

    Many real world systems exhibit a quasi linear or weakly nonlinear behavior during normal operation, and a hard saturation effect for high peaks of the input signal. In this paper, a methodology to identify a parsimonious discrete-time nonlinear state space model (NLSS) for the nonlinear dynamical system with relatively short data record is proposed. The capability of the NLSS model structure is demonstrated by introducing two different initialisation schemes, one of them using multivariate polynomials. In addition, a method using first-order information of the multivariate polynomials and tensor decomposition is employed to obtain the parsimonious decoupled representation of the set of multivariate real polynomials estimated during the identification of NLSS model. Finally, the experimental verification of the model structure is done on the cascaded water-benchmark identification problem.

  15. Three Dimensional Sheaf of Ultrasound Planes Reconstruction (SOUPR) of Ablated Volumes

    Science.gov (United States)

    Ingle, Atul; Varghese, Tomy

    2014-01-01

    This paper presents an algorithm for three dimensional reconstruction of tumor ablations using ultrasound shear wave imaging with electrode vibration elastography. Radiofrequency ultrasound data frames are acquired over imaging planes that form a subset of a sheaf of planes sharing a common axis of intersection. Shear wave velocity is estimated separately on each imaging plane using a piecewise linear function fitting technique with a fast optimization routine. An interpolation algorithm then computes velocity maps on a fine grid over a set of C-planes that are perpendicular to the axis of the sheaf. A full three dimensional rendering of the ablation can then be created from this stack of C-planes; hence the name “Sheaf Of Ultrasound Planes Reconstruction” or SOUPR. The algorithm is evaluated through numerical simulations and also using data acquired from a tissue mimicking phantom. Reconstruction quality is gauged using contrast and contrast-to-noise ratio measurements and changes in quality from using increasing number of planes in the sheaf are quantified. The highest contrast of 5 dB is seen between the stiffest and softest regions of the phantom. Under certain idealizing assumptions on the true shape of the ablation, good reconstruction quality while maintaining fast processing rate can be obtained with as few as 6 imaging planes suggesting that the method is suited for parsimonious data acquisitions with very few sparsely chosen imaging planes. PMID:24808405

  16. Quantitative tomography simulations and reconstruction algorithms

    International Nuclear Information System (INIS)

    Martz, H.E.; Aufderheide, M.B.; Goodman, D.; Schach von Wittenau, A.; Logan, C.; Hall, J.; Jackson, J.; Slone, D.

    2000-01-01

    X-ray, neutron and proton transmission radiography and computed tomography (CT) are important diagnostic tools that are at the heart of LLNL's effort to meet the goals of the DOE's Advanced Radiography Campaign. This campaign seeks to improve radiographic simulation and analysis so that radiography can be a useful quantitative diagnostic tool for stockpile stewardship. Current radiographic accuracy does not allow satisfactory separation of experimental effects from the true features of an object's tomographically reconstructed image. This can lead to difficult and sometimes incorrect interpretation of the results. By improving our ability to simulate the whole radiographic and CT system, it will be possible to examine the contribution of system components to various experimental effects, with the goal of removing or reducing them. In this project, we are merging this simulation capability with a maximum-likelihood (constrained-conjugate-gradient-CCG) reconstruction technique yielding a physics-based, forward-model image-reconstruction code. In addition, we seek to improve the accuracy of computed tomography from transmission radiographs by studying what physics is needed in the forward model. During FY 2000, an improved version of the LLNL ray-tracing code called HADES has been coupled with a recently developed LLNL CT algorithm known as CCG. The problem of image reconstruction is expressed as a large matrix equation relating a model for the object being reconstructed to its projections (radiographs). Using a constrained-conjugate-gradient search algorithm, a maximum likelihood solution is sought. This search continues until the difference between the input measured radiographs or projections and the simulated or calculated projections is satisfactorily small

  17. Constructing valid density matrices on an NMR quantum information processor via maximum likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Harpreet; Arvind; Dorai, Kavita, E-mail: kavita@iisermohali.ac.in

    2016-09-07

    Estimation of quantum states is an important step in any quantum information processing experiment. A naive reconstruction of the density matrix from experimental measurements can often give density matrices which are not positive, and hence not physically acceptable. How do we ensure that at all stages of reconstruction, we keep the density matrix positive? Recently a method has been suggested based on maximum likelihood estimation, wherein the density matrix is guaranteed to be positive definite. We experimentally implement this protocol on an NMR quantum information processor. We discuss several examples and compare with the standard method of state estimation. - Highlights: • State estimation using maximum likelihood method was performed on an NMR quantum information processor. • Physically valid density matrices were obtained every time in contrast to standard quantum state tomography. • Density matrices of several different entangled and separable states were reconstructed for two and three qubits.

  18. Method for position emission mammography image reconstruction

    Science.gov (United States)

    Smith, Mark Frederick

    2004-10-12

    An image reconstruction method comprising accepting coincidence datat from either a data file or in real time from a pair of detector heads, culling event data that is outside a desired energy range, optionally saving the desired data for each detector position or for each pair of detector pixels on the two detector heads, and then reconstructing the image either by backprojection image reconstruction or by iterative image reconstruction. In the backprojection image reconstruction mode, rays are traced between centers of lines of response (LOR's), counts are then either allocated by nearest pixel interpolation or allocated by an overlap method and then corrected for geometric effects and attenuation and the data file updated. If the iterative image reconstruction option is selected, one implementation is to compute a grid Siddon retracing, and to perform maximum likelihood expectation maiximization (MLEM) computed by either: a) tracing parallel rays between subpixels on opposite detector heads; or b) tracing rays between randomized endpoint locations on opposite detector heads.

  19. Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results

  20. Pholcid spider molecular systematics revisited, with new insights into the biogeography and the evolution of the group

    DEFF Research Database (Denmark)

    Dimitrov, Dimitar Stefanov; Astrin, Jonas J.; Huber, Bernhard A.

    2013-01-01

    analysed using parsimony, maximum-likelihood and Bayesian methods for phylogenetic reconstruction. We show that in several previously problematic cases molecular and morphological data are converging towards a single hypothesis. This is also the first study that explicitly addresses the age of pholcid......We analysed seven genetic markers sampled from 165 pholcids and 34 outgroups in order to test and improve the recently revised classification of the family. Our results are based on the largest and most comprehensive set of molecular data so far to study pholcid relationships. The data were...

  1. A Practical pedestrian approach to parsimonious regression with inaccurate inputs

    Directory of Open Access Journals (Sweden)

    Seppo Karrila

    2014-04-01

    Full Text Available A measurement result often dictates an interval containing the correct value. Interval data is also created by roundoff, truncation, and binning. We focus on such common interval uncertainty in data. Inaccuracy in model inputs is typically ignored on model fitting. We provide a practical approach for regression with inaccurate data: the mathematics is easy, and the linear programming formulations simple to use even in a spreadsheet. This self-contained elementary presentation introduces interval linear systems and requires only basic knowledge of algebra. Feature selection is automatic; but can be controlled to find only a few most relevant inputs; and joint feature selection is enabled for multiple modeled outputs. With more features than cases, a novel connection to compressed sensing emerges: robustness against interval errors-in-variables implies model parsimony, and the input inaccuracies determine the regularization term. A small numerical example highlights counterintuitive results and a dramatic difference to total least squares.

  2. Joint reconstruction of activity and attenuation in Time-of-Flight PET: A Quantitative Analysis.

    Science.gov (United States)

    Rezaei, Ahmadreza; Deroose, Christophe M; Vahle, Thomas; Boada, Fernando; Nuyts, Johan

    2018-03-01

    Joint activity and attenuation reconstruction methods from time of flight (TOF) positron emission tomography (PET) data provide an effective solution to attenuation correction when no (or incomplete/inaccurate) information on the attenuation is available. One of the main barriers limiting their use in clinical practice is the lack of validation of these methods on a relatively large patient database. In this contribution, we aim at validating the activity reconstructions of the maximum likelihood activity reconstruction and attenuation registration (MLRR) algorithm on a whole-body patient data set. Furthermore, a partial validation (since the scale problem of the algorithm is avoided for now) of the maximum likelihood activity and attenuation reconstruction (MLAA) algorithm is also provided. We present a quantitative comparison of the joint reconstructions to the current clinical gold-standard maximum likelihood expectation maximization (MLEM) reconstruction with CT-based attenuation correction. Methods: The whole-body TOF-PET emission data of each patient data set is processed as a whole to reconstruct an activity volume covering all the acquired bed positions, which helps to reduce the problem of a scale per bed position in MLAA to a global scale for the entire activity volume. Three reconstruction algorithms are used: MLEM, MLRR and MLAA. A maximum likelihood (ML) scaling of the single scatter simulation (SSS) estimate to the emission data is used for scatter correction. The reconstruction results are then analyzed in different regions of interest. Results: The joint reconstructions of the whole-body patient data set provide better quantification in case of PET and CT misalignments caused by patient and organ motion. Our quantitative analysis shows a difference of -4.2% (±2.3%) and -7.5% (±4.6%) between the joint reconstructions of MLRR and MLAA compared to MLEM, averaged over all regions of interest, respectively. Conclusion: Joint activity and attenuation

  3. Sequencing of whole plastid genomes and nuclear ribosomal DNA of Diospyros species (Ebenaceae) endemic to New Caledonia: many species, little divergence.

    Science.gov (United States)

    Turner, Barbara; Paun, Ovidiu; Munzinger, Jérôme; Chase, Mark W; Samuel, Rosabelle

    2016-06-01

    Some plant groups, especially on islands, have been shaped by strong ancestral bottlenecks and rapid, recent radiation of phenotypic characters. Single molecular markers are often not informative enough for phylogenetic reconstruction in such plant groups. Whole plastid genomes and nuclear ribosomal DNA (nrDNA) are viewed by many researchers as sources of information for phylogenetic reconstruction of groups in which expected levels of divergence in standard markers are low. Here we evaluate the usefulness of these data types to resolve phylogenetic relationships among closely related Diospyros species. Twenty-two closely related Diospyros species from New Caledonia were investigated using whole plastid genomes and nrDNA data from low-coverage next-generation sequencing (NGS). Phylogenetic trees were inferred using maximum parsimony, maximum likelihood and Bayesian inference on separate plastid and nrDNA and combined matrices. The plastid and nrDNA sequences were, singly and together, unable to provide well supported phylogenetic relationships among the closely related New Caledonian Diospyros species. In the nrDNA, a 6-fold greater percentage of parsimony-informative characters compared with plastid DNA was found, but the total number of informative sites was greater for the much larger plastid DNA genomes. Combining the plastid and nuclear data improved resolution. Plastid results showed a trend towards geographical clustering of accessions rather than following taxonomic species. In plant groups in which multiple plastid markers are not sufficiently informative, an investigation at the level of the entire plastid genome may also not be sufficient for detailed phylogenetic reconstruction. Sequencing of complete plastid genomes and nrDNA repeats seems to clarify some relationships among the New Caledonian Diospyros species, but the higher percentage of parsimony-informative characters in nrDNA compared with plastid DNA did not help to resolve the phylogenetic tree

  4. Time-Lapse Monitoring of Subsurface Fluid Flow using Parsimonious Seismic Interferometry

    KAUST Repository

    Hanafy, Sherif

    2017-04-21

    A typical small-scale seismic survey (such as 240 shot gathers) takes at least 16 working hours to be completed, which is a major obstacle in case of time-lapse monitoring experiments. This is especially true if the subject that needs to be monitored is rapidly changing. In this work, we will discuss how to decrease the recording time from 16 working hours to less than one hour of recording. Here, the virtual data has the same accuracy as the conventional data. We validate the efficacy of parsimonious seismic interferometry with the time-lapse mentoring idea with field examples, where we were able to record 30 different data sets within a 2-hour period. The recorded data are then processed to generate 30 snapshots that shows the spread of water from the ground surface down to a few meters.

  5. EM for phylogenetic topology reconstruction on nonhomogeneous data.

    Science.gov (United States)

    Ibáñez-Marcelo, Esther; Casanellas, Marta

    2014-06-17

    The reconstruction of the phylogenetic tree topology of four taxa is, still nowadays, one of the main challenges in phylogenetics. Its difficulties lie in considering not too restrictive evolutionary models, and correctly dealing with the long-branch attraction problem. The correct reconstruction of 4-taxon trees is crucial for making quartet-based methods work and being able to recover large phylogenies. We adapt the well known expectation-maximization algorithm to evolutionary Markov models on phylogenetic 4-taxon trees. We then use this algorithm to estimate the substitution parameters, compute the corresponding likelihood, and to infer the most likely quartet. In this paper we consider an expectation-maximization method for maximizing the likelihood of (time nonhomogeneous) evolutionary Markov models on trees. We study its success on reconstructing 4-taxon topologies and its performance as input method in quartet-based phylogenetic reconstruction methods such as QFIT and QuartetSuite. Our results show that the method proposed here outperforms neighbor-joining and the usual (time-homogeneous continuous-time) maximum likelihood methods on 4-leaved trees with among-lineage instantaneous rate heterogeneity, and perform similarly to usual continuous-time maximum-likelihood when data satisfies the assumptions of both methods. The method presented in this paper is well suited for reconstructing the topology of any number of taxa via quartet-based methods and is highly accurate, specially regarding largely divergent trees and time nonhomogeneous data.

  6. Introduction to maximum entropy

    International Nuclear Information System (INIS)

    Sivia, D.S.

    1989-01-01

    The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. The author reviews the need for such methods in data analysis and shows, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. He concludes with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab

  7. Introduction to maximum entropy

    International Nuclear Information System (INIS)

    Sivia, D.S.

    1988-01-01

    The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. We review the need for such methods in data analysis and show, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. We conclude with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab

  8. Composite scores in comparative effectiveness research: counterbalancing parsimony and dimensionality in patient-reported outcomes.

    Science.gov (United States)

    Schwartz, Carolyn E; Patrick, Donald L

    2014-07-01

    When planning a comparative effectiveness study comparing disease-modifying treatments, competing demands influence choice of outcomes. Current practice emphasizes parsimony, although understanding multidimensional treatment impact can help to personalize medical decision-making. We discuss both sides of this 'tug of war'. We discuss the assumptions, advantages and drawbacks of composite scores and multidimensional outcomes. We describe possible solutions to the multiple comparison problem, including conceptual hierarchy distinctions, statistical approaches, 'real-world' benchmarks of effectiveness and subgroup analysis. We conclude that comparative effectiveness research should consider multiple outcome dimensions and compare different approaches that fit the individual context of study objectives.

  9. Sparse representation and dictionary learning penalized image reconstruction for positron emission tomography

    International Nuclear Information System (INIS)

    Chen, Shuhang; Liu, Huafeng; Shi, Pengcheng; Chen, Yunmei

    2015-01-01

    Accurate and robust reconstruction of the radioactivity concentration is of great importance in positron emission tomography (PET) imaging. Given the Poisson nature of photo-counting measurements, we present a reconstruction framework that integrates sparsity penalty on a dictionary into a maximum likelihood estimator. Patch-sparsity on a dictionary provides the regularization for our effort, and iterative procedures are used to solve the maximum likelihood function formulated on Poisson statistics. Specifically, in our formulation, a dictionary could be trained on CT images, to provide intrinsic anatomical structures for the reconstructed images, or adaptively learned from the noisy measurements of PET. Accuracy of the strategy with very promising application results from Monte-Carlo simulations, and real data are demonstrated. (paper)

  10. Pinhole single-photon emission tomography reconstruction based on median root prior

    International Nuclear Information System (INIS)

    Sohlberg, Antti; Kuikka, Jyrki T.; Ruotsalainen, Ulla

    2003-01-01

    The maximum likelihood expectation maximisation (ML-EM) algorithm can be used to reduce reconstruction artefacts produced by filtered backprojection (FBP) methods in pinhole single-photon emission tomography (SPET). However, ML-EM suffers from noise propagation along iterations, which leads to quantitatively unpleasant reconstruction results. To avoid this increase in noise, the median root prior (MRP) algorithm for pinhole SPET was implemented. Projection data of a line source and Picker's thyroid phantom were collected using a single-head gamma camera with a pinhole collimator. MRP was added to existing pinhole ML-EM reconstruction algorithm and the phantom studies were reconstructed using MRP, ML-EM and FBP for comparison. Coefficients of variation, contrasts and full-widths at half-maximum were calculated and showed a clear reduction in noise without significant loss of resolution or decrease in contrast when MRP was applied. MRP also produced visually pleasing images even with high iteration numbers, free of the checkerboard-type noise patterns which are typical of ML-EM images. (orig.)

  11. Analysis on the reconstruction accuracy of the Fitch method for inferring ancestral states

    Directory of Open Access Journals (Sweden)

    Grünewald Stefan

    2011-01-01

    Full Text Available Abstract Background As one of the most widely used parsimony methods for ancestral reconstruction, the Fitch method minimizes the total number of hypothetical substitutions along all branches of a tree to explain the evolution of a character. Due to the extensive usage of this method, it has become a scientific endeavor in recent years to study the reconstruction accuracies of the Fitch method. However, most studies are restricted to 2-state evolutionary models and a study for higher-state models is needed since DNA sequences take the format of 4-state series and protein sequences even have 20 states. Results In this paper, the ambiguous and unambiguous reconstruction accuracy of the Fitch method are studied for N-state evolutionary models. Given an arbitrary phylogenetic tree, a recurrence system is first presented to calculate iteratively the two accuracies. As complete binary tree and comb-shaped tree are the two extremal evolutionary tree topologies according to balance, we focus on the reconstruction accuracies on these two topologies and analyze their asymptotic properties. Then, 1000 Yule trees with 1024 leaves are generated and analyzed to simulate real evolutionary scenarios. It is known that more taxa not necessarily increase the reconstruction accuracies under 2-state models. The result under N-state models is also tested. Conclusions In a large tree with many leaves, the reconstruction accuracies of using all taxa are sometimes less than those of using a leaf subset under N-state models. For complete binary trees, there always exists an equilibrium interval [a, b] of conservation probability, in which the limiting ambiguous reconstruction accuracy equals to the probability of randomly picking a state. The value b decreases with the increase of the number of states, and it seems to converge. When the conservation probability is greater than b, the reconstruction accuracies of the Fitch method increase rapidly. The reconstruction

  12. Hyainailourine and teratodontine cranial material from the late Eocene of Egypt and the application of parsimony and Bayesian methods to the phylogeny and biogeography of Hyaenodonta (Placentalia, Mammalia).

    Science.gov (United States)

    Borths, Matthew R; Holroyd, Patricia A; Seiffert, Erik R

    2016-01-01

    recovered from each phylogenetic method, we reconstructed the biogeographic history of Hyaenodonta using parsimony optimization (PO), likelihood optimization (LO), and Bayesian Binary Markov chain Monte Carlo (MCMC) to examine support for the Afro-Arabian origin of Hyaenodonta. Across all analyses, we found that Hyaenodonta most likely originated in Europe, rather than Afro-Arabia. The clade is estimated by tip-dating analysis to have undergone a rapid radiation in the Late Cretaceous and Paleocene; a radiation currently not documented by fossil evidence. During the Paleocene, lineages are reconstructed as dispersing to Asia, Afro-Arabia, and North America. The place of origin of Hyainailouroidea is likely Afro-Arabia according to the Bayesian topologies but it is ambiguous using parsimony. All topologies support the constituent clades-Hyainailourinae, Apterodontinae, and Teratodontinae-as Afro-Arabian and tip-dating estimates that each clade is established in Afro-Arabia by the middle Eocene.

  13. Molecular phylogenetic trees - On the validity of the Goodman-Moore augmentation algorithm

    Science.gov (United States)

    Holmquist, R.

    1979-01-01

    A response is made to the reply of Nei and Tateno (1979) to the letter of Holmquist (1978) supporting the validity of the augmentation algorithm of Moore (1977) in reconstructions of nucleotide substitutions by means of the maximum parsimony principle. It is argued that the overestimation of the augmented numbers of nucleotide substitutions (augmented distances) found by Tateno and Nei (1978) is due to an unrepresentative data sample and that it is only necessary that evolution be stochastically uniform in different regions of the phylogenetic network for the augmentation method to be useful. The importance of the average value of the true distance over all links is explained, and the relative variances of the true and augmented distances are calculated to be almost identical. The effects of topological changes in the phylogenetic tree on the augmented distance and the question of the correctness of ancestral sequences inferred by the method of parsimony are also clarified.

  14. Pengintegrasian Model Leadership Menuju Model yang Lebih Komprhensip dan Parsimoni

    Directory of Open Access Journals (Sweden)

    Miswanto Miswanti

    2016-06-01

    Full Text Available ABTSRACT Through leadership models offered by Locke et. al (1991 we can say that whether good or not the vision of leaders in the organization is highly dependent on whether good or not the motives and traits, knowledge, skill, and abilities owned leaders. Then, good or not the implementation of the vision by the leader depends on whether good or not the motives and traits, knowledge, skills, abilities, and the vision of the leaders. Strategic Leadership written by Davies (1991 states that the implementation of the vision by using strategic leadership, the meaning is much more complete than what has been written by Locke et. al. in the fourth stage of leadership. Thus, aspects of the implementation of the vision by Locke et al (1991 it is not complete implementation of the vision according to Davies (1991. With the considerations mentioned above, this article attempts to combine the leadership model of the Locke et. al and strategic leadership of the Davies. With this modification is expected to be an improvement model of leadership is more comprehensive and parsimony.

  15. Maximum likelihood phylogenetic reconstruction from high-resolution whole-genome data and a tree of 68 eukaryotes.

    Science.gov (United States)

    Lin, Yu; Hu, Fei; Tang, Jijun; Moret, Bernard M E

    2013-01-01

    The rapid accumulation of whole-genome data has renewed interest in the study of the evolution of genomic architecture, under such events as rearrangements, duplications, losses. Comparative genomics, evolutionary biology, and cancer research all require tools to elucidate the mechanisms, history, and consequences of those evolutionary events, while phylogenetics could use whole-genome data to enhance its picture of the Tree of Life. Current approaches in the area of phylogenetic analysis are limited to very small collections of closely related genomes using low-resolution data (typically a few hundred syntenic blocks); moreover, these approaches typically do not include duplication and loss events. We describe a maximum likelihood (ML) approach for phylogenetic analysis that takes into account genome rearrangements as well as duplications, insertions, and losses. Our approach can handle high-resolution genomes (with 40,000 or more markers) and can use in the same analysis genomes with very different numbers of markers. Because our approach uses a standard ML reconstruction program (RAxML), it scales up to large trees. We present the results of extensive testing on both simulated and real data showing that our approach returns very accurate results very quickly. In particular, we analyze a dataset of 68 high-resolution eukaryotic genomes, with from 3,000 to 42,000 genes, from the eGOB database; the analysis, including bootstrapping, takes just 3 hours on a desktop system and returns a tree in agreement with all well supported branches, while also suggesting resolutions for some disputed placements.

  16. Edge Artifacts in Point Spread Function-based PET Reconstruction in Relation to Object Size and Reconstruction Parameters

    Directory of Open Access Journals (Sweden)

    Yuji Tsutsui

    2017-06-01

    Full Text Available Objective(s: We evaluated edge artifacts in relation to phantom diameter and reconstruction parameters in point spread function (PSF-based positron emission tomography (PET image reconstruction.Methods: PET data were acquired from an original cone-shaped phantom filled with 18F solution (21.9 kBq/mL for 10 min using a Biograph mCT scanner. The images were reconstructed using the baseline ordered subsets expectation maximization (OSEM algorithm and the OSEM with PSF correction model. The reconstruction parameters included a pixel size of 1.0, 2.0, or 3.0 mm, 1-12 iterations, 24 subsets, and a full width at half maximum (FWHM of the post-filter Gaussian filter of 1.0, 2.0, or 3.0 mm. We compared both the maximum recovery coefficient (RCmax and the mean recovery coefficient (RCmean in the phantom at different diameters.Results: The OSEM images had no edge artifacts, but the OSEM with PSF images had a dense edge delineating the hot phantom at diameters 10 mm or more and a dense spot at the center at diameters of 8 mm or less. The dense edge was clearly observed on images with a small pixel size, a Gaussian filter with a small FWHM, and a high number of iterations. At a phantom diameter of 6-7 mm, the RCmax for the OSEM and OSEM with PSF images was 60% and 140%, respectively (pixel size: 1.0 mm; FWHM of the Gaussian filter: 2.0 mm; iterations: 2. The RCmean of the OSEM with PSF images did not exceed 100%.Conclusion: PSF-based image reconstruction resulted in edge artifacts, the degree of which depends on the pixel size, number of iterations, FWHM of the Gaussian filter, and object size.

  17. Selected event reconstruction algorithms for the CBM experiment at FAIR

    International Nuclear Information System (INIS)

    Lebedev, Semen; Höhne, Claudia; Lebedev, Andrey; Ososkov, Gennady

    2014-01-01

    Development of fast and efficient event reconstruction algorithms is an important and challenging task in the Compressed Baryonic Matter (CBM) experiment at the future FAIR facility. The event reconstruction algorithms have to process terabytes of input data produced in particle collisions. In this contribution, several event reconstruction algorithms are presented. Optimization of the algorithms in the following CBM detectors are discussed: Ring Imaging Cherenkov (RICH) detector, Transition Radiation Detectors (TRD) and Muon Chamber (MUCH). The ring reconstruction algorithm in the RICH is discussed. In TRD and MUCH track reconstruction algorithms are based on track following and Kalman Filter methods. All algorithms were significantly optimized to achieve maximum speed up and minimum memory consumption. Obtained results showed that a significant speed up factor for all algorithms was achieved and the reconstruction efficiency stays at high level.

  18. Phylogenetic reconstruction of Mycobacterium tuberculosis within four settings of the Caribbean region: tree comparative analyse and first appraisal on their phylogeography.

    Science.gov (United States)

    Duchêne, Véronique; Ferdinand, Séverine; Filliol, Ingrid; Guégan, Jean François; Rastogi, Nalin; Sola, Christophe

    2004-03-01

    In order to compare phylogenetic methods and to reconstruct the evolutionary history of the tubercle bacilli, a set of macro-array-based genotyping data of Mycobacterium tuberculosis clinical isolates (called spoligotyping for spacer oligonucleotide typing, which assays the variability of the Direct Repeat -DR- locus), was analyzed in four settings of the Caribbean region (Guadeloupe, Martinique, Cuba and Haiti). A set of 47 alleles, split into 26 shared and 21 unique alleles) representative of 321 individual M. tuberculosis clinical isolates from patients residing in the above regions was studied. The following methods (and software in brackets) were investigated: numerical taxonomy distance methods (TAXOTRON), maximum parsimony procedure (PAUP), median-joining networks (NETWORK), and nested clade analysis (GEODIS). Results using these methods were analyzed, compared and discussed. The latter method (GEODIS) was investigated in detail by introducing geographical data together with genetic variability results to detect a link between population structure and population history, and to test the null hypothesis of no association between geography and genotypes. Irrespective of the methods used, our findings demonstrate that a core structure of four families (or clades) of M. tuberculosis strains is highly prevalent within the islands studied, indirectly reflecting passed colonization history of these different settings. Specificity of M. tuberculosis genotypes in each of the islands is discussed in the light of their respective colonial and contemporary histories.

  19. Interval-based reconstruction for uncertainty quantification in PET

    Science.gov (United States)

    Kucharczak, Florentin; Loquin, Kevin; Buvat, Irène; Strauss, Olivier; Mariano-Goulart, Denis

    2018-02-01

    A new directed interval-based tomographic reconstruction algorithm, called non-additive interval based expectation maximization (NIBEM) is presented. It uses non-additive modeling of the forward operator that provides intervals instead of single-valued projections. The detailed approach is an extension of the maximum likelihood—expectation maximization algorithm based on intervals. The main motivation for this extension is that the resulting intervals have appealing properties for estimating the statistical uncertainty associated with the reconstructed activity values. After reviewing previously published theoretical concepts related to interval-based projectors, this paper describes the NIBEM algorithm and gives examples that highlight the properties and advantages of this interval valued reconstruction.

  20. Photoelectron holography with improved image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Matsushita, Tomohiro, E-mail: matusita@spring8.or.j [Japan Synchrotron Radiation Research Institute (JASRI), SPring-8, 1-1-1 Kouto, Sayo-cho, Sayo-gun Hyogo 679-5198 (Japan); Matsui, Fumihiko; Daimon, Hiroshi [Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192 (Japan); Hayashi, Kouichi [Institute for Materials Research, Tohoku University, Sendai 980-8577 (Japan)

    2010-05-15

    Electron holography is a type of atomic structural analysis, and it has unique features such as element selectivity and the ability to analyze the structure around an impurity in a crystal. In this paper, we introduce the measurement system, electron holograms, a theory for the recording process of an electron hologram, and a theory for the reconstruction algorithm. We describe photoelectron holograms, Auger electron holograms, and the inverse mode of an electron hologram. The reconstruction algorithm, scattering pattern extraction algorithm (SPEA), the SPEA with maximum entropy method (SPEA-MEM), and SPEA-MEM with translational operation are also described.

  1. Photoelectron holography with improved image reconstruction

    International Nuclear Information System (INIS)

    Matsushita, Tomohiro; Matsui, Fumihiko; Daimon, Hiroshi; Hayashi, Kouichi

    2010-01-01

    Electron holography is a type of atomic structural analysis, and it has unique features such as element selectivity and the ability to analyze the structure around an impurity in a crystal. In this paper, we introduce the measurement system, electron holograms, a theory for the recording process of an electron hologram, and a theory for the reconstruction algorithm. We describe photoelectron holograms, Auger electron holograms, and the inverse mode of an electron hologram. The reconstruction algorithm, scattering pattern extraction algorithm (SPEA), the SPEA with maximum entropy method (SPEA-MEM), and SPEA-MEM with translational operation are also described.

  2. Reconstructing Historical VOC Concentrations in Drinking Water for Epidemiological Studies at a U.S. Military Base: Summary of Results

    Directory of Open Access Journals (Sweden)

    Morris L. Maslia

    2016-10-01

    Full Text Available A U.S. government health agency conducted epidemiological studies to evaluate whether exposures to drinking water contaminated with volatile organic compounds (VOC at U.S. Marine Corps Base Camp Lejeune, North Carolina, were associated with increased health risks to children and adults. These health studies required knowledge of contaminant concentrations in drinking water—at monthly intervals—delivered to family housing, barracks, and other facilities within the study area. Because concentration data were limited or unavailable during much of the period of contamination (1950s–1985, the historical reconstruction process was used to quantify estimates of monthly mean contaminant-specific concentrations. This paper integrates many efforts, reports, and papers into a synthesis of the overall approach to, and results from, a drinking-water historical reconstruction study. Results show that at the Tarawa Terrace water treatment plant (WTP reconstructed (simulated tetrachloroethylene (PCE concentrations reached a maximum monthly average value of 183 micrograms per liter (μg/L compared to a one-time maximum measured value of 215 μg/L and exceeded the U.S. Environmental Protection Agency’s current maximum contaminant level (MCL of 5 μg/L during the period November 1957–February 1987. At the Hadnot Point WTP, reconstructed trichloroethylene (TCE concentrations reached a maximum monthly average value of 783 μg/L compared to a one-time maximum measured value of 1400 μg/L during the period August 1953–December 1984. The Hadnot Point WTP also provided contaminated drinking water to the Holcomb Boulevard housing area continuously prior to June 1972, when the Holcomb Boulevard WTP came on line (maximum reconstructed TCE concentration of 32 μg/L and intermittently during the period June 1972–February 1985 (maximum reconstructed TCE concentration of 66 μg/L. Applying the historical reconstruction process to quantify contaminant

  3. Neural network algorithm for image reconstruction using the grid friendly projections

    International Nuclear Information System (INIS)

    Cierniak, R.

    2011-01-01

    Full text: The presented paper describes a development of original approach to the reconstruction problem using a recurrent neural network. Particularly, the 'grid-friendly' angles of performed projections are selected according to the discrete Radon transform (DRT) concept to decrease the number of projections required. The methodology of our approach is consistent with analytical reconstruction algorithms. Reconstruction problem is reformulated in our approach to optimization problem. This problem is solved in present concept using method based on the maximum likelihood methodology. The reconstruction algorithm proposed in this work is consequently adapted for more practical discrete fan beam projections. Computer simulation results show that the neural network reconstruction algorithm designed to work in this way improves obtained results and outperforms conventional methods in reconstructed image quality. (author)

  4. An investigation of temporal regularization techniques for dynamic PET reconstructions using temporal splines

    International Nuclear Information System (INIS)

    Verhaeghe, Jeroen; D'Asseler, Yves; Vandenberghe, Stefaan; Staelens, Steven; Lemahieu, Ignace

    2007-01-01

    The use of a temporal B-spline basis for the reconstruction of dynamic positron emission tomography data was investigated. Maximum likelihood (ML) reconstructions using an expectation maximization framework and maximum A-posteriori (MAP) reconstructions using the generalized expectation maximization framework were evaluated. Different parameters of the B-spline basis of such as order, number of basis functions and knot placing were investigated in a reconstruction task using simulated dynamic list-mode data. We found that a higher order basis reduced both the bias and variance. Using a higher number of basis functions in the modeling of the time activity curves (TACs) allowed the algorithm to model faster changes of the TACs, however, the TACs became noisier. We have compared ML, Gaussian postsmoothed ML and MAP reconstructions. The noise level in the ML reconstructions was controlled by varying the number of basis functions. The MAP algorithm penalized the integrated squared curvature of the reconstructed TAC. The postsmoothed ML was always outperformed in terms of bias and variance properties by the MAP and ML reconstructions. A simple adaptive knot placing strategy was also developed and evaluated. It is based on an arc length redistribution scheme during the reconstruction. The free knot reconstruction allowed a more accurate reconstruction while reducing the noise level especially for fast changing TACs such as blood input functions. Limiting the number of temporal basis functions combined with the adaptive knot placing strategy is in this case advantageous for regularization purposes when compared to the other regularization techniques

  5. The inverse Fourier problem in the case of poor resolution in one given direction: the maximum-entropy solution

    International Nuclear Information System (INIS)

    Papoular, R.J.; Zheludev, A.; Ressouche, E.; Schweizer, J.

    1995-01-01

    When density distributions in crystals are reconstructed from 3D diffraction data, a problem sometimes occurs when the spatial resolution in one given direction is very small compared to that in perpendicular directions. In this case, a 2D projected density is usually reconstructed. For this task, the conventional Fourier inversion method only makes use of those structure factors measured in the projection plane. All the other structure factors contribute zero to the reconstruction of a projected density. On the contrary, the maximum-entropy method uses all the 3D data, to yield 3D-enhanced 2D projected density maps. It is even possible to reconstruct a projection in the extreme case when not one structure factor in the plane of projection is known. In the case of poor resolution along one given direction, a Fourier inversion reconstruction gives very low quality 3D densities 'smeared' in the third dimension. The application of the maximum-entropy procedure reduces the smearing significantly and reasonably well resolved projections along most directions can now be obtained from the MaxEnt 3D density. To illustrate these two ideas, particular examples based on real polarized neutron diffraction data sets are presented. (orig.)

  6. Robust statistical reconstruction for charged particle tomography

    Science.gov (United States)

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  7. Research of the system response of neutron double scatter imaging for MLEM reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, M., E-mail: wyj2013@163.com [Northwest Institute of Nuclear Technology, Xi’an 710024 (China); State Key Laboratory of Intense Pulsed Radiation-Simulation and Effect, Xi’an 710024 (China); Peng, B.D.; Sheng, L.; Li, K.N.; Zhang, X.P.; Li, Y.; Li, B.K.; Yuan, Y.; Wang, P.W.; Zhang, X.D.; Li, C.H. [Northwest Institute of Nuclear Technology, Xi’an 710024 (China); State Key Laboratory of Intense Pulsed Radiation-Simulation and Effect, Xi’an 710024 (China)

    2015-03-01

    A Maximum Likelihood image reconstruction technique has been applied to neutron scatter imaging. The response function of the imaging system can be obtained by Monte Carlo simulation, which is very time-consuming if the number of image pixels and particles is large. In this work, to improve time efficiency, an analytical approach based on the probability of neutron interaction and transport in the detector is developed to calculate the system response function. The response function was applied to calculate the relative efficiency of the neutron scatter imaging system as a function of the incident neutron energy. The calculated results agreed with simulations by the MCNP5 software. Then the maximum likelihood expectation maximization (MLEM) reconstruction method with the system response function was used to reconstruct data simulated by Monte Carlo method. The results showed that there was good consistency between the reconstruction position and true position. Compared with back-projection reconstruction, the improvement in image quality was obvious, and the locations could be discerned easily for multiple radiation point sources.

  8. Molecular Phylogeny of the Bamboo Sharks (Chiloscyllium spp.

    Directory of Open Access Journals (Sweden)

    Noor Haslina Masstor

    2014-01-01

    Full Text Available Chiloscyllium, commonly called bamboo shark, can be found inhabiting the waters of the Indo-West Pacific around East Asian countries such as Malaysia, Myanmar, Thailand, Singapore, and Indonesia. The International Union for Conservation of Nature (IUCN Red List has categorized them as nearly threatened sharks out of their declining population status due to overexploitation. A molecular study was carried out to portray the systematic relationships within Chiloscyllium species using 12S rRNA and cytochrome b gene sequences. Maximum parsimony and Bayesian were used to reconstruct their phylogeny trees. A total of 381 bp sequences’ lengths were successfully aligned in the 12S rRNA region, with 41 bp sites being parsimony-informative. In the cytochrome b region, a total of 1120 bp sites were aligned, with 352 parsimony-informative characters. All analyses yield phylogeny trees on which C. indicum has close relationships with C. plagiosum. C. punctatum is sister taxon to both C. indicum and C. plagiosum while C. griseum and C. hasseltii formed their own clade as sister taxa. These Chiloscyllium classifications can be supported by some morphological characters (lateral dermal ridges on the body, coloring patterns, and appearance of hypobranchials and basibranchial plate that can clearly be used to differentiate each species.

  9. Application of three-dimensional CT reconstruction technology on inferior oblique muscle in congenital superior oblique palsy

    Directory of Open Access Journals (Sweden)

    Yang Zhang

    2014-05-01

    Full Text Available AIM: To investigate the viability of the morphology of inferior oblique muscle observed stereoscopically using 3-dimensional CT reconstruction technique. METHODS: This control study included of 29 cases which were clinically diagnosed with monocular congenital superior oblique palsy, examined by dimensional CT. The images of the inferior oblique muscle were reconstructed by Mimics software. 3D digital images on the basis of CT scanning data of the individuals were established. Observing the morphology of binocular inferior oblique muscle by self-controlled design, we compared the maximum transverse diameter of inferior oblique muscle of paralyzed eye with non-paralyzed one. We chose 5% as the significant level.RESULTS: The reconstructed results of 3-dimensional CT scan showed that not all of the inferior oblique abdominal muscle of paralyzed eyes were thinner than that of the non-paralyzed eye in maximum transverse diameter of cross-sectional area. The maximum transverse diameter of inferior oblique muscle was measured. The average maximum transverse diameter of the paralyzed eye was 6.797±1.083mm and the non-paralyzed eye was 6.507±0.848mm. The maximum transverse diameter of inferior oblique muscle of paralyzed eye did not, however, differ significantly from the normal(P>0.05. CONCLUSION: The three-dimensional CT reconstruction technology can be used for preoperative evaluation of the morphology of inferior oblique muscle.

  10. Ray tracing reconstruction investigation for C-arm tomosynthesis

    Science.gov (United States)

    Malalla, Nuhad A. Y.; Chen, Ying

    2016-04-01

    C-arm tomosynthesis is a three dimensional imaging technique. Both x-ray source and the detector are mounted on a C-arm wheeled structure to provide wide variety of movement around the object. In this paper, C-arm tomosynthesis was introduced to provide three dimensional information over a limited view angle (less than 180o) to reduce radiation exposure and examination time. Reconstruction algorithms based on ray tracing method such as ray tracing back projection (BP), simultaneous algebraic reconstruction technique (SART) and maximum likelihood expectation maximization (MLEM) were developed for C-arm tomosynthesis. C-arm tomosynthesis projection images of simulated spherical object were simulated with a virtual geometric configuration with a total view angle of 40 degrees. This study demonstrated the sharpness of in-plane reconstructed structure and effectiveness of removing out-of-plane blur for each reconstruction algorithms. Results showed the ability of ray tracing based reconstruction algorithms to provide three dimensional information with limited angle C-arm tomosynthesis.

  11. Tibiofemoral joint contact area and pressure after single- and double-bundle anterior cruciate ligament reconstruction.

    Science.gov (United States)

    Morimoto, Yusuke; Ferretti, Mario; Ekdahl, Max; Smolinski, Patrick; Fu, Freddie H

    2009-01-01

    The purpose of this study was to compare the tibiofemoral contact area and pressure after single-bundle (SB) and double-bundle (DB) anterior cruciate ligament (ACL) reconstruction by use of 2 femoral and 2 tibial tunnels in intact cadaveric knees. Tibiofemoral contact area and mean and maximum pressures were measured by pressure-sensitive film (Fujifilm, Valhalla, NY) inserted between the tibia and femur. The knee was subjected to a 1,000-N axial load by use of a uniaxial testing machine at 0 degrees , 15 degrees , 30 degrees , and 45 degrees of flexion. Three conditions were evaluated: (1) intact ACL, (2) SB ACL reconstruction (n = 10 knees), and (3) DB ACL reconstruction (n = 9 knees). When compared with the intact knee, DB ACL reconstruction showed no significant difference in tibiofemoral contact area and mean and maximum pressures. SB ACL reconstruction had a significantly smaller contact area on the lateral and medial tibiofemoral joints at 30 degrees and 15 degrees of flexion. SB ACL reconstruction also had significantly higher mean pressures at 15 degrees of flexion on the medial tibiofemoral joint and at 0 degrees and 15 degrees of flexion on the lateral tibiofemoral joint, as well as significantly higher maximum pressures at 15 degrees of flexion on the lateral tibiofemoral joint. SB ACL reconstruction resulted in a significantly smaller tibiofemoral contact area and higher pressures. DB ACL more closely restores the normal contact area and pressure mainly at low flexion angles. Our findings suggest that the changes in the contact area and pressures after SB ACL reconstruction may be one of the causes of osteoarthritis on long-term follow-up. DB ACL reconstruction may reduce the incidence of osteoarthritis by closely restoring contact area and pressure.

  12. A comparison of PMIP2 model simulations and the MARGO proxy reconstruction for tropical sea surface temperatures at last glacial maximum

    Energy Technology Data Exchange (ETDEWEB)

    Otto-Bliesner, Bette L.; Brady, E.C. [National Center for Atmospheric Research, Climate and Global Dynamics Division, Boulder, CO (United States); Schneider, Ralph; Weinelt, M. [Christian-Albrechts Universitaet, Institut fuer Geowissenschaften, Kiel (Germany); Kucera, M. [Eberhard-Karls Universitaet Tuebingen, Institut fuer Geowissenschaften, Tuebingen (Germany); Abe-Ouchi, A. [The University of Tokyo, Center for Climate System Research, Kashiwa (Japan); Bard, E. [CEREGE, College de France, CNRS, Universite Aix-Marseille, Aix-en-Provence (France); Braconnot, P.; Kageyama, M.; Marti, O.; Waelbroeck, C. [Unite mixte CEA-CNRS-UVSQ, Laboratoire des Sciences du Climat et de l' Environnement, Gif-sur-Yvette Cedex (France); Crucifix, M. [Universite Catholique de Louvain, Institut d' Astronomie et de Geophysique Georges Lemaitre, Louvain-la-Neuve (Belgium); Hewitt, C.D. [Met Office Hadley Centre, Exeter (United Kingdom); Paul, A. [Bremen University, Department of Geosciences, Bremen (Germany); Rosell-Mele, A. [Universitat Autonoma de Barcelona, ICREA and Institut de Ciencia i Tecnologia Ambientals, Barcelona (Spain); Weber, S.L. [Royal Netherlands Meteorological Institute (KNMI), De Bilt (Netherlands); Yu, Y. [Chinese Academy of Sciences, LASG, Institute of Atmospheric Physics, Beijing (China)

    2009-05-15

    Results from multiple model simulations are used to understand the tropical sea surface temperature (SST) response to the reduced greenhouse gas concentrations and large continental ice sheets of the last glacial maximum (LGM). We present LGM simulations from the Paleoclimate Modelling Intercomparison Project, Phase 2 (PMIP2) and compare these simulations to proxy data collated and harmonized within the Multiproxy Approach for the Reconstruction of the Glacial Ocean Surface Project (MARGO). Five atmosphere-ocean coupled climate models (AOGCMs) and one coupled model of intermediate complexity have PMIP2 ocean results available for LGM. The models give a range of tropical (defined for this paper as 15 S-15 N) SST cooling of 1.0-2.4 C, comparable to the MARGO estimate of annual cooling of 1.7{+-}1 C. The models simulate greater SST cooling in the tropical Atlantic than tropical Pacific, but interbasin and intrabasin variations of cooling are much smaller than those found in the MARGO reconstruction. The simulated tropical coolings are relatively insensitive to season, a feature also present in the MARGO transferred-based estimates calculated from planktonic foraminiferal assemblages for the Indian and Pacific Oceans. These assemblages indicate seasonality in cooling in the Atlantic basin, with greater cooling in northern summer than northern winter, not captured by the model simulations. Biases in the simulations of the tropical upwelling and thermocline found in the preindustrial control simulations remain for the LGM simulations and are partly responsible for the more homogeneous spatial and temporal LGM tropical cooling simulated by the models. The PMIP2 LGM simulations give estimates for the climate sensitivity parameter of 0.67 -0.83 C per Wm{sup -2}, which translates to equilibrium climate sensitivity for doubling of atmospheric CO{sub 2} of 2.6-3.1 C. (orig.)

  13. Singular Spectrum Analysis for Astronomical Time Series: Constructing a Parsimonious Hypothesis Test

    Science.gov (United States)

    Greco, G.; Kondrashov, D.; Kobayashi, S.; Ghil, M.; Branchesi, M.; Guidorzi, C.; Stratta, G.; Ciszak, M.; Marino, F.; Ortolan, A.

    We present a data-adaptive spectral method - Monte Carlo Singular Spectrum Analysis (MC-SSA) - and its modification to tackle astrophysical problems. Through numerical simulations we show the ability of the MC-SSA in dealing with 1/f β power-law noise affected by photon counting statistics. Such noise process is simulated by a first-order autoregressive, AR(1) process corrupted by intrinsic Poisson noise. In doing so, we statistically estimate a basic stochastic variation of the source and the corresponding fluctuations due to the quantum nature of light. In addition, MC-SSA test retains its effectiveness even when a significant percentage of the signal falls below a certain level of detection, e.g., caused by the instrument sensitivity. The parsimonious approach presented here may be broadly applied, from the search for extrasolar planets to the extraction of low-intensity coherent phenomena probably hidden in high energy transients.

  14. A phylogenetic reconstruction and emendation of Agaricus section Duploannulatae.

    Science.gov (United States)

    Challen, Michael P; Kerrigan, Richard W; Callac, Philippe

    2003-01-01

    Agaricus section Duploannulatae comprises the group of species allied with A. bisporus and A. bitorquis. Disagreement exists in the literature regarding the composition of this group. We used DNA sequence data from the ITS segments of the nuclear ribosomal DNA region, in a sample of European and North American isolates, to identify characters shared by this group, to further delimit species-level taxa within the section, and to develop a phylogenetic hypothesis. Shared polymorphisms that suggest a natural limit for section Duploannulatae were found. ITS1 data were assessed using parsimony, distance and maximum likelihood methods of phylogeny. The section Duploannulatae comprised six robust clades. Five clades corresponded to well characterized species from the temperate Northern Hemisphere (A. bisporus, A. subfloccosus, A. bitorquis, A. vaporarius, A. cupressicola). The sixth clade encompassed an A. devoniensis complex. Species concepts, nomenclature, and relationships are discussed and compared with prior reports.

  15. Pollen-based biome reconstruction for southern Europe and Africa 18,000 yr BP

    NARCIS (Netherlands)

    Elenga, H; Peyron, O; Bonnefille, R; Jolly, D; Cheddadi, R; Guiot, J; Andrieu, [No Value; Bottema, S; Buchet, G; de Beaulieu, JL; Hamilton, AC; Maley, J; Marchant, R; Perez-Obiol, R; Reille, M; Riollet, G; Scott, L; Straka, H; Taylor, D; Van Campo, E; Vincens, A; Laarif, F; Jonson, H

    Pollen data from 18,000 C-14 yr sp were compiled in order to reconstruct biome distributions at the last glacial maximum in southern Europe and Africa. Biome reconstructions were made using the objective biomization method applied to pollen counts using a complete list of dryland taxa wherever

  16. On-line reconstruction of in-core power distribution by harmonics expansion method

    International Nuclear Information System (INIS)

    Wang Changhui; Wu Hongchun; Cao Liangzhi; Yang Ping

    2011-01-01

    Highlights: → A harmonics expansion method for the on-line in-core power reconstruction is proposed. → A harmonics data library is pre-generated off-line and a code named COMS is developed. → Numerical results show that the maximum relative error of the reconstruction is less than 5.5%. → This method has a high computational speed compared to traditional methods. - Abstract: Fixed in-core detectors are most suitable in real-time response to in-core power distributions in pressurized water reactors (PWRs). In this paper, a harmonics expansion method is used to reconstruct the in-core power distribution of a PWR on-line. In this method, the in-core power distribution is expanded by the harmonics of one reference case. The expansion coefficients are calculated using signals provided by fixed in-core detectors. To conserve computing time and improve reconstruction precision, a harmonics data library containing the harmonics of different reference cases is constructed. Upon reconstruction of the in-core power distribution on-line, the two closest reference cases are searched from the harmonics data library to produce expanded harmonics by interpolation. The Unit 1 reactor of DayaBay Nuclear Power Plant (DayaBay NPP) in China is considered for verification. The maximum relative error between the measurement and reconstruction results is less than 5.5%, and the computing time is about 0.53 s for a single reconstruction, indicating that this method is suitable for the on-line monitoring of PWRs.

  17. An integrative systematic framework helps to reconstruct skeletal evolution of glass sponges (Porifera, Hexactinellida).

    Science.gov (United States)

    Dohrmann, Martin; Kelley, Christopher; Kelly, Michelle; Pisera, Andrzej; Hooper, John N A; Reiswig, Henry M

    2017-01-01

    Glass sponges (Class Hexactinellida) are important components of deep-sea ecosystems and are of interest from geological and materials science perspectives. The reconstruction of their phylogeny with molecular data has only recently begun and shows a better agreement with morphology-based systematics than is typical for other sponge groups, likely because of a greater number of informative morphological characters. However, inconsistencies remain that have far-reaching implications for hypotheses about the evolution of their major skeletal construction types (body plans). Furthermore, less than half of all described extant genera have been sampled for molecular systematics, and several taxa important for understanding skeletal evolution are still missing. Increased taxon sampling for molecular phylogenetics of this group is therefore urgently needed. However, due to their remote habitat and often poorly preserved museum material, sequencing all 126 currently recognized extant genera will be difficult to achieve. Utilizing morphological data to incorporate unsequenced taxa into an integrative systematics framework therefore holds great promise, but it is unclear which methodological approach best suits this task. Here, we increase the taxon sampling of four previously established molecular markers (18S, 28S, and 16S ribosomal DNA, as well as cytochrome oxidase subunit I) by 12 genera, for the first time including representatives of the order Aulocalycoida and the type genus of Dactylocalycidae, taxa that are key to understanding hexactinellid body plan evolution. Phylogenetic analyses suggest that Aulocalycoida is diphyletic and provide further support for the paraphyly of order Hexactinosida; hence these orders are abolished from the Linnean classification. We further assembled morphological character matrices to integrate so far unsequenced genera into phylogenetic analyses in maximum parsimony (MP), maximum likelihood (ML), Bayesian, and morphology-based binning

  18. Phylogenetic reconstruction methods: an overview.

    Science.gov (United States)

    De Bruyn, Alexandre; Martin, Darren P; Lefeuvre, Pierre

    2014-01-01

    Initially designed to infer evolutionary relationships based on morphological and physiological characters, phylogenetic reconstruction methods have greatly benefited from recent developments in molecular biology and sequencing technologies with a number of powerful methods having been developed specifically to infer phylogenies from macromolecular data. This chapter, while presenting an overview of basic concepts and methods used in phylogenetic reconstruction, is primarily intended as a simplified step-by-step guide to the construction of phylogenetic trees from nucleotide sequences using fairly up-to-date maximum likelihood methods implemented in freely available computer programs. While the analysis of chloroplast sequences from various Vanilla species is used as an illustrative example, the techniques covered here are relevant to the comparative analysis of homologous sequences datasets sampled from any group of organisms.

  19. Patterns and effects of GC3 heterogeneity and parsimony informative sites on the phylogenetic tree of genes.

    Science.gov (United States)

    Ma, Shuai; Wu, Qi; Hu, Yibo; Wei, Fuwen

    2018-05-20

    The explosive growth in genomic data has provided novel insights into the conflicting signals hidden in phylogenetic trees. Although some studies have explored the effects of the GC content and parsimony informative sites (PIS) on the phylogenetic tree, the effect of the heterogeneity of the GC content at the first/second/third codon position on parsimony informative sites (GC1/2/3 PIS ) among different species and the effect of PIS on phylogenetic tree construction remain largely unexplored. Here, we used two different mammal genomic datasets to explore the patterns of GC1/2/3 PIS heterogeneity and the effect of PIS on the phylogenetic tree of genes: (i) all GC1/2/3 PIS have obvious heterogeneity between different mammals, and the levels of heterogeneity are GC3 PIS  > GC2 PIS  > GC1 PIS ; (ii) the number of PIS is positively correlated with the metrics of "good" gene tree topologies, and excluding the third codon position (C3) decreases the quality of gene trees by removing too many PIS. These results provide novel insights into the heterogeneity pattern of GC1/2/3 PIS in mammals and the relationship between GC3/PIS and gene trees. Additionally, it is necessary to carefully consider whether to exclude C3 to improve the quality of gene trees, especially in the super-tree method. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Image-reconstruction algorithms for positron-emission tomography systems

    International Nuclear Information System (INIS)

    Cheng, S.N.C.

    1982-01-01

    The positional uncertainty in the time-of-flight measurement of a positron-emission tomography system is modelled as a Gaussian distributed random variable and the image is assumed to be piecewise constant on a rectilinear lattice. A reconstruction algorithm using maximum-likelihood estimation is derived for the situation in which time-of-flight data are sorted as the most-likely-position array. The algorithm is formulated as a linear system described by a nonseparable, block-banded, Toeplitz matrix, and a sine-transform technique is used to implement this algorithm efficiently. The reconstruction algorithms for both the most-likely-position array and the confidence-weighted array are described by similar equations, hence similar linear systems can be used to described the reconstruction algorithm for a discrete, confidence-weighted array, when the matrix and the entries in the data array are properly identified. It is found that the mean square-error depends on the ratio of the full width at half the maximum of time-of-flight measurement over the size of a pixel. When other parameters are fixed, the larger the pixel size, the smaller is the mean square-error. In the study of resolution, parameters that affect the impulse response of time-of-flight reconstruction algorithms are identified. It is found that the larger the pixel size, the larger is the standard deviation of the impulse response. This shows that small mean square-error and fine resolution are two contradictory requirements

  1. Accelerated Compressed Sensing Based CT Image Reconstruction.

    Science.gov (United States)

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  2. Accelerated Compressed Sensing Based CT Image Reconstruction

    Directory of Open Access Journals (Sweden)

    SayedMasoud Hashemi

    2015-01-01

    Full Text Available In X-ray computed tomography (CT an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  3. Climatic reconstruction in Europe for 18,000 yr B.P. from pollen data

    NARCIS (Netherlands)

    Peyron, O; Guiot, J; Cheddadi, R; Tarasov, P; Reille, M; de Beaulieu, JL; Bottema, S; Andrieu, [No Value

    An improved concept of the best analogs method is used to reconstruct the climate of the last glacial maximum from pollen data in Europe. In order to deal with the lack of perfect analogs of fossil assemblages and therefore to obtain a more accurate climate reconstruction, we used a combination of

  4. Balancing practicality and hydrologic realism: a parsimonious approach for simulating rapid groundwater recharge via unsaturated-zone preferential flow

    Science.gov (United States)

    Mirus, Benjamin B.; Nimmo, J.R.

    2013-01-01

    The impact of preferential flow on recharge and contaminant transport poses a considerable challenge to water-resources management. Typical hydrologic models require extensive site characterization, but can underestimate fluxes when preferential flow is significant. A recently developed source-responsive model incorporates film-flow theory with conservation of mass to estimate unsaturated-zone preferential fluxes with readily available data. The term source-responsive describes the sensitivity of preferential flow in response to water availability at the source of input. We present the first rigorous tests of a parsimonious formulation for simulating water table fluctuations using two case studies, both in arid regions with thick unsaturated zones of fractured volcanic rock. Diffuse flow theory cannot adequately capture the observed water table responses at both sites; the source-responsive model is a viable alternative. We treat the active area fraction of preferential flow paths as a scaled function of water inputs at the land surface then calibrate the macropore density to fit observed water table rises. Unlike previous applications, we allow the characteristic film-flow velocity to vary, reflecting the lag time between source and deep water table responses. Analysis of model performance and parameter sensitivity for the two case studies underscores the importance of identifying thresholds for initiation of film flow in unsaturated rocks, and suggests that this parsimonious approach is potentially of great practical value.

  5. More quality measures versus measuring what matters: a call for balance and parsimony.

    Science.gov (United States)

    Meyer, Gregg S; Nelson, Eugene C; Pryor, David B; James, Brent; Swensen, Stephen J; Kaplan, Gary S; Weissberg, Jed I; Bisognano, Maureen; Yates, Gary R; Hunt, Gordon C

    2012-11-01

    External groups requiring measures now include public and private payers, regulators, accreditors and others that certify performance levels for consumers, patients and payers. Although benefits have accrued from the growth in quality measurement, the recent explosion in the number of measures threatens to shift resources from improving quality to cover a plethora of quality-performance metrics that may have a limited impact on the things that patients and payers want and need (ie, better outcomes, better care, and lower per capita costs). Here we propose a policy that quality measurement should be: balanced to meet the need of end users to judge quality and cost performance and the need of providers to continuously improve the quality, outcomes and costs of their services; and parsimonious to measure quality, outcomes and costs with appropriate metrics that are selected based on end-user needs.

  6. Twenty-five years of maximum-entropy principle

    Science.gov (United States)

    Kapur, J. N.

    1983-04-01

    The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.

  7. Comparison of maximum intensity projection and digitally reconstructed radiographic projection for carotid artery stenosis measurement

    International Nuclear Information System (INIS)

    Hyde, Derek E.; Habets, Damiaan F.; Fox, Allan J.; Gulka, Irene; Kalapos, Paul; Lee, Don H.; Pelz, David M.; Holdsworth, David W.

    2007-01-01

    Digital subtraction angiography is being supplanted by three-dimensional imaging techniques in many clinical applications, leading to extensive use of maximum intensity projection (MIP) images to depict volumetric vascular data. The MIP algorithm produces intensity profiles that are different than conventional angiograms, and can also increase the vessel-to-tissue contrast-to-noise ratio. We evaluated the effect of the MIP algorithm in a clinical application where quantitative vessel measurement is important: internal carotid artery stenosis grading. Three-dimensional computed rotational angiography (CRA) was performed on 26 consecutive symptomatic patients to verify an internal carotid artery stenosis originally found using duplex ultrasound. These volumes of data were visualized using two different postprocessing projection techniques: MIP and digitally reconstructed radiographic (DRR) projection. A DRR is a radiographic image simulating a conventional digitally subtracted angiogram, but it is derived computationally from the same CRA dataset as the MIP. By visualizing a single volume with two different projection techniques, the postprocessing effect of the MIP algorithm is isolated. Vessel measurements were made, according to the NASCET guidelines, and percentage stenosis grades were calculated. The paired t-test was used to determine if the measurement difference between the two techniques was statistically significant. The CRA technique provided an isotropic voxel spacing of 0.38 mm. The MIPs and DRRs had a mean signal-difference-to-noise-ratio of 30:1 and 26:1, respectively. Vessel measurements from MIPs were, on average, 0.17 mm larger than those from DRRs (P<0.0001). The NASCET-type stenosis grades tended to be underestimated on average by 2.4% with the MIP algorithm, although this was not statistically significant (P=0.09). The mean interobserver variability (standard deviation) of both the MIP and DRR images was 0.35 mm. It was concluded that the MIP

  8. Reconstructing mass balance of Garabashi Glacier (1800–2005 using dendrochronological data

    Directory of Open Access Journals (Sweden)

    E. A. Dolgova

    2013-01-01

    Full Text Available The exploration whether tree-ring data can be effectually applied for the mass balance reconstruction in Caucasus was the main goal of this research. Tree-ring width and maximum density chronologies of pine (Pinus sylvestris L. at seven high-elevation sites in Northern Caucasus were explored for this purpose. As well as in other places of the temperate zone tree- ring width has complex climate signal controlled both temperature and precipitation. Instrumental mass balance records of Garabashi Gglacier started at 1983s. It is well known that Caucasus glaciers intensively retreat in the last decades and according to instrumental data mass balance variations are mostly controlled by the ablation, i.e. summer temperature variations. Maximum density chronology has statistically significant correlation with mass balance due to summer temperature sensitivity and great input of ablation to total mass balance variations. To include in our reconstruction different climatically sensitive parameters, stepwise multiple regression model was used. The strongest relation (r = 0.88; r2 = 0.78; p < 0.05 between two ring-width and one maximum density chronologies was identified. Cross-validation test (r = 0.79; r2 = 0.62; p < 0.05 confirmed model adequacy and it allowed to reconstruct Garabashi Glacier mass balance for 1800–2005ss. Reconstructed and instrumental mass balance values coincide well except the most recent period in 2000s, when the reconstructed mass balance slightly underestimated the real values. However even in this period it remained negative as well as the instrumental records. The bias can be explained by the weak sensitivity of the chronologies to winter precipitation (i.e. accumulation. The tree-ring based mass balance reconstruction was compared with one based on meteorological data (since 1905s. Both reconstructions have good interannual agreement (r = 0.53; p < 0.05 particularly for the period between 1975 and 2005. According to the

  9. Evaluation of few-view reconstruction parameters for illicit substance detection using fast-neutron transmission spectroscopy

    International Nuclear Information System (INIS)

    Fink, C.L.; Humm, P.G.; Martin, M.M.; Micklich, B.J.

    1996-01-01

    The authors have evaluated the performance of an illicit substance detection system that performs image reconstruction using the Maximum Likelihood algebraic reconstruction algorithm, a fe number of projections, and relatively coarse projection and pixel resolution. This evaluation was done using receiver operator curves and simulated data from the fast-neutron transmission spectroscopy system operated in a mode to detect explosives in luggage. The results show that increasing the number of projection angles is more important than increasing the projection resolution, the reconstructed pixel resolution, of the number of iterations in the Maximum Likelihood algorithm. A 100% detection efficiency with essentially no false positives is possible for a square block of RDX explosive, a projection resolution of 2 cm, a reconstructed pixel size of 2x2 cm, and five projection angles. For rectangular shaped explosives more angles are required to obtain the same system performance

  10. Inference of the ancestral vertebrate phenotype through vestiges of the whole-genome duplications.

    Science.gov (United States)

    Onimaru, Koh; Kuraku, Shigehiro

    2018-03-16

    Inferring the phenotype of the last common ancestor of living vertebrates is a challenging problem because of several unresolvable factors. They include the lack of reliable out-groups of living vertebrates, poor information about less fossilizable organs and specialized traits of phylogenetically important species, such as lampreys and hagfishes (e.g. secondary loss of vertebrae in adult hagfishes). These factors undermine the reliability of ancestral reconstruction by traditional character mapping approaches based on maximum parsimony. In this article, we formulate an approach to hypothesizing ancestral vertebrate phenotypes using information from the phylogenetic and functional properties of genes duplicated by genome expansions in early vertebrate evolution. We named the conjecture as 'chronological reconstruction of ohnolog functions (CHROF)'. This CHROF conjecture raises the possibility that the last common ancestor of living vertebrates may have had more complex traits than currently thought.

  11. Statistical reconstruction for cosmic ray muon tomography.

    Science.gov (United States)

    Schultz, Larry J; Blanpied, Gary S; Borozdin, Konstantin N; Fraser, Andrew M; Hengartner, Nicolas W; Klimenko, Alexei V; Morris, Christopher L; Orum, Chris; Sossong, Michael J

    2007-08-01

    Highly penetrating cosmic ray muons constantly shower the earth at a rate of about 1 muon per cm2 per minute. We have developed a technique which exploits the multiple Coulomb scattering of these particles to perform nondestructive inspection without the use of artificial radiation. In prior work [1]-[3], we have described heuristic methods for processing muon data to create reconstructed images. In this paper, we present a maximum likelihood/expectation maximization tomographic reconstruction algorithm designed for the technique. This algorithm borrows much from techniques used in medical imaging, particularly emission tomography, but the statistics of muon scattering dictates differences. We describe the statistical model for multiple scattering, derive the reconstruction algorithm, and present simulated examples. We also propose methods to improve the robustness of the algorithm to experimental errors and events departing from the statistical model.

  12. First results of genetic algorithm application in ML image reconstruction in emission tomography

    International Nuclear Information System (INIS)

    Smolik, W.

    1999-01-01

    This paper concerns application of genetic algorithm in maximum likelihood image reconstruction in emission tomography. The example of genetic algorithm for image reconstruction is presented. The genetic algorithm was based on the typical genetic scheme modified due to the nature of solved problem. The convergence of algorithm was examined. The different adaption functions, selection and crossover methods were verified. The algorithm was tested on simulated SPECT data. The obtained results of image reconstruction are discussed. (author)

  13. Reconstruction of reflectance data using an interpolation technique.

    Science.gov (United States)

    Abed, Farhad Moghareh; Amirshahi, Seyed Hossein; Abed, Mohammad Reza Moghareh

    2009-03-01

    A linear interpolation method is applied for reconstruction of reflectance spectra of Munsell as well as ColorChecker SG color chips from the corresponding colorimetric values under a given set of viewing conditions. Hence, different types of lookup tables (LUTs) have been created to connect the colorimetric and spectrophotometeric data as the source and destination spaces in this approach. To optimize the algorithm, different color spaces and light sources have been used to build different types of LUTs. The effects of applied color datasets as well as employed color spaces are investigated. Results of recovery are evaluated by the mean and the maximum color difference values under other sets of standard light sources. The mean and the maximum values of root mean square (RMS) error between the reconstructed and the actual spectra are also calculated. Since the speed of reflectance reconstruction is a key point in the LUT algorithm, the processing time spent for interpolation of spectral data has also been measured for each model. Finally, the performance of the suggested interpolation technique is compared with that of the common principal component analysis method. According to the results, using the CIEXYZ tristimulus values as a source space shows priority over the CIELAB color space. Besides, the colorimetric position of a desired sample is a key point that indicates the success of the approach. In fact, because of the nature of the interpolation technique, the colorimetric position of the desired samples should be located inside the color gamut of available samples in the dataset. The resultant spectra that have been reconstructed by this technique show considerable improvement in terms of RMS error between the actual and the reconstructed reflectance spectra as well as CIELAB color differences under the other light source in comparison with those obtained from the standard PCA technique.

  14. The tempo and mode of New World monkey evolution and biogeography in the context of phylogenomic analysis.

    Science.gov (United States)

    Jameson Kiesling, Natalie M; Yi, Soojin V; Xu, Ke; Gianluca Sperone, F; Wildman, Derek E

    2015-01-01

    The development and evolution of organisms is heavily influenced by their environment. Thus, understanding the historical biogeography of taxa can provide insights into their evolutionary history, adaptations and trade-offs realized throughout time. In the present study we have taken a phylogenomic approach to infer New World monkey phylogeny, upon which we have reconstructed the biogeographic history of extant platyrrhines. In order to generate sufficient phylogenetic signal within the New World monkey clade, we carried out a large-scale phylogenetic analysis of approximately 40 kb of non-genic genomic DNA sequence in a 36 species subset of extant New World monkeys. Maximum parsimony, maximum likelihood and Bayesian inference analysis all converged on a single optimal tree topology. Divergence dating and biogeographic analysis reconstruct the timing and geographic location of divergence events. The ancestral area reconstruction describes the geographic locations of the last common ancestor of extant platyrrhines and provides insight into key biogeographic events occurring during platyrrhine diversification. Through these analyses we conclude that the diversification of the platyrrhines took place concurrently with the establishment and diversification of the Amazon rainforest. This suggests that an expanding rainforest environment rather than geographic isolation drove platyrrhine diversification. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Maximum likelihood reconstruction in fully 3D PET via the SAGE algorithm

    International Nuclear Information System (INIS)

    Ollinger, J.M.; Goggin, A.S.

    1996-01-01

    The SAGE and ordered subsets algorithms have been proposed as fast methods to compute penalized maximum likelihood estimates in PET. We have implemented both for use in fully 3D PET and completed a preliminary evaluation. The technique used to compute the transition matrix is fully described. The evaluation suggests that the ordered subsets algorithm converges much faster than SAGE, but that it stops short of the optimal solution

  16. 3D Tomographic Image Reconstruction using CUDA C

    International Nuclear Information System (INIS)

    Dominguez, J. S.; Assis, J. T.; Oliveira, L. F. de

    2011-01-01

    This paper presents the study and implementation of a software for three dimensional reconstruction of images obtained with a tomographic system using the capabilities of Graphic Processing Units(GPU). The reconstruction by filtered back-projection method was developed using the CUDA C, for maximum utilization of the processing capabilities of GPUs to solve computational problems with large computational cost and highly parallelizable. It was discussed the potential of GPUs and shown its advantages to solving this kind of problems. The results in terms of runtime will be compared with non-parallelized implementations and must show a great reduction of processing time. (Author)

  17. Maximum entropy reconstruction of the configurational density of states from microcanonical simulations

    International Nuclear Information System (INIS)

    Davis, Sergio

    2013-01-01

    In this work we develop a method for inferring the underlying configurational density of states of a molecular system by combining information from several microcanonical molecular dynamics or Monte Carlo simulations at different energies. This method is based on Jaynes' Maximum Entropy formalism (MaxEnt) for Bayesian statistical inference under known expectation values. We present results of its application to measure thermodynamic entropy and free energy differences in embedded-atom models of metals.

  18. Muon track reconstruction and data selection techniques in AMANDA

    International Nuclear Information System (INIS)

    Ahrens, J.; Bai, X.; Bay, R.; Barwick, S.W.; Becka, T.; Becker, J.K.; Becker, K.-H.; Bernardini, E.; Bertrand, D.; Biron, A.; Boersma, D.J.; Boeser, S.; Botner, O.; Bouchta, A.; Bouhali, O.; Burgess, T.; Carius, S.; Castermans, T.; Chirkin, D.; Collin, B.; Conrad, J.; Cooley, J.; Cowen, D.F.; Davour, A.; De Clercq, C.; DeYoung, T.; Desiati, P.; Dewulf, J.-P.; Ekstroem, P.; Feser, T.; Gaug, M.; Gaisser, T.K.; Ganugapati, R.; Geenen, H.; Gerhardt, L.; Gross, A.; Goldschmidt, A.; Hallgren, A.; Halzen, F.; Hanson, K.; Hardtke, R.; Harenberg, T.; Hauschildt, T.; Helbing, K.; Hellwig, M.; Herquet, P.; Hill, G.C.; Hubert, D.; Hughey, B.; Hulth, P.O.; Hultqvist, K.; Hundertmark, S.; Jacobsen, J.; Karle, A.; Kestel, M.; Koepke, L.; Kowalski, M.; Kuehn, K.; Lamoureux, J.I.; Leich, H.; Leuthold, M.; Lindahl, P.; Liubarsky, I.; Madsen, J.; Marciniewski, P.; Matis, H.S.; McParland, C.P.; Messarius, T.; Minaeva, Y.; Miocinovic, P.; Mock, P.C.; Morse, R.; Muenich, K.S.; Nam, J.; Nahnhauer, R.; Neunhoeffer, T.; Niessen, P.; Nygren, D.R.; Oegelman, H.; Olbrechts, Ph.; Perez de los Heros, C.; Pohl, A.C.; Porrata, R.; Price, P.B.; Przybylski, G.T.; Rawlins, K.; Resconi, E.; Rhode, W.; Ribordy, M.; Richter, S.; Rodriguez Martino, J.; Ross, D.; Sander, H.-G.; Schinarakis, K.; Schlenstedt, S.; Schmidt, T.; Schneider, D.; Schwarz, R.; Silvestri, A.; Solarz, M.; Spiczak, G.M.; Spiering, C.; Stamatikos, M.; Steele, D.; Steffen, P.; Stokstad, R.G.; Sulanke, K.-H.; Streicher, O.; Taboada, I.; Thollander, L.; Tilav, S.; Wagner, W.; Walck, C.; Wang, Y.-R.; Wiebusch, C.H.; Wiedemann, C.; Wischnewski, R.; Wissing, H.; Woschnagg, K.; Yodh, G.

    2004-01-01

    The Antarctic Muon And Neutrino Detector Array (AMANDA) is a high-energy neutrino telescope operating at the geographic South Pole. It is a lattice of photo-multiplier tubes buried deep in the polar ice between 1500 and 2000 m. The primary goal of this detector is to discover astrophysical sources of high-energy neutrinos. A high-energy muon neutrino coming through the earth from the Northern Hemisphere can be identified by the secondary muon moving upward through the detector. The muon tracks are reconstructed with a maximum likelihood method. It models the arrival times and amplitudes of Cherenkov photons registered by the photo-multipliers. This paper describes the different methods of reconstruction, which have been successfully implemented within AMANDA. Strategies for optimizing the reconstruction performance and rejecting background are presented. For a typical analysis procedure the direction of tracks are reconstructed with about 2 deg. accuracy

  19. PET reconstruction via nonlocal means induced prior.

    Science.gov (United States)

    Hou, Qingfeng; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Ma, Jianhua

    2015-01-01

    The traditional Bayesian priors for maximum a posteriori (MAP) reconstruction methods usually incorporate local neighborhood interactions that penalize large deviations in parameter estimates for adjacent pixels; therefore, only local pixel differences are utilized. This limits their abilities of penalizing the image roughness. To achieve high-quality PET image reconstruction, this study investigates a MAP reconstruction strategy by incorporating a nonlocal means induced (NLMi) prior (NLMi-MAP) which enables utilizing global similarity information of image. The present NLMi prior approximates the derivative of Gibbs energy function by an NLM filtering process. Specially, the NLMi prior is obtained by subtracting the current image estimation from its NLM filtered version and feeding the residual error back to the reconstruction filter to yield the new image estimation. We tested the present NLMi-MAP method with simulated and real PET datasets. Comparison studies with conventional filtered backprojection (FBP) and a few iterative reconstruction methods clearly demonstrate that the present NLMi-MAP method performs better in lowering noise, preserving image edge and in higher signal to noise ratio (SNR). Extensive experimental results show that the NLMi-MAP method outperforms the existing methods in terms of cross profile, noise reduction, SNR, root mean square error (RMSE) and correlation coefficient (CORR).

  20. Blind spectrum reconstruction algorithm with L0-sparse representation

    International Nuclear Information System (INIS)

    Liu, Hai; Zhang, Zhaoli; Liu, Sanyan; Shu, Jiangbo; Liu, Tingting; Zhang, Tianxu

    2015-01-01

    Raman spectrum often suffers from band overlap and Poisson noise. This paper presents a new blind Poissonian Raman spectrum reconstruction method, which incorporates the L 0 -sparse prior together with the total variation constraint into the maximum a posteriori framework. Furthermore, the greedy analysis pursuit algorithm is adopted to solve the L 0 -based minimization problem. Simulated and real spectrum experimental results show that the proposed method can effectively preserve spectral structure and suppress noise. The reconstructed Raman spectra are easily used for interpreting unknown chemical mixtures. (paper)

  1. Development of Image Reconstruction Algorithms in electrical Capacitance Tomography

    International Nuclear Information System (INIS)

    Fernandez Marron, J. L.; Alberdi Primicia, J.; Barcala Riveira, J. M.

    2007-01-01

    The Electrical Capacitance Tomography (ECT) has not obtained a good development in order to be used at industrial level. That is due first to difficulties in the measurement of very little capacitances (in the range of femto farads) and second to the problem of reconstruction on- line of the images. This problem is due also to the small numbers of electrodes (maximum 16), that made the usual algorithms of reconstruction has many errors. In this work it is described a new purely geometrical method that could be used for this purpose. (Author) 4 refs

  2. The dynamic effect of exchange-rate volatility on Turkish exports: Parsimonious error-correction model approach

    Directory of Open Access Journals (Sweden)

    Demirhan Erdal

    2015-01-01

    Full Text Available This paper aims to investigate the effect of exchange-rate stability on real export volume in Turkey, using monthly data for the period February 2001 to January 2010. The Johansen multivariate cointegration method and the parsimonious error-correction model are applied to determine long-run and short-run relationships between real export volume and its determinants. In this study, the conditional variance of the GARCH (1, 1 model is taken as a proxy for exchange-rate stability, and generalized impulse-response functions and variance-decomposition analyses are applied to analyze the dynamic effects of variables on real export volume. The empirical findings suggest that exchangerate stability has a significant positive effect on real export volume, both in the short and the long run.

  3. 3D Global Coronal Density Structure and Associated Magnetic Field near Solar Maximum

    Energy Technology Data Exchange (ETDEWEB)

    Kramar, Maxim [Physics Department, The Catholic University of America, Washington, DC (United States); Airapetian, Vladimir [Department of Physics and Astronomy, George Mason University, Fairfax, VA (United States); NASA/Goddard Space Flight Center, Code 671, Greenbelt, MD (United States); Lin, Haosheng, E-mail: vladimir.airapetian@nasa.gov [College of Natural Sciences, Institute for Astronomy, University of Hawaii at Manoa, Pukalani, HI (United States)

    2016-08-09

    Measurement of the coronal magnetic field is a crucial ingredient in understanding the nature of solar coronal dynamic phenomena at all scales. We employ STEREO/COR1 data obtained near maximum of solar activity in December 2012 (Carrington rotation, CR 2131) to retrieve and analyze the three-dimensional (3D) coronal electron density in the range of heights from 1.5 to 4 R{sub ⊙} using a tomography method and qualitatively deduce structures of the coronal magnetic field. The 3D electron density analysis is complemented by the 3D STEREO/EUVI emissivity in 195 Å band obtained by tomography for the same CR period. We find that the magnetic field configuration during CR 2131 has a tendency to become radially open at heliocentric distances below ~2.5 R{sub ⊙}. We compared the reconstructed 3D coronal structures over the CR near the solar maximum to the one at deep solar minimum. Results of our 3D density reconstruction will help to constrain solar coronal field models and test the accuracy of the magnetic field approximations for coronal modeling.

  4. 3D Global Coronal Density Structure and Associated Magnetic Field near Solar Maximum

    Directory of Open Access Journals (Sweden)

    Maxim Kramar

    2016-08-01

    Full Text Available Measurement of the coronal magnetic field is a crucial ingredient in understanding the nature of solar coronal dynamic phenomena at all scales. We employ STEREO/COR1 data obtained near maximum of solar activity in December 2012 (Carrington rotation, CR 2131 to retrieve and analyze the three-dimensional (3D coronal electron density in the range of heights from $1.5$ to $4 R_odot$ using a tomography method and qualitatively deduce structures of the coronal magnetic field. The 3D electron density analysis is complemented by the 3D STEREO/EUVI emissivity in 195 AA band obtained by tomography for the same CR period. We find that the magnetic field configuration during CR 2131 has a tendency to become radially open at heliocentric distances below $sim 2.5 R_odot$. We compared the reconstructed 3D coronal structures over the CR near the solar maximum to the one at deep solar minimum. Results of our 3D density reconstruction will help to constrain solar coronal field models and test the accuracy of the magnetic field approximations for coronal modeling.

  5. Maximum Entropy Method in Moessbauer Spectroscopy - a Problem of Magnetic Texture

    International Nuclear Information System (INIS)

    Satula, D.; Szymanski, K.; Dobrzynski, L.

    2011-01-01

    A reconstruction of the three dimensional distribution of the hyperfine magnetic field, isomer shift and texture parameter z from the Moessbauer spectra by the maximum entropy method is presented. The method was tested on the simulated spectrum consisting of two Gaussian hyperfine field distributions with different values of the texture parameters. It is shown that proper prior has to be chosen in order to arrive at the physically meaningful results. (authors)

  6. 3.5D dynamic PET image reconstruction incorporating kinetics-based clusters

    International Nuclear Information System (INIS)

    Lu Lijun; Chen Wufan; Karakatsanis, Nicolas A; Rahmim, Arman; Tang Jing

    2012-01-01

    Standard 3D dynamic positron emission tomographic (PET) imaging consists of independent image reconstructions of individual frames followed by application of appropriate kinetic model to the time activity curves at the voxel or region-of-interest (ROI). The emerging field of 4D PET reconstruction, by contrast, seeks to move beyond this scheme and incorporate information from multiple frames within the image reconstruction task. Here we propose a novel reconstruction framework aiming to enhance quantitative accuracy of parametric images via introduction of priors based on voxel kinetics, as generated via clustering of preliminary reconstructed dynamic images to define clustered neighborhoods of voxels with similar kinetics. This is then followed by straightforward maximum a posteriori (MAP) 3D PET reconstruction as applied to individual frames; and as such the method is labeled ‘3.5D’ image reconstruction. The use of cluster-based priors has the advantage of further enhancing quantitative performance in dynamic PET imaging, because: (a) there are typically more voxels in clusters than in conventional local neighborhoods, and (b) neighboring voxels with distinct kinetics are less likely to be clustered together. Using realistic simulated 11 C-raclopride dynamic PET data, the quantitative performance of the proposed method was investigated. Parametric distribution-volume (DV) and DV ratio (DVR) images were estimated from dynamic image reconstructions using (a) maximum-likelihood expectation maximization (MLEM), and MAP reconstructions using (b) the quadratic prior (QP-MAP), (c) the Green prior (GP-MAP) and (d, e) two proposed cluster-based priors (CP-U-MAP and CP-W-MAP), followed by graphical modeling, and were qualitatively and quantitatively compared for 11 ROIs. Overall, the proposed dynamic PET reconstruction methodology resulted in substantial visual as well as quantitative accuracy improvements (in terms of noise versus bias performance) for parametric DV

  7. Image reconstruction methods for the PBX-M pinhole camera

    International Nuclear Information System (INIS)

    Holland, A.; Powell, E.T.; Fonck, R.J.

    1990-03-01

    This paper describes two methods which have been used to reconstruct the soft x-ray emission profile of the PBX-M tokamak from the projected images recorded by the PBX-M pinhole camera. Both methods must accurately represent the shape of the reconstructed profile while also providing a degree of immunity to noise in the data. The first method is a simple least squares fit to the data. This has the advantage of being fast and small, and thus easily implemented on the PDP-11 computer used to control the video digitizer for the pinhole camera. The second method involves the application of a maximum entropy algorithm to an overdetermined system. This has the advantage of allowing the use of a default profile. This profile contains additional knowledge about the plasma shape which can be obtained from equilibrium fits to the external magnetic measurements. Additionally the reconstruction is guaranteed positive, and the fit to the data can be relaxed by specifying both the amount and distribution of noise in the image. The algorithm described has the advantage of being considerably faster, for an overdetermined system, than the usual Lagrange multiplier approach to finding the maximum entropy solution. 13 refs., 24 figs

  8. Truncation artifact suppression in cone-beam radionuclide transmission CT using maximum likelihood techniques: evaluation with human subjects

    International Nuclear Information System (INIS)

    Manglos, S.H.

    1992-01-01

    Transverse image truncation can be a serious problem for human imaging using cone-beam transmission CT (CB-CT) implemented on a conventional rotating gamma camera. This paper presents a reconstruction method to reduce or eliminate the artifacts resulting from the truncation. The method uses a previously published transmission maximum likelihood EM algorithm, adapted to the cone-beam geometry. The reconstruction method is evaluated qualitatively using three human subjects of various dimensions and various degrees of truncation. (author)

  9. Anelastic sensitivity kernels with parsimonious storage for adjoint tomography and full waveform inversion

    KAUST Repository

    Komatitsch, Dimitri; Xie, Zhinan; Bozdağ, Ebru; de Andrade, Elliott Sales; Peter, Daniel; Liu, Qinya; Tromp, Jeroen

    2016-01-01

    We introduce a technique to compute exact anelastic sensitivity kernels in the time domain using parsimonious disk storage. The method is based on a reordering of the time loop of time-domain forward/adjoint wave propagation solvers combined with the use of a memory buffer. It avoids instabilities that occur when time-reversing dissipative wave propagation simulations. The total number of required time steps is unchanged compared to usual acoustic or elastic approaches. The cost is reduced by a factor of 4/3 compared to the case in which anelasticity is partially accounted for by accommodating the effects of physical dispersion. We validate our technique by performing a test in which we compare the Kα sensitivity kernel to the exact kernel obtained by saving the entire forward calculation. This benchmark confirms that our approach is also exact. We illustrate the importance of including full attenuation in the calculation of sensitivity kernels by showing significant differences with physical-dispersion-only kernels.

  10. Anelastic sensitivity kernels with parsimonious storage for adjoint tomography and full waveform inversion

    KAUST Repository

    Komatitsch, Dimitri

    2016-06-13

    We introduce a technique to compute exact anelastic sensitivity kernels in the time domain using parsimonious disk storage. The method is based on a reordering of the time loop of time-domain forward/adjoint wave propagation solvers combined with the use of a memory buffer. It avoids instabilities that occur when time-reversing dissipative wave propagation simulations. The total number of required time steps is unchanged compared to usual acoustic or elastic approaches. The cost is reduced by a factor of 4/3 compared to the case in which anelasticity is partially accounted for by accommodating the effects of physical dispersion. We validate our technique by performing a test in which we compare the Kα sensitivity kernel to the exact kernel obtained by saving the entire forward calculation. This benchmark confirms that our approach is also exact. We illustrate the importance of including full attenuation in the calculation of sensitivity kernels by showing significant differences with physical-dispersion-only kernels.

  11. Anelastic sensitivity kernels with parsimonious storage for adjoint tomography and full waveform inversion

    Science.gov (United States)

    Komatitsch, Dimitri; Xie, Zhinan; Bozdaǧ, Ebru; Sales de Andrade, Elliott; Peter, Daniel; Liu, Qinya; Tromp, Jeroen

    2016-09-01

    We introduce a technique to compute exact anelastic sensitivity kernels in the time domain using parsimonious disk storage. The method is based on a reordering of the time loop of time-domain forward/adjoint wave propagation solvers combined with the use of a memory buffer. It avoids instabilities that occur when time-reversing dissipative wave propagation simulations. The total number of required time steps is unchanged compared to usual acoustic or elastic approaches. The cost is reduced by a factor of 4/3 compared to the case in which anelasticity is partially accounted for by accommodating the effects of physical dispersion. We validate our technique by performing a test in which we compare the Kα sensitivity kernel to the exact kernel obtained by saving the entire forward calculation. This benchmark confirms that our approach is also exact. We illustrate the importance of including full attenuation in the calculation of sensitivity kernels by showing significant differences with physical-dispersion-only kernels.

  12. MLE [Maximum Likelihood Estimator] reconstruction of a brain phantom using a Monte Carlo transition matrix and a statistical stopping rule

    International Nuclear Information System (INIS)

    Veklerov, E.; Llacer, J.; Hoffman, E.J.

    1987-10-01

    In order to study properties of the Maximum Likelihood Estimator (MLE) algorithm for image reconstruction in Positron Emission Tomographyy (PET), the algorithm is applied to data obtained by the ECAT-III tomograph from a brain phantom. The procedure for subtracting accidental coincidences from the data stream generated by this physical phantom is such that he resultant data are not Poisson distributed. This makes the present investigation different from other investigations based on computer-simulated phantoms. It is shown that the MLE algorithm is robust enough to yield comparatively good images, especially when the phantom is in the periphery of the field of view, even though the underlying assumption of the algorithm is violated. Two transition matrices are utilized. The first uses geometric considerations only. The second is derived by a Monte Carlo simulation which takes into account Compton scattering in the detectors, positron range, etc. in the detectors. It is demonstrated that the images obtained from the Monte Carlo matrix are superior in some specific ways. A stopping rule derived earlier and allowing the user to stop the iterative process before the images begin to deteriorate is tested. Since the rule is based on the Poisson assumption, it does not work well with the presently available data, although it is successful wit computer-simulated Poisson data

  13. Glacial evolution in King George and Livingston Islands (Antarctica) since the Last Glacial Maximum based on cosmogenic nuclide dating and glacier surface reconstruction - CRONOANTAR project

    Science.gov (United States)

    Ruiz Fernández, Jesús; Oliva, Marc; Fernández Menéndez, Susana del Carmen; García Hernández, Cristina; Menéndez Duarte, Rosa Ana; Pellitero Ondicol, Ramón; Pérez Alberti, Augusto; Schimmelpfennig, Irene

    2017-04-01

    CRONOANTAR brings together researchers from Spain, Portugal, France and United Kingdom with the objective of spatially and temporally reconstruct the deglaciation process at the two largest islands in the South Shetlands Archipelago (Maritime Antarctica), since the Global Last Glacial Maximum. Glacier retreat in polar areas has major implications at a local, regional and even planetary scale. Global average sea level rise is the most obvious and socio-economically relevant, but there are others such as the arrival of new fauna to deglaciated areas, plant colonisation or permafrost formation and degradation. This project will study the ice-free areas in Byers and Hurd peninsulas (Livingston Island) and Fildes and Potter peninsulas (King George Island). Ice-cap glacier retreat chronology will be revealed by the use of cosmogenic isotopes (mainly 36Cl) on glacially originated sedimentary and erosive records. Cosmogenic dating will be complemented by other dating methods (C14 and OSL), which will permit the validation of these methods in regions with cold-based glaciers. Given the geomorphological evidences and the obtained ages, a deglaciation calendar will be proposed and we will use a GIS methodology to reconstruct the glacier extent and the ice thickness. The results emerging from this project will allow to assess whether the high glacier retreat rates observed during the last decades were registered in the past, or if they are conversely the consequence (and evidence) of the Global Change in Antarctica. Acknowledgements This work has been funded by the Spanish Ministry of Economy, Industry and Competitiveness (Reference: CTM2016-77878-P).

  14. Quantitative SPECT reconstruction of iodine-123 data

    International Nuclear Information System (INIS)

    Gilland, D.R.; Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1991-01-01

    Many clinical and research studies in nuclear medicine require quantitation of iodine-123 ( 123 I) distribution for the determination of kinetics or localization. The objective of this study was to implement several reconstruction methods designed for single-photon emission computed tomography (SPECT) using 123 I and to evaluate their performance in terms of quantitative accuracy, image artifacts, and noise. The methods consisted of four attenuation and scatter compensation schemes incorporated into both the filtered backprojection/Chang (FBP) and maximum likelihood-expectation maximization (ML-EM) reconstruction algorithms. The methods were evaluated on data acquired of a phantom containing a hot sphere of 123 I activity in a lower level background 123 I distribution and nonuniform density media. For both reconstruction algorithms, nonuniform attenuation compensation combined with either scatter subtraction or Metz filtering produced images that were quantitatively accurate to within 15% of the true value. The ML-EM algorithm demonstrated quantitative accuracy comparable to FBP and smaller relative noise magnitude for all compensation schemes

  15. Matrix-based image reconstruction methods for tomography

    International Nuclear Information System (INIS)

    Llacer, J.; Meng, J.D.

    1984-10-01

    Matrix methods of image reconstruction have not been used, in general, because of the large size of practical matrices, ill condition upon inversion and the success of Fourier-based techniques. An exception is the work that has been done at the Lawrence Berkeley Laboratory for imaging with accelerated radioactive ions. An extension of that work into more general imaging problems shows that, with a correct formulation of the problem, positron tomography with ring geometries results in well behaved matrices which can be used for image reconstruction with no distortion of the point response in the field of view and flexibility in the design of the instrument. Maximum Likelihood Estimator methods of reconstruction, which use the system matrices tailored to specific instruments and do not need matrix inversion, are shown to result in good preliminary images. A parallel processing computer structure based on multiple inexpensive microprocessors is proposed as a system to implement the matrix-MLE methods. 14 references, 7 figures

  16. Image reconstruction using three-dimensional compound Gauss-Markov random field in emission computed tomography

    International Nuclear Information System (INIS)

    Watanabe, Shuichi; Kudo, Hiroyuki; Saito, Tsuneo

    1993-01-01

    In this paper, we propose a new reconstruction algorithm based on MAP (maximum a posteriori probability) estimation principle for emission tomography. To improve noise suppression properties of the conventional ML-EM (maximum likelihood expectation maximization) algorithm, direct three-dimensional reconstruction that utilizes intensity correlations between adjacent transaxial slices is introduced. Moreover, to avoid oversmoothing of edges, a priori knowledge of RI (radioisotope) distribution is represented by using a doubly-stochastic image model called the compound Gauss-Markov random field. The a posteriori probability is maximized by using the iterative GEM (generalized EM) algorithm. Computer simulation results are shown to demonstrate validity of the proposed algorithm. (author)

  17. Phylogenetic inferences of Nepenthes species in Peninsular Malaysia revealed by chloroplast (trnL intron) and nuclear (ITS) DNA sequences.

    Science.gov (United States)

    Bunawan, Hamidun; Yen, Choong Chee; Yaakop, Salmah; Noor, Normah Mohd

    2017-01-26

    The chloroplastic trnL intron and the nuclear internal transcribed spacer (ITS) region were sequenced for 11 Nepenthes species recorded in Peninsular Malaysia to examine their phylogenetic relationship and to evaluate the usage of trnL intron and ITS sequences for phylogenetic reconstruction of this genus. Phylogeny reconstruction was carried out using neighbor-joining, maximum parsimony and Bayesian analyses. All the trees revealed two major clusters, a lowland group consisting of N. ampullaria, N. mirabilis, N. gracilis and N. rafflesiana, and another containing both intermediately distributed species (N. albomarginata and N. benstonei) and four highland species (N. sanguinea, N. macfarlanei, N. ramispina and N. alba). The trnL intron and ITS sequences proved to provide phylogenetic informative characters for deriving a phylogeny of Nepenthes species in Peninsular Malaysia. To our knowledge, this is the first molecular phylogenetic study of Nepenthes species occurring along an altitudinal gradient in Peninsular Malaysia.

  18. Compton scatter and randoms corrections for origin ensembles 3D PET reconstructions

    Energy Technology Data Exchange (ETDEWEB)

    Sitek, Arkadiusz [Harvard Medical School, Boston, MA (United States). Dept. of Radiology; Brigham and Women' s Hospital, Boston, MA (United States); Kadrmas, Dan J. [Utah Univ., Salt Lake City, UT (United States). Utah Center for Advanced Imaging Research (UCAIR)

    2011-07-01

    In this work we develop a novel approach to correction for scatter and randoms in reconstruction of data acquired by 3D positron emission tomography (PET) applicable to tomographic reconstruction done by the origin ensemble (OE) approach. The statistical image reconstruction using OE is based on calculation of expectations of the numbers of emitted events per voxel based on complete-data space. Since the OE estimation is fundamentally different than regular statistical estimators such those based on the maximum likelihoods, the standard methods of implementation of scatter and randoms corrections cannot be used. Based on prompts, scatter, and random rates, each detected event is graded in terms of a probability of being a true event. These grades are utilized by the Markov Chain Monte Carlo (MCMC) algorithm used in OE approach for calculation of the expectation over the complete-data space of the number of emitted events per voxel (OE estimator). We show that the results obtained with the OE are almost identical to results obtained by the maximum likelihood-expectation maximization (ML-EM) algorithm for reconstruction for experimental phantom data acquired using Siemens Biograph mCT 3D PET/CT scanner. The developed correction removes artifacts due to scatter and randoms in investigated 3D PET datasets. (orig.)

  19. Reconstruction of far-field tsunami amplitude distributions from earthquake sources

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.

    2016-01-01

    The probability distribution of far-field tsunami amplitudes is explained in relation to the distribution of seismic moment at subduction zones. Tsunami amplitude distributions at tide gauge stations follow a similar functional form, well described by a tapered Pareto distribution that is parameterized by a power-law exponent and a corner amplitude. Distribution parameters are first established for eight tide gauge stations in the Pacific, using maximum likelihood estimation. A procedure is then developed to reconstruct the tsunami amplitude distribution that consists of four steps: (1) define the distribution of seismic moment at subduction zones; (2) establish a source-station scaling relation from regression analysis; (3) transform the seismic moment distribution to a tsunami amplitude distribution for each subduction zone; and (4) mix the transformed distribution for all subduction zones to an aggregate tsunami amplitude distribution specific to the tide gauge station. The tsunami amplitude distribution is adequately reconstructed for four tide gauge stations using globally constant seismic moment distribution parameters established in previous studies. In comparisons to empirical tsunami amplitude distributions from maximum likelihood estimation, the reconstructed distributions consistently exhibit higher corner amplitude values, implying that in most cases, the empirical catalogs are too short to include the largest amplitudes. Because the reconstructed distribution is based on a catalog of earthquakes that is much larger than the tsunami catalog, it is less susceptible to the effects of record-breaking events and more indicative of the actual distribution of tsunami amplitudes.

  20. Maximum likelihood reconstruction for pinhole SPECT with a displaced center-of-rotation

    International Nuclear Information System (INIS)

    Li, J.; Jaszczak, R.J.; Coleman, R.E.

    1995-01-01

    In this paper, the authors describe the implementation of a maximum likelihood (ML), algorithm using expectation maximization (EM) for pin-hole SPECT with a displaced center-of-rotation. A ray-tracing technique is used in implementing the ML-EM algorithm. The proposed ML-EM algorithm is able to correct the center of rotation displacement which can be characterized by two orthogonal components. The algorithm is tested using experimentally acquired data, and the results demonstrate that the pinhole ML-EM algorithm is able to correct artifacts associated with the center-of-rotation displacement

  1. Reconstructing Atmospheric CO2 Through The Paleocene-Eocene Thermal Maximum Using Stomatal Index and Stomatal Density Values From Ginkgo adiantoides

    Science.gov (United States)

    Barclay, R. S.; Wing, S. L.

    2013-12-01

    The Paleocene-Eocene Thermal Maximum (PETM) was a geologically brief interval of intense global warming 56 million years ago. It is arguably the best geological analog for a worst-case scenario of anthropogenic carbon emissions. The PETM is marked by a ~4-6‰ negative carbon isotope excursion (CIE) and extensive marine carbonate dissolution, which together are powerful evidence for a massive addition of carbon to the oceans and atmosphere. In spite of broad agreement that the PETM reflects a large carbon cycle perturbation, atmospheric concentrations of CO2 (pCO2) during the event are not well constrained. The goal of this study is to produce a high resolution reconstruction of pCO2 using stomatal frequency proxies (both stomatal index and stomatal density) before, during, and after the PETM. These proxies rely upon a genetically controlled mechanism whereby plants decrease the proportion of gas-exchange pores (stomata) in response to increased pCO2. Terrestrial sections in the Bighorn Basin, Wyoming, contain macrofossil plants with cuticle immediately bracketing the PETM, as well as dispersed plant cuticle from within the body of the CIE. These fossils allow for the first stomatal-based reconstruction of pCO2 near the Paleocene-Eocene boundary; we also use them to determine the relative timing of pCO2 change in relation to the CIE that defines the PETM. Preliminary results come from macrofossil specimens of Ginkgo adiantoides, collected from an ~200ka interval prior to the onset of the CIE (~230-30ka before), and just after the 'recovery interval' of the CIE. Stomatal index values decreased by 37% within an ~70ka time interval at least 100ka prior to the onset of the CIE. The decrease in stomatal index is interpreted as a significant increase in pCO2, and has a magnitude equivalent to the entire range of stomatal index adjustment observed in modern Ginkgo biloba during the anthropogenic CO2 rise during the last 150 years. The inferred CO2 increase prior to the

  2. The efficiency of different search strategies in estimating parsimony jackknife, bootstrap, and Bremer support

    Directory of Open Access Journals (Sweden)

    Müller Kai F

    2005-10-01

    Full Text Available Abstract Background For parsimony analyses, the most common way to estimate confidence is by resampling plans (nonparametric bootstrap, jackknife, and Bremer support (Decay indices. The recent literature reveals that parameter settings that are quite commonly employed are not those that are recommended by theoretical considerations and by previous empirical studies. The optimal search strategy to be applied during resampling was previously addressed solely via standard search strategies available in PAUP*. The question of a compromise between search extensiveness and improved support accuracy for Bremer support received even less attention. A set of experiments was conducted on different datasets to find an empirical cut-off point at which increased search extensiveness does not significantly change Bremer support and jackknife or bootstrap proportions any more. Results For the number of replicates needed for accurate estimates of support in resampling plans, a diagram is provided that helps to address the question whether apparently different support values really differ significantly. It is shown that the use of random addition cycles and parsimony ratchet iterations during bootstrapping does not translate into higher support, nor does any extension of the search extensiveness beyond the rather moderate effort of TBR (tree bisection and reconnection branch swapping plus saving one tree per replicate. Instead, in case of very large matrices, saving more than one shortest tree per iteration and using a strict consensus tree of these yields decreased support compared to saving only one tree. This can be interpreted as a small risk of overestimating support but should be more than compensated by other factors that counteract an enhanced type I error. With regard to Bremer support, a rule of thumb can be derived stating that not much is gained relative to the surplus computational effort when searches are extended beyond 20 ratchet iterations per

  3. CAT reconstruction and potting comparison of a LMFBR fuel bundle

    International Nuclear Information System (INIS)

    Betten, P.R.; Tow, D.M.

    1984-04-01

    A standard Liquid Metal Fast Breeder Reactor (LMFBR) subassembly used in the Experimental Breeder Reactor II (EBR-II) was investigated, by remote techniques, for fuel bundle distortion by both nondestructive and destructive methods, and the results from both methods were compared. The non-destructive method employed neutron tomography to reconstruct the locations of fuel elements through the use of a maximum entropy reconstruction algorithm known as MENT. The destructive method consisted of ''potting'' (a technique that embeds and permanently fixes the fuel elements in a solid matrix) the subassembly, and then cutting and polishing the individual sections. The comparison indicated that the tomography reconstruction provided good results in describing the bundle geometry and spacer-wire locations, with the overall resolution being on the order of a spacer-wire diameter. A dimensional consistency check indicated that the element and spacer-wire dimensions were accurately reproduced in the reconstruction

  4. A multicenter evaluation of seven commercial ML-EM algorithms for SPECT image reconstruction using simulation data

    International Nuclear Information System (INIS)

    Matsumoto, Keiichi; Ohnishi, Hideo; Niida, Hideharu; Nishimura, Yoshihiro; Wada, Yasuhiro; Kida, Tetsuo

    2003-01-01

    The maximum likelihood expectation maximization (ML-EM) algorithm has become available as an alternative to filtered back projection in SPECT. The actual physical performance may be different depending on the manufacturer and model, because of differences in computational details. The purpose of this study was to investigate the characteristics of seven different types of ML-EM algorithms using simple simulation data. Seven ML-EM algorithm programs were used: Genie (GE), esoft (Siemens), HARP-III (Hitachi), GMS-5500UI (Toshiba), Pegasys (ADAC), ODYSSEY-FX (Marconi), and Windows-PC (original software). Projection data of a 2-pixel-wide line source in the center of the field of view were simulated without attenuation or scatter. Images were reconstructed with ML-EM by changing the number of iterations from 1 to 45 for each algorithm. Image quality was evaluated after a reconstruction using full width at half maximum (FWHM), full width at tenth maximum (FWTM), and the total counts of the reconstructed images. In the maximum number of iterations, the difference in the FWHM value was up to 1.5 pixels, and that of FWTM, no less than 2.0 pixels. The total counts of the reconstructed images in the initial few iterations were larger or smaller than the converged value depending on the initial values. Our results for the simplest simulation data suggest that each ML-EM algorithm itself provides a simulation image. We should keep in mind which algorithm is being used and its computational details, when physical and clinical usefulness are compared. (author)

  5. Anatomically-aided PET reconstruction using the kernel method.

    Science.gov (United States)

    Hutchcroft, Will; Wang, Guobao; Chen, Kevin T; Catana, Ciprian; Qi, Jinyi

    2016-09-21

    This paper extends the kernel method that was proposed previously for dynamic PET reconstruction, to incorporate anatomical side information into the PET reconstruction model. In contrast to existing methods that incorporate anatomical information using a penalized likelihood framework, the proposed method incorporates this information in the simpler maximum likelihood (ML) formulation and is amenable to ordered subsets. The new method also does not require any segmentation of the anatomical image to obtain edge information. We compare the kernel method with the Bowsher method for anatomically-aided PET image reconstruction through a simulated data set. Computer simulations demonstrate that the kernel method offers advantages over the Bowsher method in region of interest quantification. Additionally the kernel method is applied to a 3D patient data set. The kernel method results in reduced noise at a matched contrast level compared with the conventional ML expectation maximization algorithm.

  6. Iterative reconstruction with attenuation compensation from cone-beam projections acquired via nonplanar orbits

    International Nuclear Information System (INIS)

    Zeng, G.L.; Weng, Y.; Gullberg, G.T.

    1997-01-01

    Single photon emission computed tomography (SPECT) imaging with cone-beam collimators provides improved sensitivity and spatial resolution for imaging small objects with large field-of-view detectors. It is known that Tuy's cone-beam data sufficiency condition must be met to obtain artifact-free reconstructions. Even though Tuy's condition was derived for an attenuation-free situation, the authors hypothesize that an artifact-free reconstruction can be obtained even if the cone-beam data are attenuated, provided the imaging orbit satisfies Tuy's condition and the exact attenuation map is known. In the authors' studies, emission data are acquired using nonplanar circle-and-line orbits to acquire cone-beam data for tomographic reconstructions. An extended iterative ML-EM (maximum likelihood-expectation maximization) reconstruction algorithm is derived and used to reconstruct projection data with either a pre-acquired or assumed attenuation map. Quantitative accuracy of the attenuation corrected emission reconstruction is significantly improved

  7. Multiple sequence alignment accuracy and phylogenetic inference.

    Science.gov (United States)

    Ogden, T Heath; Rosenberg, Michael S

    2006-04-01

    Phylogenies are often thought to be more dependent upon the specifics of the sequence alignment rather than on the method of reconstruction. Simulation of sequences containing insertion and deletion events was performed in order to determine the role that alignment accuracy plays during phylogenetic inference. Data sets were simulated for pectinate, balanced, and random tree shapes under different conditions (ultrametric equal branch length, ultrametric random branch length, nonultrametric random branch length). Comparisons between hypothesized alignments and true alignments enabled determination of two measures of alignment accuracy, that of the total data set and that of individual branches. In general, our results indicate that as alignment error increases, topological accuracy decreases. This trend was much more pronounced for data sets derived from more pectinate topologies. In contrast, for balanced, ultrametric, equal branch length tree shapes, alignment inaccuracy had little average effect on tree reconstruction. These conclusions are based on average trends of many analyses under different conditions, and any one specific analysis, independent of the alignment accuracy, may recover very accurate or inaccurate topologies. Maximum likelihood and Bayesian, in general, outperformed neighbor joining and maximum parsimony in terms of tree reconstruction accuracy. Results also indicated that as the length of the branch and of the neighboring branches increase, alignment accuracy decreases, and the length of the neighboring branches is the major factor in topological accuracy. Thus, multiple-sequence alignment can be an important factor in downstream effects on topological reconstruction.

  8. Time-of-flight PET image reconstruction using origin ensembles

    Science.gov (United States)

    Wülker, Christian; Sitek, Arkadiusz; Prevrhal, Sven

    2015-03-01

    The origin ensemble (OE) algorithm is a novel statistical method for minimum-mean-square-error (MMSE) reconstruction of emission tomography data. This method allows one to perform reconstruction entirely in the image domain, i.e. without the use of forward and backprojection operations. We have investigated the OE algorithm in the context of list-mode (LM) time-of-flight (TOF) PET reconstruction. In this paper, we provide a general introduction to MMSE reconstruction, and a statistically rigorous derivation of the OE algorithm. We show how to efficiently incorporate TOF information into the reconstruction process, and how to correct for random coincidences and scattered events. To examine the feasibility of LM-TOF MMSE reconstruction with the OE algorithm, we applied MMSE-OE and standard maximum-likelihood expectation-maximization (ML-EM) reconstruction to LM-TOF phantom data with a count number typically registered in clinical PET examinations. We analyzed the convergence behavior of the OE algorithm, and compared reconstruction time and image quality to that of the EM algorithm. In summary, during the reconstruction process, MMSE-OE contrast recovery (CRV) remained approximately the same, while background variability (BV) gradually decreased with an increasing number of OE iterations. The final MMSE-OE images exhibited lower BV and a slightly lower CRV than the corresponding ML-EM images. The reconstruction time of the OE algorithm was approximately 1.3 times longer. At the same time, the OE algorithm can inherently provide a comprehensive statistical characterization of the acquired data. This characterization can be utilized for further data processing, e.g. in kinetic analysis and image registration, making the OE algorithm a promising approach in a variety of applications.

  9. A new iterative algorithm to reconstruct the refractive index.

    Science.gov (United States)

    Liu, Y J; Zhu, P P; Chen, B; Wang, J Y; Yuan, Q X; Huang, W X; Shu, H; Li, E R; Liu, X S; Zhang, K; Ming, H; Wu, Z Y

    2007-06-21

    The latest developments in x-ray imaging are associated with techniques based on the phase contrast. However, the image reconstruction procedures demand significant improvements of the traditional methods, and/or new algorithms have to be introduced to take advantage of the high contrast and sensitivity of the new experimental techniques. In this letter, an improved iterative reconstruction algorithm based on the maximum likelihood expectation maximization technique is presented and discussed in order to reconstruct the distribution of the refractive index from data collected by an analyzer-based imaging setup. The technique considered probes the partial derivative of the refractive index with respect to an axis lying in the meridional plane and perpendicular to the propagation direction. Computer simulations confirm the reliability of the proposed algorithm. In addition, the comparison between an analytical reconstruction algorithm and the iterative method has been also discussed together with the convergent characteristic of this latter algorithm. Finally, we will show how the proposed algorithm may be applied to reconstruct the distribution of the refractive index of an epoxy cylinder containing small air bubbles of about 300 micro of diameter.

  10. An attempt to reconstruct phylogenetic relationships within Caribbean nummulitids: simulating relationships and tracing character evolution

    Science.gov (United States)

    Eder, Wolfgang; Ives Torres-Silva, Ana; Hohenegger, Johann

    2017-04-01

    Phylogenetic analysis and trees based on molecular data are broadly applied and used to infer genetical and biogeographic relationship in recent larger foraminifera. Molecular phylogenetic is intensively used within recent nummulitids, however for fossil representatives these trees are only of minor informational value. Hence, within paleontological studies a phylogenetic approach through morphometric analysis is of much higher value. To tackle phylogenetic relationships within the nummulitid family, a much higher number of morphological character must be measured than are commonly used in biometric studies, where mostly parameters describing embryonic size (e.g., proloculus diameter, deuteroloculus diameter) and/or the marginal spiral (e.g., spiral diagrams, spiral indices) are studied. For this purpose 11 growth-independent and/or growth-invariant characters have been used to describe the morphological variability of equatorial thin sections of seven Carribbean nummulitid taxa (Nummulites striatoreticulatus, N. macgillavry, Palaeonummulites willcoxi, P.floridensis, P. soldadensis, P.trinitatensis and P.ocalanus) and one outgroup taxon (Ranikothalia bermudezi). Using these characters, phylogenetic trees were calculated using a restricted maximum likelihood algorithm (REML), and results are cross-checked by ordination and cluster analysis. Square-change parsimony method has been run to reconstruct ancestral states, as well as to simulate the evolution of the chosen characters along the calculated phylogenetic tree and, independent - contrast analysis was used to estimate confidence intervals. Based on these simulations, phylogenetic tendencies of certain characters proposed for nummulitids (e.g., Cope's rule or nepionic acceleration) can be tested, whether these tendencies are valid for the whole family or only for certain clades. At least, within the Carribean nummulitids, phylogenetic trends along some growth-independent characters of the embryo (e.g., first

  11. Complete mitochondrial genomes reveal phylogeny relationship and evolutionary history of the family Felidae.

    Science.gov (United States)

    Zhang, W Q; Zhang, M H

    2013-09-03

    Many mitochondrial DNA sequences are used to estimate phylogenetic relationships among animal taxa and perform molecular phylogenetic evolution analysis. With the continuous development of sequencing technology, numerous mitochondrial sequences have been released in public databases, especially complete mitochondrial DNA sequences. Using multiple sequences is better than using single sequences for phylogenetic analysis of animals because multiple sequences have sufficient information for evolutionary process reconstruction. Therefore, we performed phylogenetic analyses of 14 species of Felidae based on complete mitochondrial genome sequences, with Canis familiaris as an outgroup, using neighbor joining, maximum likelihood, maximum parsimony, and Bayesian inference methods. The consensus phylogenetic trees supported the monophyly of Felidae, and the family could be divided into 2 subfamilies, Felinae and Pantherinae. The genus Panthera and species tigris were also studied in detail. Meanwhile, the divergence of this family was estimated by phylogenetic analysis using the Bayesian method with a relaxed molecular clock, and the results shown were consistent with previous studies. In summary, the evolution of Felidae was reconstructed by phylogenetic analysis based on mitochondrial genome sequences. The described method may be broadly applicable for phylogenetic analyses of anima taxa.

  12. Molecular phylogeny of the spoonbills (Aves: Threskiornithidae) based on mitochondrial DNA

    Science.gov (United States)

    Chesser, R. Terry; Yeung, Carol K.L.; Yao, Cheng-Te; Tian, Xiu-Hua; Li, Shou-Hsien

    2010-01-01

    Spoonbills (genus Platalea) are a small group of wading birds, generally considered to constitute the subfamily Plataleinae (Aves: Threskiornithidae). We reconstructed phylogenetic relationships among the six species of spoonbills using variation in sequences of the mitochondrial genes ND2 and cytochrome b (total 1796 bp). Topologies of phylogenetic trees reconstructed using maximum likelihood, maximum parsimony, and Bayesian analyses were virtually identical and supported monophyly of the spoonbills. Most relationships within Platalea received strong support: P. minor and P. regia were closely related sister species, P. leucorodia was sister to the minor-regia clade, and P. alba was sister to the minor-regia-leucorodia clade. Relationships of P. flavipes and P. ajaja were less well resolved: these species either formed a clade that was sister to the four-species clade, or were successive sisters to this clade. This phylogeny is consistent with ideas of relatedness derived from spoonbill morphology. Our limited sampling of the Threskiornithinae (ibises), the putative sister group to the spoonbills, indicated that this group is paraphyletic, in agreement with previous molecular data; this suggests that separation of the Threskiornithidae into subfamilies Plataleinae and Threskiornithinae may not be warranted.

  13. Analysis of Acorus calamus chloroplast genome and its phylogenetic implications.

    Science.gov (United States)

    Goremykin, Vadim V; Holland, Barbara; Hirsch-Ernst, Karen I; Hellwig, Frank H

    2005-09-01

    Determining the phylogenetic relationships among the major lines of angiosperms is a long-standing problem, yet the uncertainty as to the phylogenetic affinity of these lines persists. While a number of studies have suggested that the ANITA (Amborella-Nymphaeales-Illiciales-Trimeniales-Aristolochiales) grade is basal within angiosperms, studies of complete chloroplast genome sequences also suggested an alternative tree, wherein the line leading to the grasses branches first among the angiosperms. To improve taxon sampling in the existing chloroplast genome data, we sequenced the chloroplast genome of the monocot Acorus calamus. We generated a concatenated alignment (89,436 positions for 15 taxa), encompassing almost all sequences usable for phylogeny reconstruction within spermatophytes. The data still contain support for both the ANITA-basal and grasses-basal hypotheses. Using simulations we can show that were the ANITA-basal hypothesis true, parsimony (and distance-based methods with many models) would be expected to fail to recover it. The self-evident explanation for this failure appears to be a long-branch attraction (LBA) between the clade of grasses and the out-group. However, this LBA cannot explain the discrepancies observed between tree topology recovered using the maximum likelihood (ML) method and the topologies recovered using the parsimony and distance-based methods when grasses are deleted. Furthermore, the fact that neither maximum parsimony nor distance methods consistently recover the ML tree, when according to the simulations they would be expected to, when the out-group (Pinus) is deleted, suggests that either the generating tree is not correct or the best symmetric model is misspecified (or both). We demonstrate that the tree recovered under ML is extremely sensitive to model specification and that the best symmetric model is misspecified. Hence, we remain agnostic regarding phylogenetic relationships among basal angiosperm lineages.

  14. Prediction of dissolved reactive phosphorus losses from small agricultural catchments: calibration and validation of a parsimonious model

    Directory of Open Access Journals (Sweden)

    C. Hahn

    2013-10-01

    Full Text Available Eutrophication of surface waters due to diffuse phosphorus (P losses continues to be a severe water quality problem worldwide, causing the loss of ecosystem functions of the respective water bodies. Phosphorus in runoff often originates from a small fraction of a catchment only. Targeting mitigation measures to these critical source areas (CSAs is expected to be most efficient and cost-effective, but requires suitable tools. Here we investigated the capability of the parsimonious Rainfall-Runoff-Phosphorus (RRP model to identify CSAs in grassland-dominated catchments based on readily available soil and topographic data. After simultaneous calibration on runoff data from four small hilly catchments on the Swiss Plateau, the model was validated on a different catchment in the same region without further calibration. The RRP model adequately simulated the discharge and dissolved reactive P (DRP export from the validation catchment. Sensitivity analysis showed that the model predictions were robust with respect to the classification of soils into "poorly drained" and "well drained", based on the available soil map. Comparing spatial hydrological model predictions with field data from the validation catchment provided further evidence that the assumptions underlying the model are valid and that the model adequately accounts for the dominant P export processes in the target region. Thus, the parsimonious RRP model is a valuable tool that can be used to determine CSAs. Despite the considerable predictive uncertainty regarding the spatial extent of CSAs, the RRP can provide guidance for the implementation of mitigation measures. The model helps to identify those parts of a catchment where high DRP losses are expected or can be excluded with high confidence. Legacy P was predicted to be the dominant source for DRP losses and thus, in combination with hydrologic active areas, a high risk for water quality.

  15. Accurate phylogenetic tree reconstruction from quartets: a heuristic approach.

    Science.gov (United States)

    Reaz, Rezwana; Bayzid, Md Shamsuzzoha; Rahman, M Sohel

    2014-01-01

    Supertree methods construct trees on a set of taxa (species) combining many smaller trees on the overlapping subsets of the entire set of taxa. A 'quartet' is an unrooted tree over 4 taxa, hence the quartet-based supertree methods combine many 4-taxon unrooted trees into a single and coherent tree over the complete set of taxa. Quartet-based phylogeny reconstruction methods have been receiving considerable attentions in the recent years. An accurate and efficient quartet-based method might be competitive with the current best phylogenetic tree reconstruction methods (such as maximum likelihood or Bayesian MCMC analyses), without being as computationally intensive. In this paper, we present a novel and highly accurate quartet-based phylogenetic tree reconstruction method. We performed an extensive experimental study to evaluate the accuracy and scalability of our approach on both simulated and biological datasets.

  16. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    Science.gov (United States)

    Pereira, N. F.; Sitek, A.

    2010-09-01

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated.

  17. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    International Nuclear Information System (INIS)

    Pereira, N F; Sitek, A

    2010-01-01

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated.

  18. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, N F; Sitek, A, E-mail: nfp4@bwh.harvard.ed, E-mail: asitek@bwh.harvard.ed [Department of Radiology, Brigham and Women' s Hospital-Harvard Medical School Boston, MA (United States)

    2010-09-21

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated.

  19. Variance-based Salt Body Reconstruction

    KAUST Repository

    Ovcharenko, Oleg

    2017-05-26

    Seismic inversions of salt bodies are challenging when updating velocity models based on Born approximation- inspired gradient methods. We propose a variance-based method for velocity model reconstruction in regions complicated by massive salt bodies. The novel idea lies in retrieving useful information from simultaneous updates corresponding to different single frequencies. Instead of the commonly used averaging of single-iteration monofrequency gradients, our algorithm iteratively reconstructs salt bodies in an outer loop based on updates from a set of multiple frequencies after a few iterations of full-waveform inversion. The variance among these updates is used to identify areas where considerable cycle-skipping occurs. In such areas, we update velocities by interpolating maximum velocities within a certain region. The result of several recursive interpolations is later used as a new starting model to improve results of conventional full-waveform inversion. An application on part of the BP 2004 model highlights the evolution of the proposed approach and demonstrates its effectiveness.

  20. Reconstruction of signals with unknown spectra in information field theory with parameter uncertainty

    International Nuclear Information System (INIS)

    Ensslin, Torsten A.; Frommert, Mona

    2011-01-01

    The optimal reconstruction of cosmic metric perturbations and other signals requires knowledge of their power spectra and other parameters. If these are not known a priori, they have to be measured simultaneously from the same data used for the signal reconstruction. We formulate the general problem of signal inference in the presence of unknown parameters within the framework of information field theory. To solve this, we develop a generic parameter-uncertainty renormalized estimation (PURE) technique. As a concrete application, we address the problem of reconstructing Gaussian signals with unknown power-spectrum with five different approaches: (i) separate maximum-a-posteriori power-spectrum measurement and subsequent reconstruction, (ii) maximum-a-posteriori reconstruction with marginalized power-spectrum, (iii) maximizing the joint posterior of signal and spectrum, (iv) guessing the spectrum from the variance in the Wiener-filter map, and (v) renormalization flow analysis of the field-theoretical problem providing the PURE filter. In all cases, the reconstruction can be described or approximated as Wiener-filter operations with assumed signal spectra derived from the data according to the same recipe, but with differing coefficients. All of these filters, except the renormalized one, exhibit a perception threshold in case of a Jeffreys prior for the unknown spectrum. Data modes with variance below this threshold do not affect the signal reconstruction at all. Filter (iv) seems to be similar to the so-called Karhune-Loeve and Feldman-Kaiser-Peacock estimators for galaxy power spectra used in cosmology, which therefore should also exhibit a marginal perception threshold if correctly implemented. We present statistical performance tests and show that the PURE filter is superior to the others, especially if the post-Wiener-filter corrections are included or in case an additional scale-independent spectral smoothness prior can be adopted.

  1. Maximum entropy deconvolution of low count nuclear medicine images

    International Nuclear Information System (INIS)

    McGrath, D.M.

    1998-12-01

    Maximum entropy is applied to the problem of deconvolving nuclear medicine images, with special consideration for very low count data. The physics of the formation of scintigraphic images is described, illustrating the phenomena which degrade planar estimates of the tracer distribution. Various techniques which are used to restore these images are reviewed, outlining the relative merits of each. The development and theoretical justification of maximum entropy as an image processing technique is discussed. Maximum entropy is then applied to the problem of planar deconvolution, highlighting the question of the choice of error parameters for low count data. A novel iterative version of the algorithm is suggested which allows the errors to be estimated from the predicted Poisson mean values. This method is shown to produce the exact results predicted by combining Poisson statistics and a Bayesian interpretation of the maximum entropy approach. A facility for total count preservation has also been incorporated, leading to improved quantification. In order to evaluate this iterative maximum entropy technique, two comparable methods, Wiener filtering and a novel Bayesian maximum likelihood expectation maximisation technique, were implemented. The comparison of results obtained indicated that this maximum entropy approach may produce equivalent or better measures of image quality than the compared methods, depending upon the accuracy of the system model used. The novel Bayesian maximum likelihood expectation maximisation technique was shown to be preferable over many existing maximum a posteriori methods due to its simplicity of implementation. A single parameter is required to define the Bayesian prior, which suppresses noise in the solution and may reduce the processing time substantially. Finally, maximum entropy deconvolution was applied as a pre-processing step in single photon emission computed tomography reconstruction of low count data. Higher contrast results were

  2. Catchment legacies and time lags: a parsimonious watershed model to predict the effects of legacy storage on nitrogen export.

    Directory of Open Access Journals (Sweden)

    Kimberly J Van Meter

    Full Text Available Nutrient legacies in anthropogenic landscapes, accumulated over decades of fertilizer application, lead to time lags between implementation of conservation measures and improvements in water quality. Quantification of such time lags has remained difficult, however, due to an incomplete understanding of controls on nutrient depletion trajectories after changes in land-use or management practices. In this study, we have developed a parsimonious watershed model for quantifying catchment-scale time lags based on both soil nutrient accumulations (biogeochemical legacy and groundwater travel time distributions (hydrologic legacy. The model accurately predicted the time lags observed in an Iowa watershed that had undergone a 41% conversion of area from row crop to native prairie. We explored the time scales of change for stream nutrient concentrations as a function of both natural and anthropogenic controls, from topography to spatial patterns of land-use change. Our results demonstrate that the existence of biogeochemical nutrient legacies increases time lags beyond those due to hydrologic legacy alone. In addition, we show that the maximum concentration reduction benefits vary according to the spatial pattern of intervention, with preferential conversion of land parcels having the shortest catchment-scale travel times providing proportionally greater concentration reductions as well as faster response times. In contrast, a random pattern of conversion results in a 1:1 relationship between percent land conversion and percent concentration reduction, irrespective of denitrification rates within the landscape. Our modeling framework allows for the quantification of tradeoffs between costs associated with implementation of conservation measures and the time needed to see the desired concentration reductions, making it of great value to decision makers regarding optimal implementation of watershed conservation measures.

  3. Reconstructing perceived faces from brain activations with deep adversarial neural decoding

    NARCIS (Netherlands)

    Güçlütürk, Y.; Güçlü, U.; Seeliger, K.; Bosch, S.E.; Lier, R.J. van; Gerven, M.A.J. van; Guyon, I.; Luxburg, U.V.; Bengio, S.; Wallach, H.; Fergus, R.; Vishwanathan, S.; Garnett, R.

    2017-01-01

    Here, we present a novel approach to solve the problem of reconstructing perceived stimuli from brain responses by combining probabilistic inference with deep learning. Our approach first inverts the linear transformation from latent features to brain responses with maximum a posteriori estimation

  4. Adaptive multiresolution method for MAP reconstruction in electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Acar, Erman, E-mail: erman.acar@tut.fi [Department of Signal Processing, Tampere University of Technology, P.O. Box 553, FI-33101 Tampere (Finland); BioMediTech, Tampere University of Technology, Biokatu 10, 33520 Tampere (Finland); Peltonen, Sari; Ruotsalainen, Ulla [Department of Signal Processing, Tampere University of Technology, P.O. Box 553, FI-33101 Tampere (Finland); BioMediTech, Tampere University of Technology, Biokatu 10, 33520 Tampere (Finland)

    2016-11-15

    3D image reconstruction with electron tomography holds problems due to the severely limited range of projection angles and low signal to noise ratio of the acquired projection images. The maximum a posteriori (MAP) reconstruction methods have been successful in compensating for the missing information and suppressing noise with their intrinsic regularization techniques. There are two major problems in MAP reconstruction methods: (1) selection of the regularization parameter that controls the balance between the data fidelity and the prior information, and (2) long computation time. One aim of this study is to provide an adaptive solution to the regularization parameter selection problem without having additional knowledge about the imaging environment and the sample. The other aim is to realize the reconstruction using sequences of resolution levels to shorten the computation time. The reconstructions were analyzed in terms of accuracy and computational efficiency using a simulated biological phantom and publically available experimental datasets of electron tomography. The numerical and visual evaluations of the experiments show that the adaptive multiresolution method can provide more accurate results than the weighted back projection (WBP), simultaneous iterative reconstruction technique (SIRT), and sequential MAP expectation maximization (sMAPEM) method. The method is superior to sMAPEM also in terms of computation time and usability since it can reconstruct 3D images significantly faster without requiring any parameter to be set by the user. - Highlights: • An adaptive multiresolution reconstruction method is introduced for electron tomography. • The method provides more accurate results than the conventional reconstruction methods. • The missing wedge and noise problems can be compensated by the method efficiently.

  5. Genome rearrangements and phylogeny reconstruction in Yersinia pestis.

    Science.gov (United States)

    Bochkareva, Olga O; Dranenko, Natalia O; Ocheredko, Elena S; Kanevsky, German M; Lozinsky, Yaroslav N; Khalaycheva, Vera A; Artamonova, Irena I; Gelfand, Mikhail S

    2018-01-01

    Genome rearrangements have played an important role in the evolution of Yersinia pestis from its progenitor Yersinia pseudotuberculosis . Traditional phylogenetic trees for Y. pestis based on sequence comparison have short internal branches and low bootstrap supports as only a small number of nucleotide substitutions have occurred. On the other hand, even a small number of genome rearrangements may resolve topological ambiguities in a phylogenetic tree. We reconstructed phylogenetic trees based on genome rearrangements using several popular approaches such as Maximum likelihood for Gene Order and the Bayesian model of genome rearrangements by inversions. We also reconciled phylogenetic trees for each of the three CRISPR loci to obtain an integrated scenario of the CRISPR cassette evolution. Analysis of contradictions between the obtained evolutionary trees yielded numerous parallel inversions and gain/loss events. Our data indicate that an integrated analysis of sequence-based and inversion-based trees enhances the resolution of phylogenetic reconstruction. In contrast, reconstructions of strain relationships based on solely CRISPR loci may not be reliable, as the history is obscured by large deletions, obliterating the order of spacer gains. Similarly, numerous parallel gene losses preclude reconstruction of phylogeny based on gene content.

  6. Modeling Mediterranean Ocean climate of the Last Glacial Maximum

    Directory of Open Access Journals (Sweden)

    U. Mikolajewicz

    2011-03-01

    Full Text Available A regional ocean general circulation model of the Mediterranean is used to study the climate of the Last Glacial Maximum. The atmospheric forcing for these simulations has been derived from simulations with an atmospheric general circulation model, which in turn was forced with surface conditions from a coarse resolution earth system model. The model is successful in reproducing the general patterns of reconstructed sea surface temperature anomalies with the strongest cooling in summer in the northwestern Mediterranean and weak cooling in the Levantine, although the model underestimates the extent of the summer cooling in the western Mediterranean. However, there is a strong vertical gradient associated with this pattern of summer cooling, which makes the comparison with reconstructions complicated. The exchange with the Atlantic is decreased to roughly one half of its present value, which can be explained by the shallower Strait of Gibraltar as a consequence of lower global sea level. This reduced exchange causes a strong increase of salinity in the Mediterranean in spite of reduced net evaporation.

  7. Direct 4D reconstruction of parametric images incorporating anato-functional joint entropy.

    Science.gov (United States)

    Tang, Jing; Kuwabara, Hiroto; Wong, Dean F; Rahmim, Arman

    2010-08-07

    We developed an anatomy-guided 4D closed-form algorithm to directly reconstruct parametric images from projection data for (nearly) irreversible tracers. Conventional methods consist of individually reconstructing 2D/3D PET data, followed by graphical analysis on the sequence of reconstructed image frames. The proposed direct reconstruction approach maintains the simplicity and accuracy of the expectation-maximization (EM) algorithm by extending the system matrix to include the relation between the parametric images and the measured data. A closed-form solution was achieved using a different hidden complete-data formulation within the EM framework. Furthermore, the proposed method was extended to maximum a posterior reconstruction via incorporation of MR image information, taking the joint entropy between MR and parametric PET features as the prior. Using realistic simulated noisy [(11)C]-naltrindole PET and MR brain images/data, the quantitative performance of the proposed methods was investigated. Significant improvements in terms of noise versus bias performance were demonstrated when performing direct parametric reconstruction, and additionally upon extending the algorithm to its Bayesian counterpart using the MR-PET joint entropy measure.

  8. Cophylogeny reconstruction via an approximate Bayesian computation.

    Science.gov (United States)

    Baudet, C; Donati, B; Sinaimeri, B; Crescenzi, P; Gautier, C; Matias, C; Sagot, M-F

    2015-05-01

    Despite an increasingly vast literature on cophylogenetic reconstructions for studying host-parasite associations, understanding the common evolutionary history of such systems remains a problem that is far from being solved. Most algorithms for host-parasite reconciliation use an event-based model, where the events include in general (a subset of) cospeciation, duplication, loss, and host switch. All known parsimonious event-based methods then assign a cost to each type of event in order to find a reconstruction of minimum cost. The main problem with this approach is that the cost of the events strongly influences the reconciliation obtained. Some earlier approaches attempt to avoid this problem by finding a Pareto set of solutions and hence by considering event costs under some minimization constraints. To deal with this problem, we developed an algorithm, called Coala, for estimating the frequency of the events based on an approximate Bayesian computation approach. The benefits of this method are 2-fold: (i) it provides more confidence in the set of costs to be used in a reconciliation, and (ii) it allows estimation of the frequency of the events in cases where the data set consists of trees with a large number of taxa. We evaluate our method on simulated and on biological data sets. We show that in both cases, for the same pair of host and parasite trees, different sets of frequencies for the events lead to equally probable solutions. Moreover, often these solutions differ greatly in terms of the number of inferred events. It appears crucial to take this into account before attempting any further biological interpretation of such reconciliations. More generally, we also show that the set of frequencies can vary widely depending on the input host and parasite trees. Indiscriminately applying a standard vector of costs may thus not be a good strategy. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  9. A heuristic statistical stopping rule for iterative reconstruction in emission tomography

    International Nuclear Information System (INIS)

    Ben Bouallegue, F.; Mariano-Goulart, D.; Crouzet, J.F.

    2013-01-01

    We propose a statistical stopping criterion for iterative reconstruction in emission tomography based on a heuristic statistical description of the reconstruction process. The method was assessed for maximum likelihood expectation maximization (MLEM) reconstruction. Based on Monte-Carlo numerical simulations and using a perfectly modeled system matrix, our method was compared with classical iterative reconstruction followed by low-pass filtering in terms of Euclidian distance to the exact object, noise, and resolution. The stopping criterion was then evaluated with realistic PET data of a Hoffman brain phantom produced using the Geant4 application in emission tomography (GATE) platform for different count levels. The numerical experiments showed that compared with the classical method, our technique yielded significant improvement of the noise-resolution tradeoff for a wide range of counting statistics compatible with routine clinical settings. When working with realistic data, the stopping rule allowed a qualitatively and quantitatively efficient determination of the optimal image. Our method appears to give a reliable estimation of the optimal stopping point for iterative reconstruction. It should thus be of practical interest as it produces images with similar or better quality than classical post-filtered iterative reconstruction with a mastered computation time. (author)

  10. Using MOEA with Redistribution and Consensus Branches to Infer Phylogenies.

    Science.gov (United States)

    Min, Xiaoping; Zhang, Mouzhao; Yuan, Sisi; Ge, Shengxiang; Liu, Xiangrong; Zeng, Xiangxiang; Xia, Ningshao

    2017-12-26

    In recent years, to infer phylogenies, which are NP-hard problems, more and more research has focused on using metaheuristics. Maximum Parsimony and Maximum Likelihood are two effective ways to conduct inference. Based on these methods, which can also be considered as the optimal criteria for phylogenies, various kinds of multi-objective metaheuristics have been used to reconstruct phylogenies. However, combining these two time-consuming methods results in those multi-objective metaheuristics being slower than a single objective. Therefore, we propose a novel, multi-objective optimization algorithm, MOEA-RC, to accelerate the processes of rebuilding phylogenies using structural information of elites in current populations. We compare MOEA-RC with two representative multi-objective algorithms, MOEA/D and NAGA-II, and a non-consensus version of MOEA-RC on three real-world datasets. The result is, within a given number of iterations, MOEA-RC achieves better solutions than the other algorithms.

  11. Molecular characterization and phylogeny of some mazocraeidean monogeneans from carangid fish.

    Science.gov (United States)

    Tambireddy, Neeraja; Gayatri, Tripathi; Gireesh-Babu, Pathakota; Pavan-Kumar, Annam

    2016-03-01

    Polyopisthocotylean monogenean parasites of fishes are highly host specific and have been used as an appropriate model to study the host-parasite co-evolution. In the present study, eight monogeneans of the order Mazocraeidea were characterized by nuclear 28S rDNA sequences and their phylogenetic relationship with other polyopisthocotylean species was investigated. Neighbour-joining, maximum parsimony, maximum likelihood and Bayesian Inference methods were used for phylogenetic reconstruction. The topology sustained by high bootstrap was: (((Hexabothriidae (Mazocraeidae (Discocotylidae (Diplozoidae (Diclidophoridae (Plectanocotylidae (Heteromicrocotylidae (Microcotylidae (Heteraxinidae), (Thoracocotylidae, Gotocotylidae (Gastrocoylidae (Allodiscocotylidae: Protomicrocotylidae))). In addition, we have also developed DNA barcodes (COI sequences) for six species and the barcodes clearly discriminated all the species. The polytomy within Protomicrocotylidae family is resolved in this study for the first time and it appears that within this family, Bilaterocotyloides species are basal compared to Neomicrocotyle and Lethacotyle species while the latter is the more derived.

  12. Molecular phylogeny of mitochondrial cytochrome b and 12S rRNA sequences in the Felidae: ocelot and domestic cat lineages.

    Science.gov (United States)

    Masuda, R; Lopez, J V; Slattery, J P; Yuhki, N; O'Brien, S J

    1996-12-01

    Molecular phylogeny of the cat family Felidae is derived using two mitochondrial genes, cytochrome b and 12S rRNA. Phylogenetic methods of weighted maximum parsimony and minimum evolution estimated by neighbor-joining are employed to reconstruct topologies among 20 extant felid species. Sequence analyses of 363 bp of cytochrome b and 376 bp of the 12S rRNA genes yielded average pair-wise similarity values between felids ranging from 94 to 99% and from 85 to 99%, respectively. Phylogenetic reconstruction supports more recent, intralineage associations but fails to completely resolve interlineage relationships. Both genes produce a monophyletic group of Felis species but vary in the placement of the pallas cat. The ocelot lineage represents an early divergence within the Felidae, with strong associations between ocelot and margay, Geoffroy's cat and kodkod, and pampas cat and tigrina. Implications of the relative recency of felid evolution, presence of ancestral polymorphisms, and influence of outgroups in placement of the topological root are discussed.

  13. Reconstruction of Laser-Induced Surface Topography from Electron Backscatter Diffraction Patterns.

    Science.gov (United States)

    Callahan, Patrick G; Echlin, McLean P; Pollock, Tresa M; De Graef, Marc

    2017-08-01

    We demonstrate that the surface topography of a sample can be reconstructed from electron backscatter diffraction (EBSD) patterns collected with a commercial EBSD system. This technique combines the location of the maximum background intensity with a correction from Monte Carlo simulations to determine the local surface normals at each point in an EBSD scan. A surface height map is then reconstructed from the local surface normals. In this study, a Ni sample was machined with a femtosecond laser, which causes the formation of a laser-induced periodic surface structure (LIPSS). The topography of the LIPSS was analyzed using atomic force microscopy (AFM) and reconstructions from EBSD patterns collected at 5 and 20 kV. The LIPSS consisted of a combination of low frequency waviness due to curtaining and high frequency ridges. The morphology of the reconstructed low frequency waviness and high frequency ridges matched the AFM data. The reconstruction technique does not require any modification to existing EBSD systems and so can be particularly useful for measuring topography and its evolution during in situ experiments.

  14. Pollen-based biome reconstructions for Colombia at 3000, 6000, 15 000 and 18 000 14C yr ago : Late Quaternary tropical vegetation dynamics

    NARCIS (Netherlands)

    Marchant, R.; Behling, H.; Berrío, J.C.; Cleef, A.M.; Duivenvoorden, J.; Hooghiemstra, H.; Kuhry, P.; Melief, B.; Schreve-Brinkman, E.; Geel, van B.; Hammen, van der T.; Reenen, van G.

    2002-01-01

    Colombian biomes are reconstructed at 45 sites from the modern period extending to the Last Glacial Maximum (LGM). The basis for our reconstruction is pollen data assigned to plant functional types and biomes at six 3000-yr intervals. A reconstruction of modern biomes is used to check the treatment

  15. PET image reconstruction using multi-parametric anato-functional priors

    Science.gov (United States)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results

  16. An integer programming formulation of the parsimonious loss of heterozygosity problem.

    Science.gov (United States)

    Catanzaro, Daniele; Labbé, Martine; Halldórsson, Bjarni V

    2013-01-01

    A loss of heterozygosity (LOH) event occurs when, by the laws of Mendelian inheritance, an individual should be heterozygote at a given site but, due to a deletion polymorphism, is not. Deletions play an important role in human disease and their detection could provide fundamental insights for the development of new diagnostics and treatments. In this paper, we investigate the parsimonious loss of heterozygosity problem (PLOHP), i.e., the problem of partitioning suspected polymorphisms from a set of individuals into a minimum number of deletion areas. Specifically, we generalize Halldórsson et al.'s work by providing a more general formulation of the PLOHP and by showing how one can incorporate different recombination rates and prior knowledge about the locations of deletions. Moreover, we show that the PLOHP can be formulated as a specific version of the clique partition problem in a particular class of graphs called undirected catch-point interval graphs and we prove its general $({\\cal NP})$-hardness. Finally, we provide a state-of-the-art integer programming (IP) formulation and strengthening valid inequalities to exactly solve real instances of the PLOHP containing up to 9,000 individuals and 3,000 SNPs. Our results give perspectives on the mathematics of the PLOHP and suggest new directions on the development of future efficient exact solution approaches.

  17. Climatic changes in Eurasia and Africa at the last glacial maximum and mid-Holocene: reconstruction from pollen data using inverse vegetation modelling

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Haibin [Chinese Academy of Sciences, SKLLQ, Institute of Earth Environment, Xi' an (China); CEREGE, UMR 6635, CNRS/Universite Paul Cezanne, CEREGE BP 80, Europole Mediterraneen de l' Arbois, Aix-en-Provence Cedex 4 (France); Guiot, Joel; Brewer, Simon [CEREGE, UMR 6635, CNRS/Universite Paul Cezanne, CEREGE BP 80, Europole Mediterraneen de l' Arbois, Aix-en-Provence Cedex 4 (France); Guo, Zhengtang [Chinese Academy of Sciences, SKLLQ, Institute of Earth Environment, Xi' an (China); Chinese Academy of Sciences, Institute of Geology and Geophysics, P.O. Box 9825, Beijing (China)

    2007-08-15

    In order to improve the reliability of climate reconstruction, especially the climatologies outside the modern observed climate space, an improved inverse vegetation model using a recent version of BIOME4 has been designed to quantitatively reconstruct past climates, based on pollen biome scores from the BIOME6000 project. The method has been validated with surface pollen spectra from Eurasia and Africa, and applied to palaeoclimate reconstruction. At 6 cal ka BP (calendar years), the climate was generally wetter than today in southern Europe and northern Africa, especially in the summer. Winter temperatures were higher (1-5 C) than present in southern Scandinavia, northeastern Europe, and southern Africa, but cooler in southern Eurasia and in tropical Africa, especially in Mediterranean regions. Summer temperatures were generally higher than today in most of Eurasia and Africa, with a significant warming from {proportional_to}3 to 5 C over northwestern and southern Europe, southern Africa, and eastern Africa. In contrast, summers were 1-3 C cooler than present in the Mediterranean lowlands and in a band from the eastern Black Sea to Siberia. At 21 cal ka BP, a marked hydrological change can be seen in the tropical zone, where annual precipitation was {proportional_to}200-1,000 mm/year lower than today in equatorial East Africa compared to the present. A robust inverse relationship is shown between precipitation change and elevation in Africa. This relationship indicates that precipitation likely had an important role in controlling equilibrium-line altitudes (ELA) changes in the tropics during the LGM period. In Eurasia, hydrological decreases follow a longitudinal gradient from Europe to Siberia. Winter temperatures were {proportional_to}10-17 C lower than today in Eurasia with a more significant decrease in northern regions. In Africa, winter temperature was {proportional_to}10-15 C lower than present in the south, while it was only reduced by {proportional_to}0

  18. A three-step reconstruction method for fluorescence molecular tomography based on compressive sensing

    DEFF Research Database (Denmark)

    Zhu, Yansong; Jha, Abhinav K.; Dreyer, Jakob K.

    2017-01-01

    Fluorescence molecular tomography (FMT) is a promising tool for real time in vivo quantification of neurotransmission (NT) as we pursue in our BRAIN initiative effort. However, the acquired image data are noisy and the reconstruction problem is ill-posed. Further, while spatial sparsity of the NT...... matrix coherence. The resultant image data are input to a homotopy-based reconstruction strategy that exploits sparsity via ℓ1 regularization. The reconstructed image is then input to a maximum-likelihood expectation maximization (MLEM) algorithm that retains the sparseness of the input estimate...... and improves upon the quantitation by accurate Poisson noise modeling. The proposed reconstruction method was evaluated in a three-dimensional simulated setup with fluorescent sources in a cuboidal scattering medium with optical properties simulating human brain cortex (reduced scattering coefficient: 9.2 cm-1...

  19. Experimental reconstruction of a highly reflecting fiber Bragg grating by using spectral regularization and inverse scattering.

    Science.gov (United States)

    Rosenthal, Amir; Horowitz, Moshe; Kieckbusch, Sven; Brinkmeyer, Ernst

    2007-10-01

    We demonstrate experimentally, for the first time to our knowledge, a reconstruction of a highly reflecting fiber Bragg grating from its complex reflection spectrum by using a regularization algorithm. The regularization method is based on correcting the measured reflection spectrum at the Bragg zone frequencies and enables the reconstruction of the grating profile using the integral-layer-peeling algorithm. A grating with an approximately uniform profile and with a maximum reflectivity of 99.98% was accurately reconstructed by measuring only its complex reflection spectrum.

  20. Measuring stone volume - three-dimensional software reconstruction or an ellipsoid algebra formula?

    Science.gov (United States)

    Finch, William; Johnston, Richard; Shaida, Nadeem; Winterbottom, Andrew; Wiseman, Oliver

    2014-04-01

    To determine the optimal method for assessing stone volume, and thus stone burden, by comparing the accuracy of scalene, oblate, and prolate ellipsoid volume equations with three-dimensional (3D)-reconstructed stone volume. Kidney stone volume may be helpful in predicting treatment outcome for renal stones. While the precise measurement of stone volume by 3D reconstruction can be accomplished using modern computer tomography (CT) scanning software, this technique is not available in all hospitals or with routine acute colic scanning protocols. Therefore, maximum diameters as measured by either X-ray or CT are used in the calculation of stone volume based on a scalene ellipsoid formula, as recommended by the European Association of Urology. In all, 100 stones with both X-ray and CT (1-2-mm slices) were reviewed. Complete and partial staghorn stones were excluded. Stone volume was calculated using software designed to measure tissue density of a certain range within a specified region of interest. Correlation coefficients among all measured outcomes were compared. Stone volumes were analysed to determine the average 'shape' of the stones. The maximum stone diameter on X-ray was 3-25 mm and on CT was 3-36 mm, with a reasonable correlation (r = 0.77). Smaller stones (15 mm towards scalene ellipsoids. There was no difference in stone shape by location within the kidney. As the average shape of renal stones changes with diameter, no single equation for estimating stone volume can be recommended. As the maximum diameter increases, calculated stone volume becomes less accurate, suggesting that larger stones have more asymmetric shapes. We recommend that research looking at stone clearance rates should use 3D-reconstructed stone volumes when available, followed by prolate, oblate, or scalene ellipsoid formulas depending on the maximum stone diameter. © 2013 The Authors. BJU International © 2013 BJU International.

  1. Invariant Versus Classical Quartet Inference When Evolution is Heterogeneous Across Sites and Lineages.

    Science.gov (United States)

    Fernández-Sánchez, Jesús; Casanellas, Marta

    2016-03-01

    One reason why classical phylogenetic reconstruction methods fail to correctly infer the underlying topology is because they assume oversimplified models. In this article, we propose a quartet reconstruction method consistent with the most general Markov model of nucleotide substitution, which can also deal with data coming from mixtures on the same topology. Our proposed method uses phylogenetic invariants and provides a system of weights that can be used as input for quartet-based methods. We study its performance on real data and on a wide range of simulated 4-taxon data (both time-homogeneous and nonhomogeneous, with or without among-site rate heterogeneity, and with different branch length settings). We compare it to the classical methods of neighbor-joining (with paralinear distance), maximum likelihood (with different underlying models), and maximum parsimony. Our results show that this method is accurate and robust, has a similar performance to maximum likelihood when data satisfies the assumptions of both methods, and outperform the other methods when these are based on inappropriate substitution models. If alignments are long enough, then it also outperforms other methods when some of its assumptions are violated. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. PET reconstruction

    International Nuclear Information System (INIS)

    O'Sullivan, F.; Pawitan, Y.; Harrison, R.L.; Lewellen, T.K.

    1990-01-01

    In statistical terms, filtered backprojection can be viewed as smoothed Least Squares (LS). In this paper, the authors report on improvement in LS resolution by: incorporating locally adaptive smoothers, imposing positivity and using statistical methods for optimal selection of the resolution parameter. The resulting algorithm has high computational efficiency relative to more elaborate Maximum Likelihood (ML) type techniques (i.e. EM with sieves). Practical aspects of the procedure are discussed in the context of PET and illustrations with computer simulated and real tomograph data are presented. The relative recovery coefficients for a 9mm sphere in a computer simulated hot-spot phantom range from .3 to .6 when the number of counts ranges from 10,000 to 640,000 respectively. The authors will also present results illustrating the relative efficacy of ML and LS reconstruction techniques

  3. Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE

  4. Athletic Performance at the National Basketball Association Combine After Anterior Cruciate Ligament Reconstruction.

    Science.gov (United States)

    Mehran, Nima; Williams, Phillip N; Keller, Robert A; Khalil, Lafi S; Lombardo, Stephen J; Kharrazi, F Daniel

    2016-05-01

    Anterior cruciate ligament (ACL) injuries are significant injuries in elite-level basketball players. In-game statistical performance after ACL reconstruction has been demonstrated; however, few studies have reviewed functional performance in National Basketball Association (NBA)-caliber athletes after ACL reconstruction. To compare NBA Combine performance of athletes after ACL reconstruction with an age-, size-, and position-matched control group of players with no previous reported knee injury requiring surgery. We hypothesized that there is no difference between the 2 groups in functional performance. Cross-sectional study; Level of evidence, 3. A total of 1092 NBA-caliber players who participated in the NBA Combine between 2000 and 2015 were reviewed. Twenty-one athletes were identified as having primary ACL reconstruction prior to participation in the combine. This study group was compared with an age-, size-, and position-matched control group in objective functional performance testing, including the shuttle run test, lane agility test, three-quarter court sprint, vertical jump (no step), and maximum vertical jump (running start). With regard to quickness and agility, both ACL-reconstructed athletes and controls scored an average of 11.5 seconds in the lane agility test and 3.1 seconds in the shuttle run test (P = .745 and .346, respectively). Speed and acceleration was measured by the three-quarter court sprint, in which both the study group and the control group averaged 3.3 seconds (P = .516). In the maximum vertical jump, which demonstrates an athlete's jumping ability with a running start, the ACL reconstruction group had an average height of 33.6 inches while the controls averaged 33.9 inches (P = .548). In the standing vertical jump, the ACL reconstruction group averaged 28.2 inches while the control group averaged 29.2 inches (P = .067). In athletes who are able to return to sport and compete at a high level such as the NBA Combine, there is no

  5. Bayesian PET image reconstruction incorporating anato-functional joint entropy

    International Nuclear Information System (INIS)

    Tang Jing; Rahmim, Arman

    2009-01-01

    We developed a maximum a posterior (MAP) reconstruction method for positron emission tomography (PET) image reconstruction incorporating magnetic resonance (MR) image information, with the joint entropy between the PET and MR image features serving as the regularization constraint. A non-parametric method was used to estimate the joint probability density of the PET and MR images. Using realistically simulated PET and MR human brain phantoms, the quantitative performance of the proposed algorithm was investigated. Incorporation of the anatomic information via this technique, after parameter optimization, was seen to dramatically improve the noise versus bias tradeoff in every region of interest, compared to the result from using conventional MAP reconstruction. In particular, hot lesions in the FDG PET image, which had no anatomical correspondence in the MR image, also had improved contrast versus noise tradeoff. Corrections were made to figures 3, 4 and 6, and to the second paragraph of section 3.1 on 13 November 2009. The corrected electronic version is identical to the print version.

  6. A human genome-wide library of local phylogeny predictions for whole-genome inference problems

    Directory of Open Access Journals (Sweden)

    Schwartz Russell

    2008-08-01

    Full Text Available Abstract Background Many common inference problems in computational genetics depend on inferring aspects of the evolutionary history of a data set given a set of observed modern sequences. Detailed predictions of the full phylogenies are therefore of value in improving our ability to make further inferences about population history and sources of genetic variation. Making phylogenetic predictions on the scale needed for whole-genome analysis is, however, extremely computationally demanding. Results In order to facilitate phylogeny-based predictions on a genomic scale, we develop a library of maximum parsimony phylogenies within local regions spanning all autosomal human chromosomes based on Haplotype Map variation data. We demonstrate the utility of this library for population genetic inferences by examining a tree statistic we call 'imperfection,' which measures the reuse of variant sites within a phylogeny. This statistic is significantly predictive of recombination rate, shows additional regional and population-specific conservation, and allows us to identify outlier genes likely to have experienced unusual amounts of variation in recent human history. Conclusion Recent theoretical advances in algorithms for phylogenetic tree reconstruction have made it possible to perform large-scale inferences of local maximum parsimony phylogenies from single nucleotide polymorphism (SNP data. As results from the imperfection statistic demonstrate, phylogeny predictions encode substantial information useful for detecting genomic features and population history. This data set should serve as a platform for many kinds of inferences one may wish to make about human population history and genetic variation.

  7. Applying Bayesian neural networks to event reconstruction in reactor neutrino experiments

    International Nuclear Information System (INIS)

    Xu Ye; Xu Weiwei; Meng Yixiong; Zhu Kaien; Xu Wei

    2008-01-01

    A toy detector has been designed to simulate central detectors in reactor neutrino experiments in the paper. The electron samples from the Monte-Carlo simulation of the toy detector have been reconstructed by the method of Bayesian neural networks (BNNs) and the standard algorithm, a maximum likelihood method (MLD), respectively. The result of the event reconstruction using BNN has been compared with the one using MLD. Compared to MLD, the uncertainties of the electron vertex are not improved, but the energy resolutions are significantly improved using BNN. And the improvement is more obvious for the high energy electrons than the low energy ones

  8. 3D Surface Reconstruction for Lower Limb Prosthetic Model using Radon Transform

    Science.gov (United States)

    Sobani, S. S. Mohd; Mahmood, N. H.; Zakaria, N. A.; Razak, M. A. Abdul

    2018-03-01

    This paper describes the idea to realize three-dimensional surfaces of objects with cylinder-based shapes where the techniques adopted and the strategy developed for a non-rigid three-dimensional surface reconstruction of an object from uncalibrated two-dimensional image sequences using multiple-view digital camera and turntable setup. The surface of an object is reconstructed based on the concept of tomography with the aid of performing several digital image processing algorithms on the two-dimensional images captured by a digital camera in thirty-six different projections and the three-dimensional structure of the surface is analysed. Four different objects are used as experimental models in the reconstructions and each object is placed on a manually rotated turntable. The results shown that the proposed method has successfully reconstruct the three-dimensional surface of the objects and practicable. The shape and size of the reconstructed three-dimensional objects are recognizable and distinguishable. The reconstructions of objects involved in the test are strengthened with the analysis where the maximum percent error obtained from the computation is approximately 1.4 % for the height whilst 4.0%, 4.79% and 4.7% for the diameters at three specific heights of the objects.

  9. A comparison of quantitative reconstruction techniques for PIXE-tomography analysis applied to biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Beasley, D.G., E-mail: dgbeasley@ctn.ist.utl.pt [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Alves, L.C. [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Barberet, Ph.; Bourret, S.; Devès, G.; Gordillo, N.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Le Trequesser, Q. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Institut de Chimie de la Matière Condensée de Bordeaux (ICMCB, UPR9048) CNRS, Université de Bordeaux, 87 avenue du Dr. A. Schweitzer, Pessac F-33608 (France); Marques, A.C. [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Silva, R.C. da [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal)

    2014-07-15

    The tomographic reconstruction of biological specimens requires robust algorithms, able to deal with low density contrast and low element concentrations. At the IST/ITN microprobe facility new GPU-accelerated reconstruction software, JPIXET, has been developed, which can significantly increase the speed of quantitative reconstruction of Proton Induced X-ray Emission Tomography (PIXE-T) data. It has a user-friendly graphical user interface for pre-processing, data analysis and reconstruction of PIXE-T and Scanning Transmission Ion Microscopy Tomography (STIM-T). The reconstruction of PIXE-T data is performed using either an algorithm based on a GPU-accelerated version of the Maximum Likelihood Expectation Maximisation (MLEM) method or a GPU-accelerated version of the Discrete Image Space Reconstruction Algorithm (DISRA) (Sakellariou (2001) [2]). The original DISRA, its accelerated version, and the MLEM algorithm, were compared for the reconstruction of a biological sample of Caenorhabditis elegans – a small worm. This sample was analysed at the microbeam line of the AIFIRA facility of CENBG, Bordeaux. A qualitative PIXE-T reconstruction was obtained using the CENBG software package TomoRebuild (Habchi et al. (2013) [6]). The effects of pre-processing and experimental conditions on the elemental concentrations are discussed.

  10. Intensity-based bayesian framework for image reconstruction from sparse projection data

    International Nuclear Information System (INIS)

    Rashed, E.A.; Kudo, Hiroyuki

    2009-01-01

    This paper presents a Bayesian framework for iterative image reconstruction from projection data measured over a limited number of views. The classical Nyquist sampling rule yields the minimum number of projection views required for accurate reconstruction. However, challenges exist in many medical and industrial imaging applications in which the projection data is undersampled. Classical analytical reconstruction methods such as filtered backprojection (FBP) are not a good choice for use in such cases because the data undersampling in the angular range introduces aliasing and streak artifacts that degrade lesion detectability. In this paper, we propose a Bayesian framework for maximum likelihood-expectation maximization (ML-EM)-based iterative reconstruction methods that incorporates a priori knowledge obtained from expected intensity information. The proposed framework is based on the fact that, in tomographic imaging, it is often possible to expect a set of intensity values of the reconstructed object with relatively high accuracy. The image reconstruction cost function is modified to include the l 1 norm distance to the a priori known information. The proposed method has the potential to regularize the solution to reduce artifacts without missing lesions that cannot be expected from the a priori information. Numerical studies showed a significant improvement in image quality and lesion detectability under the condition of highly undersampled projection data. (author)

  11. Improving Phylogeny Reconstruction at the Strain Level Using Peptidome Datasets.

    Directory of Open Access Journals (Sweden)

    Aitor Blanco-Míguez

    2016-12-01

    Full Text Available Typical bacterial strain differentiation methods are often challenged by high genetic similarity between strains. To address this problem, we introduce a novel in silico peptide fingerprinting method based on conventional wet-lab protocols that enables the identification of potential strain-specific peptides. These can be further investigated using in vitro approaches, laying a foundation for the development of biomarker detection and application-specific methods. This novel method aims at reducing large amounts of comparative peptide data to binary matrices while maintaining a high phylogenetic resolution. The underlying case study concerns the Bacillus cereus group, namely the differentiation of Bacillus thuringiensis, Bacillus anthracis and Bacillus cereus strains. Results show that trees based on cytoplasmic and extracellular peptidomes are only marginally in conflict with those based on whole proteomes, as inferred by the established Genome-BLAST Distance Phylogeny (GBDP method. Hence, these results indicate that the two approaches can most likely be used complementarily even in other organismal groups. The obtained results confirm previous reports about the misclassification of many strains within the B. cereus group. Moreover, our method was able to separate the B. anthracis strains with high resolution, similarly to the GBDP results as benchmarked via Bayesian inference and both Maximum Likelihood and Maximum Parsimony. In addition to the presented phylogenomic applications, whole-peptide fingerprinting might also become a valuable complementary technique to digital DNA-DNA hybridization, notably for bacterial classification at the species and subspecies level in the future.

  12. Improving Phylogeny Reconstruction at the Strain Level Using Peptidome Datasets.

    Science.gov (United States)

    Blanco-Míguez, Aitor; Meier-Kolthoff, Jan P; Gutiérrez-Jácome, Alberto; Göker, Markus; Fdez-Riverola, Florentino; Sánchez, Borja; Lourenço, Anália

    2016-12-01

    Typical bacterial strain differentiation methods are often challenged by high genetic similarity between strains. To address this problem, we introduce a novel in silico peptide fingerprinting method based on conventional wet-lab protocols that enables the identification of potential strain-specific peptides. These can be further investigated using in vitro approaches, laying a foundation for the development of biomarker detection and application-specific methods. This novel method aims at reducing large amounts of comparative peptide data to binary matrices while maintaining a high phylogenetic resolution. The underlying case study concerns the Bacillus cereus group, namely the differentiation of Bacillus thuringiensis, Bacillus anthracis and Bacillus cereus strains. Results show that trees based on cytoplasmic and extracellular peptidomes are only marginally in conflict with those based on whole proteomes, as inferred by the established Genome-BLAST Distance Phylogeny (GBDP) method. Hence, these results indicate that the two approaches can most likely be used complementarily even in other organismal groups. The obtained results confirm previous reports about the misclassification of many strains within the B. cereus group. Moreover, our method was able to separate the B. anthracis strains with high resolution, similarly to the GBDP results as benchmarked via Bayesian inference and both Maximum Likelihood and Maximum Parsimony. In addition to the presented phylogenomic applications, whole-peptide fingerprinting might also become a valuable complementary technique to digital DNA-DNA hybridization, notably for bacterial classification at the species and subspecies level in the future.

  13. Evaluation of bias and variance in low-count OSEM list mode reconstruction

    International Nuclear Information System (INIS)

    Jian, Y; Carson, R E; Planeta, B

    2015-01-01

    Statistical algorithms have been widely used in PET image reconstruction. The maximum likelihood expectation maximization reconstruction has been shown to produce bias in applications where images are reconstructed from a relatively small number of counts. In this study, image bias and variability in low-count OSEM reconstruction are investigated on images reconstructed with MOLAR (motion-compensation OSEM list-mode algorithm for resolution-recovery reconstruction) platform. A human brain ([ 11 C]AFM) and a NEMA phantom are used in the simulation and real experiments respectively, for the HRRT and Biograph mCT. Image reconstructions were repeated with different combinations of subsets and iterations. Regions of interest were defined on low-activity and high-activity regions to evaluate the bias and noise at matched effective iteration numbers (iterations × subsets). Minimal negative biases and no positive biases were found at moderate count levels and less than 5% negative bias was found using extremely low levels of counts (0.2 M NEC). At any given count level, other factors, such as subset numbers and frame-based scatter correction may introduce small biases (1–5%) in the reconstructed images. The observed bias was substantially lower than that reported in the literature, perhaps due to the use of point spread function and/or other implementation methods in MOLAR. (paper)

  14. Accurate and robust phylogeny estimation based on profile distances: a study of the Chlorophyceae (Chlorophyta

    Directory of Open Access Journals (Sweden)

    Rahmann Sven

    2004-06-01

    Full Text Available Abstract Background In phylogenetic analysis we face the problem that several subclade topologies are known or easily inferred and well supported by bootstrap analysis, but basal branching patterns cannot be unambiguously estimated by the usual methods (maximum parsimony (MP, neighbor-joining (NJ, or maximum likelihood (ML, nor are they well supported. We represent each subclade by a sequence profile and estimate evolutionary distances between profiles to obtain a matrix of distances between subclades. Results Our estimator of profile distances generalizes the maximum likelihood estimator of sequence distances. The basal branching pattern can be estimated by any distance-based method, such as neighbor-joining. Our method (profile neighbor-joining, PNJ then inherits the accuracy and robustness of profiles and the time efficiency of neighbor-joining. Conclusions Phylogenetic analysis of Chlorophyceae with traditional methods (MP, NJ, ML and MrBayes reveals seven well supported subclades, but the methods disagree on the basal branching pattern. The tree reconstructed by our method is better supported and can be confirmed by known morphological characters. Moreover the accuracy is significantly improved as shown by parametric bootstrap.

  15. Savannah River Site radioiodine atmospheric releases and offsite maximum doses

    International Nuclear Information System (INIS)

    Marter, W.L.

    1990-01-01

    Radioisotopes of iodine have been released to the atmosphere from the Savannah River Site since 1955. The releases, mostly from the 200-F and 200-H Chemical Separations areas, consist of the isotopes, I-129 and 1-131. Small amounts of 1-131 and 1-133 have also been released from reactor facilities and the Savannah River Laboratory. This reference memorandum was issued to summarize our current knowledge of releases of radioiodines and resultant maximum offsite doses. This memorandum supplements the reference memorandum by providing more detailed supporting technical information. Doses reported in this memorandum from consumption of the milk containing the highest I-131 concentration following the 1961 1-131 release incident are about 1% higher than reported in the reference memorandum. This is the result of using unrounded 1-131 concentrations of I-131 in milk in this memo. It is emphasized here that this technical report does not constitute a dose reconstruction in the same sense as the dose reconstruction effort currently underway at Hanford. This report uses existing published data for radioiodine releases and existing transport and dosimetry models

  16. Spectral reconstruction for shifted-excitation Raman difference spectroscopy (SERDS).

    Science.gov (United States)

    Guo, Shuxia; Chernavskaia, Olga; Popp, Jürgen; Bocklitz, Thomas

    2018-08-15

    Fluorescence emission is one of the major obstacles to apply Raman spectroscopy in biological investigations. It is usually several orders more intense than Raman scattering and hampers further analysis. In cases where the fluorescence emission is too intense to be efficiently removed via routine mathematical baseline correction algorithms, an alternative approach is needed. One alternative approach is shifted-excitation Raman difference spectroscopy (SERDS), where two Raman spectra are recorded with two slightly different excitation wavelengths. Ideally, the fluorescence emission at the two excitations does not change while the Raman spectrum shifts according to the excitation wavelength. Hence the fluorescence is removed in the difference of the two recorded Raman spectra. For better interpretability a spectral reconstruction procedure is necessary to recover the fluorescence-free Raman spectrum. This is challenging due to the intensity variations between the two recorded Raman spectra caused by unavoidable experimental changes as well as the presence of noise. Existent approaches suffer from drawbacks like spectral resolution loss, fluorescence residual, and artefacts. In this contribution, we proposed a reconstruction method based on non-negative least squares (NNLS), where the intensity variations between the two measurements are utilized in the reconstruction model. The method achieved fluorescence-free reconstruction on three real-world SERDS datasets without significant information loss. Thereafter, we quantified the performance of the reconstruction based on artificial datasets from four aspects: reconstructed spectral resolution, precision of reconstruction, signal-to-noise-ratio (SNR), and fluorescence residual. The artificial datasets were constructed with varied Raman to fluorescence intensity ratio (RFIR), SNR, full-width at half-maximum (FWHM), excitation wavelength shift, and fluorescence variation between the two spectra. It was demonstrated that

  17. Super-Resolution Image Reconstruction Applied to Medical Ultrasound

    Science.gov (United States)

    Ellis, Michael

    Ultrasound is the preferred imaging modality for many diagnostic applications due to its real-time image reconstruction and low cost. Nonetheless, conventional ultrasound is not used in many applications because of limited spatial resolution and soft tissue contrast. Most commercial ultrasound systems reconstruct images using a simple delay-and-sum architecture on receive, which is fast and robust but does not utilize all information available in the raw data. Recently, more sophisticated image reconstruction methods have been developed that make use of far more information in the raw data to improve resolution and contrast. One such method is the Time-Domain Optimized Near-Field Estimator (TONE), which employs a maximum a priori estimation to solve a highly underdetermined problem, given a well-defined system model. TONE has been shown to significantly improve both the contrast and resolution of ultrasound images when compared to conventional methods. However, TONE's lack of robustness to variations from the system model and extremely high computational cost hinder it from being readily adopted in clinical scanners. This dissertation aims to reduce the impact of TONE's shortcomings, transforming it from an academic construct to a clinically viable image reconstruction algorithm. By altering the system model from a collection of individual hypothetical scatterers to a collection of weighted, diffuse regions, dTONE is able to achieve much greater robustness to modeling errors. A method for efficient parallelization of dTONE is presented that reduces reconstruction time by more than an order of magnitude with little loss in image fidelity. An alternative reconstruction algorithm, called qTONE, is also developed and is able to reduce reconstruction times by another two orders of magnitude while simultaneously improving image contrast. Each of these methods for improving TONE are presented, their limitations are explored, and all are used in concert to reconstruct in

  18. Phylogenetic relationships of Malayan gaur with other species of the genus Bos based on cytochrome b gene DNA sequences.

    Science.gov (United States)

    Rosli, M K A; Zakaria, S S; Syed-Shabthar, S M F; Zainal, Z Z; Shukor, M N; Mahani, M C; Abas-Mazni, O; Md-Zain, B M

    2011-03-22

    The Malayan gaur (Bos gaurus hubbacki) is one of the three subspecies of gaurs that can be found in Malaysia. We examined the phylogenetic relationships of this subspecies with other species of the genus Bos (B. javanicus, B. indicus, B. taurus, and B. grunniens). The sequence of a key gene, cytochrome b, was compared among 20 Bos species and the bongo antelope, used as an outgroup. Phylogenetic reconstruction was employed using neighbor joining and maximum parsimony in PAUP and Bayesian inference in MrBayes 3.1. All tree topologies indicated that the Malayan gaur is in its own monophyletic clade, distinct from other species of the genus Bos. We also found significant branching differences in the tree topologies between wild and domestic cattle.

  19. Stability indicators in network reconstruction.

    Directory of Open Access Journals (Sweden)

    Michele Filosi

    Full Text Available The number of available algorithms to infer a biological network from a dataset of high-throughput measurements is overwhelming and keeps growing. However, evaluating their performance is unfeasible unless a 'gold standard' is available to measure how close the reconstructed network is to the ground truth. One measure of this is the stability of these predictions to data resampling approaches. We introduce NetSI, a family of Network Stability Indicators, to assess quantitatively the stability of a reconstructed network in terms of inference variability due to data subsampling. In order to evaluate network stability, the main NetSI methods use a global/local network metric in combination with a resampling (bootstrap or cross-validation procedure. In addition, we provide two normalized variability scores over data resampling to measure edge weight stability and node degree stability, and then introduce a stability ranking for edges and nodes. A complete implementation of the NetSI indicators, including the Hamming-Ipsen-Mikhailov (HIM network distance adopted in this paper is available with the R package nettools. We demonstrate the use of the NetSI family by measuring network stability on four datasets against alternative network reconstruction methods. First, the effect of sample size on stability of inferred networks is studied in a gold standard framework on yeast-like data from the Gene Net Weaver simulator. We also consider the impact of varying modularity on a set of structurally different networks (50 nodes, from 2 to 10 modules, and then of complex feature covariance structure, showing the different behaviours of standard reconstruction methods based on Pearson correlation, Maximum Information Coefficient (MIC and False Discovery Rate (FDR strategy. Finally, we demonstrate a strong combined effect of different reconstruction methods and phenotype subgroups on a hepatocellular carcinoma miRNA microarray dataset (240 subjects, and we

  20. Comparison of 3D Maximum A Posteriori and Filtered Backprojection algorithms for high resolution animal imaging in microPET

    International Nuclear Information System (INIS)

    Chatziioannou, A.; Qi, J.; Moore, A.; Annala, A.; Nguyen, K.; Leahy, R.M.; Cherry, S.R.

    2000-01-01

    We have evaluated the performance of two three dimensional reconstruction algorithms with data acquired from microPET, a high resolution tomograph dedicated to small animal imaging. The first was a linear filtered-backprojection algorithm (FBP) with reprojection of the missing data and the second was a statistical maximum-aposteriori probability algorithm (MAP). The two algorithms were evaluated in terms of their resolution performance, both in phantoms and in vivo. Sixty independent realizations of a phantom simulating the brain of a baby monkey were acquired, each containing 3 million counts. Each of these realizations was reconstructed independently with both algorithms. The ensemble of the sixty reconstructed realizations was used to estimate the standard deviation as a measure of the noise for each reconstruction algorithm. More detail was recovered in the MAP reconstruction without an increase in noise relative to FBP. Studies in a simple cylindrical compartment phantom demonstrated improved recovery of known activity ratios with MAP. Finally in vivo studies also demonstrated a clear improvement in spatial resolution using the MAP algorithm. The quantitative accuracy of the MAP reconstruction was also evaluated by comparison with autoradiography and direct well counting of tissue samples and was shown to be superior

  1. Evaluation of two methods for using MR information in PET reconstruction

    International Nuclear Information System (INIS)

    Caldeira, L.; Scheins, J.; Almeida, P.; Herzog, H.

    2013-01-01

    Using magnetic resonance (MR) information in maximum a posteriori (MAP) algorithms for positron emission tomography (PET) image reconstruction has been investigated in the last years. Recently, three methods to introduce this information have been evaluated and the Bowsher prior was considered the best. Its main advantage is that it does not require image segmentation. Another method that has been widely used for incorporating MR information is using boundaries obtained by segmentation. This method has also shown improvements in image quality. In this paper, two methods for incorporating MR information in PET reconstruction are compared. After a Bayes parameter optimization, the reconstructed images were compared using the mean squared error (MSE) and the coefficient of variation (CV). MSE values are 3% lower in Bowsher than using boundaries. CV values are 10% lower in Bowsher than using boundaries. Both methods performed better than using no prior, that is, maximum likelihood expectation maximization (MLEM) or MAP without anatomic information in terms of MSE and CV. Concluding, incorporating MR information using the Bowsher prior gives better results in terms of MSE and CV than boundaries. MAP algorithms showed again to be effective in noise reduction and convergence, specially when MR information is incorporated. The robustness of the priors in respect to noise and inhomogeneities in the MR image has however still to be performed

  2. A Taxonomic Reduced-Space Pollen Model for Paleoclimate Reconstruction

    Science.gov (United States)

    Wahl, E. R.; Schoelzel, C.

    2010-12-01

    Paleoenvironmental reconstruction from fossil pollen often attempts to take advantage of the rich taxonomic diversity in such data. Here, a taxonomically "reduced-space" reconstruction model is explored that would be parsimonious in introducing parameters needing to be estimated within a Bayesian Hierarchical Modeling context. This work involves a refinement of the traditional pollen ratio method. This method is useful when one (or a few) dominant pollen type(s) in a region have a strong positive correlation with a climate variable of interest and another (or a few) dominant pollen type(s) have a strong negative correlation. When, e.g., counts of pollen taxa a and b (r >0) are combined with pollen types c and d (r logistic generalized linear model (GLM). The GLM can readily model this relationship in the forward form, pollen = g(climate), which is more physically realistic than inverse models often used in paleoclimate reconstruction [climate = f(pollen)]. The specification of the model is: rnum Bin(n,p), where E(r|T) = p = exp(η)/[1+exp(η)], and η = α + β(T); r is the pollen ratio formed as above, rnum is the ratio numerator, n is the ratio denominator (i.e., the sum of pollen counts), the denominator-specific count is (n - rnum), and T is the temperature at each site corresponding to a specific value of r. Ecological and empirical screening identified the model (Spruce+Birch) / (Spruce+Birch+Oak+Hickory) for use in temperate eastern N. America. α and β were estimated using both "traditional" and Bayesian GLM algorithms (in R). Although it includes only four pollen types, the ratio model yields more explained variation ( 80%) in the pollen-temperature relationship of the study region than a 64-taxon modern analog technique (MAT). Thus, the new pollen ratio method represents an information-rich, reduced space data model that can be efficiently employed in a BHM framework. The ratio model can directly reconstruct past temperature by solving the GLM equations

  3. Molecular diversification of Trichuris spp. from Sigmodontinae (Cricetidae) rodents from Argentina based on mitochondrial DNA sequences.

    Science.gov (United States)

    Callejón, Rocío; Robles, María Del Rosario; Panei, Carlos Javier; Cutillas, Cristina

    2016-08-01

    A molecular phylogenetic hypothesis is presented for the genus Trichuris based on sequence data from mitochondrial cytochrome c oxidase 1 (cox1) and cytochrome b (cob). The taxa consisted of nine populations of whipworm from five species of Sigmodontinae rodents from Argentina. Bayesian Inference, Maximum Parsimony, and Maximum Likelihood methods were used to infer phylogenies for each gene separately but also for the combined mitochondrial data and the combined mitochondrial and nuclear dataset. Phylogenetic results based on cox1 and cob mitochondrial DNA (mtDNA) revealed three clades strongly resolved corresponding to three different species (Trichuris navonae, Trichuris bainae, and Trichuris pardinasi) showing phylogeographic variation, but relationships among Trichuris species were poorly resolved. Phylogenetic reconstruction based on concatenated sequences had greater phylogenetic resolution for delimiting species and populations intra-specific of Trichuris than those based on partitioned genes. Thus, populations of T. bainae and T. pardinasi could be affected by geographical factors and co-divergence parasite-host.

  4. Evolutionary position of Peruvian land snails (Orthalicidae among Stylommatophora (Mollusca: Gastropoda

    Directory of Open Access Journals (Sweden)

    Jorge Ramirez

    2011-07-01

    Full Text Available The genera Bostryx and Scutalus (Orthalicidae: Bulimulinae are endemics from South America. They are mainly distributed on the western slopes of the Peruvian Andes. The goal of the present work was to assess their evolutionary position among the stylommatophoran gastropods based on the 16S rRNA mitochondrial marker. Four sequences were obtained, and along with 28 sequences of other Stylommatophora retrieved from the GenBank, were aligned with ClustalX. The phylogenetic reconstruction was carried out using the methods of Neighbor-Joining, Maximum Parsimony, Maximum Likelihood and Bayesian inference. The multiple sequence alignment had 371 sites, with indels. The two genera of the family Orthalicidae for the first time included in a molecular phylogeny (Bostryx and Scutalus, formed a monophyletic group along with another member of the superfamily Orthalicoidea (Placostylus, result that is comparable with that obtained with nuclear markers. Their evolutionary relationship with other land snails is also discussed.

  5. Connectivity dynamics since the Last Glacial Maximum in the northern Andes: a pollen-driven framework to assess potential migration

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; van Boxel, J.H.; Cabrera, M.; González-Carranza, Z.; González-Arango, C.; Stevens, W.D.; Montiel, O.M.; Raven, P.H.

    2014-01-01

    We provide an innovative pollen-driven connectivity framework of the dynamic altitudinal distribution of North Andean biomes since the Last Glacial Maximum (LGM). Altitudinally changing biome distributions reconstructed from a pollen record from Lake La Cocha (2780 m) are assessed in terms of their

  6. On a full Bayesian inference for force reconstruction problems

    Science.gov (United States)

    Aucejo, M.; De Smet, O.

    2018-05-01

    In a previous paper, the authors introduced a flexible methodology for reconstructing mechanical sources in the frequency domain from prior local information on both their nature and location over a linear and time invariant structure. The proposed approach was derived from Bayesian statistics, because of its ability in mathematically accounting for experimenter's prior knowledge. However, since only the Maximum a Posteriori estimate was computed, the posterior uncertainty about the regularized solution given the measured vibration field, the mechanical model and the regularization parameter was not assessed. To answer this legitimate question, this paper fully exploits the Bayesian framework to provide, from a Markov Chain Monte Carlo algorithm, credible intervals and other statistical measures (mean, median, mode) for all the parameters of the force reconstruction problem.

  7. Primary Vertex Reconstruction for Upgrade at LHCb

    CERN Document Server

    Wanczyk, Joanna

    2016-01-01

    The aim of the LHCb experiment is the study of beauty and charm hadron decays with the main focus on CP violating phenomena and searches for physics beyond the Standard Model through rare decays. At the present, the second data taking period is ongoing, which is called Run II. After 2018 during the long shutdown, the replacement of signicant parts of the LHCb detector is planned. One of main changes is upgrade of the present software and hardware trigger to a more rapid full software trigger. Primary Vertex (PV) is a basis for the further tracking and it is sensitive to the LHC running conditions, which are going to change for the Upgrade. In particular, the center-of-mass collision energy should reach the maximum value of 14 TeV. As a result the quality of the reconstruction has to be studied and the reconstruction algorithms have to be optimized.

  8. Innovative Bayesian and Parsimony Phylogeny of Dung Beetles (Coleoptera, Scarabaeidae, Scarabaeinae) Enhanced by Ontology-Based Partitioning of Morphological Characters

    Science.gov (United States)

    Tarasov, Sergei; Génier, François

    2015-01-01

    Scarabaeine dung beetles are the dominant dung feeding group of insects and are widely used as model organisms in conservation, ecology and developmental biology. Due to the conflicts among 13 recently published phylogenies dealing with the higher-level relationships of dung beetles, the phylogeny of this lineage remains largely unresolved. In this study, we conduct rigorous phylogenetic analyses of dung beetles, based on an unprecedented taxon sample (110 taxa) and detailed investigation of morphology (205 characters). We provide the description of morphology and thoroughly illustrate the used characters. Along with parsimony, traditionally used in the analysis of morphological data, we also apply the Bayesian method with a novel approach that uses anatomy ontology for matrix partitioning. This approach allows for heterogeneity in evolutionary rates among characters from different anatomical regions. Anatomy ontology generates a number of parameter-partition schemes which we compare using Bayes factor. We also test the effect of inclusion of autapomorphies in the morphological analysis, which hitherto has not been examined. Generally, schemes with more parameters were favored in the Bayesian comparison suggesting that characters located on different body regions evolve at different rates and that partitioning of the data matrix using anatomy ontology is reasonable; however, trees from the parsimony and all the Bayesian analyses were quite consistent. The hypothesized phylogeny reveals many novel clades and provides additional support for some clades recovered in previous analyses. Our results provide a solid basis for a new classification of dung beetles, in which the taxonomic limits of the tribes Dichotomiini, Deltochilini and Coprini are restricted and many new tribes must be described. Based on the consistency of the phylogeny with biogeography, we speculate that dung beetles may have originated in the Mesozoic contrary to the traditional view pointing to a

  9. Revealing pancrustacean relationships: phylogenetic analysis of ribosomal protein genes places Collembola (springtails) in a monophyletic Hexapoda and reinforces the discrepancy between mitochondrial and nuclear DNA markers.

    Science.gov (United States)

    Timmermans, M J T N; Roelofs, D; Mariën, J; van Straalen, N M

    2008-03-12

    In recent years, several new hypotheses on phylogenetic relations among arthropods have been proposed on the basis of DNA sequences. One of the challenged hypotheses is the monophyly of hexapods. This discussion originated from analyses based on mitochondrial DNA datasets that, due to an unusual positioning of Collembola, suggested that the hexapod body plan evolved at least twice. Here, we re-evaluate the position of Collembola using ribosomal protein gene sequences. In total 48 ribosomal proteins were obtained for the collembolan Folsomia candida. These 48 sequences were aligned with sequence data on 35 other ecdysozoans. Each ribosomal protein gene was available for 25% to 86% of the taxa. However, the total sequence information was unequally distributed over the taxa and ranged between 4% and 100%. A concatenated dataset was constructed (5034 inferred amino acids in length), of which ~66% of the positions were filled. Phylogenetic tree reconstructions, using Maximum Likelihood, Maximum Parsimony, and Bayesian methods, resulted in a topology that supports monophyly of Hexapoda. Although ribosomal proteins in general may not evolve independently, they once more appear highly valuable for phylogenetic reconstruction. Our analyses clearly suggest that Hexapoda is monophyletic. This underpins the inconsistency between nuclear and mitochondrial datasets when analyzing pancrustacean relationships. Caution is needed when applying mitochondrial markers in deep phylogeny.

  10. Homoplastic evolution and host association of Eriophyoidea (Acari, Prostigmata) conflict with the morphological-based taxonomic system.

    Science.gov (United States)

    Li, Hao-Sen; Xue, Xiao-Feng; Hong, Xiao-Yue

    2014-09-01

    The superfamily Eriophyoidea is exceptionally diverse and its members are highly host-specific. Currently, the taxonomy of this group is based on morphology only. However, phylogenetic relationships in this group could be incorrect if the diagnostic morphological characters are homoplastic. Therefore, the phylogeny of 112 representative taxa of Eriophyoidea from China was determined using 18S, 28S D2-5 and D9-10 rRNA. Phylogenetic relationships were inferred through Bayesian, maximum likelihood and maximum parsimony methods, and then a number of clades or major clades were defined according to robust phylogenetic topologies combined with morphological comparison. Tests of monophyly showed that two of three families of Eriophyoidea as well as one subfamily and four tribes were not monophyletic. Ancestral character state reconstruction (ACSR) showed that five diagnostic morphological characters evolved several times, confounding the current taxonomy. Additionally, reconstruction of the history of host plant colonization suggested host switching occurred in a limited range of host plants. The host association data made it possible to determine taxonomic relationships more accurately. These results show that by integrating morphological and molecular information and host plant choice, it is possible to obtain a more accurate taxonomy and a deeper phylogenetic understanding of Eriophyoidea. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Statistical analysis of maximum likelihood estimator images of human brain FDG PET studies

    International Nuclear Information System (INIS)

    Llacer, J.; Veklerov, E.; Hoffman, E.J.; Nunez, J.; Coakley, K.J.

    1993-01-01

    The work presented in this paper evaluates the statistical characteristics of regional bias and expected error in reconstructions of real PET data of human brain fluorodeoxiglucose (FDG) studies carried out by the maximum likelihood estimator (MLE) method with a robust stopping rule, and compares them with the results of filtered backprojection (FBP) reconstructions and with the method of sieves. The task that the authors have investigated is that of quantifying radioisotope uptake in regions-of-interest (ROI's). They first describe a robust methodology for the use of the MLE method with clinical data which contains only one adjustable parameter: the kernel size for a Gaussian filtering operation that determines final resolution and expected regional error. Simulation results are used to establish the fundamental characteristics of the reconstructions obtained by out methodology, corresponding to the case in which the transition matrix is perfectly known. Then, data from 72 independent human brain FDG scans from four patients are used to show that the results obtained from real data are consistent with the simulation, although the quality of the data and of the transition matrix have an effect on the final outcome

  12. Simultaneous reconstruction of outer boundary shape and conductivity distribution in electrical impedance tomography

    KAUST Repository

    Hyvönen, Nuutti

    2016-01-05

    The simultaneous retrieval of the exterior boundary shape and the interior admittivity distribution of an examined body in electrical impedance tomography is considered. The reconstruction method is built for the complete electrode model and it is based on the Frechet derivative of the corresponding current-to-voltage map with respect to the body shape. The reconstruction problem is cast into the Bayesian framework, and maximum a posteriori estimates for the admittivity and the boundary geometry are computed. The feasibility of the approach is evaluated by experimental data from water tank measurements.

  13. MAP-MRF-Based Super-Resolution Reconstruction Approach for Coded Aperture Compressive Temporal Imaging

    Directory of Open Access Journals (Sweden)

    Tinghua Zhang

    2018-02-01

    Full Text Available Coded Aperture Compressive Temporal Imaging (CACTI can afford low-cost temporal super-resolution (SR, but limits are imposed by noise and compression ratio on reconstruction quality. To utilize inter-frame redundant information from multiple observations and sparsity in multi-transform domains, a robust reconstruction approach based on maximum a posteriori probability and Markov random field (MAP-MRF model for CACTI is proposed. The proposed approach adopts a weighted 3D neighbor system (WNS and the coordinate descent method to perform joint estimation of model parameters, to achieve the robust super-resolution reconstruction. The proposed multi-reconstruction algorithm considers both total variation (TV and ℓ 2 , 1 norm in wavelet domain to address the minimization problem for compressive sensing, and solves it using an accelerated generalized alternating projection algorithm. The weighting coefficient for different regularizations and frames is resolved by the motion characteristics of pixels. The proposed approach can provide high visual quality in the foreground and background of a scene simultaneously and enhance the fidelity of the reconstruction results. Simulation results have verified the efficacy of our new optimization framework and the proposed reconstruction approach.

  14. Plethodontid salamander mitochondrial genomics: A parsimonyevaluation of character conflict and implications for historicalbiogeography

    Energy Technology Data Exchange (ETDEWEB)

    Macey, J. Robert

    2005-01-19

    A new parsimony analysis of 27 complete mitochondrial genomic sequences is conducted to investigate the phylogenetic relationships of plethodontid salamanders. This analysis focuses on the amount of character conflict between phylogenetic trees recovered from newly conducted parsimony searches and the Bayesian and maximum likelihood topology reported by Mueller et al. (2004, PNAS, 101, 13820-13825). Strong support for Hemidactylium as the sister taxon to all other plethodontids is recovered from parsimony analyses. Plotting area relationships on the most parsimonious phylogenetic tree suggests that eastern North America is the origin of the family Plethodontidae supporting the ''Out of Appalachia'' hypothesis. A new taxonomy that recognizes clades recovered from phylogenetic analyses is proposed.

  15. Phylogenomics and evolution of floral traits in the Neotropical tribe Malmeeae (Annonaceae).

    Science.gov (United States)

    Lopes, J C; Chatrou, L W; Mello-Silva, R; Rudall, P J; Sajo, M G

    2018-01-01

    Androdioecy is the rarest sexual system among plants. The majority of androdioecious species are herbaceous plants that have evolved from dioecious ancestors. Nevertheless, some woody and androdioecious plants have hermaphrodite ancestors, as in the Annonaceae, where androdioecious genera have arisen several times in different lineages. The majority of androdioecious species of Annonaceae belong to the Neotropical tribe Malmeeae. In addition to these species, Pseudoxandra spiritus-sancti was recently confirmed to be androdioecious. Here, we describe the morphology of male and bisexual flowers of Pseudoxandra spiritus-sancti, and investigate the evolution of androdioecy in Malmeeae. The phylogeny of tribe Malmeeae was reconstructed using Bayesian inference, maximum parsimony and maximum likelihood of 32 taxa, using DNA sequences of 66 molecular markers of the chloroplast genome, sequenced by next generation sequencing. The reconstruction of ancestral states was performed for characters associated with sexual systems and floral morphology. The phylogenetic analyses reconstructed three main groups in Malmeeae, (Malmea (Cremastosperma, Pseudoxandra)) sister to the rest of the tribe, and (Unonopsis (Bocageopsis, Onychopetalum)) sister to (Mosannona, Ephedranthus, Klarobelia, Oxandra, Pseudephedranthus fragrans, Pseudomalmea, Ruizodendron ovale). Hermaphroditism is plesiomorphic in the tribe, with four independent evolutions of androdieocy, which represents a synapomorphy of two groups, one that includes three genera and 14 species, the other with a single genus of seven species. Male flowers are unisexual from inception and bisexual flowers possess staminodes and functional stamens. Pseudoxandra spiritus-sancti is structurally androdioecious. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Model-based respiratory motion compensation for emission tomography image reconstruction

    International Nuclear Information System (INIS)

    Reyes, M; Malandain, G; Koulibaly, P M; Gonzalez-Ballester, M A; Darcourt, J

    2007-01-01

    In emission tomography imaging, respiratory motion causes artifacts in lungs and cardiac reconstructed images, which lead to misinterpretations, imprecise diagnosis, impairing of fusion with other modalities, etc. Solutions like respiratory gating, correlated dynamic PET techniques, list-mode data based techniques and others have been tested, which lead to improvements over the spatial activity distribution in lungs lesions, but which have the disadvantages of requiring additional instrumentation or the need of discarding part of the projection data used for reconstruction. The objective of this study is to incorporate respiratory motion compensation directly into the image reconstruction process, without any additional acquisition protocol consideration. To this end, we propose an extension to the maximum likelihood expectation maximization (MLEM) algorithm that includes a respiratory motion model, which takes into account the displacements and volume deformations produced by the respiratory motion during the data acquisition process. We present results from synthetic simulations incorporating real respiratory motion as well as from phantom and patient data

  17. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    Science.gov (United States)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  18. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  19. A Novel Kernel-Based Regularization Technique for PET Image Reconstruction

    Directory of Open Access Journals (Sweden)

    Abdelwahhab Boudjelal

    2017-06-01

    Full Text Available Positron emission tomography (PET is an imaging technique that generates 3D detail of physiological processes at the cellular level. The technique requires a radioactive tracer, which decays and releases a positron that collides with an electron; consequently, annihilation photons are emitted, which can be measured. The purpose of PET is to use the measurement of photons to reconstruct the distribution of radioisotopes in the body. Currently, PET is undergoing a revamp, with advancements in data measurement instruments and the computing methods used to create the images. These computer methods are required to solve the inverse problem of “image reconstruction from projection”. This paper proposes a novel kernel-based regularization technique for maximum-likelihood expectation-maximization ( κ -MLEM to reconstruct the image. Compared to standard MLEM, the proposed algorithm is more robust and is more effective in removing background noise, whilst preserving the edges; this suppresses image artifacts, such as out-of-focus slice blur.

  20. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    International Nuclear Information System (INIS)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J. C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses

  1. Kinematic analysis of anterior cruciate ligament reconstruction in total knee arthroplasty

    Science.gov (United States)

    Liu, Hua-Wei; Ni, Ming; Zhang, Guo-Qiang; Li, Xiang; Chen, Hui; Zhang, Qiang; Chai, Wei; Zhou, Yong-Gang; Chen, Ji-Ying; Liu, Yu-Liang; Cheng, Cheng-Kung; Wang, Yan

    2016-01-01

    Background: This study aims to retain normal knee kinematics after knee replacement surgeries by reconstructing anterior cruciate ligament during total knee arthroplasty. Method: We use computational simulation tools to establish four dynamic knee models, including normal knee model, posterior cruciate ligament retaining knee model, posterior cruciate ligament substituting knee model, and anterior cruciate ligament reconstructing knee model. Our proposed method utilizes magnetic resonance images to reconstruct solid bones and attachments of ligaments, and assemble femoral and tibial components according representative literatures and operational specifications. Dynamic data of axial tibial rotation and femoral translation from full-extension to 135 were measured for analyzing the motion of knee models. Findings: The computational simulation results show that comparing with the posterior cruciate ligament retained knee model and the posterior cruciate ligament substituted knee model, reconstructing anterior cruciate ligament improves the posterior movement of the lateral condyle, medial condyle and tibial internal rotation through a full range of flexion. The maximum posterior translations of the lateral condyle, medial condyle and tibial internal rotation of the anterior cruciate ligament reconstructed knee are 15.3 mm, 4.6 mm and 20.6 at 135 of flexion. Interpretation: Reconstructing anterior cruciate ligament in total knee arthroplasty has been approved to be an more efficient way of maintaining normal knee kinematics comparing to posterior cruciate ligament retained and posterior cruciate ligament substituted total knee arthroplasty. PMID:27347334

  2. Inverse Monte Carlo: a unified reconstruction algorithm for SPECT

    International Nuclear Information System (INIS)

    Floyd, C.E.; Coleman, R.E.; Jaszczak, R.J.

    1985-01-01

    Inverse Monte Carlo (IMOC) is presented as a unified reconstruction algorithm for Emission Computed Tomography (ECT) providing simultaneous compensation for scatter, attenuation, and the variation of collimator resolution with depth. The technique of inverse Monte Carlo is used to find an inverse solution to the photon transport equation (an integral equation for photon flux from a specified source) for a parameterized source and specific boundary conditions. The system of linear equations so formed is solved to yield the source activity distribution for a set of acquired projections. For the studies presented here, the equations are solved using the EM (Maximum Likelihood) algorithm although other solution algorithms, such as Least Squares, could be employed. While the present results specifically consider the reconstruction of camera-based Single Photon Emission Computed Tomographic (SPECT) images, the technique is equally valid for Positron Emission Tomography (PET) if a Monte Carlo model of such a system is used. As a preliminary evaluation, experimentally acquired SPECT phantom studies for imaging Tc-99m (140 keV) are presented which demonstrate the quantitative compensation for scatter and attenuation for a two dimensional (single slice) reconstruction. The algorithm may be expanded in a straight forward manner to full three dimensional reconstruction including compensation for out of plane scatter

  3. Mastectomy Skin Necrosis After Breast Reconstruction: A Comparative Analysis Between Autologous Reconstruction and Implant-Based Reconstruction.

    Science.gov (United States)

    Sue, Gloria R; Lee, Gordon K

    2018-05-01

    Mastectomy skin necrosis is a significant problem after breast reconstruction. We sought to perform a comparative analysis on this complication between patients undergoing autologous breast reconstruction and patients undergoing 2-stage expander implant breast reconstruction. A retrospective review was performed on consecutive patients undergoing autologous breast reconstruction or 2-stage expander implant breast reconstruction by the senior author from 2006 through 2015. Patient demographic factors including age, body mass index, history of diabetes, history of smoking, and history of radiation to the breast were collected. Our primary outcome measure was mastectomy skin necrosis. Fisher exact test was used for statistical analysis between the 2 patient cohorts. The treatment patterns of mastectomy skin necrosis were then analyzed. We identified 204 patients who underwent autologous breast reconstruction and 293 patients who underwent 2-stage expander implant breast reconstruction. Patients undergoing autologous breast reconstruction were older, heavier, more likely to have diabetes, and more likely to have had prior radiation to the breast compared with patients undergoing implant-based reconstruction. The incidence of mastectomy skin necrosis was 30.4% of patients in the autologous group compared with only 10.6% of patients in the tissue expander group (P care in the autologous group, only 3.2% were treated with local wound care in the tissue expander group (P skin necrosis is significantly more likely to occur after autologous breast reconstruction compared with 2-stage expander implant-based breast reconstruction. Patients with autologous reconstructions are more readily treated with local wound care compared with patients with tissue expanders, who tended to require operative treatment of this complication. Patients considering breast reconstruction should be counseled appropriately regarding the differences in incidence and management of mastectomy skin

  4. Similarity-regulation of OS-EM for accelerated SPECT reconstruction

    Science.gov (United States)

    Vaissier, P. E. B.; Beekman, F. J.; Goorden, M. C.

    2016-06-01

    Ordered subsets expectation maximization (OS-EM) is widely used to accelerate image reconstruction in single photon emission computed tomography (SPECT). Speedup of OS-EM over maximum likelihood expectation maximization (ML-EM) is close to the number of subsets used. Although a high number of subsets can shorten reconstruction times significantly, it can also cause severe image artifacts such as improper erasure of reconstructed activity if projections contain few counts. We recently showed that such artifacts can be prevented by using a count-regulated OS-EM (CR-OS-EM) algorithm which automatically adapts the number of subsets for each voxel based on the estimated number of counts that the voxel contributed to the projections. While CR-OS-EM reached high speed-up over ML-EM in high-activity regions of images, speed in low-activity regions could still be very slow. In this work we propose similarity-regulated OS-EM (SR-OS-EM) as a much faster alternative to CR-OS-EM. SR-OS-EM also automatically and locally adapts the number of subsets, but it uses a different criterion for subset regulation: the number of subsets that is used for updating an individual voxel depends on how similar the reconstruction algorithm would update the estimated activity in that voxel with different subsets. Reconstructions of an image quality phantom and in vivo scans show that SR-OS-EM retains all of the favorable properties of CR-OS-EM, while reconstruction speed can be up to an order of magnitude higher in low-activity regions. Moreover our results suggest that SR-OS-EM can be operated with identical reconstruction parameters (including the number of iterations) for a wide range of count levels, which can be an additional advantage from a user perspective since users would only have to post-filter an image to present it at an appropriate noise level.

  5. Theoretical Study of Penalized-Likelihood Image Reconstruction for Region of Interest Quantification

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2006-01-01

    Region of interest (ROI) quantification is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Statistical image reconstruction methods based on the penalized maximum-likelihood (PML) or maximum a posteriori principle have been developed for emission tomography to deal with the low signal-to-noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the regularization parameter in PML reconstruction controls the resolution and noise tradeoff and, hence, affects ROI quantification. In this paper, we theoretically analyze the performance of ROI quantification in PML reconstructions. Building on previous work, we derive simplified theoretical expressions for the bias, variance, and ensemble mean-squared-error (EMSE) of the estimated total activity in an ROI that is surrounded by a uniform background. When the mean and covariance matrix of the activity inside the ROI are known, the theoretical expressions are readily computable and allow for fast evaluation of image quality for ROI quantification with different regularization parameters. The optimum regularization parameter can then be selected to minimize the EMSE. Computer simulations are conducted for small ROIs with variable uniform uptake. The results show that the theoretical predictions match the Monte Carlo results reasonably well

  6. Simultaneous reconstruction, segmentation, and edge enhancement of relatively piecewise continuous images with intensity-level information

    International Nuclear Information System (INIS)

    Liang, Z.; Jaszczak, R.; Coleman, R.; Johnson, V.

    1991-01-01

    A multinomial image model is proposed which uses intensity-level information for reconstruction of contiguous image regions. The intensity-level information assumes that image intensities are relatively constant within contiguous regions over the image-pixel array and that intensity levels of these regions are determined either empirically or theoretically by information criteria. These conditions may be valid, for example, for cardiac blood-pool imaging, where the intensity levels (or radionuclide activities) of myocardium, blood-pool, and background regions are distinct and the activities within each region of muscle, blood, or background are relatively uniform. To test the model, a mathematical phantom over a 64x64 array was constructed. The phantom had three contiguous regions. Each region had a different intensity level. Measurements from the phantom were simulated using an emission-tomography geometry. Fifty projections were generated over 180 degree, with 64 equally spaced parallel rays per projection. Projection data were randomized to contain Poisson noise. Image reconstructions were performed using an iterative maximum a posteriori probability procedure. The contiguous regions corresponding to the three intensity levels were automatically segmented. Simultaneously, the edges of the regions were sharpened. Noise in the reconstructed images was significantly suppressed. Convergence of the iterative procedure to the phantom was observed. Compared with maximum likelihood and filtered-backprojection approaches, the results obtained using the maximum a posteriori probability with the intensity-level information demonstrated qualitative and quantitative improvement in localizing the regions of varying intensities

  7. Voting based object boundary reconstruction

    Science.gov (United States)

    Tian, Qi; Zhang, Like; Ma, Jingsheng

    2005-07-01

    A voting-based object boundary reconstruction approach is proposed in this paper. Morphological technique was adopted in many applications for video object extraction to reconstruct the missing pixels. However, when the missing areas become large, the morphological processing cannot bring us good results. Recently, Tensor voting has attracted people"s attention, and it can be used for boundary estimation on curves or irregular trajectories. However, the complexity of saliency tensor creation limits its applications in real-time systems. An alternative approach based on tensor voting is introduced in this paper. Rather than creating saliency tensors, we use a "2-pass" method for orientation estimation. For the first pass, Sobel d*etector is applied on a coarse boundary image to get the gradient map. In the second pass, each pixel puts decreasing weights based on its gradient information, and the direction with maximum weights sum is selected as the correct orientation of the pixel. After the orientation map is obtained, pixels begin linking edges or intersections along their direction. The approach is applied to various video surveillance clips under different conditions, and the experimental results demonstrate significant improvement on the final extracted objects accuracy.

  8. An objective and parsimonious approach for classifying natural flow regimes at a continental scale

    Science.gov (United States)

    Archfield, S. A.; Kennen, J.; Carlisle, D.; Wolock, D.

    2013-12-01

    Hydroecological stream classification--the process of grouping streams by similar hydrologic responses and, thereby, similar aquatic habitat--has been widely accepted and is often one of the first steps towards developing ecological flow targets. Despite its importance, the last national classification of streamgauges was completed about 20 years ago. A new classification of 1,534 streamgauges in the contiguous United States is presented using a novel and parsimonious approach to understand similarity in ecological streamflow response. This new classification approach uses seven fundamental daily streamflow statistics (FDSS) rather than winnowing down an uncorrelated subset from 200 or more ecologically relevant streamflow statistics (ERSS) commonly used in hydroecological classification studies. The results of this investigation demonstrate that the distributions of 33 tested ERSS are consistently different among the classes derived from the seven FDSS. It is further shown that classification based solely on the 33 ERSS generally does a poorer job in grouping similar streamgauges than the classification based on the seven FDSS. This new classification approach has the additional advantages of overcoming some of the subjectivity associated with the selection of the classification variables and provides a set of robust continental-scale classes of US streamgauges.

  9. Agaricus section Xanthodermatei: a phylogenetic reconstruction with commentary on taxa.

    Science.gov (United States)

    Kerrigan, Richard W; Callac, Philippe; Guinberteau, Jacques; Challen, Michael P; Parra, Luis A

    2005-01-01

    Agaricus section Xanthodermatei comprises a group of species allied to A. xanthodermus and generally characterized by basidiomata having phenolic odors, transiently yellowing discolorations in some parts of the basidiome, Schaeffer's reaction negative, and mild to substantial toxicity. The section has a global distribution, while most included species have distributions restricted to regions of single continents. Using specimens and cultures from Europe, North America, and Hawaii, we analyzed DNA sequences from the ITS1+2 region of the nuclear rDNA to identify and characterize phylogenetically distinct entities and to construct a hypothesis of relationships, both among members of the section and with representative taxa from other sections of the genus. 61 sequences from affiliated taxa, plus 20 from six (or seven) other sections of Agaricus, and one Micropsalliota sequence, were evaluated under distance, maximum parsimony and maximum likelihood methods. We recognized 21 discrete entities in Xanthodermatei, including 14 established species and 7 new ones, three of which are described elsewhere. Four species from California, New Mexico, and France deserve further study before they are described. Type studies of American taxa are particularly emphasized, and a lectotype is designated for A. californicus. Section Xanthodermatei formed a single clade in most analyses, indicating that the traditional sectional characters noted above are good unifying characters that appear to have arisen only once within Agaricus. Deep divisions within the sequence-derived structure of the section could be interpreted as subsections in Xanthodermatei; however, various considerations led us to refrain from proposing new supraspecific taxa. The nearest neighbors of section Xanthodermatei are putatively in section Duploannulati.

  10. Fibrotic changes after postmastectomy radiotherapy and reconstructive surgery in breast cancer. A retrospective analysis in 109 patients

    International Nuclear Information System (INIS)

    Classen, Johannes; St. Vincentius-Kliniken, Karlsruhe; Nitzsche, Sibille; Wallwiener, Diethelm; Brucker, Sara; Kristen, Peter; Souchon, Rainer; Bamberg, Michael

    2010-01-01

    The purpose of this study was to analyze the probability and time course of fibrotic changes in breast reconstruction before or after postmastectomy radiotherapy (PMRT). Between 1995 and 2004, 109 patients were treated with PMRT at Tuebingen University and underwent heterologous (HL) or autologous (AL) breast reconstruction prior or subsequent to radiation therapy. Fibrosis of the reconstructed breast after radiotherapy was assessed using the Baker score for HL reconstructions and the Common Terminology Criteria for Adverse Events (CTCAE) for all patients. Actuarial rates of fibrosis were calculated for the maximum degree acquired during follow- up and at the last follow-up visit documented. Median time to follow-up was 34 months (3-227 months). Radiotherapy was applied with a median total dose of 50.4 Gy. A total of 44 patients (40.4%) received a boost treatment with a median dose of 10 Gy. Breast reconstruction was performed with AL, HL, or combined techniques in 20, 82, and 7 patients, respectively. The 3-year incidence of ≥ grade III maximum fibrosis was 20% and 43% for Baker and CTCAE scores, respectively. The corresponding figures for fibrosis at last follow-up visit were 18% and 2%. The 3-year rate of surgical correction of the contralateral breast was 30%. Initially unplanned surgery of the reconstructed breast was performed in 39 patients (35.8%). Boost treatment and type of cosmetic surgery (HL vs. AL) were not significantly associated with the incidence of fibrosis. We found severe fibrosis to be a frequent complication after PMRT radiotherapy and breast reconstruction. However, surgical intervention can ameliorate the majority of high grade fibrotic events leading to acceptable long-term results. No treatment parameters associated with the rate of fibrosis could be identified. (orig.)

  11. Assessment of phylogenetic sensitivity for reconstructing HIV-1 epidemiological relationships.

    Science.gov (United States)

    Beloukas, Apostolos; Magiorkinis, Emmanouil; Magiorkinis, Gkikas; Zavitsanou, Asimina; Karamitros, Timokratis; Hatzakis, Angelos; Paraskevis, Dimitrios

    2012-06-01

    Phylogenetic analysis has been extensively used as a tool for the reconstruction of epidemiological relations for research or for forensic purposes. It was our objective to assess the sensitivity of different phylogenetic methods and various phylogenetic programs to reconstruct epidemiological links among HIV-1 infected patients that is the probability to reveal a true transmission relationship. Multiple datasets (90) were prepared consisting of HIV-1 sequences in protease (PR) and partial reverse transcriptase (RT) sampled from patients with documented epidemiological relationship (target population), and from unrelated individuals (control population) belonging to the same HIV-1 subtype as the target population. Each dataset varied regarding the number, the geographic origin and the transmission risk groups of the sequences among the control population. Phylogenetic trees were inferred by neighbor-joining (NJ), maximum likelihood heuristics (hML) and Bayesian methods. All clusters of sequences belonging to the target population were correctly reconstructed by NJ and Bayesian methods receiving high bootstrap and posterior probability (PP) support, respectively. On the other hand, TreePuzzle failed to reconstruct or provide significant support for several clusters; high puzzling step support was associated with the inclusion of control sequences from the same geographic area as the target population. In contrary, all clusters were correctly reconstructed by hML as implemented in PhyML 3.0 receiving high bootstrap support. We report that under the conditions of our study, hML using PhyML, NJ and Bayesian methods were the most sensitive for the reconstruction of epidemiological links mostly from sexually infected individuals. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Evaluating low pass filters on SPECT reconstructed cardiac orientation estimation

    Science.gov (United States)

    Dwivedi, Shekhar

    2009-02-01

    Low pass filters can affect the quality of clinical SPECT images by smoothing. Appropriate filter and parameter selection leads to optimum smoothing that leads to a better quantification followed by correct diagnosis and accurate interpretation by the physician. This study aims at evaluating the low pass filters on SPECT reconstruction algorithms. Criteria for evaluating the filters are estimating the SPECT reconstructed cardiac azimuth and elevation angle. Low pass filters studied are butterworth, gaussian, hamming, hanning and parzen. Experiments are conducted using three reconstruction algorithms, FBP (filtered back projection), MLEM (maximum likelihood expectation maximization) and OSEM (ordered subsets expectation maximization), on four gated cardiac patient projections (two patients with stress and rest projections). Each filter is applied with varying cutoff and order for each reconstruction algorithm (only butterworth used for MLEM and OSEM). The azimuth and elevation angles are calculated from the reconstructed volume and the variation observed in the angles with varying filter parameters is reported. Our results demonstrate that behavior of hamming, hanning and parzen filter (used with FBP) with varying cutoff is similar for all the datasets. Butterworth filter (cutoff > 0.4) behaves in a similar fashion for all the datasets using all the algorithms whereas with OSEM for a cutoff < 0.4, it fails to generate cardiac orientation due to oversmoothing, and gives an unstable response with FBP and MLEM. This study on evaluating effect of low pass filter cutoff and order on cardiac orientation using three different reconstruction algorithms provides an interesting insight into optimal selection of filter parameters.

  13. Additions, losses, and rearrangements on the evolutionary route from a reconstructed ancestor to the modern Saccharomyces cerevisiae genome.

    Directory of Open Access Journals (Sweden)

    Jonathan L Gordon

    2009-05-01

    Full Text Available Comparative genomics can be used to infer the history of genomic rearrangements that occurred during the evolution of a species. We used the principle of parsimony, applied to aligned synteny blocks from 11 yeast species, to infer the gene content and gene order that existed in the genome of an extinct ancestral yeast about 100 Mya, immediately before it underwent whole-genome duplication (WGD. The reconstructed ancestral genome contains 4,703 ordered loci on eight chromosomes. The reconstruction is complete except for the subtelomeric regions. We then inferred the series of rearrangement steps that led from this ancestor to the current Saccharomyces cerevisiae genome; relative to the ancestral genome we observe 73 inversions, 66 reciprocal translocations, and five translocations involving telomeres. Some fragile chromosomal sites were reused as evolutionary breakpoints multiple times. We identified 124 genes that have been gained by S. cerevisiae in the time since the WGD, including one that is derived from a hAT family transposon, and 88 ancestral loci at which S. cerevisiae did not retain either of the gene copies that were formed by WGD. Sites of gene gain and evolutionary breakpoints both tend to be associated with tRNA genes and, to a lesser extent, with origins of replication. Many of the gained genes in S. cerevisiae have functions associated with ethanol production, growth in hypoxic environments, or the uptake of alternative nutrient sources.

  14. Gender differences in the knee adduction moment after anterior cruciate ligament reconstruction surgery.

    Science.gov (United States)

    Webster, Kate E; McClelland, Jodie A; Palazzolo, Simon E; Santamaria, Luke J; Feller, Julian A

    2012-04-01

    The external knee adduction moment during gait has previously been associated with knee pain and osteoarthritis (OA). Recently, the knee adduction moment has been shown to be increased following anterior cruciate ligament (ACL) reconstruction surgery and has been suggested as a potential mechanism for the progression of early onset knee OA in this population. No study has investigated the gender differences in gait biomechanics following ACL reconstruction. To examine gender differences in gait biomechanics following ACL reconstruction surgery. 36 subjects (18 females, 18 males) who had previously undergone ACL reconstruction surgery (mean time since surgery 20 months) underwent gait analysis at a self-selected walking speed. Males and females were well matched for age, time since surgery and walking speed. Maximum flexion and adduction angles and moments were recorded during the stance phase of level walking and compared between the male and female groups. The knee adduction moment was 23% greater in the female compared with the male ACL group. No gender differences were seen in the sagittal plane. No differences were seen between the reconstructed and contralateral limb. The higher knee adduction moment seen in females compared with males may suggest an increased risk for the development of OA in ACL-reconstructed females.

  15. Virtual reconstruction of modern and fossil hominoid crania: consequences of reference sample choice.

    Science.gov (United States)

    Senck, Sascha; Bookstein, Fred L; Benazzi, Stefano; Kastner, Johann; Weber, Gerhard W

    2015-05-01

    Most hominin cranial fossils are incomplete and require reconstruction prior to subsequent analyses. Missing data can be estimated by geometric morphometrics using information from complete specimens, for example, by using thin-plate splines. In this study, we estimate missing data in several virtually fragmented models of hominoid crania (Homo, Pan, Pongo) and fossil hominins (e.g., Australopithecus africanus, Homo heidelbergensis). The aim is to investigate in which way different references influence estimations of cranial shape and how this information can be employed in the reconstruction of fossils. We used a sample of 64 three-dimensional digital models of complete human, chimpanzee, and orangutan crania and a set of 758 landmarks and semilandmarks. The virtually knocked out neurocranial and facial areas that were reconstructed corresponded to those of a real case found in A.L. 444-2 (A. afarensis) cranium. Accuracy of multiple intraspecies and interspecies reconstructions was computed as the maximum square root of the mean squared difference between the original and the reconstruction (root mean square). The results show that the uncertainty in reconstructions is a function of both the geometry of the knockout area and the dissimilarity between the reference sample and the specimen(s) undergoing reconstruction. We suggest that it is possible to estimate large missing cranial areas if the shape of the reference is similar enough to the shape of the specimen reconstructed, though caution must be exercised when employing these reconstructions in subsequent analyses. We provide a potential guide for the choice of the reference by means of bending energy. © 2015 Wiley Periodicals, Inc.

  16. Past climate reconstruction: a tool for assessing site suitability

    International Nuclear Information System (INIS)

    Potter, G.

    1978-01-01

    Reconstructing past climatic variations can lead to a better understanding of possible future precipitation and groundwater recharge patterns. Work so far has led to several new insights into past climate variability and will provide input into the hydrologic modeling effort in progress for the Waste Management Program. Short-term reconstructions (0 to 350 y) suggest that the basin and range of the southwestern United States have the driest, least variable precipitation record. The Pacific Northwest shows higher variability and several trends lasting for more then 25 y. The Southern High Plains have even more variability, but the upper Midwest and Southwest vary most and have the highest precipitation amounts. Pollen and lake level data from the literature suggest that the Southwest was wetter during at least part of the last glacial maximum than it is today

  17. Regularization parameter selection methods for ill-posed Poisson maximum likelihood estimation

    International Nuclear Information System (INIS)

    Bardsley, Johnathan M; Goldes, John

    2009-01-01

    In image processing applications, image intensity is often measured via the counting of incident photons emitted by the object of interest. In such cases, image data noise is accurately modeled by a Poisson distribution. This motivates the use of Poisson maximum likelihood estimation for image reconstruction. However, when the underlying model equation is ill-posed, regularization is needed. Regularized Poisson likelihood estimation has been studied extensively by the authors, though a problem of high importance remains: the choice of the regularization parameter. We will present three statistically motivated methods for choosing the regularization parameter, and numerical examples will be presented to illustrate their effectiveness

  18. A temporal interpolation approach for dynamic reconstruction in perfusion CT

    International Nuclear Information System (INIS)

    Montes, Pau; Lauritsch, Guenter

    2007-01-01

    This article presents a dynamic CT reconstruction algorithm for objects with time dependent attenuation coefficient. Projection data acquired over several rotations are interpreted as samples of a continuous signal. Based on this idea, a temporal interpolation approach is proposed which provides the maximum temporal resolution for a given rotational speed of the CT scanner. Interpolation is performed using polynomial splines. The algorithm can be adapted to slow signals, reducing the amount of data acquired and the computational cost. A theoretical analysis of the approximations made by the algorithm is provided. In simulation studies, the temporal interpolation approach is compared with three other dynamic reconstruction algorithms based on linear regression, linear interpolation, and generalized Parker weighting. The presented algorithm exhibits the highest temporal resolution for a given sampling interval. Hence, our approach needs less input data to achieve a certain quality in the reconstruction than the other algorithms discussed or, equivalently, less x-ray exposure and computational complexity. The proposed algorithm additionally allows the possibility of using slow rotating scanners for perfusion imaging purposes

  19. Quantifying Regional Vegetation Changes in China During Three Contrasting Warming Intervals since the Last Glacial Maximum

    Science.gov (United States)

    Li, Q.; Wu, H.; Yu, Y.; Sun, A.; Luo, Y.

    2017-12-01

    Reconstructing patterns of past vegetation change on a large-scale facilitates a better understanding of the interactions and feedbacks between climate change and the terrestrial biosphere. In addition, reducing the uncertainty in predictions of vegetation change under global warming highlights the importance of reconstructing vegetation patterns during past warming intervals. Here, we present a quantitative regional vegetation reconstruction for China during three intervals: Last Glacial Maximum (LGM, 18±2 14C kyr B.P.), early Holocene (8.5±0.5 14C kyr B.P.), and mid-Holocene (6±0.5 14C kyr B.P.). The biomization method, based on 249 pollen records, was used for the reconstructions. The results demonstrate that during the LGM, steppe and desert expanded eastwards and southwards, reaching the present-day temperate deciduous forest (TEDE) zone, and dominated northern China. In contrast, the forest in Eastern China underwent a substantial southwards retreat and the percentage of forest-type sites was at a minimum. In addition, the warm mixed forest (WAMF) and TEDE shifted southwards of 10° N relative to the present-day, and tropical seasonal rain forest (TSFO) was almost absent. At the same time, the forest-steppe boundary shifted southwards to near the middle and lower reaches of Yangtze River. For the early Holocene and mid-Holocene, the TSFO, WAMF, and TEDE shifted northwards by 2-5° relative to today, and the percentage of forest sites increased and reached a maximum in the mid-Holocene. The slight expansion of forest from the early Holocene to the mid-Holocene caused the forest-steppe boundary to shift northwestwards to near the present-day 300 mm isohyet by the mid-Holocene. Our results also indicate that climatic warming since the LGM, which strengthened the East Asian summer monsoon, favored the development of forest in China. This is potentially an important finding for evaluating the possible response of forest in China to future global warming.

  20. Stochastic rainfall modeling in West Africa: Parsimonious approaches for domestic rainwater harvesting assessment

    Science.gov (United States)

    Cowden, Joshua R.; Watkins, David W., Jr.; Mihelcic, James R.

    2008-10-01

    SummarySeveral parsimonious stochastic rainfall models are developed and compared for application to domestic rainwater harvesting (DRWH) assessment in West Africa. Worldwide, improved water access rates are lowest for Sub-Saharan Africa, including the West African region, and these low rates have important implications on the health and economy of the region. Domestic rainwater harvesting (DRWH) is proposed as a potential mechanism for water supply enhancement, especially for the poor urban households in the region, which is essential for development planning and poverty alleviation initiatives. The stochastic rainfall models examined are Markov models and LARS-WG, selected due to availability and ease of use for water planners in the developing world. A first-order Markov occurrence model with a mixed exponential amount model is selected as the best option for unconditioned Markov models. However, there is no clear advantage in selecting Markov models over the LARS-WG model for DRWH in West Africa, with each model having distinct strengths and weaknesses. A multi-model approach is used in assessing DRWH in the region to illustrate the variability associated with the rainfall models. It is clear DRWH can be successfully used as a water enhancement mechanism in West Africa for certain times of the year. A 200 L drum storage capacity could potentially optimize these simple, small roof area systems for many locations in the region.

  1. Reconstruction of multiple-pinhole micro-SPECT data using origin ensembles.

    Science.gov (United States)

    Lyon, Morgan C; Sitek, Arkadiusz; Metzler, Scott D; Moore, Stephen C

    2016-10-01

    The authors are currently developing a dual-resolution multiple-pinhole microSPECT imaging system based on three large NaI(Tl) gamma cameras. Two multiple-pinhole tungsten collimator tubes will be used sequentially for whole-body "scout" imaging of a mouse, followed by high-resolution (hi-res) imaging of an organ of interest, such as the heart or brain. Ideally, the whole-body image will be reconstructed in real time such that data need only be acquired until the area of interest can be visualized well-enough to determine positioning for the hi-res scan. The authors investigated the utility of the origin ensemble (OE) algorithm for online and offline reconstructions of the scout data. This algorithm operates directly in image space, and can provide estimates of image uncertainty, along with reconstructed images. Techniques for accelerating the OE reconstruction were also introduced and evaluated. System matrices were calculated for our 39-pinhole scout collimator design. SPECT projections were simulated for a range of count levels using the MOBY digital mouse phantom. Simulated data were used for a comparison of OE and maximum-likelihood expectation maximization (MLEM) reconstructions. The OE algorithm convergence was evaluated by calculating the total-image entropy and by measuring the counts in a volume-of-interest (VOI) containing the heart. Total-image entropy was also calculated for simulated MOBY data reconstructed using OE with various levels of parallelization. For VOI measurements in the heart, liver, bladder, and soft-tissue, MLEM and OE reconstructed images agreed within 6%. Image entropy converged after ∼2000 iterations of OE, while the counts in the heart converged earlier at ∼200 iterations of OE. An accelerated version of OE completed 1000 iterations in <9 min for a 6.8M count data set, with some loss of image entropy performance, whereas the same dataset required ∼79 min to complete 1000 iterations of conventional OE. A combination of the two

  2. Strategies of reconstruction algorithms for computerized tomography

    International Nuclear Information System (INIS)

    Garderet, P.

    1984-10-01

    Image reconstruction from projections has progressively spread out over all fields of medical imaging. As the mathematical aspects of the problem become more and more comprehensively explored a great variety of numerical solutions have been developed best suited to such-and-such imaging medical application and taking into account the physical phenomena related to data collection (a priori properties for signal and noise). The purpose of that survey is to present the general mathematical frame and the fundamental assumptions of various strategies; Fourier methods approximate explicit deterministic inversion formula for the Radon transform. Algebraic reconstruction techniques set up an a priori discrete model through a series expansion approach of the solution. The numerical system to be solved is huge when a fine grid of pixels is to be reconstructed; iterative solutions may then be found. Recently some least square procedures have been shown to be tractable which avoid the use of iterative methods. Finally maximum like hood approach incorporates accurately the Poisson nature of photon noise and are well adapted to emission computed tomography. The various strategies will be analysed from both aspects of theoretical assumptions needed for suitable use and of computing facilities, actual performance and cost. In the end we take a glimpse of the extension of the algorithms from two dimensional imaging to fully three dimensional volume analysis in preparation of the future medical imaging technologies

  3. Bayesian image reconstruction for emission tomography based on median root prior

    International Nuclear Information System (INIS)

    Alenius, S.

    1997-01-01

    The aim of the present study was to investigate a new type of Bayesian one-step late reconstruction method which utilizes a median root prior (MRP). The method favours images which have locally monotonous radioactivity concentrations. The new reconstruction algorithm was applied to ideal simulated data, phantom data and some patient examinations with PET. The same projection data were reconstructed with filtered back-projection (FBP) and maximum likelihood-expectation maximization (ML-EM) methods for comparison. The MRP method provided good-quality images with a similar resolution to the FBP method with a ramp filter, and at the same time the noise properties were as good as with Hann-filtered FBP images. The typical artefacts seen in FBP reconstructed images outside of the object were completely removed, as was the grainy noise inside the object. Quantitativley, the resulting average regional radioactivity concentrations in a large region of interest in images produced by the MRP method corresponded to the FBP and ML-EM results but at the pixel by pixel level the MRP method proved to be the most accurate of the tested methods. In contrast to other iterative reconstruction methods, e.g. ML-EM, the MRP method was not sensitive to the number of iterations nor to the adjustment of reconstruction parameters. Only the Bayesian parameter β had to be set. The proposed MRP method is much more simple to calculate than the methods described previously, both with regard to the parameter settings and in terms of general use. The new MRP reconstruction method was shown to produce high-quality quantitative emission images with only one parameter setting in addition to the number of iterations. (orig.)

  4. Optimization of the reconstruction parameters in [123I]FP-CIT SPECT

    Science.gov (United States)

    Niñerola-Baizán, Aida; Gallego, Judith; Cot, Albert; Aguiar, Pablo; Lomeña, Francisco; Pavía, Javier; Ros, Domènec

    2018-04-01

    The aim of this work was to obtain a set of parameters to be applied in [123I]FP-CIT SPECT reconstruction in order to minimize the error between standardized and true values of the specific uptake ratio (SUR) in dopaminergic neurotransmission SPECT studies. To this end, Monte Carlo simulation was used to generate a database of 1380 projection data-sets from 23 subjects, including normal cases and a variety of pathologies. Studies were reconstructed using filtered back projection (FBP) with attenuation correction and ordered subset expectation maximization (OSEM) with correction for different degradations (attenuation, scatter and PSF). Reconstruction parameters to be optimized were the cut-off frequency of a 2D Butterworth pre-filter in FBP, and the number of iterations and the full width at Half maximum of a 3D Gaussian post-filter in OSEM. Reconstructed images were quantified using regions of interest (ROIs) derived from Magnetic Resonance scans and from the Automated Anatomical Labeling map. Results were standardized by applying a simple linear regression line obtained from the entire patient dataset. Our findings show that we can obtain a set of optimal parameters for each reconstruction strategy. The accuracy of the standardized SUR increases when the reconstruction method includes more corrections. The use of generic ROIs instead of subject-specific ROIs adds significant inaccuracies. Thus, after reconstruction with OSEM and correction for all degradations, subject-specific ROIs led to errors between standardized and true SUR values in the range [‑0.5, +0.5] in 87% and 92% of the cases for caudate and putamen, respectively. These percentages dropped to 75% and 88% when the generic ROIs were used.

  5. Edge-promoting reconstruction of absorption and diffusivity in optical tomography

    International Nuclear Information System (INIS)

    Hannukainen, A; Hyvönen, N; Majander, H; Harhanen, L

    2016-01-01

    In optical tomography a physical body is illuminated with near-infrared light and the resulting outward photon flux is measured at the object boundary. The goal is to reconstruct internal optical properties of the body, such as absorption and diffusivity. In this work, it is assumed that the imaged object is composed of an approximately homogeneous background with clearly distinguishable embedded inhomogeneities. An algorithm for finding the maximum a posteriori estimate for the absorption and diffusion coefficients is introduced assuming an edge-preferring prior and an additive Gaussian measurement noise model. The method is based on iteratively combining a lagged diffusivity step and a linearization of the measurement model of diffuse optical tomography with priorconditioned LSQR. The performance of the reconstruction technique is tested via three-dimensional numerical experiments with simulated data. (paper)

  6. Resolution-recovery-embedded image reconstruction for a high-resolution animal SPECT system.

    Science.gov (United States)

    Zeraatkar, Navid; Sajedi, Salar; Farahani, Mohammad Hossein; Arabi, Hossein; Sarkar, Saeed; Ghafarian, Pardis; Rahmim, Arman; Ay, Mohammad Reza

    2014-11-01

    The small-animal High-Resolution SPECT (HiReSPECT) is a dedicated dual-head gamma camera recently designed and developed in our laboratory for imaging of murine models. Each detector is composed of an array of 1.2 × 1.2 mm(2) (pitch) pixelated CsI(Na) crystals. Two position-sensitive photomultiplier tubes (H8500) are coupled to each head's crystal. In this paper, we report on a resolution-recovery-embedded image reconstruction code applicable to the system and present the experimental results achieved using different phantoms and mouse scans. Collimator-detector response functions (CDRFs) were measured via a pixel-driven method using capillary sources at finite distances from the head within the field of view (FOV). CDRFs were then fitted by independent Gaussian functions. Thereafter, linear interpolations were applied to the standard deviation (σ) values of the fitted Gaussians, yielding a continuous map of CDRF at varying distances from the head. A rotation-based maximum-likelihood expectation maximization (MLEM) method was used for reconstruction. A fast rotation algorithm was developed to rotate the image matrix according to the desired angle by means of pre-generated rotation maps. The experiments demonstrated improved resolution utilizing our resolution-recovery-embedded image reconstruction. While the full-width at half-maximum (FWHM) radial and tangential resolution measurements of the system were over 2 mm in nearly all positions within the FOV without resolution recovery, reaching around 2.5 mm in some locations, they fell below 1.8 mm everywhere within the FOV using the resolution-recovery algorithm. The noise performance of the system was also acceptable; the standard deviation of the average counts per voxel in the reconstructed images was 6.6% and 8.3% without and with resolution recovery, respectively. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Joint model of motion and anatomy for PET image reconstruction

    International Nuclear Information System (INIS)

    Qiao Feng; Pan Tinsu; Clark, John W. Jr.; Mawlawi, Osama

    2007-01-01

    Anatomy-based positron emission tomography (PET) image enhancement techniques have been shown to have the potential for improving PET image quality. However, these techniques assume an accurate alignment between the anatomical and the functional images, which is not always valid when imaging the chest due to respiratory motion. In this article, we present a joint model of both motion and anatomical information by integrating a motion-incorporated PET imaging system model with an anatomy-based maximum a posteriori image reconstruction algorithm. The mismatched anatomical information due to motion can thus be effectively utilized through this joint model. A computer simulation and a phantom study were conducted to assess the efficacy of the joint model, whereby motion and anatomical information were either modeled separately or combined. The reconstructed images in each case were compared to corresponding reference images obtained using a quadratic image prior based maximum a posteriori reconstruction algorithm for quantitative accuracy. Results of these studies indicated that while modeling anatomical information or motion alone improved the PET image quantitation accuracy, a larger improvement in accuracy was achieved when using the joint model. In the computer simulation study and using similar image noise levels, the improvement in quantitation accuracy compared to the reference images was 5.3% and 19.8% when using anatomical or motion information alone, respectively, and 35.5% when using the joint model. In the phantom study, these results were 5.6%, 5.8%, and 19.8%, respectively. These results suggest that motion compensation is important in order to effectively utilize anatomical information in chest imaging using PET. The joint motion-anatomy model presented in this paper provides a promising solution to this problem

  8. Direct Parametric Reconstruction With Joint Motion Estimation/Correction for Dynamic Brain PET Data.

    Science.gov (United States)

    Jiao, Jieqing; Bousse, Alexandre; Thielemans, Kris; Burgos, Ninon; Weston, Philip S J; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Markiewicz, Pawel; Ourselin, Sebastien

    2017-01-01

    Direct reconstruction of parametric images from raw photon counts has been shown to improve the quantitative analysis of dynamic positron emission tomography (PET) data. However it suffers from subject motion which is inevitable during the typical acquisition time of 1-2 hours. In this work we propose a framework to jointly estimate subject head motion and reconstruct the motion-corrected parametric images directly from raw PET data, so that the effects of distorted tissue-to-voxel mapping due to subject motion can be reduced in reconstructing the parametric images with motion-compensated attenuation correction and spatially aligned temporal PET data. The proposed approach is formulated within the maximum likelihood framework, and efficient solutions are derived for estimating subject motion and kinetic parameters from raw PET photon count data. Results from evaluations on simulated [ 11 C]raclopride data using the Zubal brain phantom and real clinical [ 18 F]florbetapir data of a patient with Alzheimer's disease show that the proposed joint direct parametric reconstruction motion correction approach can improve the accuracy of quantifying dynamic PET data with large subject motion.

  9. Reconstruction of pin burnup characteristics from nodal calculations in hexagonal geometry

    International Nuclear Information System (INIS)

    Yang, W.S.; Finck, P.J.; Khalil, H.S.

    1990-01-01

    A reconstruction method has been developed for recovering pin burnup characteristics from fuel cycle calculations performed in hexagonal-z geometry using the nodal diffusion option of the DIF3D/REBUS-3 code system. Intra-modal distributions of group fluxes, nuclide densities, power density, burnup, and fluence are efficiently computed using polynomial shapes constrained to satisfy nodal information. The accuracy of the method has been tested by performing several numerical benchmark calculations and by comparing predicted local burnups to values measured for experimental assemblies in EBR-11. The results indicate that the reconstruction methods are quite accurate, yielding maximum errors in power and nuclide densities that are less than 2% for driver assemblies and typically less than 5% for blanket assemblies. 14 refs., 2 figs., 5 tabs

  10. Noniterative MAP reconstruction using sparse matrix representations.

    Science.gov (United States)

    Cao, Guangzhi; Bouman, Charles A; Webb, Kevin J

    2009-09-01

    We present a method for noniterative maximum a posteriori (MAP) tomographic reconstruction which is based on the use of sparse matrix representations. Our approach is to precompute and store the inverse matrix required for MAP reconstruction. This approach has generally not been used in the past because the inverse matrix is typically large and fully populated (i.e., not sparse). In order to overcome this problem, we introduce two new ideas. The first idea is a novel theory for the lossy source coding of matrix transformations which we refer to as matrix source coding. This theory is based on a distortion metric that reflects the distortions produced in the final matrix-vector product, rather than the distortions in the coded matrix itself. The resulting algorithms are shown to require orthonormal transformations of both the measurement data and the matrix rows and columns before quantization and coding. The second idea is a method for efficiently storing and computing the required orthonormal transformations, which we call a sparse-matrix transform (SMT). The SMT is a generalization of the classical FFT in that it uses butterflies to compute an orthonormal transform; but unlike an FFT, the SMT uses the butterflies in an irregular pattern, and is numerically designed to best approximate the desired transforms. We demonstrate the potential of the noniterative MAP reconstruction with examples from optical tomography. The method requires offline computation to encode the inverse transform. However, once these offline computations are completed, the noniterative MAP algorithm is shown to reduce both storage and computation by well over two orders of magnitude, as compared to a linear iterative reconstruction methods.

  11. Body mass reconstruction on the basis of selected skeletal traits.

    Science.gov (United States)

    Myszka, Anna; Piontek, Janusz; Vancata, Vaclav

    2012-07-01

    The objective of this paper is: to estimate the body mass of the skeletons with the mechanical method (femoral head body mass estimation method--FH) and non-mechanical method (stature/living bi-iliac breadth body mass estimation method--ST/LBIB); to compare the reliability and potential use of results obtained with both methods. The material (46 skeletons, 26 males, 20 females) used in the study came from the medieval burial ground in Cedynia, Poland. Body mass reconstruction according to non-mechanical method was made using equations proposed by Ruff et al. (2005). Body mass estimation based on the mechanical method was calculated using formulas proposed by Ruff et al. (1995). In the mechanical body mass reconstruction method, femoral superoinferior breadth was used. Reconstruction of body weight using the non-mechanical method was based on maximum pelvic breadth and reconstructed body height. The correlation between bi-iliac breadth and femoral head measurements and the correlation between femoral head and reconstructed body height were also calculated. The significance of differences between the body mass of male and female individuals was tested with the Mann-Whitney U-test. The significance of differences between body mass values obtained with the mechanical (FH) and the non-mechanical method (ST/ LBIB) was tested using Pearson's correlation. The same test was used for the calculation of the relationship between bi-iliac breadth and femoral head measurements and between femoral head and reconstructed body height. In contrast to females, in males there is no statistically significant correlation between body mass estimated with the mechanical method (FH) and the non-mechanical method (ST/LBIB). In both sexes there was not statistically significant correlation between bi-iliac breadth and femoral head measurements. Only in the females group the correlation between femoral head and reconstructed body height was statistically significant. It is worth to continue

  12. Penalized maximum-likelihood sinogram restoration for dual focal spot computed tomography

    International Nuclear Information System (INIS)

    Forthmann, P; Koehler, T; Begemann, P G C; Defrise, M

    2007-01-01

    Due to various system non-idealities, the raw data generated by a computed tomography (CT) machine are not readily usable for reconstruction. Although the deterministic nature of corruption effects such as crosstalk and afterglow permits correction by deconvolution, there is a drawback because deconvolution usually amplifies noise. Methods that perform raw data correction combined with noise suppression are commonly termed sinogram restoration methods. The need for sinogram restoration arises, for example, when photon counts are low and non-statistical reconstruction algorithms such as filtered backprojection are used. Many modern CT machines offer a dual focal spot (DFS) mode, which serves the goal of increased radial sampling by alternating the focal spot between two positions on the anode plate during the scan. Although the focal spot mode does not play a role with respect to how the data are affected by the above-mentioned corruption effects, it needs to be taken into account if regularized sinogram restoration is to be applied to the data. This work points out the subtle difference in processing that sinogram restoration for DFS requires, how it is correctly employed within the penalized maximum-likelihood sinogram restoration algorithm and what impact it has on image quality

  13. A new iterative reconstruction technique for attenuation correction in high-resolution positron emission tomography

    International Nuclear Information System (INIS)

    Knesaurek, K.; Machac, J.; Vallabhajosula, S.; Buchsbaum, M.S.

    1996-01-01

    A new interative reconstruction technique (NIRT) for positron emission computed tomography (PET), which uses transmission data for nonuniform attenuation correction, is described. Utilizing the general inverse problem theory, a cost functional which includes a noise term was derived. The cost functional was minimized using a weighted-least-square maximum a posteriori conjugate gradient (CG) method. The procedure involves a change in the Hessian of the cost function by adding an additional term. Two phantoms were used in a real data acquisition. The first was a cylinder phantom filled with uniformly distributed activity of 74 MBq of fluorine-18. Two different inserts were placed in the phantom. The second was a Hoffman brain phantom filled with uniformly distributed activity of 7.4 MBq of 18 F. Resulting reconstructed images were used to test and compare a new interative reconstruction technique with a standard filtered backprojection (FBP) method. The results confirmed that NIRT, based on the conjugate gradient method, converges rapidly and provides good reconstructed images. In comaprison with standard results obtained by the FBP method, the images reconstructed by NIRT showed better noise properties. The noise was measured as rms% noise and was less, by a factor of 1.75, in images reconstructed by NIRT than in the same images reconstructed by FBP. The distance between the Hoffman brain slice created from the MRI image was 0.526, while the same distance for the Hoffman brain slice reconstructed by NIRT was 0.328. The NIRT method suppressed the propagation of the noise without visible loss of resolution in the reconstructed PET images. (orig.)

  14. MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions

    Science.gov (United States)

    Novosad, Philip; Reader, Andrew J.

    2016-06-01

    Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [18F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral

  15. Phylogenetic analyses of RPB1 and RPB2 support a middle Cretaceous origin for a clade comprising all agriculturally and medically important fusaria

    DEFF Research Database (Denmark)

    O’Donnell, Kerry; Rooney, Alejandro P.; Proctor, Robert H.

    2013-01-01

    Fusarium (Hypocreales, Nectriaceae) is one of the most economically important and systematically challenging groups of mycotoxigenic phytopathogens and emergent human pathogens. We conducted maximum likelihood (ML), maximum parsimony (MP) and Bayesian (B) analyses on partial DNA-directed RNA poly...

  16. Algebraic reconstruction techniques for spectral reconstruction in diffuse optical tomography

    International Nuclear Information System (INIS)

    Brendel, Bernhard; Ziegler, Ronny; Nielsen, Tim

    2008-01-01

    Reconstruction in diffuse optical tomography (DOT) necessitates solving the diffusion equation, which is nonlinear with respect to the parameters that have to be reconstructed. Currently applied solving methods are based on the linearization of the equation. For spectral three-dimensional reconstruction, the emerging equation system is too large for direct inversion, but the application of iterative methods is feasible. Computational effort and speed of convergence of these iterative methods are crucial since they determine the computation time of the reconstruction. In this paper, the iterative methods algebraic reconstruction technique (ART) and conjugated gradients (CGs) as well as a new modified ART method are investigated for spectral DOT reconstruction. The aim of the modified ART scheme is to speed up the convergence by considering the specific conditions of spectral reconstruction. As a result, it converges much faster to favorable results than conventional ART and CG methods

  17. Using tree diversity to compare phylogenetic heuristics.

    Science.gov (United States)

    Sul, Seung-Jin; Matthews, Suzanne; Williams, Tiffani L

    2009-04-29

    Evolutionary trees are family trees that represent the relationships between a group of organisms. Phylogenetic heuristics are used to search stochastically for the best-scoring trees in tree space. Given that better tree scores are believed to be better approximations of the true phylogeny, traditional evaluation techniques have used tree scores to determine the heuristics that find the best scores in the fastest time. We develop new techniques to evaluate phylogenetic heuristics based on both tree scores and topologies to compare Pauprat and Rec-I-DCM3, two popular Maximum Parsimony search algorithms. Our results show that although Pauprat and Rec-I-DCM3 find the trees with the same best scores, topologically these trees are quite different. Furthermore, the Rec-I-DCM3 trees cluster distinctly from the Pauprat trees. In addition to our heatmap visualizations of using parsimony scores and the Robinson-Foulds distance to compare best-scoring trees found by the two heuristics, we also develop entropy-based methods to show the diversity of the trees found. Overall, Pauprat identifies more diverse trees than Rec-I-DCM3. Overall, our work shows that there is value to comparing heuristics beyond the parsimony scores that they find. Pauprat is a slower heuristic than Rec-I-DCM3. However, our work shows that there is tremendous value in using Pauprat to reconstruct trees-especially since it finds identical scoring but topologically distinct trees. Hence, instead of discounting Pauprat, effort should go in improving its implementation. Ultimately, improved performance measures lead to better phylogenetic heuristics and will result in better approximations of the true evolutionary history of the organisms of interest.

  18. Computer-assisted lung nodule volumetry from multi-detector row CT: Influence of image reconstruction parameters

    International Nuclear Information System (INIS)

    Honda, Osamu; Sumikawa, Hiromitsu; Johkoh, Takeshi; Tomiyama, Noriyuki; Mihara, Naoki; Inoue, Atsuo; Tsubamoto, Mitsuko; Natsag, Javzandulam; Hamada, Seiki; Nakamura, Hironobu

    2007-01-01

    Purpose: To investigate differences in volumetric measurement of pulmonary nodules caused by changing the reconstruction parameters for multi-detector row CT. Materials and methods: Thirty-nine pulmonary nodules less than 2 cm in diameter were examined by multi-slice CT. All nodules were solid, and located in the peripheral part of the lungs. The resultant 48 parameters images were reconstructed by changing slice thickness (1.25, 2.5, 3.75, or 5 mm), field of view (FOV: 10, 20, or 30 cm), algorithm (high-spatial frequency algorithm or low-spatial frequency algorithm) and reconstruction interval (reconstruction with 50% overlapping of the reconstructed slices or non-overlapping reconstruction). Volumetric measurements were calculated using commercially available software. The differences between nodule volumes were analyzed by the Kruskal-Wallis test and the Wilcoxon Signed-Ranks test. Results: The diameter of the nodules was 8.7 ± 2.7 mm on average, ranging from 4.3 to 16.4 mm. Pulmonary nodule volume did not change significantly with changes in slice thickness or FOV (p > 0.05), but was significantly larger with the high-spatial frequency algorithm than the low-spatial frequency algorithm (p < 0.05), except for one reconstruction parameter. The volumes determined by non-overlapping reconstruction were significantly larger than those of overlapping reconstruction (p < 0.05), except for a 1.25 mm thickness with 10 cm FOV with the high-spatial frequency algorithm, and 5 mm thickness. The maximum difference in measured volume was 16% on average between the 1.25 mm slice thickness/10 cm FOV/high-spatial frequency algorithm parameters and overlapping reconstruction. Conclusion: Volumetric measurements of pulmonary nodules differ with changes in the reconstruction parameters, with a tendency toward larger volumes in high-spatial frequency algorithm and non-overlapping reconstruction compared to the low-spatial frequency algorithm and overlapping reconstruction

  19. Maximum Entropy Approach in Dynamic Contrast-Enhanced Magnetic Resonance Imaging.

    Science.gov (United States)

    Farsani, Zahra Amini; Schmid, Volker J

    2017-01-01

    In the estimation of physiological kinetic parameters from Dynamic Contrast-Enhanced Magnetic Resonance Imaging (DCE-MRI) data, the determination of the arterial input function (AIF) plays a key role. This paper proposes a Bayesian method to estimate the physiological parameters of DCE-MRI along with the AIF in situations, where no measurement of the AIF is available. In the proposed algorithm, the maximum entropy method (MEM) is combined with the maximum a posterior approach (MAP). To this end, MEM is used to specify a prior probability distribution of the unknown AIF. The ability of this method to estimate the AIF is validated using the Kullback-Leibler divergence. Subsequently, the kinetic parameters can be estimated with MAP. The proposed algorithm is evaluated with a data set from a breast cancer MRI study. The application shows that the AIF can reliably be determined from the DCE-MRI data using MEM. Kinetic parameters can be estimated subsequently. The maximum entropy method is a powerful tool to reconstructing images from many types of data. This method is useful for generating the probability distribution based on given information. The proposed method gives an alternative way to assess the input function from the existing data. The proposed method allows a good fit of the data and therefore a better estimation of the kinetic parameters. In the end, this allows for a more reliable use of DCE-MRI. Schattauer GmbH.

  20. Enhanced reconstruction of weighted networks from strengths and degrees

    International Nuclear Information System (INIS)

    Mastrandrea, Rossana; Fagiolo, Giorgio; Squartini, Tiziano; Garlaschelli, Diego

    2014-01-01

    Network topology plays a key role in many phenomena, from the spreading of diseases to that of financial crises. Whenever the whole structure of a network is unknown, one must resort to reconstruction methods that identify the least biased ensemble of networks consistent with the partial information available. A challenging case, frequently encountered due to privacy issues in the analysis of interbank flows and Big Data, is when there is only local (node-specific) aggregate information available. For binary networks, the relevant ensemble is one where the degree (number of links) of each node is constrained to its observed value. However, for weighted networks the problem is much more complicated. While the naïve approach prescribes to constrain the strengths (total link weights) of all nodes, recent counter-intuitive results suggest that in weighted networks the degrees are often more informative than the strengths. This implies that the reconstruction of weighted networks would be significantly enhanced by the specification of both strengths and degrees, a computationally hard and bias-prone procedure. Here we solve this problem by introducing an analytical and unbiased maximum-entropy method that works in the shortest possible time and does not require the explicit generation of reconstructed samples. We consider several real-world examples and show that, while the strengths alone give poor results, the additional knowledge of the degrees yields accurately reconstructed networks. Information-theoretic criteria rigorously confirm that the degree sequence, as soon as it is non-trivial, is irreducible to the strength sequence. Our results have strong implications for the analysis of motifs and communities and whenever the reconstructed ensemble is required as a null model to detect higher-order patterns

  1. Sequence Variation in Toxoplasma gondii rop17 Gene among Strains from Different Hosts and Geographical Locations

    Directory of Open Access Journals (Sweden)

    Nian-Zhang Zhang

    2014-01-01

    Full Text Available Genetic diversity of T. gondii is a concern of many studies, due to the biological and epidemiological diversity of this parasite. The present study examined sequence variation in rhoptry protein 17 (ROP17 gene among T. gondii isolates from different hosts and geographical regions. The rop17 gene was amplified and sequenced from 10 T. gondii strains, and phylogenetic relationship among these T. gondii strains was reconstructed using maximum parsimony (MP, neighbor-joining (NJ, and maximum likelihood (ML analyses. The partial rop17 gene sequences were 1375 bp in length and A+T contents varied from 49.45% to 50.11% among all examined T. gondii strains. Sequence analysis identified 33 variable nucleotide positions (2.1%, 16 of which were identified as transitions. Phylogeny reconstruction based on rop17 gene data revealed two major clusters which could readily distinguish Type I and Type II strains. Analyses of sequence variations in nucleotides and amino acids among these strains revealed high ratio of nonsynonymous to synonymous polymorphisms (>1, indicating that rop17 shows signs of positive selection. This study demonstrated the existence of slightly high sequence variability in the rop17 gene sequences among T. gondii strains from different hosts and geographical regions, suggesting that rop17 gene may represent a new genetic marker for population genetic studies of T. gondii isolates.

  2. Revealing pancrustacean relationships: Phylogenetic analysis of ribosomal protein genes places Collembola (springtails in a monophyletic Hexapoda and reinforces the discrepancy between mitochondrial and nuclear DNA markers

    Directory of Open Access Journals (Sweden)

    Mariën J

    2008-03-01

    Full Text Available Abstract Background In recent years, several new hypotheses on phylogenetic relations among arthropods have been proposed on the basis of DNA sequences. One of the challenged hypotheses is the monophyly of hexapods. This discussion originated from analyses based on mitochondrial DNA datasets that, due to an unusual positioning of Collembola, suggested that the hexapod body plan evolved at least twice. Here, we re-evaluate the position of Collembola using ribosomal protein gene sequences. Results In total 48 ribosomal proteins were obtained for the collembolan Folsomia candida. These 48 sequences were aligned with sequence data on 35 other ecdysozoans. Each ribosomal protein gene was available for 25% to 86% of the taxa. However, the total sequence information was unequally distributed over the taxa and ranged between 4% and 100%. A concatenated dataset was constructed (5034 inferred amino acids in length, of which ~66% of the positions were filled. Phylogenetic tree reconstructions, using Maximum Likelihood, Maximum Parsimony, and Bayesian methods, resulted in a topology that supports monophyly of Hexapoda. Conclusion Although ribosomal proteins in general may not evolve independently, they once more appear highly valuable for phylogenetic reconstruction. Our analyses clearly suggest that Hexapoda is monophyletic. This underpins the inconsistency between nuclear and mitochondrial datasets when analyzing pancrustacean relationships. Caution is needed when applying mitochondrial markers in deep phylogeny.

  3. LOR-OSEM: statistical PET reconstruction from raw line-of-response histograms

    International Nuclear Information System (INIS)

    Kadrmas, Dan J

    2004-01-01

    Iterative statistical reconstruction methods are becoming the standard in positron emission tomography (PET). Conventional maximum-likelihood expectation-maximization (MLEM) and ordered-subsets (OSEM) algorithms act on data which have been pre-processed into corrected, evenly-spaced histograms; however, such pre-processing corrupts the Poisson statistics. Recent advances have incorporated attenuation, scatter and randoms compensation into the iterative reconstruction. The objective of this work was to incorporate the remaining pre-processing steps, including arc correction, to reconstruct directly from raw unevenly-spaced line-of-response (LOR) histograms. This exactly preserves Poisson statistics and full spatial information in a manner closely related to listmode ML, making full use of the ML statistical model. The LOR-OSEM algorithm was implemented using a rotation-based projector which maps directly to the unevenly-spaced LOR grid. Simulation and phantom experiments were performed to characterize resolution, contrast and noise properties for 2D PET. LOR-OSEM provided a beneficial noise-resolution tradeoff, outperforming AW-OSEM by about the same margin that AW-OSEM outperformed pre-corrected OSEM. The relationship between LOR-ML and listmode ML algorithms was explored, and implementation differences are discussed. LOR-OSEM is a viable alternative to AW-OSEM for histogram-based reconstruction with improved spatial resolution and noise properties

  4. Pollen-based continental climate reconstructions at 6 and 21 ka: a global synthesis

    Energy Technology Data Exchange (ETDEWEB)

    Bartlein, P.J. [University of Oregon, Department of Geography, Eugene, Oregon (United States); Harrison, S.P. [University of Bristol, School of Geographical Sciences, Bristol (United Kingdom); Macquarie University, School of Biological Sciences, North Ryde, NSW (Australia); Brewer, S. [University of Wyoming, Botany Department, Wyoming (United States); Connor, S. [University of the Algarve, Centre for Marine and Environmental Research, Faro (Portugal); Davis, B.A.S. [Ecole Polytechnique Federale de Lausanne, School of Architecture, Civil and Environmental Engineering, Lausanne (Switzerland); Gajewski, K.; Viau, A.E. [University of Ottawa, Department of Geography, Ottawa, ON (Canada); Guiot, J. [CEREGE, Aix-en-Provence cedex 4 (France); Harrison-Prentice, T.I. [GTZ, PAKLIM, Jakarta (Indonesia); Henderson, A. [University of Minnesota, Department of Geology and Geophysics, Minneapolis, MN (United States); Peyron, O. [Laboratoire Chrono-Environnement UMR 6249 CNRS-UFC UFR Sciences et Techniques, Besancon Cedex (France); Prentice, I.C. [Macquarie University, School of Biological Sciences, North Ryde, NSW (Australia); University of Bristol, QUEST, Department of Earth Sciences, Bristol (United Kingdom); Scholze, M. [University of Bristol, QUEST, Department of Earth Sciences, Bristol (United Kingdom); Seppae, H. [University of Helsinki, Department of Geology, P.O. Box 65, Helsinki (Finland); Shuman, B. [University of Wyoming, Department of Geology and Geophysics, Laramie, WY (United States); Sugita, S. [Tallinn University, Institute of Ecology, Tallinn (Estonia); Thompson, R.S. [US Geological Survey, PO Box 25046, Denver, CO (United States); Williams, J. [University of Wisconsin, Department of Geography, Madison, WI (United States); Wu, H. [Chinese Academy of Sciences, Key Laboratory of Cenozoic Geology and Environment, Institute of Geology and Geophysics, Beijing (China)

    2011-08-15

    Subfossil pollen and plant macrofossil data derived from {sup 14}C-dated sediment profiles can provide quantitative information on glacial and interglacial climates. The data allow climate variables related to growing-season warmth, winter cold, and plant-available moisture to be reconstructed. Continental-scale reconstructions have been made for the mid-Holocene (MH, around 6 ka) and Last Glacial Maximum (LGM, around 21 ka), allowing comparison with palaeoclimate simulations currently being carried out as part of the fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change. The synthesis of the available MH and LGM climate reconstructions and their uncertainties, obtained using modern-analogue, regression and model-inversion techniques, is presented for four temperature variables and two moisture variables. Reconstructions of the same variables based on surface-pollen assemblages are shown to be accurate and unbiased. Reconstructed LGM and MH climate anomaly patterns are coherent, consistent between variables, and robust with respect to the choice of technique. They support a conceptual model of the controls of Late Quaternary climate change whereby the first-order effects of orbital variations and greenhouse forcing on the seasonal cycle of temperature are predictably modified by responses of the atmospheric circulation and surface energy balance. (orig.)

  5. Dynamic knee stability and ballistic knee movement after ACL reconstruction: an application on instep soccer kick.

    Science.gov (United States)

    Cordeiro, Nuno; Cortes, Nelson; Fernandes, Orlando; Diniz, Ana; Pezarat-Correia, Pedro

    2015-04-01

    The instep soccer kick is a pre-programmed ballistic movement with a typical agonist-antagonist coordination pattern. The coordination pattern of the kick can provide insight into deficient neuromuscular control. The purpose of this study was to investigate knee kinematics and hamstrings/quadriceps coordination pattern during the knee ballistic extension phase of the instep kick in soccer players after anterior cruciate ligament reconstruction (ACL reconstruction). Seventeen players from the Portuguese Soccer League participated in this study. Eight ACL-reconstructed athletes (experimental group) and 9 healthy individuals (control group) performed three instep kicks. Knee kinematics (flexion and extension angles at football contact and maximum velocity instants) were calculated during the kicks. Rectus femoris (RF), vastus lateralis, vastus medialis, biceps femoralis, and semitendinosus muscle activations were quantified during the knee extension phase. The ACL-reconstructed group had significantly lower knee extension angle (-1.2 ± 1.6, p ballistic control movement pattern between normal and ACL-reconstructed subjects. Performing open kinetic chain exercises using ballistic movements can be beneficial when recovering from ACL reconstruction. The exercises should focus on achieving multi-joint coordination and full knee extension (range of motion). III.

  6. Comparison of the accuracy of 3-dimensional cone-beam computed tomography and micro-computed tomography reconstructions by using different voxel sizes.

    Science.gov (United States)

    Maret, Delphine; Peters, Ove A; Galibourg, Antoine; Dumoncel, Jean; Esclassan, Rémi; Kahn, Jean-Luc; Sixou, Michel; Telmon, Norbert

    2014-09-01

    Cone-beam computed tomography (CBCT) data are, in principle, metrically exact. However, clinicians need to consider the precision of measurements of dental morphology as well as other hard tissue structures. CBCT spatial resolution, and thus image reconstruction quality, is restricted by the acquisition voxel size. The aim of this study was to assess geometric discrepancies among 3-dimensional CBCT reconstructions relative to the micro-CT reference. A total of 37 permanent teeth from 9 mandibles were scanned with CBCT 9500 and 9000 3D and micro-CT. After semiautomatic segmentation, reconstructions were obtained from CBCT acquisitions (voxel sizes 76, 200, and 300 μm) and from micro-CT (voxel size 41 μm). All reconstructions were positioned in the same plane by image registration. The topography of the geometric discrepancies was displayed by using a color map allowing the maximum differences to be located. The maximum differences were mainly found at the cervical margins and on the cusp tips or incisal edges. Geometric reconstruction discrepancies were significant at 300-μm resolution (P = .01, Wilcoxon test). To study hard tissue morphology, CBCT acquisitions require voxel sizes smaller than 300 μm. This experimental study will have to be complemented by studies in vivo that consider the conditions of clinical practice. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  7. Accelerated median root prior reconstruction for pinhole single-photon emission tomography (SPET)

    Energy Technology Data Exchange (ETDEWEB)

    Sohlberg, Antti [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, PO Box 1777 FIN-70211, Kuopio (Finland); Ruotsalainen, Ulla [Institute of Signal Processing, DMI, Tampere University of Technology, PO Box 553 FIN-33101, Tampere (Finland); Watabe, Hiroshi [National Cardiovascular Center Research Institute, 5-7-1 Fujisihro-dai, Suita City, Osaka 565-8565 (Japan); Iida, Hidehiro [National Cardiovascular Center Research Institute, 5-7-1 Fujisihro-dai, Suita City, Osaka 565-8565 (Japan); Kuikka, Jyrki T [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, PO Box 1777 FIN-70211, Kuopio (Finland)

    2003-07-07

    Pinhole collimation can be used to improve spatial resolution in SPET. However, the resolution improvement is achieved at the cost of reduced sensitivity, which leads to projection images with poor statistics. Images reconstructed from these projections using the maximum likelihood expectation maximization (ML-EM) algorithms, which have been used to reduce the artefacts generated by the filtered backprojection (FBP) based reconstruction, suffer from noise/bias trade-off: noise contaminates the images at high iteration numbers, whereas early abortion of the algorithm produces images that are excessively smooth and biased towards the initial estimate of the algorithm. To limit the noise accumulation we propose the use of the pinhole median root prior (PH-MRP) reconstruction algorithm. MRP is a Bayesian reconstruction method that has already been used in PET imaging and shown to possess good noise reduction and edge preservation properties. In this study the PH-MRP algorithm was accelerated with the ordered subsets (OS) procedure and compared to the FBP, OS-EM and conventional Bayesian reconstruction methods in terms of noise reduction, quantitative accuracy, edge preservation and visual quality. The results showed that the accelerated PH-MRP algorithm was very robust. It provided visually pleasing images with lower noise level than the FBP or OS-EM and with smaller bias and sharper edges than the conventional Bayesian methods.

  8. The costs of breed reconstruction from cryopreserved material in mammalian livestock species

    Directory of Open Access Journals (Sweden)

    Gandini Gustavo

    2007-07-01

    Full Text Available Abstract The aim of this work was to compare costs, in the horse, cattle, sheep, swine, and rabbit species, for the creation of gene banks for reconstruction of an extinct breed, using different strategies: embryos-only, embryos in combination with semen, and semen-only. Three cost measures were used: time required for population reconstruction, cost for creation of the gene bank, number of years-keeping-female to reach reconstruction. Semen costs were estimated across four scenarios: the presence or absence of a commercial market for semen, purchase of semen donors, and semen extracted from the epididymus. The number of cells were doubled to take into account the creation of two storage sites. The strategy embryos-only required the shortest time to reach reconstruction. With the strategy embryos + semen, time increased with decreasing proportions of embryos. With semen-only, reconstruction time varied from 2 to 21 years. A high variation of costs was observed across species and strategies, from 360 Euros in the rabbit to 1 092 300 in the horse. In all species, the embryos-only strategy was about 10% more expensive than using 90% embryos + semen. Decreasing the percentage of embryos further diminished costs. The number of years-keeping-female ranged across strategies, from 2 in the rabbit, to a maximum of 12 878 in the horse.

  9. Applying a multiobjective metaheuristic inspired by honey bees to phylogenetic inference.

    Science.gov (United States)

    Santander-Jiménez, Sergio; Vega-Rodríguez, Miguel A

    2013-10-01

    The development of increasingly popular multiobjective metaheuristics has allowed bioinformaticians to deal with optimization problems in computational biology where multiple objective functions must be taken into account. One of the most relevant research topics that can benefit from these techniques is phylogenetic inference. Throughout the years, different researchers have proposed their own view about the reconstruction of ancestral evolutionary relationships among species. As a result, biologists often report different phylogenetic trees from a same dataset when considering distinct optimality principles. In this work, we detail a multiobjective swarm intelligence approach based on the novel Artificial Bee Colony algorithm for inferring phylogenies. The aim of this paper is to propose a complementary view of phylogenetics according to the maximum parsimony and maximum likelihood criteria, in order to generate a set of phylogenetic trees that represent a compromise between these principles. Experimental results on a variety of nucleotide data sets and statistical studies highlight the relevance of the proposal with regard to other multiobjective algorithms and state-of-the-art biological methods. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Fast image reconstruction for Compton camera using stochastic origin ensemble approach.

    Science.gov (United States)

    Andreyev, Andriy; Sitek, Arkadiusz; Celler, Anna

    2011-01-01

    Compton camera has been proposed as a potential imaging tool in astronomy, industry, homeland security, and medical diagnostics. Due to the inherent geometrical complexity of Compton camera data, image reconstruction of distributed sources can be ineffective and/or time-consuming when using standard techniques such as filtered backprojection or maximum likelihood-expectation maximization (ML-EM). In this article, the authors demonstrate a fast reconstruction of Compton camera data using a novel stochastic origin ensembles (SOE) approach based on Markov chains. During image reconstruction, the origins of the measured events are randomly assigned to locations on conical surfaces, which are the Compton camera analogs of lines-of-responses in PET. Therefore, the image is defined as an ensemble of origin locations of all possible event origins. During the course of reconstruction, the origins of events are stochastically moved and the acceptance of the new event origin is determined by the predefined acceptance probability, which is proportional to the change in event density. For example, if the event density at the new location is higher than in the previous location, the new position is always accepted. After several iterations, the reconstructed distribution of origins converges to a quasistationary state which can be voxelized and displayed. Comparison with the list-mode ML-EM reveals that the postfiltered SOE algorithm has similar performance in terms of image quality while clearly outperforming ML-EM in relation to reconstruction time. In this study, the authors have implemented and tested a new image reconstruction algorithm for the Compton camera based on the stochastic origin ensembles with Markov chains. The algorithm uses list-mode data, is parallelizable, and can be used for any Compton camera geometry. SOE algorithm clearly outperforms list-mode ML-EM for simple Compton camera geometry in terms of reconstruction time. The difference in computational time

  11. Parsimonious data: How a single Facebook like predicts voting behavior in multiparty systems.

    Directory of Open Access Journals (Sweden)

    Jakob Bæk Kristensen

    Full Text Available This study shows how liking politicians' public Facebook posts can be used as an accurate measure for predicting present-day voter intention in a multiparty system. We highlight that a few, but selective digital traces produce prediction accuracies that are on par or even greater than most current approaches based upon bigger and broader datasets. Combining the online and offline, we connect a subsample of surveyed respondents to their public Facebook activity and apply machine learning classifiers to explore the link between their political liking behaviour and actual voting intention. Through this work, we show that even a single selective Facebook like can reveal as much about political voter intention as hundreds of heterogeneous likes. Further, by including the entire political like history of the respondents, our model reaches prediction accuracies above previous multiparty studies (60-70%. The main contribution of this paper is to show how public like-activity on Facebook allows political profiling of individual users in a multiparty system with accuracies above previous studies. Beside increased accuracies, the paper shows how such parsimonious measures allows us to generalize our findings to the entire population of a country and even across national borders, to other political multiparty systems. The approach in this study relies on data that are publicly available, and the simple setup we propose can with some limitations, be generalized to millions of users in other multiparty systems.

  12. L.U.St: a tool for approximated maximum likelihood supertree reconstruction.

    Science.gov (United States)

    Akanni, Wasiu A; Creevey, Christopher J; Wilkinson, Mark; Pisani, Davide

    2014-06-12

    Supertrees combine disparate, partially overlapping trees to generate a synthesis that provides a high level perspective that cannot be attained from the inspection of individual phylogenies. Supertrees can be seen as meta-analytical tools that can be used to make inferences based on results of previous scientific studies. Their meta-analytical application has increased in popularity since it was realised that the power of statistical tests for the study of evolutionary trends critically depends on the use of taxon-dense phylogenies. Further to that, supertrees have found applications in phylogenomics where they are used to combine gene trees and recover species phylogenies based on genome-scale data sets. Here, we present the L.U.St package, a python tool for approximate maximum likelihood supertree inference and illustrate its application using a genomic data set for the placental mammals. L.U.St allows the calculation of the approximate likelihood of a supertree, given a set of input trees, performs heuristic searches to look for the supertree of highest likelihood, and performs statistical tests of two or more supertrees. To this end, L.U.St implements a winning sites test allowing ranking of a collection of a-priori selected hypotheses, given as a collection of input supertree topologies. It also outputs a file of input-tree-wise likelihood scores that can be used as input to CONSEL for calculation of standard tests of two trees (e.g. Kishino-Hasegawa, Shimidoara-Hasegawa and Approximately Unbiased tests). This is the first fully parametric implementation of a supertree method, it has clearly understood properties, and provides several advantages over currently available supertree approaches. It is easy to implement and works on any platform that has python installed. bitBucket page - https://afro-juju@bitbucket.org/afro-juju/l.u.st.git. Davide.Pisani@bristol.ac.uk.

  13. The method of the maximum entropy for the reconstruction of the distribution bolt the bolt of the neutrons flow in a fuel element; O metodo da maxima entropia para a reconstrucao da distribuicao pino a pino do fluxo de neutrons em um elemento combustivel

    Energy Technology Data Exchange (ETDEWEB)

    Ancalla, Lourdes Pilar Zaragoza

    2005-04-15

    The reconstruction of the distribution of density of potency pin upright in a heterogeneous combustible element, of the nucleus of a nuclear reactor, it is a subject that has been studied inside by a long time in Physics of Reactors area. Several methods exist to do this reconstruction, one of them is Maximum Entropy's Method, that besides being an optimization method that finds the best solution of all the possible solutions, it is a method also improved that uses multipliers of Lagrange to obtain the distribution of the flows in the faces of the combustible element. This distribution of the flows in the faces is used then as a contour condition in the calculations of a detailed distribution of flow inside the combustible element. In this work, in first place it was made the homogenization of the heterogeneous element. Soon after the factor of the multiplication executes and the medium values of the flow and of the liquid current they are computed, with the program NEM2D. These values medium nodal are, then, used upright in the reconstruction of the distribution pin of the flow inside the combustible element. The obtained results were acceptable, when compared with those obtained using fine mesh. (author)

  14. Diffusion archeology for diffusion progression history reconstruction.

    Science.gov (United States)

    Sefer, Emre; Kingsford, Carl

    2016-11-01

    Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring - perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data.

  15. New method for reconstruction of star spatial distribution in globular clusters and its application to flare stars in Pleiades

    International Nuclear Information System (INIS)

    Kosarev, E.L.

    1980-01-01

    A new method to reconstruct spatial star distribution in globular clusters is presented. The method gives both the estimation of unknown spatial distribution and the probable reconstruction error. This error has statistical origin and depends only on the number of stars in a cluster. The method is applied to reconstruct the spatial density of 441 flare stars in Pleiades. The spatial density has a maximum in the centre of the cluster of about 1.6-2.5 pc -3 and with increasing distance from the center smoothly falls down to zero approximately with the Gaussian law with a scale parameter of 3.5 pc

  16. Storm Surge Reconstruction and Return Water Level Estimation in Southeast Asia for the 20th Century

    NARCIS (Netherlands)

    Cid, Alba; Wahl, Thomas; Chambers, Don P.; Muis, Sanne

    2018-01-01

    We present a methodology to reconstruct the daily maximum storm surge levels, obtained from tide gauges, based on the surrounding atmospheric conditions from an atmospheric reanalysis (20th Century Reanalysis-20CR). Tide gauge records in Southeast Asia are relatively short, so this area is often

  17. Implicit vessel surface reconstruction for visualization and CFD simulation

    International Nuclear Information System (INIS)

    Schumann, Christian; Peitgen, Heinz-Otto; Neugebauer, Mathias; Bade, Ragnar; Preim, Bernhard

    2008-01-01

    Accurate and high-quality reconstructions of vascular structures are essential for vascular disease diagnosis and blood flow simulations.These applications necessitate a trade-off between accuracy and smoothness. An additional requirement for the volume grid generation for Computational Fluid Dynamics (CFD) simulations is a high triangle quality. We propose a method that produces an accurate reconstruction of the vessel surface with satisfactory surface quality. A point cloud representing the vascular boundary is generated based on a segmentation result. Thin vessels are subsampled to enable an accurate reconstruction. A signed distance field is generated using Multi-level Partition of Unity Implicits and subsequently polygonized using a surface tracking approach. To guarantee a high triangle quality, the surface is remeshed. Compared to other methods, our approach represents a good trade-off between accuracy and smoothness. For the tested data, the average surface deviation to the segmentation results is 0.19 voxel diagonals and the maximum equi-angle skewness values are below 0.75. The generated surfaces are considerably more accurate than those obtained using model-based approaches. Compared to other model-free approaches, the proposed method produces smoother results and thus better supports the perception and interpretation of the vascular topology. Moreover, the triangle quality of the generated surfaces is suitable for CFD simulations. (orig.)

  18. Fast gradient-based methods for Bayesian reconstruction of transmission and emission PET images

    International Nuclear Information System (INIS)

    Mumcuglu, E.U.; Leahy, R.; Zhou, Z.; Cherry, S.R.

    1994-01-01

    The authors describe conjugate gradient algorithms for reconstruction of transmission and emission PET images. The reconstructions are based on a Bayesian formulation, where the data are modeled as a collection of independent Poisson random variables and the image is modeled using a Markov random field. A conjugate gradient algorithm is used to compute a maximum a posteriori (MAP) estimate of the image by maximizing over the posterior density. To ensure nonnegativity of the solution, a penalty function is used to convert the problem to one of unconstrained optimization. Preconditioners are used to enhance convergence rates. These methods generally achieve effective convergence in 15--25 iterations. Reconstructions are presented of an 18 FDG whole body scan from data collected using a Siemens/CTI ECAT931 whole body system. These results indicate significant improvements in emission image quality using the Bayesian approach, in comparison to filtered backprojection, particularly when reprojections of the MAP transmission image are used in place of the standard attenuation correction factors

  19. New algorithms and methods to estimate maximum-likelihood phylogenies: assessing the performance of PhyML 3.0.

    Science.gov (United States)

    Guindon, Stéphane; Dufayard, Jean-François; Lefort, Vincent; Anisimova, Maria; Hordijk, Wim; Gascuel, Olivier

    2010-05-01

    PhyML is a phylogeny software based on the maximum-likelihood principle. Early PhyML versions used a fast algorithm performing nearest neighbor interchanges to improve a reasonable starting tree topology. Since the original publication (Guindon S., Gascuel O. 2003. A simple, fast and accurate algorithm to estimate large phylogenies by maximum likelihood. Syst. Biol. 52:696-704), PhyML has been widely used (>2500 citations in ISI Web of Science) because of its simplicity and a fair compromise between accuracy and speed. In the meantime, research around PhyML has continued, and this article describes the new algorithms and methods implemented in the program. First, we introduce a new algorithm to search the tree space with user-defined intensity using subtree pruning and regrafting topological moves. The parsimony criterion is used here to filter out the least promising topology modifications with respect to the likelihood function. The analysis of a large collection of real nucleotide and amino acid data sets of various sizes demonstrates the good performance of this method. Second, we describe a new test to assess the support of the data for internal branches of a phylogeny. This approach extends the recently proposed approximate likelihood-ratio test and relies on a nonparametric, Shimodaira-Hasegawa-like procedure. A detailed analysis of real alignments sheds light on the links between this new approach and the more classical nonparametric bootstrap method. Overall, our tests show that the last version (3.0) of PhyML is fast, accurate, stable, and ready to use. A Web server and binary files are available from http://www.atgc-montpellier.fr/phyml/.

  20. Anatomy assisted PET image reconstruction incorporating multi-resolution joint entropy

    International Nuclear Information System (INIS)

    Tang, Jing; Rahmim, Arman

    2015-01-01

    A promising approach in PET image reconstruction is to incorporate high resolution anatomical information (measured from MR or CT) taking the anato-functional similarity measures such as mutual information or joint entropy (JE) as the prior. These similarity measures only classify voxels based on intensity values, while neglecting structural spatial information. In this work, we developed an anatomy-assisted maximum a posteriori (MAP) reconstruction algorithm wherein the JE measure is supplied by spatial information generated using wavelet multi-resolution analysis. The proposed wavelet-based JE (WJE) MAP algorithm involves calculation of derivatives of the subband JE measures with respect to individual PET image voxel intensities, which we have shown can be computed very similarly to how the inverse wavelet transform is implemented. We performed a simulation study with the BrainWeb phantom creating PET data corresponding to different noise levels. Realistically simulated T1-weighted MR images provided by BrainWeb modeling were applied in the anatomy-assisted reconstruction with the WJE-MAP algorithm and the intensity-only JE-MAP algorithm. Quantitative analysis showed that the WJE-MAP algorithm performed similarly to the JE-MAP algorithm at low noise level in the gray matter (GM) and white matter (WM) regions in terms of noise versus bias tradeoff. When noise increased to medium level in the simulated data, the WJE-MAP algorithm started to surpass the JE-MAP algorithm in the GM region, which is less uniform with smaller isolated structures compared to the WM region. In the high noise level simulation, the WJE-MAP algorithm presented clear improvement over the JE-MAP algorithm in both the GM and WM regions. In addition to the simulation study, we applied the reconstruction algorithms to real patient studies involving DPA-173 PET data and Florbetapir PET data with corresponding T1-MPRAGE MRI images. Compared to the intensity-only JE-MAP algorithm, the WJE

  1. Anatomy assisted PET image reconstruction incorporating multi-resolution joint entropy

    Science.gov (United States)

    Tang, Jing; Rahmim, Arman

    2015-01-01

    A promising approach in PET image reconstruction is to incorporate high resolution anatomical information (measured from MR or CT) taking the anato-functional similarity measures such as mutual information or joint entropy (JE) as the prior. These similarity measures only classify voxels based on intensity values, while neglecting structural spatial information. In this work, we developed an anatomy-assisted maximum a posteriori (MAP) reconstruction algorithm wherein the JE measure is supplied by spatial information generated using wavelet multi-resolution analysis. The proposed wavelet-based JE (WJE) MAP algorithm involves calculation of derivatives of the subband JE measures with respect to individual PET image voxel intensities, which we have shown can be computed very similarly to how the inverse wavelet transform is implemented. We performed a simulation study with the BrainWeb phantom creating PET data corresponding to different noise levels. Realistically simulated T1-weighted MR images provided by BrainWeb modeling were applied in the anatomy-assisted reconstruction with the WJE-MAP algorithm and the intensity-only JE-MAP algorithm. Quantitative analysis showed that the WJE-MAP algorithm performed similarly to the JE-MAP algorithm at low noise level in the gray matter (GM) and white matter (WM) regions in terms of noise versus bias tradeoff. When noise increased to medium level in the simulated data, the WJE-MAP algorithm started to surpass the JE-MAP algorithm in the GM region, which is less uniform with smaller isolated structures compared to the WM region. In the high noise level simulation, the WJE-MAP algorithm presented clear improvement over the JE-MAP algorithm in both the GM and WM regions. In addition to the simulation study, we applied the reconstruction algorithms to real patient studies involving DPA-173 PET data and Florbetapir PET data with corresponding T1-MPRAGE MRI images. Compared to the intensity-only JE-MAP algorithm, the WJE

  2. Influence of rebinning on the reconstructed resolution of fan-beam SPECT

    International Nuclear Information System (INIS)

    Koole, M.; D'Asseler, Y.; Staelens, S.; Vandenberghe, S.; Eede, I. van den; Walle, R. van de; Lemahieu, I.

    2002-01-01

    Aim: Fan-beam projection data can be rebinned to a parallel-beam geometry. This rebinning operation allows these data to be reconstructed with algorithms for parallel-beam projection data. The advantage of such an operation is that a dedicated projection/backprojection step for fan-beam geometry doesn't need to be developed. In clinical practice bilinear interpolation is often used for this rebinning operation. The aim of this study is to investigate the influence of the rebinning operation on the resolution properties of the reconstructed SPECT-image. Materials and methods: We have simulated the resolution properties of a fan-beam collimator, used in clinical routine, by means of a dedicated projector operation which models the distance dependent sensitivity and resolution of the collimator. With this projector, we generated noise-free sinograms for a point source located at various distances from the center of rotation. The number of angles of these sinograms varied from 60 to 180, corresponding to a step angle of 6 to 2 degrees. These generated fan-beam projection data were reconstructed directly with a filtered backprojection algorithm for fan-beam projection data, which consists of weighting and filtering the projection data with a ramp filter and of a weighted backprojection. Next, the generated fan-beam projection data were rebinned by means of bilinear interpolation and reconstructed with standard filtered backprojection for parallel-beam data. A two-dimensional Gaussian was fitted to the two point sources, one reconstructed with FBP for fan-beam and one reconstructed with FBP for parallel-beam after rebinning, yielding an estimate for the reconstructed Full Width at Half Maximum (FWHM) in the radial and tangential direction, for different locations in the field of view. Results: Results show little difference in resolution degradation in the radial direction between direct reconstruction and reconstruction after rebinning. However, significant loss in

  3. Breast reconstruction after mastectomy

    Directory of Open Access Journals (Sweden)

    Daniel eSchmauss

    2016-01-01

    Full Text Available Breast cancer is the leading cause of cancer death in women worldwide. Its surgical approach has become less and less mutilating in the last decades. However, the overall number of breast reconstructions has significantly increased lately. Nowadays breast reconstruction should be individualized at its best, first of all taking into consideration oncological aspects of the tumor, neo-/adjuvant treatment and genetic predisposition, but also its timing (immediate versus delayed breast reconstruction, as well as the patient’s condition and wish. This article gives an overview over the various possibilities of breast reconstruction, including implant- and expander-based reconstruction, flap-based reconstruction (vascularized autologous tissue, the combination of implant and flap, reconstruction using non-vascularized autologous fat, as well as refinement surgery after breast reconstruction.

  4. MR guided applicator reconstruction for brachytherapy of cervical cancer using the novel titanium Rotterdam applicator

    International Nuclear Information System (INIS)

    Petit, Steven; Wielopolski, Piotr; Rijnsdorp, Reneé; Mens, Jan-Willem; Kolkman-Deurloo, Inger-Karine

    2013-01-01

    A novel model of the titanium Rotterdam tandem and ovoid applicator is presented. As titanium produces artefacts in MR images, an MR sequence was sought and optimised for visualisation and accurate applicator reconstruction. The mean inter-observer (8 observers) variability for four patients was only 0.7 mm (maximum 1.7 mm)

  5. Breast reconstruction - implants

    Science.gov (United States)

    Breast implants surgery; Mastectomy - breast reconstruction with implants; Breast cancer - breast reconstruction with implants ... harder to find a tumor if your breast cancer comes back. Getting breast implants does not take as long as breast reconstruction ...

  6. Phylogenetic analyses of Vitis (Vitaceae) based on complete chloroplast genome sequences: effects of taxon sampling and phylogenetic methods on resolving relationships among rosids.

    Science.gov (United States)

    Jansen, Robert K; Kaittanis, Charalambos; Saski, Christopher; Lee, Seung-Bum; Tomkins, Jeffrey; Alverson, Andrew J; Daniell, Henry

    2006-04-09

    The Vitaceae (grape) is an economically important family of angiosperms whose phylogenetic placement is currently unresolved. Recent phylogenetic analyses based on one to several genes have suggested several alternative placements of this family, including sister to Caryophyllales, asterids, Saxifragales, Dilleniaceae or to rest of rosids, though support for these different results has been weak. There has been a recent interest in using complete chloroplast genome sequences for resolving phylogenetic relationships among angiosperms. These studies have clarified relationships among several major lineages but they have also emphasized the importance of taxon sampling and the effects of different phylogenetic methods for obtaining accurate phylogenies. We sequenced the complete chloroplast genome of Vitis vinifera and used these data to assess relationships among 27 angiosperms, including nine taxa of rosids. The Vitis vinifera chloroplast genome is 160,928 bp in length, including a pair of inverted repeats of 26,358 bp that are separated by small and large single copy regions of 19,065 bp and 89,147 bp, respectively. The gene content and order of Vitis is identical to many other unrearranged angiosperm chloroplast genomes, including tobacco. Phylogenetic analyses using maximum parsimony and maximum likelihood were performed on DNA sequences of 61 protein-coding genes for two datasets with 28 or 29 taxa, including eight or nine taxa from four of the seven currently recognized major clades of rosids. Parsimony and likelihood phylogenies of both data sets provide strong support for the placement of Vitaceae as sister to the remaining rosids. However, the position of the Myrtales and support for the monophyly of the eurosid I clade differs between the two data sets and the two methods of analysis. In parsimony analyses, the inclusion of Gossypium is necessary to obtain trees that support the monophyly of the eurosid I clade. However, maximum likelihood analyses place

  7. Phylogenetic analyses of Vitis (Vitaceae based on complete chloroplast genome sequences: effects of taxon sampling and phylogenetic methods on resolving relationships among rosids

    Directory of Open Access Journals (Sweden)

    Alverson Andrew J

    2006-04-01

    Full Text Available Abstract Background The Vitaceae (grape is an economically important family of angiosperms whose phylogenetic placement is currently unresolved. Recent phylogenetic analyses based on one to several genes have suggested several alternative placements of this family, including sister to Caryophyllales, asterids, Saxifragales, Dilleniaceae or to rest of rosids, though support for these different results has been weak. There has been a recent interest in using complete chloroplast genome sequences for resolving phylogenetic relationships among angiosperms. These studies have clarified relationships among several major lineages but they have also emphasized the importance of taxon sampling and the effects of different phylogenetic methods for obtaining accurate phylogenies. We sequenced the complete chloroplast genome of Vitis vinifera and used these data to assess relationships among 27 angiosperms, including nine taxa of rosids. Results The Vitis vinifera chloroplast genome is 160,928 bp in length, including a pair of inverted repeats of 26,358 bp that are separated by small and large single copy regions of 19,065 bp and 89,147 bp, respectively. The gene content and order of Vitis is identical to many other unrearranged angiosperm chloroplast genomes, including tobacco. Phylogenetic analyses using maximum parsimony and maximum likelihood were performed on DNA sequences of 61 protein-coding genes for two datasets with 28 or 29 taxa, including eight or nine taxa from four of the seven currently recognized major clades of rosids. Parsimony and likelihood phylogenies of both data sets provide strong support for the placement of Vitaceae as sister to the remaining rosids. However, the position of the Myrtales and support for the monophyly of the eurosid I clade differs between the two data sets and the two methods of analysis. In parsimony analyses, the inclusion of Gossypium is necessary to obtain trees that support the monophyly of the eurosid I clade

  8. [MAXIMUM SINGLE DOSE OF COLLOIDAL SILVER NEGATIVELY AFFECTS ERYTHROPOIESIS IN VITRO].

    Science.gov (United States)

    Tishevskayal, N V; Zakharovl, Y M; Bolotovl, A A; Arkhipenko, Yu V; Sazontova, T G

    2015-01-01

    Erythroblastic islets (EI) of rat bone marrow were cultured for 24 h in the presence of silver nanoparticles (1.07 · 10(-4) mg/ml; 1.07 · 10(-3) mg/ml; and 1.07 · 10(-2) mg/mL). The colloidal silver at 1.07 · 10(-3) mg/ml concentration inhibited the formation of new Elby disrupting contacts of bone marrow macrophages with CFU-E (erythropoiesis de novo) by 65.3% (p Colloidal silver nanoparticles suppressed the reconstruction of erythropoiesis and inhibited the formation of new EI by disrupting contacts of CFU-E and central macrophages with matured erythroidal "crown" (erythropoiesis de repeto). The colloidal silver concentration of 1.07 · 10(-3) mg/ml in the culture medium also reduced the number of self-reconstructing EI by 67.5% (p colloidal silver reduced this value by 93.7% (p Silver nanoparticles retarded maturation of erythroid cells at the stage of oxiphylic normoblast denucleation: 1.07 · 10(-3) mg/ml colloidal silver increased the number of mature El by 53% (p colloidal silver in concentration equivalent to the maximum single dose is related to the effect of silver nanoparticles rather than glycerol present in the colloidal suspension.

  9. Clinical application of 16-slice spiral CT in reconstruction imaging of coronary artery for diagnosing coronary disense

    International Nuclear Information System (INIS)

    Mao Xinbo; Zhu Xinjin; Zeng Huiliang; Chen Xueguang

    2005-01-01

    Objective: An evaluation of the reconstructed imaging of coronary arteries with 16-slice spiral CT in diagnosis of coronary disease. Methods: The reconstructed images of coronary arteries obtained on a 16-slice spiral CT scanner were reviewed in 60 cases, on which the following techniques were applied: retrospective ECG-gating, Segment method with 75% R-R interval, volume rendering technique (VRT), maximum intensity projection (MIP), mulfiplanar reconstruction (MPR), curved planar reconstruction (CPR) and CT virtual endoscopy (CTVE). Results: In all 60 cases, different stages of CHD were revealed in 21 cases; none abnormality was found in 33; and images were in poor quality in 2 cases, which was available for diagnosis. There were 4 stents planted in 4 cases: soft plaque suspected in lcase, patent in 2 and occlude in 1. Conclusion: The reconstructed imaging of coronary arteries with 16-slice spiral CT is superior modality in evaluation of severe coronary stenosis, plaques, and the pantency of the intra-luminal stents, which is an efficient and non-invasive imaging in diagnosis of early-stage CHD and screening in high risk population. (authors)

  10. Reconstructing sea level from paleo and projected temperatures 200 to 2100 AD

    DEFF Research Database (Denmark)

    Grinsted, Aslak; Moore, John; Jevrejeva, Svetlana

    2010-01-01

    -proxy reconstructions assuming that the established relationship between temperature and sea level holds from 200 to 2100 ad. Over the last 2,000 years minimum sea level (-19 to -26 cm) occurred around 1730 ad, maximum sea level (12–21 cm) around 1150 AD. Sea level 2090–2099 is projected to be 0.9 to 1.3 m for the A1B...

  11. SU-F-T-117: A Pilot Study of Organ Dose Reconstruction for Wilms Tumor Patients Treated with Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Makkia, R; Pelletier, C; Jung, J [East Carolina University, Greenville, NC (United States); Gopalakrishnan, M [Northwestern Memorial Hospital, Chicago, IL (United States); Lee, C [University of Michigan, Ann Arbor, MI (United States); Mille, M; Lee, C [National Cancer Institute, Rockville, MD (United States); Kalapurakal, J [Northwestern University, Chicago, IL (United States)

    2016-06-15

    Purpose: To reconstruct major organ doses for the Wilms tumor pediatric patients treated with radiation therapy using pediatric computational phantoms, treatment planning system (TPS), and Monte Carlo (MC) dose calculation methods. Methods: A total of ten female and male pediatric patients (15–88 months old) were selected from the National Wilms Tumor Study cohort and ten pediatric computational phantoms corresponding to the patient’s height and weight were selected for the organ dose reconstruction. Treatment plans were reconstructed on the computational phantoms in a Pinnacle TPS (v9.10) referring to treatment records and exported into DICOM-RT files, which were then used to generate the input files for XVMC MC code. The mean doses to major organs and the dose received by 50% of the heart were calculated and compared between TPS and MC calculations. The same calculations were conducted by replacing the computational human phantoms with a series of diagnostic patient CT images selected by matching the height and weight of the patients to validate the anatomical accuracy of the computational phantoms. Results: Dose to organs located within the treatment fields from the computational phantoms and the diagnostic patient CT images agreed within 2% for all cases for both TPS and MC calculations. The maximum difference of organ doses was 55.9 % (thyroid), but the absolute dose difference in this case was 0.33 Gy which was 0.96% of the prescription dose. The doses to ovaries and testes from MC in out-of-field provided more discrepancy (the maximum difference of 13.2% and 50.8%, respectively). The maximum difference of the 50% heart volume dose between the phantoms and the patient CT images was 40.0%. Conclusion: This study showed the pediatric computational phantoms are applicable to organ doses reconstruction for the radiotherapy patients whose three-dimensional radiological images are not available.

  12. Reconstructed Temperature And Precipitation On A Millennial Timescale From Tree-Rings In The Southern Colorado Plateau, U.S.A.

    Energy Technology Data Exchange (ETDEWEB)

    Salzer, M.W. [Laboratory of Tree-Ring Research, The University of Arizona, Tucson, Arizona, 85721 (United States); Kipfmueller, K.F. [Department of Geography, University of Minnesota, Minneapolis, MN, 55455 (United States)

    2005-06-01

    Two independent calibrated and verified climate reconstructions from ecologically contrasting tree-ring sites in the southern Colorado Plateau, U.S.A. reveal decadal-scale climatic trends during the past two millennia. Combining precisely dated annual mean-maximum temperature and October through July precipitation reconstructions yields an unparalleled record of climatic variability. The approach allows for the identification of thirty extreme wet periods and thirty-five extreme dry periods in the 1,425-year precipitation reconstruction and 30 extreme cool periods and 26 extreme warm periods in 2,262-year temperature reconstruction. In addition, the reconstructions were integrated to identify intervals when conditions were extreme in both climatic variables (cool/dry, cool/wet, warm/dry, warm/wet). Noteworthy in the reconstructions are the post-1976 warm/wet period, unprecedented in the 1,425-year record both in amplitude and duration, anomalous and prolonged late 20th century warmth, that while never exceeded, was nearly equaled in magnitude for brief intervals in the past, and substantial decadal-scale variability within the Medieval Warm Period and Little Ice Age intervals.

  13. Adaptive algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Lu Wenkai; Yin Fangfang

    2004-01-01

    Algebraic reconstruction techniques (ART) are iterative procedures for reconstructing objects from their projections. It is proven that ART can be computationally efficient by carefully arranging the order in which the collected data are accessed during the reconstruction procedure and adaptively adjusting the relaxation parameters. In this paper, an adaptive algebraic reconstruction technique (AART), which adopts the same projection access scheme in multilevel scheme algebraic reconstruction technique (MLS-ART), is proposed. By introducing adaptive adjustment of the relaxation parameters during the reconstruction procedure, one-iteration AART can produce reconstructions with better quality, in comparison with one-iteration MLS-ART. Furthermore, AART outperforms MLS-ART with improved computational efficiency

  14. Three-dimensional fracture visualisation of multidetector CT of the skull base in trauma patients: comparison of three reconstruction algorithms

    International Nuclear Information System (INIS)

    Ringl, Helmut; Schernthaner, Ruediger; Philipp, Marcel O.; Metz-Schimmerl, Sylvia; Czerny, Christian; Weber, Michael; Steiner-Ringl, Andrea; Peloschek, Philipp; Herold, Christian J.; Schima, Wolfgang; Gaebler, Christian

    2009-01-01

    The purpose of this study was to retrospectively assess the detection rate of skull-base fractures for three different three-dimensional (3D) reconstruction methods of cranial CT examinations in trauma patients. A total of 130 cranial CT examinations of patients with previous head trauma were subjected to 3D reconstruction of the skull base, using solid (SVR) and transparent (TVR) volume-rendering technique and maximum intensity projection (MIP). Three radiologists independently evaluated all reconstructions as well as standard high-resolution multiplanar reformations (HR-MPRs). Mean fracture detection rates for all readers reading rotating reconstructions were 39, 36, 61 and 64% for SVR, TVR, MIP and HR-MPR respectively. Although not significantly different from HR-MPR with respect to sensitivity (P = 0.9), MIP visualised 18% of fractures that were not reported in HR-MPR. Because of the relatively low detection rate using HR-MPRs alone, we recommend reading MIP reconstructions in addition to the obligatory HR-MPRs to improve fracture detection. (orig.)

  15. Molecular systematics of terraranas (Anura: Brachycephaloidea) with an assessment of the effects of alignment and optimality criteria.

    Science.gov (United States)

    Padial, José M; Grant, Taran; Frost, Darrel R

    2014-06-26

    Brachycephaloidea is a monophyletic group of frogs with more than 1000 species distributed throughout the New World tropics, subtropics, and Andean regions. Recently, the group has been the target of multiple molecular phylogenetic analyses, resulting in extensive changes in its taxonomy. Here, we test previous hypotheses of phylogenetic relationships for the group by combining available molecular evidence (sequences of 22 genes representing 431 ingroup and 25 outgroup terminals) and performing a tree-alignment analysis under the parsimony optimality criterion using the program POY. To elucidate the effects of alignment and optimality criterion on phylogenetic inferences, we also used the program MAFFT to obtain a similarity-alignment for analysis under both parsimony and maximum likelihood using the programs TNT and GARLI, respectively. Although all three analytical approaches agreed on numerous points, there was also extensive disagreement. Tree-alignment under parsimony supported the monophyly of the ingroup and the sister group relationship of the monophyletic marsupial frogs (Hemiphractidae), while maximum likelihood and parsimony analyses of the MAFFT similarity-alignment did not. All three methods differed with respect to the position of Ceuthomantis smaragdinus (Ceuthomantidae), with tree-alignment using parsimony recovering this species as the sister of Pristimantis + Yunganastes. All analyses rejected the monophyly of Strabomantidae and Strabomantinae as originally defined, and the tree-alignment analysis under parsimony further rejected the recently redefined Craugastoridae and Pristimantinae. Despite the greater emphasis in the systematics literature placed on the choice of optimality criterion for evaluating trees than on the choice of method for aligning DNA sequences, we found that the topological differences attributable to the alignment method were as great as those caused by the optimality criterion. Further, the optimal tree-alignment indicates

  16. Pulmonary sequestration: diagnosis with three dimensional reconstruction using spiral CT

    International Nuclear Information System (INIS)

    Nie Yongkang; Zhao Shaohong; Cai Zulong; Yang Li; Zhao Hong; Zhang Ailian; Huang Hui

    2003-01-01

    Objective: To evaluate the role of three dimensional (3D) reconstruction using spiral CT in the diagnosis of pulmonary sequestration. Methods: Ten patients with pulmonary sequestration were analyzed. The diagnoses were confirmed by angiography in 2 patients, by operation in 2 patients, and by CT angiography in 6 patients. All patients were examined with Philips SR 7000 or GE Lightspeed Plus scanner. CT images were transferred to a workstation and 3D reconstruction was performed. All images were reviewed and analyzed by two radiologists. Results: Among 10 patients, the pulmonary sequestration was in the right lower lobe in 1 patient and in the left lower lobe in 9 patients. Anomalous systemic arteries originated from thoracic aorta in 8 patients and from celiac artery in 2 patients. On plain CT scan, there were 4 patients with patchy opacities, 3 patients with hilar mass accompanying vascular engorgement and profusion in adjacent parenchyma, 2 patients with finger-like appendage surrounded by hyper-inflated lung, and 1 patient with lung mass-like lesion. Enhanced CT revealed anomalous systemic arteries in 9 patients and drainage vein in 7 patients. Maximum intensity projection (MIP) and curvilinear reconstruction could depict the abnormal systemic artery and drainage vein in sequestration. Surface shadow display (SSD) and volume rendering (VR) could delineate the anomalous systemic artery. Conclusion: 3D reconstruction with enhanced spiral CT can depict anomalous systemic artery and drainage vein and it is the first method of choice in diagnosing pulmonary sequestration

  17. Compensation of spatial system response in SPECT with conjugate gradient reconstruction technique

    International Nuclear Information System (INIS)

    Formiconi, A.R.; Pupi, A.; Passeri, A.

    1989-01-01

    A procedure for determination of the system matrix in single photon emission tomography (SPECT) is described which use a conjugate gradient reconstruction technique to take into account the variable system resolution of a camera equipped with parallel-hole collimators. The procedure involves acquisition of system line spread functions (LSF) in the region occupied by the object studied. Those data are used to generate a set of weighting factors based on the assumption that the LSFs of the collimated camera are of Gaussian shape with full width at half maximum (FWHM) linearly dependent on source depth in the span of image space. Factors are stored on a disc file for subsequent use in reconstruction. Afterwards reconstruction is performed using the conjugate gradient method with the system matrix modified by incorporation of these precalculated factors to take into account variable geometrical system response. The set of weighting factors is regenerated whenever acquisition conditions are changed (collimator, radius of rotation) with an ultra high resolution (UHR) collimator 2000 weighting factors need to be calculated. (author)

  18. MO-DE-207A-12: Toward Patient-Specific 4DCT Reconstruction Using Adaptive Velocity Binning

    International Nuclear Information System (INIS)

    Morris, E.D.; Glide-Hurst, C.; Klahr, P.

    2016-01-01

    Purpose: While 4DCT provides organ/tumor motion information, it often samples data over 10–20 breathing cycles. For patients presenting with compromised pulmonary function, breathing patterns can change over the acquisition time, potentially leading to tumor delineation discrepancies. This work introduces a novel adaptive velocity-modulated binning (AVB) 4DCT algorithm that modulates the reconstruction based on the respiratory waveform, yielding a patient-specific 4DCT solution. Methods: AVB was implemented in a research reconstruction configuration. After filtering the respiratory waveform, the algorithm examines neighboring data to a phase reconstruction point and the temporal gate is widened until the difference between the reconstruction point and waveform exceeds a threshold value—defined as percent difference between maximum/minimum waveform amplitude. The algorithm only impacts reconstruction if the gate width exceeds a set minimum temporal width required for accurate reconstruction. A sensitivity experiment of threshold values (0.5, 1, 5, 10, and 12%) was conducted to examine the interplay between threshold, signal to noise ratio (SNR), and image sharpness for phantom and several patient 4DCT cases using ten-phase reconstructions. Individual phase reconstructions were examined. Subtraction images and regions of interest were compared to quantify changes in SNR. Results: AVB increased signal in reconstructed 4DCT slices for respiratory waveforms that met the prescribed criteria. For the end-exhale phases, where the respiratory velocity is low, patient data revealed a threshold of 0.5% demonstrated increased SNR in the AVB reconstructions. For intermediate breathing phases, threshold values were required to be >10% to notice appreciable changes in CT intensity with AVB. AVB reconstructions exhibited appreciably higher SNR and reduced noise in regions of interest that were photon deprived such as the liver. Conclusion: We demonstrated that patient

  19. MO-DE-207A-12: Toward Patient-Specific 4DCT Reconstruction Using Adaptive Velocity Binning

    Energy Technology Data Exchange (ETDEWEB)

    Morris, E.D.; Glide-Hurst, C. [Henry Ford Health System, Detroit, MI (United States); Wayne State University, Detroit, MI (United States); Klahr, P. [Philips Healthcare, Cleveland, Ohio (United States)

    2016-06-15

    Purpose: While 4DCT provides organ/tumor motion information, it often samples data over 10–20 breathing cycles. For patients presenting with compromised pulmonary function, breathing patterns can change over the acquisition time, potentially leading to tumor delineation discrepancies. This work introduces a novel adaptive velocity-modulated binning (AVB) 4DCT algorithm that modulates the reconstruction based on the respiratory waveform, yielding a patient-specific 4DCT solution. Methods: AVB was implemented in a research reconstruction configuration. After filtering the respiratory waveform, the algorithm examines neighboring data to a phase reconstruction point and the temporal gate is widened until the difference between the reconstruction point and waveform exceeds a threshold value—defined as percent difference between maximum/minimum waveform amplitude. The algorithm only impacts reconstruction if the gate width exceeds a set minimum temporal width required for accurate reconstruction. A sensitivity experiment of threshold values (0.5, 1, 5, 10, and 12%) was conducted to examine the interplay between threshold, signal to noise ratio (SNR), and image sharpness for phantom and several patient 4DCT cases using ten-phase reconstructions. Individual phase reconstructions were examined. Subtraction images and regions of interest were compared to quantify changes in SNR. Results: AVB increased signal in reconstructed 4DCT slices for respiratory waveforms that met the prescribed criteria. For the end-exhale phases, where the respiratory velocity is low, patient data revealed a threshold of 0.5% demonstrated increased SNR in the AVB reconstructions. For intermediate breathing phases, threshold values were required to be >10% to notice appreciable changes in CT intensity with AVB. AVB reconstructions exhibited appreciably higher SNR and reduced noise in regions of interest that were photon deprived such as the liver. Conclusion: We demonstrated that patient

  20. Nuclear Enhanced X-ray Maximum Entropy Method Used to Analyze Local Distortions in Simple Structures

    DEFF Research Database (Denmark)

    Christensen, Sebastian; Bindzus, Niels; Christensen, Mogens

    We introduce a novel method for reconstructing pseudo nuclear density distributions (NDDs): Nuclear Enhanced X-ray Maximum Entropy Method (NEXMEM). NEXMEM offers an alternative route to experimental NDDs, exploiting the superior quality of synchrotron X-ray data compared to neutron data. The method...... proposed to result from anharmonic phonon scattering or from local fluctuating dipoles on the Pb site.[1,2] No macroscopic symmetry change are associated with these effects, rendering them invisible to conventional crystallographic techniques. For this reason PbX was until recently believed to adopt...

  1. Optimal contact definition for reconstruction of Contact Maps

    Directory of Open Access Journals (Sweden)

    Stehr Henning

    2010-05-01

    Full Text Available Abstract Background Contact maps have been extensively used as a simplified representation of protein structures. They capture most important features of a protein's fold, being preferred by a number of researchers for the description and study of protein structures. Inspired by the model's simplicity many groups have dedicated a considerable amount of effort towards contact prediction as a proxy for protein structure prediction. However a contact map's biological interest is subject to the availability of reliable methods for the 3-dimensional reconstruction of the structure. Results We use an implementation of the well-known distance geometry protocol to build realistic protein 3-dimensional models from contact maps, performing an extensive exploration of many of the parameters involved in the reconstruction process. We try to address the questions: a to what accuracy does a contact map represent its corresponding 3D structure, b what is the best contact map representation with regard to reconstructability and c what is the effect of partial or inaccurate contact information on the 3D structure recovery. Our results suggest that contact maps derived from the application of a distance cutoff of 9 to 11Å around the Cβ atoms constitute the most accurate representation of the 3D structure. The reconstruction process does not provide a single solution to the problem but rather an ensemble of conformations that are within 2Å RMSD of the crystal structure and with lower values for the pairwise average ensemble RMSD. Interestingly it is still possible to recover a structure with partial contact information, although wrong contacts can lead to dramatic loss in reconstruction fidelity. Conclusions Thus contact maps represent a valid approximation to the structures with an accuracy comparable to that of experimental methods. The optimal contact definitions constitute key guidelines for methods based on contact maps such as structure prediction through

  2. Self-reconstruction of diffraction-free and accelerating laser beams in scattering media

    International Nuclear Information System (INIS)

    Ersoy, T.; Yalizay, B.; Akturk, S.

    2012-01-01

    We experimentally investigate propagation of laser beams with different intensity profiles in highly scattering media. We generate transverse laser amplitude profiles with Gaussian, Bessel and Airy function envelopes. We then propagate these beams through optical phantoms formed with variable density intralipid solutions. At the sample exit, we compare change in maximum intensities, as well as beam profile reconstruction. We show that self-reconstruction properties of Bessel and Airy beams bring about slower decrease in maximum intensity with increasing scatterer density. On the other hand, the beam profiles deteriorate faster, as compared to reference Gaussian beams. Slower decrease in the intensity can be attributed to the wavevector spectra providing a continuous flow of energy to the beam center, while beam deterioration is linked to total beam volume in the scattering medium. These results show that beam shaping methods can significantly enhance delivery of intense light deeper into turbid media, but this enhancement is compromised by stronger speckling of beam profiles. -- Highlights: ► We experimentally investigate propagation of shaped laser beams in turbid media. ► Peak intensity of Bessel and Airy beams decrease slower with increasing scatterer. ► Shaped beam profiles deteriorate faster, as compared to reference Gaussian beams. ► Shaped beam profiles can enhance applications of lasers inscattering media.

  3. PET image reconstruction with rotationally symmetric polygonal pixel grid based highly compressible system matrix

    International Nuclear Information System (INIS)

    Yu Yunhan; Xia Yan; Liu Yaqiang; Wang Shi; Ma Tianyu; Chen Jing; Hong Baoyu

    2013-01-01

    To achieve a maximum compression of system matrix in positron emission tomography (PET) image reconstruction, we proposed a polygonal image pixel division strategy in accordance with rotationally symmetric PET geometry. Geometrical definition and indexing rule for polygonal pixels were established. Image conversion from polygonal pixel structure to conventional rectangular pixel structure was implemented using a conversion matrix. A set of test images were analytically defined in polygonal pixel structure, converted to conventional rectangular pixel based images, and correctly displayed which verified the correctness of the image definition, conversion description and conversion of polygonal pixel structure. A compressed system matrix for PET image recon was generated by tap model and tested by forward-projecting three different distributions of radioactive sources to the sinogram domain and comparing them with theoretical predictions. On a practical small animal PET scanner, a compress ratio of 12.6:1 of the system matrix size was achieved with the polygonal pixel structure, comparing with the conventional rectangular pixel based tap-mode one. OS-EM iterative image reconstruction algorithms with the polygonal and conventional Cartesian pixel grid were developed. A hot rod phantom was detected and reconstructed based on these two grids with reasonable time cost. Image resolution of reconstructed images was both 1.35 mm. We conclude that it is feasible to reconstruct and display images in a polygonal image pixel structure based on a compressed system matrix in PET image reconstruction. (authors)

  4. Propagation stability of self-reconstructing Bessel beams enables contrast-enhanced imaging in thick media.

    Science.gov (United States)

    Fahrbach, Florian O; Rohrbach, Alexander

    2012-01-17

    Laser beams that can self-reconstruct their initial beam profile even in the presence of massive phase perturbations are able to propagate deeper into inhomogeneous media. This ability has crucial advantages for light sheet-based microscopy in thick media, such as cell clusters, embryos, skin or brain tissue or plants, as well as scattering synthetic materials. A ring system around the central intensity maximum of a Bessel beam enables its self-reconstruction, but at the same time illuminates out-of-focus regions and deteriorates image contrast. Here we present a detection method that minimizes the negative effect of the ring system. The beam's propagation stability along one straight line enables the use of a confocal line principle, resulting in a significant increase in image contrast. The axial resolution could be improved by nearly 100% relative to the standard light-sheet techniques using scanned Gaussian beams, while demonstrating self-reconstruction also for high propagation depths.

  5. Acoustical source reconstruction from non-synchronous sequential measurements by Fast Iterative Shrinkage Thresholding Algorithm

    Science.gov (United States)

    Yu, Liang; Antoni, Jerome; Leclere, Quentin; Jiang, Weikang

    2017-11-01

    Acoustical source reconstruction is a typical inverse problem, whose minimum frequency of reconstruction hinges on the size of the array and maximum frequency depends on the spacing distance between the microphones. For the sake of enlarging the frequency of reconstruction and reducing the cost of an acquisition system, Cyclic Projection (CP), a method of sequential measurements without reference, was recently investigated (JSV,2016,372:31-49). In this paper, the Propagation based Fast Iterative Shrinkage Thresholding Algorithm (Propagation-FISTA) is introduced, which improves CP in two aspects: (1) the number of acoustic sources is no longer needed and the only making assumption is that of a "weakly sparse" eigenvalue spectrum; (2) the construction of the spatial basis is much easier and adaptive to practical scenarios of acoustical measurements benefiting from the introduction of propagation based spatial basis. The proposed Propagation-FISTA is first investigated with different simulations and experimental setups and is next illustrated with an industrial case.

  6. LASER: A Maximum Likelihood Toolkit for Detecting Temporal Shifts in Diversification Rates From Molecular Phylogenies

    Directory of Open Access Journals (Sweden)

    Daniel L. Rabosky

    2006-01-01

    Full Text Available Rates of species origination and extinction can vary over time during evolutionary radiations, and it is possible to reconstruct the history of diversification using molecular phylogenies of extant taxa only. Maximum likelihood methods provide a useful framework for inferring temporal variation in diversification rates. LASER is a package for the R programming environment that implements maximum likelihood methods based on the birth-death process to test whether diversification rates have changed over time. LASER contrasts the likelihood of phylogenetic data under models where diversification rates have changed over time to alternative models where rates have remained constant over time. Major strengths of the package include the ability to detect temporal increases in diversification rates and the inference of diversification parameters under multiple rate-variable models of diversification. The program and associated documentation are freely available from the R package archive at http://cran.r-project.org.

  7. Quantitative assessment of oral orbicular muscle deformation after cleft lip reconstruction: an ultrasound elastography study.

    Science.gov (United States)

    de Korte, Chris L; van Hees, Nancy; Lopata, Richard G P; Weijers, Gert; Katsaros, Christos; Thijssen, Johan M

    2009-08-01

    Reconstruction of a cleft lip leads inevitably to scar tissue formation. Scar tissue within the restored oral orbicular muscle might be assessed by quantification of the local contractility of this muscle. Furthermore, information about the contraction capability of the oral orbicular muscle is crucial for planning the revision surgery of an individual patient. We used ultrasound elastography to determine the local deformation (strain) of the upper lip and to differentiate contracting muscle from passive scar tissue. Raw ultrasound data (radio-frequency format; rf-) were acquired, while the lips were brought from normal state into a pout condition and back in normal state, in three patients and three normal individuals. During this movement, the oral orbicular muscle contracts and, consequently, thickens in contrast to scar tissue that will not contract, or even expand. An iterative coarse-to-fine strain estimation method was used to calculate the local tissue strain. Analysis of the raw ultrasound data allows estimation of tissue strain with a high precision. The minimum strain that can be assessed reproducibly is 0.1%. In normal individuals, strain of the orbicular oral muscle was in the order of 20%. Also, a uniform strain distribution in the oral orbicular muscle was found. However, in patients deviating values were found in the region of the reconstruction and the muscle tissue surrounding that. In two patients with a successful reconstruction, strain was reduced by 6% in the reconstructed region with respect to the normal parts of the muscle (from 22% to 16% and from 25% to 19%). In a patient with severe aesthetical and functional disability, strain decreased from 30% in the normal region to 5% in the reconstructed region. With ultrasound elastography, the strain of the oral orbicular muscle can be quantified. In healthy subjects, the strain profiles and maximum strain values in all parts of the muscle were similar. The maximum strain of the muscle during

  8. Anatomy-based reconstruction of FDG-PET images with implicit partial volume correction improves detection of hypometabolic regions in patients with epilepsy due to focal cortical dysplasia diagnosed on MRI

    Energy Technology Data Exchange (ETDEWEB)

    Goffin, Karolien; Baete, Kristof; Nuyts, Johan; Laere, Koen van [University Hospital Leuven, Division of Nuclear Medicine and Medical Imaging Center, Leuven (Belgium); Van Paesschen, Wim [University Hospital Leuven, Neurology Department, Leuven (Belgium); Dupont, Patrick [University Hospital Leuven, Division of Nuclear Medicine and Medical Imaging Center, Leuven (Belgium); University Hospital Leuven, Laboratory of Cognitive Neurology, Leuven (Belgium); Palmini, Andre [Pontificia Universidade Catolica do Rio Grande do Sul (PUCRS), Porto Alegre Epilepsy Surgery Program, Hospital Sao Lucas, Porto Alegre (Brazil)

    2010-06-15

    Detection of hypometabolic areas on interictal FDG-PET images for assessing the epileptogenic zone is hampered by partial volume effects. We evaluated the performance of an anatomy-based maximum a-posteriori (A-MAP) reconstruction algorithm which combined noise suppression with correction for the partial volume effect in the detection of hypometabolic areas in patients with focal cortical dysplasia (FCD). FDG-PET images from 14 patients with refractory partial epilepsy were reconstructed using A-MAP and maximum likelihood (ML) reconstruction. In all patients, presurgical evaluation showed that FCD represented the epileptic lesion. Correspondence between the FCD location and regional metabolism on a predefined atlas was evaluated. An asymmetry index of FCD to normal cortex was calculated. Hypometabolism at the FCD location was detected in 9/14 patients (64%) using ML and in 10/14 patients (71%) using A-MAP reconstruction. Hypometabolic areas outside the FCD location were detected in 12/14 patients (86%) using ML and in 11/14 patients (79%) using A-MAP reconstruction. The asymmetry index was higher using A-MAP reconstruction (0.61, ML 0.49, p=0.03). The A-MAP reconstruction algorithm improved visual detection of epileptic FCD on brain FDG-PET images compared to ML reconstruction, due to higher contrast and better delineation of the lesion. This improvement failed to reach significance in our small sample. Hypometabolism outside the lesion is often present, consistent with the observation that the functional deficit zone tends to be larger than the epileptogenic zone. (orig.)

  9. Anatomy-based reconstruction of FDG-PET images with implicit partial volume correction improves detection of hypometabolic regions in patients with epilepsy due to focal cortical dysplasia diagnosed on MRI

    International Nuclear Information System (INIS)

    Goffin, Karolien; Baete, Kristof; Nuyts, Johan; Laere, Koen van; Van Paesschen, Wim; Dupont, Patrick; Palmini, Andre

    2010-01-01

    Detection of hypometabolic areas on interictal FDG-PET images for assessing the epileptogenic zone is hampered by partial volume effects. We evaluated the performance of an anatomy-based maximum a-posteriori (A-MAP) reconstruction algorithm which combined noise suppression with correction for the partial volume effect in the detection of hypometabolic areas in patients with focal cortical dysplasia (FCD). FDG-PET images from 14 patients with refractory partial epilepsy were reconstructed using A-MAP and maximum likelihood (ML) reconstruction. In all patients, presurgical evaluation showed that FCD represented the epileptic lesion. Correspondence between the FCD location and regional metabolism on a predefined atlas was evaluated. An asymmetry index of FCD to normal cortex was calculated. Hypometabolism at the FCD location was detected in 9/14 patients (64%) using ML and in 10/14 patients (71%) using A-MAP reconstruction. Hypometabolic areas outside the FCD location were detected in 12/14 patients (86%) using ML and in 11/14 patients (79%) using A-MAP reconstruction. The asymmetry index was higher using A-MAP reconstruction (0.61, ML 0.49, p=0.03). The A-MAP reconstruction algorithm improved visual detection of epileptic FCD on brain FDG-PET images compared to ML reconstruction, due to higher contrast and better delineation of the lesion. This improvement failed to reach significance in our small sample. Hypometabolism outside the lesion is often present, consistent with the observation that the functional deficit zone tends to be larger than the epileptogenic zone. (orig.)

  10. Isotope specific resolution recovery image reconstruction in high resolution PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kotasidis, Fotis A. [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, CH-1211 Geneva, Switzerland and Wolfson Molecular Imaging Centre, MAHSC, University of Manchester, M20 3LJ, Manchester (United Kingdom); Angelis, Georgios I. [Faculty of Health Sciences, Brain and Mind Research Institute, University of Sydney, NSW 2006, Sydney (Australia); Anton-Rodriguez, Jose; Matthews, Julian C. [Wolfson Molecular Imaging Centre, MAHSC, University of Manchester, Manchester M20 3LJ (United Kingdom); Reader, Andrew J. [Montreal Neurological Institute, McGill University, Montreal QC H3A 2B4, Canada and Department of Biomedical Engineering, Division of Imaging Sciences and Biomedical Engineering, King' s College London, St. Thomas’ Hospital, London SE1 7EH (United Kingdom); Zaidi, Habib [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, CH-1211 Geneva (Switzerland); Geneva Neuroscience Centre, Geneva University, CH-1205 Geneva (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, PO Box 30 001, Groningen 9700 RB (Netherlands)

    2014-05-15

    Purpose: Measuring and incorporating a scanner-specific point spread function (PSF) within image reconstruction has been shown to improve spatial resolution in PET. However, due to the short half-life of clinically used isotopes, other long-lived isotopes not used in clinical practice are used to perform the PSF measurements. As such, non-optimal PSF models that do not correspond to those needed for the data to be reconstructed are used within resolution modeling (RM) image reconstruction, usually underestimating the true PSF owing to the difference in positron range. In high resolution brain and preclinical imaging, this effect is of particular importance since the PSFs become more positron range limited and isotope-specific PSFs can help maximize the performance benefit from using resolution recovery image reconstruction algorithms. Methods: In this work, the authors used a printing technique to simultaneously measure multiple point sources on the High Resolution Research Tomograph (HRRT), and the authors demonstrated the feasibility of deriving isotope-dependent system matrices from fluorine-18 and carbon-11 point sources. Furthermore, the authors evaluated the impact of incorporating them within RM image reconstruction, using carbon-11 phantom and clinical datasets on the HRRT. Results: The results obtained using these two isotopes illustrate that even small differences in positron range can result in different PSF maps, leading to further improvements in contrast recovery when used in image reconstruction. The difference is more pronounced in the centre of the field-of-view where the full width at half maximum (FWHM) from the positron range has a larger contribution to the overall FWHM compared to the edge where the parallax error dominates the overall FWHM. Conclusions: Based on the proposed methodology, measured isotope-specific and spatially variant PSFs can be reliably derived and used for improved spatial resolution and variance performance in resolution

  11. Isotope specific resolution recovery image reconstruction in high resolution PET imaging

    International Nuclear Information System (INIS)

    Kotasidis, Fotis A.; Angelis, Georgios I.; Anton-Rodriguez, Jose; Matthews, Julian C.; Reader, Andrew J.; Zaidi, Habib

    2014-01-01

    Purpose: Measuring and incorporating a scanner-specific point spread function (PSF) within image reconstruction has been shown to improve spatial resolution in PET. However, due to the short half-life of clinically used isotopes, other long-lived isotopes not used in clinical practice are used to perform the PSF measurements. As such, non-optimal PSF models that do not correspond to those needed for the data to be reconstructed are used within resolution modeling (RM) image reconstruction, usually underestimating the true PSF owing to the difference in positron range. In high resolution brain and preclinical imaging, this effect is of particular importance since the PSFs become more positron range limited and isotope-specific PSFs can help maximize the performance benefit from using resolution recovery image reconstruction algorithms. Methods: In this work, the authors used a printing technique to simultaneously measure multiple point sources on the High Resolution Research Tomograph (HRRT), and the authors demonstrated the feasibility of deriving isotope-dependent system matrices from fluorine-18 and carbon-11 point sources. Furthermore, the authors evaluated the impact of incorporating them within RM image reconstruction, using carbon-11 phantom and clinical datasets on the HRRT. Results: The results obtained using these two isotopes illustrate that even small differences in positron range can result in different PSF maps, leading to further improvements in contrast recovery when used in image reconstruction. The difference is more pronounced in the centre of the field-of-view where the full width at half maximum (FWHM) from the positron range has a larger contribution to the overall FWHM compared to the edge where the parallax error dominates the overall FWHM. Conclusions: Based on the proposed methodology, measured isotope-specific and spatially variant PSFs can be reliably derived and used for improved spatial resolution and variance performance in resolution

  12. Isotope specific resolution recovery image reconstruction in high resolution PET imaging.

    Science.gov (United States)

    Kotasidis, Fotis A; Angelis, Georgios I; Anton-Rodriguez, Jose; Matthews, Julian C; Reader, Andrew J; Zaidi, Habib

    2014-05-01

    Measuring and incorporating a scanner-specific point spread function (PSF) within image reconstruction has been shown to improve spatial resolution in PET. However, due to the short half-life of clinically used isotopes, other long-lived isotopes not used in clinical practice are used to perform the PSF measurements. As such, non-optimal PSF models that do not correspond to those needed for the data to be reconstructed are used within resolution modeling (RM) image reconstruction, usually underestimating the true PSF owing to the difference in positron range. In high resolution brain and preclinical imaging, this effect is of particular importance since the PSFs become more positron range limited and isotope-specific PSFs can help maximize the performance benefit from using resolution recovery image reconstruction algorithms. In this work, the authors used a printing technique to simultaneously measure multiple point sources on the High Resolution Research Tomograph (HRRT), and the authors demonstrated the feasibility of deriving isotope-dependent system matrices from fluorine-18 and carbon-11 point sources. Furthermore, the authors evaluated the impact of incorporating them within RM image reconstruction, using carbon-11 phantom and clinical datasets on the HRRT. The results obtained using these two isotopes illustrate that even small differences in positron range can result in different PSF maps, leading to further improvements in contrast recovery when used in image reconstruction. The difference is more pronounced in the centre of the field-of-view where the full width at half maximum (FWHM) from the positron range has a larger contribution to the overall FWHM compared to the edge where the parallax error dominates the overall FWHM. Based on the proposed methodology, measured isotope-specific and spatially variant PSFs can be reliably derived and used for improved spatial resolution and variance performance in resolution recovery image reconstruction. The

  13. Parsimonious model for blood glucose level monitoring in type 2 diabetes patients.

    Science.gov (United States)

    Zhao, Fang; Ma, Yan Fen; Wen, Jing Xiao; DU, Yan Fang; Li, Chun Lin; Li, Guang Wei

    2014-07-01

    To establish the parsimonious model for blood glucose monitoring in patients with type 2 diabetes receiving oral hypoglycemic agent treatment. One hundred and fifty-nine adult Chinese type 2 diabetes patients were randomized to receive rapid-acting or sustained-release gliclazide therapy for 12 weeks. Their blood glucose levels were measured at 10 time points in a 24 h period before and after treatment, and the 24 h mean blood glucose levels were measured. Contribution of blood glucose levels to the mean blood glucose level and HbA1c was assessed by multiple regression analysis. The correlation coefficients of blood glucose level measured at 10 time points to the daily MBG were 0.58-0.74 and 0.59-0.79, respectively, before and after treatment (Pblood glucose levels measured at 6 of the 10 time points could explain 95% and 97% of the changes in MBG before and after treatment. The three blood glucose levels, which were measured at fasting, 2 h after breakfast and before dinner, of the 10 time points could explain 84% and 86% of the changes in MBG before and after treatment, but could only explain 36% and 26% of the changes in HbA1c before and after treatment, and they had a poorer correlation with the HbA1c than with the 24 h MBG. The blood glucose levels measured at fasting, 2 h after breakfast and before dinner truly reflected the change 24 h blood glucose level, suggesting that they are appropriate for the self-monitoring of blood glucose levels in diabetes patients receiving oral anti-diabetes therapy. Copyright © 2014 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.

  14. Robustness of ancestral sequence reconstruction to phylogenetic uncertainty.

    Science.gov (United States)

    Hanson-Smith, Victor; Kolaczkowski, Bryan; Thornton, Joseph W

    2010-09-01

    Ancestral sequence reconstruction (ASR) is widely used to formulate and test hypotheses about the sequences, functions, and structures of ancient genes. Ancestral sequences are usually inferred from an alignment of extant sequences using a maximum likelihood (ML) phylogenetic algorithm, which calculates the most likely ancestral sequence assuming a probabilistic model of sequence evolution and a specific phylogeny--typically the tree with the ML. The true phylogeny is seldom known with certainty, however. ML methods ignore this uncertainty, whereas Bayesian methods incorporate it by integrating the likelihood of each ancestral state over a distribution of possible trees. It is not known whether Bayesian approaches to phylogenetic uncertainty improve the accuracy of inferred ancestral sequences. Here, we use simulation-based experiments under both simplified and empirically derived conditions to compare the accuracy of ASR carried out using ML and Bayesian approaches. We show that incorporating phylogenetic uncertainty by integrating over topologies very rarely changes the inferred ancestral state and does not improve the accuracy of the reconstructed ancestral sequence. Ancestral state reconstructions are robust to uncertainty about the underlying tree because the conditions that produce phylogenetic uncertainty also make the ancestral state identical across plausible trees; conversely, the conditions under which different phylogenies yield different inferred ancestral states produce little or no ambiguity about the true phylogeny. Our results suggest that ML can produce accurate ASRs, even in the face of phylogenetic uncertainty. Using Bayesian integration to incorporate this uncertainty is neither necessary nor beneficial.

  15. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  16. Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in vivo studies

    Science.gov (United States)

    Petibon, Yoann; Rakvongthai, Yothin; El Fakhri, Georges; Ouyang, Jinsong

    2017-05-01

    Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves-TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans—each containing 1/8th of the total number of events—were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard ordered subset expectation maximization (OSEM) reconstruction algorithm on one side, and the one-step late maximum a posteriori (OSL-MAP) algorithm on the other

  17. Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in-vivo studies

    Science.gov (United States)

    Petibon, Yoann; Rakvongthai, Yothin; Fakhri, Georges El; Ouyang, Jinsong

    2017-01-01

    Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves -TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in-vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans - each containing 1/8th of the total number of events - were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard Ordered Subset Expectation Maximization (OSEM) reconstruction algorithm on one side, and the One-Step Late Maximum a Posteriori (OSL-MAP) algorithm on the other

  18. Effect of Low-Dose MDCT and Iterative Reconstruction on Trabecular Bone Microstructure Assessment.

    Science.gov (United States)

    Kopp, Felix K; Holzapfel, Konstantin; Baum, Thomas; Nasirudin, Radin A; Mei, Kai; Garcia, Eduardo G; Burgkart, Rainer; Rummeny, Ernst J; Kirschke, Jan S; Noël, Peter B

    2016-01-01

    We investigated the effects of low-dose multi detector computed tomography (MDCT) in combination with statistical iterative reconstruction algorithms on trabecular bone microstructure parameters. Twelve donated vertebrae were scanned with the routine radiation exposure used in our department (standard-dose) and a low-dose protocol. Reconstructions were performed with filtered backprojection (FBP) and maximum-likelihood based statistical iterative reconstruction (SIR). Trabecular bone microstructure parameters were assessed and statistically compared for each reconstruction. Moreover, fracture loads of the vertebrae were biomechanically determined and correlated to the assessed microstructure parameters. Trabecular bone microstructure parameters based on low-dose MDCT and SIR significantly correlated with vertebral bone strength. There was no significant difference between microstructure parameters calculated on low-dose SIR and standard-dose FBP images. However, the results revealed a strong dependency on the regularization strength applied during SIR. It was observed that stronger regularization might corrupt the microstructure analysis, because the trabecular structure is a very small detail that might get lost during the regularization process. As a consequence, the introduction of SIR for trabecular bone microstructure analysis requires a specific optimization of the regularization parameters. Moreover, in comparison to other approaches, superior noise-resolution trade-offs can be found with the proposed methods.

  19. Statistical inference approach to structural reconstruction of complex networks from binary time series

    Science.gov (United States)

    Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng

    2018-02-01

    Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.

  20. Direct Reconstruction of CT-based Attenuation Correction Images for PET with Cluster-Based Penalties

    Science.gov (United States)

    Kim, Soo Mee; Alessio, Adam M.; De Man, Bruno; Asma, Evren; Kinahan, Paul E.

    2015-01-01

    Extremely low-dose CT acquisitions for the purpose of PET attenuation correction will have a high level of noise and biasing artifacts due to factors such as photon starvation. This work explores a priori knowledge appropriate for CT iterative image reconstruction for PET attenuation correction. We investigate the maximum a posteriori (MAP) framework with cluster-based, multinomial priors for the direct reconstruction of the PET attenuation map. The objective function for direct iterative attenuation map reconstruction was modeled as a Poisson log-likelihood with prior terms consisting of quadratic (Q) and mixture (M) distributions. The attenuation map is assumed to have values in 4 clusters: air+background, lung, soft tissue, and bone. Under this assumption, the MP was a mixture probability density function consisting of one exponential and three Gaussian distributions. The relative proportion of each cluster was jointly estimated during each voxel update of direct iterative coordinate decent (dICD) method. Noise-free data were generated from NCAT phantom and Poisson noise was added. Reconstruction with FBP (ramp filter) was performed on the noise-free (ground truth) and noisy data. For the noisy data, dICD reconstruction was performed with the combination of different prior strength parameters (β and γ) of Q- and M-penalties. The combined quadratic and mixture penalties reduces the RMSE by 18.7% compared to post-smoothed iterative reconstruction and only 0.7% compared to quadratic alone. For direct PET attenuation map reconstruction from ultra-low dose CT acquisitions, the combination of quadratic and mixture priors offers regularization of both variance and bias and is a potential method to derive attenuation maps with negligible patient dose. However, the small improvement in quantitative accuracy relative to the substantial increase in algorithm complexity does not currently justify the use of mixture-based PET attenuation priors for reconstruction of CT

  1. The 3D tomographic image reconstruction software for prompt-gamma measurement of the boron neutron capture therapy

    International Nuclear Information System (INIS)

    Morozov, Boris; Auterinen, Iiro; Kotiluoto, Petri; Kortesniemi, Mika

    2006-01-01

    A tomographic imaging system based on the spatial distribution measurement of the neutron capture reaction during Boron Neutron Capture Therapy (BNCT) would be very useful for clinical purpose. Using gamma-detectors in a 2D-panel, boron neutron capture and hydrogen neutron capture gamma-rays emitted by the neutron irradiated region can be detected, and an image of the neutron capture events can be reconstructed. A 3D reconstruction software package has been written to support the development of a 3D prompt-gamma tomographic system. The package consists of three independent modules: phantom generation, reconstruction and evaluation modules. The reconstruction modules are based on algebraic approach of the iterative reconstruction algorithm (ART), and on the maximum likelihood estimation method (ML-EM). In addition to that, two subsets of the ART, the simultaneous iterative reconstruction technique (SIRT) and the component averaging algorithms (CAV) have been included to the package employing parallel codes for multiprocessor architecture. All implemented algorithms use two different field functions for the reconstruction of the region. One is traditional voxel function, another is, so called, blob function, smooth spherically symmetric generalized Kaiser-Bessel function. The generation module provides the phantom and projections with background by tracing the prompt gamma-rays for a given scanner geometry. The evaluation module makes statistical comparisons between the generated and reconstructed images, and provides figure-of-merit (FOM) values for the applied reconstruction algorithms. The package has been written in C language and tested under Linux and Windows platforms. The simple graphical user interface (GUI) is used for command execution and visualization purposed. (author)

  2. LGM permafrost distribution: how well can the latest PMIP multi-model ensembles reconstruct?

    OpenAIRE

    K. Saito; T. Sueyoshi; S. Marchenko; V. Romanovsky; B. Otto-Bliesner; J. Walsh; N. Bigelow; A. Hendricks; K. Yoshikawa

    2013-01-01

    Global-scale frozen ground distribution during the Last Glacial Maximum (LGM) was reconstructed using multi-model ensembles of global climate models, and then compared with evidence-based knowledge and earlier numerical results. Modeled soil temperatures, taken from Paleoclimate Modelling Intercomparison Project Phase III (PMIP3) simulations, were used to diagnose the subsurface thermal regime and determine underlying frozen ground types for the present-day (pre-industrial; 0 k) and the LGM (...

  3. Manufacture of reconstruction-bricks in Mexico

    Science.gov (United States)

    Rojas-Valencia, Ma. Neftalí; Penagos, Armando Aguilar; Rojas, Denise Y. Fernández; López, Alberto López; Gálves, David Morillón

    2017-12-01

    In Mexico, around 33.600 tons of construction wastes are generated every day, Mexico City contributing for around tons/day, with fewer than 1.000 tons/day being sent to be recycled. For that reason the purpose of this study was to manufacture sustainable bricks, based on three types of wastes generated in the building industry: wood cutting residues, wastes from the excavation process (From Coapa and Cuautlancingo, Puebla, Mexico) and recycled aggregates. Water was added as kneading material, and Opuntia ficus-indica (mucilage) was supplemented as natural additive to improve the workability of the mixtures. Conventional firing process was substituted by drying in a solar drying chamber. Nine mixtures were prepared using 62% excavation wastes, 4% wood cutting residues and 11%, 17% and 34% recycled aggregates. These mixtures were classified in two groups depending on their granulometry: the first one denominated cementitious recycled aggregates only having granulometry from 25.4 mm, 9.52 mm to 6.35 mm to fines and the second group denominated all in one recycled aggregates having granulometry of 6.35 mm to fines. The quality of the sustainable bricks was evaluated according to compressive strength and water absorption parameters. The results of nine mixtures showed that the reconstruction-bricks manufactured with the mixture seven consisting of 9.52 mm and 6.35 mm construction residues (all in one) fines presented the highest strength values, lowest maximum initial absorption (4 g/min) compared to the norm NMX-C-037-ONNCCE-2013 which establishes that the maximum limit for walls exposed to the outside is 5 g/min. Using a solar desiccator made from construction residues, the bricks were dried in 11 days, the maximum temperature was 76 °C and the maximum solar radiation captured was 733.4 W/m2.

  4. Evaluating the relationship between evolutionary divergence and phylogenetic accuracy in AFLP data sets.

    Science.gov (United States)

    García-Pereira, María Jesús; Caballero, Armando; Quesada, Humberto

    2010-05-01

    Using in silico amplified fragment length polymorphism (AFLP) fingerprints, we explore the relationship between sequence similarity and phylogeny accuracy to test when, in terms of genetic divergence, the quality of AFLP data becomes too low to be informative for a reliable phylogenetic reconstruction. We generated DNA sequences with known phylogenies using balanced and unbalanced trees with recent, uniform and ancient radiations, and average branch lengths (from the most internal node to the tip) ranging from 0.02 to 0.4 substitutions per site. The resulting sequences were used to emulate the AFLP procedure. Trees were estimated by maximum parsimony (MP), neighbor-joining (NJ), and minimum evolution (ME) methods from both DNA sequences and virtual AFLP fingerprints. The estimated trees were compared with the reference trees using a score that measures overall differences in both topology and relative branch length. As expected, the accuracy of AFLP-based phylogenies decreased dramatically in the more divergent data sets. Above a divergence of approximately 0.05, AFLP-based phylogenies were largely inaccurate irrespective of the distinct topology, radiation model, or phylogenetic method used. This value represents an upper bound of expected tree accuracy for data sets with a simple divergence history; AFLP data sets with a similar divergence but with unbalanced topologies and short ancestral branches produced much less accurate trees. The lack of homology of AFLP bands quickly increases with divergence and reaches its maximum value (100%) at a divergence of only 0.4. Low guanine-cytosine (GC) contents increase the number of nonhomologous bands in AFLP data sets and lead to less reliable trees. However, the effect of the lack of band homology on tree accuracy is surprisingly small relative to the negative impact due to the low information content of AFLP characters. Tree-building methods based on genetic distance displayed similar trends and outperformed parsimony

  5. Image Reconstruction. Chapter 13

    Energy Technology Data Exchange (ETDEWEB)

    Nuyts, J. [Department of Nuclear Medicine and Medical Imaging Research Center, Katholieke Universiteit Leuven, Leuven (Belgium); Matej, S. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, PA (United States)

    2014-12-15

    This chapter discusses how 2‑D or 3‑D images of tracer distribution can be reconstructed from a series of so-called projection images acquired with a gamma camera or a positron emission tomography (PET) system [13.1]. This is often called an ‘inverse problem’. The reconstruction is the inverse of the acquisition. The reconstruction is called an inverse problem because making software to compute the true tracer distribution from the acquired data turns out to be more difficult than the ‘forward’ direction, i.e. making software to simulate the acquisition. There are basically two approaches to image reconstruction: analytical reconstruction and iterative reconstruction. The analytical approach is based on mathematical inversion, yielding efficient, non-iterative reconstruction algorithms. In the iterative approach, the reconstruction problem is reduced to computing a finite number of image values from a finite number of measurements. That simplification enables the use of iterative instead of mathematical inversion. Iterative inversion tends to require more computer power, but it can cope with more complex (and hopefully more accurate) models of the acquisition process.

  6. Reconstruction of Attosecond Pulse Trains Using an Adiabatic Phase Expansion

    International Nuclear Information System (INIS)

    Varju, K.; Gustafsson, E.; Johnsson, P.; Mauritsson, J.; L'Huillier, A.; Mairesse, Y.; Agostini, P.; Breger, P.; Carre, B.; Merdji, H.; Monchicourt, P.; Salieres, P.; Frasinski, L.J.

    2005-01-01

    We propose a new method to reconstruct the electric field of attosecond pulse trains. The phase of the high-order harmonic emission electric field is Taylor expanded around the maximum of the laser pulse envelope in the time domain and around the central harmonic in the frequency domain. Experimental measurements allow us to determine the coefficients of this expansion and to characterize the radiation with attosecond accuracy over a femtosecond time scale. The method gives access to pulse-to-pulse variations along the train, including the timing, the chirp, and the attosecond carrier envelope phase

  7. Clinical application of helical CT 3D reconstruction for the dental orthopaedics

    International Nuclear Information System (INIS)

    Han Benyi; Jiang Xiaolu; Li Hongru

    2005-01-01

    Objective: To evaluate the clinical application of helical CT 3D reconstruction technique in the dental orthopaedics. Methods: The helical CT was performed with 3.0 mm slice thickness and 1.0 pitch in 41 patients with dental orthopaedics. The 3D reconstructions, including maximum intensity projection (MIP), surface shaded display (SSD), and multiplanar reconstructions (MPR), were made for all the cases. Results: Thirty-seven of the 41 patients showed malalignment, tilt, rotation, overlap of the teeth and the different space between the longitudinal axes of the teeth. Twenty-five cases of them have shown 36 buried teeth in all. The axial images covered all the information. SSD demonstrated the external contours and entire morphologies of the teeth and the mandible with the relationship of the teeth alignment and the mandible. MIP clearly manifested the full view and the longitudinal alignment of the teeth. Among the 36 buried teeth, there were 29 palatally and 7 labially presented teeth, and they were morphologically delineated on MIP through various angles. Conclusion: The helical CT 3D reconstruction is a new technique to display the stereoscopic configuration of teeth. The combination of axial images and MIP, SSD, and MPR provides valuable anatomic and diagnostic information helpful for the surgeons to structure and determine the treatment protocol for the dental orthopaedics. (authors)

  8. Coordinate reconstruction using box reconstruction and projection of X-ray photo

    International Nuclear Information System (INIS)

    Achmad Suntoro

    2011-01-01

    Some mathematical formula have been derived for a process of reconstruction to define the coordinate of any point relative to a pre set coordinate system. The process of reconstruction uses a reconstruction box in which each edge's length of the box is known, each top-bottom face and left-right face of the box having a cross marker, and the top face and the right face of the box as plane projections by X-ray source in perspective projection -system. Using the data of the two X-ray projection images, any point inside the reconstruction box, as long as its projection is recorded in the two photos, will be determined its coordinate relative to the midpoint of the reconstruction box as the central point coordinates. (author)

  9. Maximum permissible dose

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    This chapter presents a historic overview of the establishment of radiation guidelines by various national and international agencies. The use of maximum permissible dose and maximum permissible body burden limits to derive working standards is discussed

  10. Towards improving searches for optimal phylogenies.

    Science.gov (United States)

    Ford, Eric; St John, Katherine; Wheeler, Ward C

    2015-01-01

    Finding the optimal evolutionary history for a set of taxa is a challenging computational problem, even when restricting possible solutions to be "tree-like" and focusing on the maximum-parsimony optimality criterion. This has led to much work on using heuristic tree searches to find approximate solutions. We present an approach for finding exact optimal solutions that employs and complements the current heuristic methods for finding optimal trees. Given a set of taxa and a set of aligned sequences of characters, there may be subsets of characters that are compatible, and for each such subset there is an associated (possibly partially resolved) phylogeny with edges corresponding to each character state change. These perfect phylogenies serve as anchor trees for our constrained search space. We show that, for sequences with compatible sites, the parsimony score of any tree [Formula: see text] is at least the parsimony score of the anchor trees plus the number of inferred changes between [Formula: see text] and the anchor trees. As the maximum-parsimony optimality score is additive, the sum of the lower bounds on compatible character partitions provides a lower bound on the complete alignment of characters. This yields a region in the space of trees within which the best tree is guaranteed to be found; limiting the search for the optimal tree to this region can significantly reduce the number of trees that must be examined in a search of the space of trees. We analyze this method empirically using four different biological data sets as well as surveying 400 data sets from the TreeBASE repository, demonstrating the effectiveness of our technique in reducing the number of steps in exact heuristic searches for trees under the maximum-parsimony optimality criterion. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Segmentation-DrivenTomographic Reconstruction

    DEFF Research Database (Denmark)

    Kongskov, Rasmus Dalgas

    such that the segmentation subsequently can be carried out by use of a simple segmentation method, for instance just a thresholding method. We tested the advantages of going from a two-stage reconstruction method to a one stage segmentation-driven reconstruction method for the phase contrast tomography reconstruction......The tomographic reconstruction problem is concerned with creating a model of the interior of an object from some measured data, typically projections of the object. After reconstructing an object it is often desired to segment it, either automatically or manually. For computed tomography (CT...

  12. Using the T-scan III system to analyze occlusal function in mandibular reconstruction patients: A pilot study

    Directory of Open Access Journals (Sweden)

    Chao-Wei Liu

    2015-02-01

    Full Text Available Background: This study was designed to analyze the post-rehabilitation occlusal function of subjects treated with complex mandibular resection and subsequently rehabilitated with fibula osteoseptocutaneous flaps, dental implants, and fixed prostheses utilizing the T-scan system. Methods: Ten mandibular complex resection cases that adopted fibula osteoseptocutaneous flaps, dental implants, and fixed prostheses to reconstruct occlusal function were analyzed. The mandibular reconstructions were divided into three groups based on size: full mandibular reconstructions, mandibular reconstructions larger than half of the arch, and mandibular reconstructions smaller than half of the arch. The T-scan III system was used to measure maximum occlusal force, occlusal time, anterior-posterior as well as left-right occlusal force asymmetries, and anterior-posterior as well as left-right asymmetrical locations of occlusal centers. Results: Subjects with larger mandibular reconstructions and dental implants with fixed partial dentures demonstrated decreased average occlusal force; however, the difference did not reach the statistically significant level (p > 0.05. The most significant asymmetry of occlusal center location occurred among subjects with mandibular reconstructed areas larger than half of the mandibular arch. Conclusions: Comparison of the parameters of T-scan system used to analyze the occlusal function showed that the occlusal force was not an objective reference. Measurements of the location of the occlusal center appeared more repeatable, and were less affected by additional factors. The research results of this study showed that the size of a reconstruction did not affect the occlusal force after reconstruction and larger reconstructed areas did not decrease the average occlusal force. The most significant parameter was left and right asymmetry of the occlusion center (LROC and was measured in subjects with reconstruction areas larger than half

  13. Cladistic analysis of Bantu languages: a new tree based on combined lexical and grammatical data

    Science.gov (United States)

    Rexová, Kateřina; Bastin, Yvonne; Frynta, Daniel

    2006-04-01

    The phylogeny of the Bantu languages is reconstructed by application of the cladistic methodology to the combined lexical and grammatical data (87 languages, 144 characters). A maximum parsimony tree and Bayesian analysis supported some previously recognized clades, e.g., that of eastern and southern Bantu languages. Moreover, the results revealed that Bantu languages south and east of the equatorial forest are probably monophyletic. It suggests an unorthodox scenario of Bantu expansion including (after initial radiation in their homelands and neighboring territories) just a single passage through rainforest areas followed by a subsequent divergence into major clades. The likely localization of this divergence is in the area west of the Great Lakes. It conforms to the view that demographic expansion and dispersal throughout the dry-forests and savanna regions of subequatorial Africa was associated with the acquisition of new technologies (iron metallurgy and grain cultivation).

  14. Metal-induced streak artifact reduction using iterative reconstruction algorithms in x-ray computed tomography image of the dentoalveolar region.

    Science.gov (United States)

    Dong, Jian; Hayakawa, Yoshihiko; Kannenberg, Sven; Kober, Cornelia

    2013-02-01

    The objective of this study was to reduce metal-induced streak artifact on oral and maxillofacial x-ray computed tomography (CT) images by developing the fast statistical image reconstruction system using iterative reconstruction algorithms. Adjacent CT images often depict similar anatomical structures in thin slices. So, first, images were reconstructed using the same projection data of an artifact-free image. Second, images were processed by the successive iterative restoration method where projection data were generated from reconstructed image in sequence. Besides the maximum likelihood-expectation maximization algorithm, the ordered subset-expectation maximization algorithm (OS-EM) was examined. Also, small region of interest (ROI) setting and reverse processing were applied for improving performance. Both algorithms reduced artifacts instead of slightly decreasing gray levels. The OS-EM and small ROI reduced the processing duration without apparent detriments. Sequential and reverse processing did not show apparent effects. Two alternatives in iterative reconstruction methods were effective for artifact reduction. The OS-EM algorithm and small ROI setting improved the performance. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. 4D-PET reconstruction using a spline-residue model with spatial and temporal roughness penalties

    Science.gov (United States)

    Ralli, George P.; Chappell, Michael A.; McGowan, Daniel R.; Sharma, Ricky A.; Higgins, Geoff S.; Fenwick, John D.

    2018-05-01

    4D reconstruction of dynamic positron emission tomography (dPET) data can improve the signal-to-noise ratio in reconstructed image sequences by fitting smooth temporal functions to the voxel time-activity-curves (TACs) during the reconstruction, though the optimal choice of function remains an open question. We propose a spline-residue model, which describes TACs as weighted sums of convolutions of the arterial input function with cubic B-spline basis functions. Convolution with the input function constrains the spline-residue model at early time-points, potentially enhancing noise suppression in early time-frames, while still allowing a wide range of TAC descriptions over the entire imaged time-course, thus limiting bias. Spline-residue based 4D-reconstruction is compared to that of a conventional (non-4D) maximum a posteriori (MAP) algorithm, and to 4D-reconstructions based on adaptive-knot cubic B-splines, the spectral model and an irreversible two-tissue compartment (‘2C3K’) model. 4D reconstructions were carried out using a nested-MAP algorithm including spatial and temporal roughness penalties. The algorithms were tested using Monte-Carlo simulated scanner data, generated for a digital thoracic phantom with uptake kinetics based on a dynamic [18F]-Fluromisonidazole scan of a non-small cell lung cancer patient. For every algorithm, parametric maps were calculated by fitting each voxel TAC within a sub-region of the reconstructed images with the 2C3K model. Compared to conventional MAP reconstruction, spline-residue-based 4D reconstruction achieved  >50% improvements for five of the eight combinations of the four kinetics parameters for which parametric maps were created with the bias and noise measures used to analyse them, and produced better results for 5/8 combinations than any of the other reconstruction algorithms studied, while spectral model-based 4D reconstruction produced the best results for 2/8. 2C3K model-based 4D reconstruction generated

  16. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in quantifying coronary calcium.

    Science.gov (United States)

    Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi

    2016-01-01

    Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  17. How cold was Europe at the Last Glacial Maximum? A synthesis of the progress achieved since the first PMIP model-data comparison

    Directory of Open Access Journals (Sweden)

    G. Ramstein

    2007-06-01

    Full Text Available The Last Glacial Maximum has been one of the first foci of the Paleoclimate Modelling Intercomparison Project (PMIP. During its first phase, the results of 17 atmosphere general circulation models were compared to paleoclimate reconstructions. One of the largest discrepancies in the simulations was the systematic underestimation, by at least 10°C, of the winter cooling over Europe and the Mediterranean region observed in the pollen-based reconstructions. In this paper, we investigate the progress achieved to reduce this inconsistency through a large modelling effort and improved temperature reconstructions. We show that increased model spatial resolution does not significantly increase the simulated LGM winter cooling. Further, neither the inclusion of a vegetation cover compatible with the LGM climate, nor the interactions with the oceans simulated by the atmosphere-ocean general circulation models run in the second phase of PMIP result in a better agreement between models and data. Accounting for changes in interannual variability in the interpretation of the pollen data does not result in a reduction of the reconstructed cooling. The largest recent improvement in the model-data comparison has instead arisen from a new climate reconstruction based on inverse vegetation modelling, which explicitly accounts for the CO2 decrease at LGM and which substantially reduces the LGM winter cooling reconstructed from pollen assemblages. As a result, the simulated and observed LGM winter cooling over Western Europe and the Mediterranean area are now in much better agreement.

  18. A hybrid reconstruction algorithm for fast and accurate 4D cone-beam CT imaging.

    Science.gov (United States)

    Yan, Hao; Zhen, Xin; Folkerts, Michael; Li, Yongbao; Pan, Tinsu; Cervino, Laura; Jiang, Steve B; Jia, Xun

    2014-07-01

    4D cone beam CT (4D-CBCT) has been utilized in radiation therapy to provide 4D image guidance in lung and upper abdomen area. However, clinical application of 4D-CBCT is currently limited due to the long scan time and low image quality. The purpose of this paper is to develop a new 4D-CBCT reconstruction method that restores volumetric images based on the 1-min scan data acquired with a standard 3D-CBCT protocol. The model optimizes a deformation vector field that deforms a patient-specific planning CT (p-CT), so that the calculated 4D-CBCT projections match measurements. A forward-backward splitting (FBS) method is invented to solve the optimization problem. It splits the original problem into two well-studied subproblems, i.e., image reconstruction and deformable image registration. By iteratively solving the two subproblems, FBS gradually yields correct deformation information, while maintaining high image quality. The whole workflow is implemented on a graphic-processing-unit to improve efficiency. Comprehensive evaluations have been conducted on a moving phantom and three real patient cases regarding the accuracy and quality of the reconstructed images, as well as the algorithm robustness and efficiency. The proposed algorithm reconstructs 4D-CBCT images from highly under-sampled projection data acquired with 1-min scans. Regarding the anatomical structure location accuracy, 0.204 mm average differences and 0.484 mm maximum difference are found for the phantom case, and the maximum differences of 0.3-0.5 mm for patients 1-3 are observed. As for the image quality, intensity errors below 5 and 20 HU compared to the planning CT are achieved for the phantom and the patient cases, respectively. Signal-noise-ratio values are improved by 12.74 and 5.12 times compared to results from FDK algorithm using the 1-min data and 4-min data, respectively. The computation time of the algorithm on a NVIDIA GTX590 card is 1-1.5 min per phase. High-quality 4D-CBCT imaging based

  19. A hybrid reconstruction algorithm for fast and accurate 4D cone-beam CT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Hao; Folkerts, Michael; Jiang, Steve B., E-mail: xun.jia@utsouthwestern.edu, E-mail: steve.jiang@UTSouthwestern.edu; Jia, Xun, E-mail: xun.jia@utsouthwestern.edu, E-mail: steve.jiang@UTSouthwestern.edu [Department of Radiation Oncology, The University of Texas, Southwestern Medical Center, Dallas, Texas 75390 (United States); Zhen, Xin [Department of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong 510515 (China); Li, Yongbao [Department of Radiation Oncology, The University of Texas, Southwestern Medical Center, Dallas, Texas 75390 and Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Pan, Tinsu [Department of Imaging Physics, The University of Texas, MD Anderson Cancer Center, Houston, Texas 77030 (United States); Cervino, Laura [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California 92093 (United States)

    2014-07-15

    Purpose: 4D cone beam CT (4D-CBCT) has been utilized in radiation therapy to provide 4D image guidance in lung and upper abdomen area. However, clinical application of 4D-CBCT is currently limited due to the long scan time and low image quality. The purpose of this paper is to develop a new 4D-CBCT reconstruction method that restores volumetric images based on the 1-min scan data acquired with a standard 3D-CBCT protocol. Methods: The model optimizes a deformation vector field that deforms a patient-specific planning CT (p-CT), so that the calculated 4D-CBCT projections match measurements. A forward-backward splitting (FBS) method is invented to solve the optimization problem. It splits the original problem into two well-studied subproblems, i.e., image reconstruction and deformable image registration. By iteratively solving the two subproblems, FBS gradually yields correct deformation information, while maintaining high image quality. The whole workflow is implemented on a graphic-processing-unit to improve efficiency. Comprehensive evaluations have been conducted on a moving phantom and three real patient cases regarding the accuracy and quality of the reconstructed images, as well as the algorithm robustness and efficiency. Results: The proposed algorithm reconstructs 4D-CBCT images from highly under-sampled projection data acquired with 1-min scans. Regarding the anatomical structure location accuracy, 0.204 mm average differences and 0.484 mm maximum difference are found for the phantom case, and the maximum differences of 0.3–0.5 mm for patients 1–3 are observed. As for the image quality, intensity errors below 5 and 20 HU compared to the planning CT are achieved for the phantom and the patient cases, respectively. Signal-noise-ratio values are improved by 12.74 and 5.12 times compared to results from FDK algorithm using the 1-min data and 4-min data, respectively. The computation time of the algorithm on a NVIDIA GTX590 card is 1–1.5 min per phase

  20. Influence of the Pixel Sizes of Reference Computed Tomography on Single-photon Emission Computed Tomography Image Reconstruction Using Conjugate-gradient Algorithm.

    Science.gov (United States)

    Okuda, Kyohei; Sakimoto, Shota; Fujii, Susumu; Ida, Tomonobu; Moriyama, Shigeru

    The frame-of-reference using computed-tomography (CT) coordinate system on single-photon emission computed tomography (SPECT) reconstruction is one of the advanced characteristics of the xSPECT reconstruction system. The aim of this study was to reveal the influence of the high-resolution frame-of-reference on the xSPECT reconstruction. 99m Tc line-source phantom and National Electrical Manufacturers Association (NEMA) image quality phantom were scanned using the SPECT/CT system. xSPECT reconstructions were performed with the reference CT images in different sizes of the display field-of-view (DFOV) and pixel. The pixel sizes of the reconstructed xSPECT images were close to 2.4 mm, which is acquired as originally projection data, even if the reference CT resolution was varied. The full width at half maximum (FWHM) of the line-source, absolute recovery coefficient, and background variability of image quality phantom were independent on the sizes of DFOV in the reference CT images. The results of this study revealed that the image quality of the reconstructed xSPECT images is not influenced by the resolution of frame-of-reference on SPECT reconstruction.