WorldWideScience

Sample records for cophylogeny reconstruction problem

  1. Jane: a new tool for the cophylogeny reconstruction problem.

    Science.gov (United States)

    Conow, Chris; Fielder, Daniel; Ovadia, Yaniv; Libeskind-Hadas, Ran

    2010-02-03

    This paper describes the theory and implementation of a new software tool, called Jane, for the study of historical associations. This problem arises in parasitology (associations of hosts and parasites), molecular systematics (associations of orderings and genes), and biogeography (associations of regions and orderings). The underlying problem is that of reconciling pairs of trees subject to biologically plausible events and costs associated with these events. Existing software tools for this problem have strengths and limitations, and the new Jane tool described here provides functionality that complements existing tools. The Jane software tool uses a polynomial time dynamic programming algorithm in conjunction with a genetic algorithm to find very good, and often optimal, solutions even for relatively large pairs of trees. The tool allows the user to provide rich timing information on both the host and parasite trees. In addition the user can limit host switch distance and specify multiple host switch costs by specifying regions in the host tree and costs for host switches between pairs of regions. Jane also provides a graphical user interface that allows the user to interactively experiment with modifications to the solutions found by the program. Jane is shown to be a useful tool for cophylogenetic reconstruction. Its functionality complements existing tools and it is therefore likely to be of use to researchers in the areas of parasitology, molecular systematics, and biogeography.

  2. Jane: a new tool for the cophylogeny reconstruction problem

    Directory of Open Access Journals (Sweden)

    Ovadia Yaniv

    2010-02-01

    Full Text Available Abstract Background This paper describes the theory and implementation of a new software tool, called Jane, for the study of historical associations. This problem arises in parasitology (associations of hosts and parasites, molecular systematics (associations of orderings and genes, and biogeography (associations of regions and orderings. The underlying problem is that of reconciling pairs of trees subject to biologically plausible events and costs associated with these events. Existing software tools for this problem have strengths and limitations, and the new Jane tool described here provides functionality that complements existing tools. Results The Jane software tool uses a polynomial time dynamic programming algorithm in conjunction with a genetic algorithm to find very good, and often optimal, solutions even for relatively large pairs of trees. The tool allows the user to provide rich timing information on both the host and parasite trees. In addition the user can limit host switch distance and specify multiple host switch costs by specifying regions in the host tree and costs for host switches between pairs of regions. Jane also provides a graphical user interface that allows the user to interactively experiment with modifications to the solutions found by the program. Conclusions Jane is shown to be a useful tool for cophylogenetic reconstruction. Its functionality complements existing tools and it is therefore likely to be of use to researchers in the areas of parasitology, molecular systematics, and biogeography.

  3. Cophylogeny reconstruction via an approximate Bayesian computation.

    Science.gov (United States)

    Baudet, C; Donati, B; Sinaimeri, B; Crescenzi, P; Gautier, C; Matias, C; Sagot, M-F

    2015-05-01

    Despite an increasingly vast literature on cophylogenetic reconstructions for studying host-parasite associations, understanding the common evolutionary history of such systems remains a problem that is far from being solved. Most algorithms for host-parasite reconciliation use an event-based model, where the events include in general (a subset of) cospeciation, duplication, loss, and host switch. All known parsimonious event-based methods then assign a cost to each type of event in order to find a reconstruction of minimum cost. The main problem with this approach is that the cost of the events strongly influences the reconciliation obtained. Some earlier approaches attempt to avoid this problem by finding a Pareto set of solutions and hence by considering event costs under some minimization constraints. To deal with this problem, we developed an algorithm, called Coala, for estimating the frequency of the events based on an approximate Bayesian computation approach. The benefits of this method are 2-fold: (i) it provides more confidence in the set of costs to be used in a reconciliation, and (ii) it allows estimation of the frequency of the events in cases where the data set consists of trees with a large number of taxa. We evaluate our method on simulated and on biological data sets. We show that in both cases, for the same pair of host and parasite trees, different sets of frequencies for the events lead to equally probable solutions. Moreover, often these solutions differ greatly in terms of the number of inferred events. It appears crucial to take this into account before attempting any further biological interpretation of such reconciliations. More generally, we also show that the set of frequencies can vary widely depending on the input host and parasite trees. Indiscriminately applying a standard vector of costs may thus not be a good strategy. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  4. An analytical statistical approach to the 3D reconstruction problem

    Energy Technology Data Exchange (ETDEWEB)

    Cierniak, Robert [Czestochowa Univ. of Technology (Poland). Inst. of Computer Engineering

    2011-07-01

    The presented here approach is concerned with the reconstruction problem for 3D spiral X-ray tomography. The reconstruction problem is formulated taking into considerations the statistical properties of signals obtained in X-ray CT. Additinally, image processing performed in our approach is involved in analytical methodology. This conception significantly improves quality of the obtained after reconstruction images and decreases the complexity of the reconstruction problem in comparison with other approaches. Computer simulations proved that schematically described here reconstruction algorithm outperforms conventional analytical methods in obtained image quality. (orig.)

  5. Honey bee-inspired algorithms for SNP haplotype reconstruction problem

    Science.gov (United States)

    PourkamaliAnaraki, Maryam; Sadeghi, Mehdi

    2016-03-01

    Reconstructing haplotypes from SNP fragments is an important problem in computational biology. There have been a lot of interests in this field because haplotypes have been shown to contain promising data for disease association research. It is proved that haplotype reconstruction in Minimum Error Correction model is an NP-hard problem. Therefore, several methods such as clustering techniques, evolutionary algorithms, neural networks and swarm intelligence approaches have been proposed in order to solve this problem in appropriate time. In this paper, we have focused on various evolutionary clustering techniques and try to find an efficient technique for solving haplotype reconstruction problem. It can be referred from our experiments that the clustering methods relying on the behaviour of honey bee colony in nature, specifically bees algorithm and artificial bee colony methods, are expected to result in more efficient solutions. An application program of the methods is available at the following link. http://www.bioinf.cs.ipm.ir/software/haprs/

  6. Iterative Reconstruction Methods for Hybrid Inverse Problems in Impedance Tomography

    DEFF Research Database (Denmark)

    Hoffmann, Kristoffer; Knudsen, Kim

    2014-01-01

    For a general formulation of hybrid inverse problems in impedance tomography the Picard and Newton iterative schemes are adapted and four iterative reconstruction algorithms are developed. The general problem formulation includes several existing hybrid imaging modalities such as current density...... impedance imaging, magnetic resonance electrical impedance tomography, and ultrasound modulated electrical impedance tomography, and the unified approach to the reconstruction problem encompasses several algorithms suggested in the literature. The four proposed algorithms are implemented numerically in two...

  7. Time-dependent problems in quantum-mechanical state reconstruction

    International Nuclear Information System (INIS)

    Leonhardt, U.; Bardroff, P. J.

    1997-01-01

    We study the state reconstruction of wave packets that travel in time-dependent potentials. We solve the problem for explicitly time-dependent potentials. We solve the problem for explicitly time-dependent harmonic oscillators and sketch a general adaptive technique for finding the wave function that matches and observed evolution. (authors)

  8. Pathgroups, a dynamic data structure for genome reconstruction problems.

    Science.gov (United States)

    Zheng, Chunfang

    2010-07-01

    Ancestral gene order reconstruction problems, including the median problem, quartet construction, small phylogeny, guided genome halving and genome aliquoting, are NP hard. Available heuristics dedicated to each of these problems are computationally costly for even small instances. We present a data structure enabling rapid heuristic solution to all these ancestral genome reconstruction problems. A generic greedy algorithm with look-ahead based on an automatically generated priority system suffices for all the problems using this data structure. The efficiency of the algorithm is due to fast updating of the structure during run time and to the simplicity of the priority scheme. We illustrate with the first rapid algorithm for quartet construction and apply this to a set of yeast genomes to corroborate a recent gene sequence-based phylogeny. http://albuquerque.bioinformatics.uottawa.ca/pathgroup/Quartet.html chunfang313@gmail.com Supplementary data are available at Bioinformatics online.

  9. Point source reconstruction principle of linear inverse problems

    International Nuclear Information System (INIS)

    Terazono, Yasushi; Matani, Ayumu; Fujimaki, Norio; Murata, Tsutomu

    2010-01-01

    Exact point source reconstruction for underdetermined linear inverse problems with a block-wise structure was studied. In a block-wise problem, elements of a source vector are partitioned into blocks. Accordingly, a leadfield matrix, which represents the forward observation process, is also partitioned into blocks. A point source is a source having only one nonzero block. An example of such a problem is current distribution estimation in electroencephalography and magnetoencephalography, where a source vector represents a vector field and a point source represents a single current dipole. In this study, the block-wise norm, a block-wise extension of the l p -norm, was defined as the family of cost functions of the inverse method. The main result is that a set of three conditions was found to be necessary and sufficient for block-wise norm minimization to ensure exact point source reconstruction for any leadfield matrix that admit such reconstruction. The block-wise norm that satisfies the conditions is the sum of the cost of all the observations of source blocks, or in other words, the block-wisely extended leadfield-weighted l 1 -norm. Additional results are that minimization of such a norm always provides block-wisely sparse solutions and that its solutions form cones in source space

  10. The co phylogeny reconstruction problem is NP-complete.

    Science.gov (United States)

    Ovadia, Y; Fielder, D; Conow, C; Libeskind-Hadas, R

    2011-01-01

    The co phylogeny reconstruction problem is that of finding minimum cost explanations of differences between historical associations. The problem arises in parasitology, molecular systematics, and biogeography. Existing software tools for this problem either have worst-case exponential time or use heuristics that do not guarantee optimal solutions. To date, no polynomial time optimal algorithms have been found for this problem. In this article, we prove that the problem is NP-complete, suggesting that future research on algorithms for this problem should seek better polynomial-time approximation algorithms and heuristics rather than optimal solutions.

  11. Reconstructing the Hopfield network as an inverse Ising problem

    International Nuclear Information System (INIS)

    Huang Haiping

    2010-01-01

    We test four fast mean-field-type algorithms on Hopfield networks as an inverse Ising problem. The equilibrium behavior of Hopfield networks is simulated through Glauber dynamics. In the low-temperature regime, the simulated annealing technique is adopted. Although performances of these network reconstruction algorithms on the simulated network of spiking neurons are extensively studied recently, the analysis of Hopfield networks is lacking so far. For the Hopfield network, we found that, in the retrieval phase favored when the network wants to memory one of stored patterns, all the reconstruction algorithms fail to extract interactions within a desired accuracy, and the same failure occurs in the spin-glass phase where spurious minima show up, while in the paramagnetic phase, albeit unfavored during the retrieval dynamics, the algorithms work well to reconstruct the network itself. This implies that, as an inverse problem, the paramagnetic phase is conversely useful for reconstructing the network while the retrieval phase loses all the information about interactions in the network except for the case where only one pattern is stored. The performances of algorithms are studied with respect to the system size, memory load, and temperature; sample-to-sample fluctuations are also considered.

  12. Mandible reconstruction: History, state of the art and persistent problems.

    Science.gov (United States)

    Ferreira, José J; Zagalo, Carlos M; Oliveira, Marta L; Correia, André M; Reis, Ana R

    2015-06-01

    Mandibular reconstruction has been experiencing an amazing evolution. Several different approaches are used to reconstruct this bone and therefore have a fundamental role in the recovery of oral functions. This review aims to highlight the persistent problems associated with the approaches identified, whether bone grafts or prosthetic devices are used. A brief summary of the historical evolution of the surgical procedures is presented, as well as an insight into possible future pathways. A literature review was conducted from September to December 2012 using the PubMed database. The keyword used was "mandible reconstruction." Articles published in the last three years were included as well as the relevant references from those articles and the "historical articles" were referred. This research resulted in a monograph that this article aims to summarize. Titanium plates, bone grafts, pediculate flaps, free osteomyocutaneous flaps, rapid prototyping, and tissue engineering strategies are some of the identified possibilities. The classical approaches present considerable associated morbidity donor-site-related problems. Research that results in the development of new prosthetics devices is needed. A new prosthetic approach could minimize the identified problems and offer the patients more predictable, affordable, and comfortable solutions. This review, while affirming the evolution and the good results found with the actual approaches, emphasizes the negative aspects that still subsist. Thus, it shows that mandible reconstruction is not a closed issue. On the contrary, it remains as a research field where new findings could have a direct positive impact on patients' life quality. The identification of the persistent problems reveals the characteristics to be considered in a new prosthetic device. This could overcome the current difficulties and result in more comfortable solutions. Medical teams have the responsibility to keep patients informed about the predictable

  13. Architectural and town-planning reconstruction problems of the city of Voronezh

    Science.gov (United States)

    Mikhaylova, TTatyana; Parshin, Dmitriy; Shoshinov, Vitaly; Trebukhin, Anatoliy

    2018-03-01

    The analysis of the state of the historically developed urban district of the city of Voronezh is made. The ways of solving the identified architectural and urban problems of reconstruction of historically developed buildings are proposed. The concept of reconstruction of a territory with historical buildings along Vaytsekhovsky Street is presented.

  14. An evolutionary algorithm for tomographic reconstructions in limited data sets problems

    International Nuclear Information System (INIS)

    Turcanu, Catrinel; Craciunescu, Teddy

    2000-01-01

    The paper proposes a new method for tomographic reconstructions. Unlike nuclear medicine applications, in physical science problems we are often confronted with limited data sets: constraints in the number of projections or limited angle views. The problem of image reconstruction from projections may be considered as a problem of finding an image (solution) having projections that match the experimental ones. In our approach, we choose a statistical correlation coefficient to evaluate the fitness of any potential solution. The optimization process is carried out by an evolutionary algorithm. Our algorithm has some problem-oriented characteristics. One of them is that a chromosome, representing a potential solution, is not linear but coded as a matrix of pixels corresponding to a two-dimensional image. This kind of internal representation reflects the genuine manifestation and slight differences between two points situated in the original problem space give rise to similar differences once they become coded. Another particular feature is a newly built crossover operator: the grid-based crossover, suitable for high dimension two-dimensional chromosomes. Except for the population size and the dimension of the cutting grid for the grid-based crossover, all the other parameters of the algorithm are independent of the geometry of the tomographic reconstruction. The performances of the method are evaluated in comparison with a traditional tomographic method, based on the maximization of the entropy of the image, that proved to work well with limited data sets. The test phantom is typical for an application with limited data sets: the determination of the neutron energy spectra with time resolution in case of short-pulsed neutron emission. The qualitative judgement and also the quantitative one, based on some figures of merit, point out that the proposed method ensures an improved reconstruction of shapes, sizes and resolution in the image, even in the presence of noise

  15. Improved iterative image reconstruction algorithm for the exterior problem of computed tomography

    International Nuclear Information System (INIS)

    Guo, Yumeng; Zeng, Li

    2017-01-01

    In industrial applications that are limited by the angle of a fan-beam and the length of a detector, the exterior problem of computed tomography (CT) uses only the projection data that correspond to the external annulus of the objects to reconstruct an image. Because the reconstructions are not affected by the projection data that correspond to the interior of the objects, the exterior problem is widely applied to detect cracks in the outer wall of large-sized objects, such as in-service pipelines. However, image reconstruction in the exterior problem is still a challenging problem due to truncated projection data and beam-hardening, both of which can lead to distortions and artifacts. Thus, developing an effective algorithm and adopting a scanning trajectory suited for the exterior problem may be valuable. In this study, an improved iterative algorithm that combines total variation minimization (TVM) with a region scalable fitting (RSF) model was developed for a unilateral off-centered scanning trajectory and can be utilized to inspect large-sized objects for defects. Experiments involving simulated phantoms and real projection data were conducted to validate the practicality of our algorithm. Furthermore, comparative experiments show that our algorithm outperforms others in suppressing the artifacts caused by truncated projection data and beam-hardening.

  16. Improved iterative image reconstruction algorithm for the exterior problem of computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yumeng [Chongqing University, College of Mathematics and Statistics, Chongqing 401331 (China); Chongqing University, ICT Research Center, Key Laboratory of Optoelectronic Technology and System of the Education Ministry of China, Chongqing 400044 (China); Zeng, Li, E-mail: drlizeng@cqu.edu.cn [Chongqing University, College of Mathematics and Statistics, Chongqing 401331 (China); Chongqing University, ICT Research Center, Key Laboratory of Optoelectronic Technology and System of the Education Ministry of China, Chongqing 400044 (China)

    2017-01-11

    In industrial applications that are limited by the angle of a fan-beam and the length of a detector, the exterior problem of computed tomography (CT) uses only the projection data that correspond to the external annulus of the objects to reconstruct an image. Because the reconstructions are not affected by the projection data that correspond to the interior of the objects, the exterior problem is widely applied to detect cracks in the outer wall of large-sized objects, such as in-service pipelines. However, image reconstruction in the exterior problem is still a challenging problem due to truncated projection data and beam-hardening, both of which can lead to distortions and artifacts. Thus, developing an effective algorithm and adopting a scanning trajectory suited for the exterior problem may be valuable. In this study, an improved iterative algorithm that combines total variation minimization (TVM) with a region scalable fitting (RSF) model was developed for a unilateral off-centered scanning trajectory and can be utilized to inspect large-sized objects for defects. Experiments involving simulated phantoms and real projection data were conducted to validate the practicality of our algorithm. Furthermore, comparative experiments show that our algorithm outperforms others in suppressing the artifacts caused by truncated projection data and beam-hardening.

  17. Reconstruction formula for a 3-d phaseless inverse scattering problem for the Schrodinger equation

    OpenAIRE

    Klibanov, Michael V.; Romanov, Vladimir G.

    2014-01-01

    The inverse scattering problem of the reconstruction of the unknown potential with compact support in the 3-d Schr\\"odinger equation is considered. Only the modulus of the scattering complex valued wave field is known, whereas the phase is unknown. It is shown that the unknown potential can be reconstructed via the inverse Radon transform. Therefore, a long standing problem posed in 1977 by K. Chadan and P.C. Sabatier in their book "Inverse Problems in Quantum Scattering Theory" is solved.

  18. On a full Bayesian inference for force reconstruction problems

    Science.gov (United States)

    Aucejo, M.; De Smet, O.

    2018-05-01

    In a previous paper, the authors introduced a flexible methodology for reconstructing mechanical sources in the frequency domain from prior local information on both their nature and location over a linear and time invariant structure. The proposed approach was derived from Bayesian statistics, because of its ability in mathematically accounting for experimenter's prior knowledge. However, since only the Maximum a Posteriori estimate was computed, the posterior uncertainty about the regularized solution given the measured vibration field, the mechanical model and the regularization parameter was not assessed. To answer this legitimate question, this paper fully exploits the Bayesian framework to provide, from a Markov Chain Monte Carlo algorithm, credible intervals and other statistical measures (mean, median, mode) for all the parameters of the force reconstruction problem.

  19. Sparse deconvolution for the large-scale ill-posed inverse problem of impact force reconstruction

    Science.gov (United States)

    Qiao, Baijie; Zhang, Xingwu; Gao, Jiawei; Liu, Ruonan; Chen, Xuefeng

    2017-01-01

    Most previous regularization methods for solving the inverse problem of force reconstruction are to minimize the l2-norm of the desired force. However, these traditional regularization methods such as Tikhonov regularization and truncated singular value decomposition, commonly fail to solve the large-scale ill-posed inverse problem in moderate computational cost. In this paper, taking into account the sparse characteristic of impact force, the idea of sparse deconvolution is first introduced to the field of impact force reconstruction and a general sparse deconvolution model of impact force is constructed. Second, a novel impact force reconstruction method based on the primal-dual interior point method (PDIPM) is proposed to solve such a large-scale sparse deconvolution model, where minimizing the l2-norm is replaced by minimizing the l1-norm. Meanwhile, the preconditioned conjugate gradient algorithm is used to compute the search direction of PDIPM with high computational efficiency. Finally, two experiments including the small-scale or medium-scale single impact force reconstruction and the relatively large-scale consecutive impact force reconstruction are conducted on a composite wind turbine blade and a shell structure to illustrate the advantage of PDIPM. Compared with Tikhonov regularization, PDIPM is more efficient, accurate and robust whether in the single impact force reconstruction or in the consecutive impact force reconstruction.

  20. Convex optimization problem prototyping for image reconstruction in computed tomography with the Chambolle–Pock algorithm

    DEFF Research Database (Denmark)

    Sidky, Emil Y.; Jørgensen, Jakob Heide; Pan, Xiaochuan

    2012-01-01

    The primal–dual optimization algorithm developed in Chambolle and Pock (CP) (2011 J. Math. Imag. Vis. 40 1–26) is applied to various convex optimization problems of interest in computed tomography (CT) image reconstruction. This algorithm allows for rapid prototyping of optimization problems...... for the purpose of designing iterative image reconstruction algorithms for CT. The primal–dual algorithm is briefly summarized in this paper, and its potential for prototyping is demonstrated by explicitly deriving CP algorithm instances for many optimization problems relevant to CT. An example application...

  1. A Solution to Hammer's X-ray Reconstruction Problem

    DEFF Research Database (Denmark)

    Gardner, Richard J.; Kiderlen, Markus

    2007-01-01

    We propose algorithms for reconstructing a planar convex body K from possibly noisy measurements of either its parallel X-rays taken in a fixed finite set of directions or its point X-rays taken at a fixed finite set of points, in known situations that guarantee a unique solution when the data is...... to K in the Hausdorff metric as k tends to infinity. This solves, for the first time in the strongest sense, Hammer’s X-ray problem published in 1963....

  2. EXTEND OPERATION PROBLEMS AND RECONSTRUCTION OF LARGEPANEL FIVE-STOREY BUILDINGS OF 50-60-IES XX CENTURY

    Directory of Open Access Journals (Sweden)

    BOLSHAKOV V. I.,

    2016-01-01

    Full Text Available Raising of the problem. In many regions is utilised housing, that age is more than half a century. According to the research materials of the analytical center of Ukrainian Cities Association there are 25,5 thousand houses built by first mass series project of large, block and brick buildings with a total area of 72 million M2 today in the state, rather those, that require reconstruction and modernization. In general, most of the housing stock of Ukraine is in a poor technical condition due to its deficient funding; it keeps the tendency of premature aging of the housing stock.One of the major problems of modern construction industry is the continuation of housing exploitation, in particular is it the building era of mass construction of 50-60-ies of XX century, called "Khrushchevki". According to the State Statistics Service of Ukraine the deterioration of residential buildings in Ukraine amounts to 47.2%, which makes us think of the immediate actions to occure this situation. The most acceptable way, at first viewe, seems the reconstruction of "Khrushchvki". However, the reconstruction is a complex problem that requires the construction industry solution due to the economic component, the social factor, the views of residents of these homes to create a technological and economical viable result. Analysis of publications. The problem of the "Khrushchevki" reconstruction is the subject of continual researches of leading builders of Ukraine. In the researchers' attention just as the technological problems [1 - 3], so economic components [4 - 6], in general, give an idea of the work scale required to overcome the impending crisis. The purpose of the article. Defining the main problems of exploatation of panel fivestory buildings of 50 - 60-ies twentieth century and their residents thoughts about existing inconvenience, as well as associated economic, technological and legal problems in the implementation of buildings reconstruction. Conclusions

  3. On a problem of reconstruction of a discontinuous function by its Radon transform

    Energy Technology Data Exchange (ETDEWEB)

    Derevtsov, Evgeny Yu.; Maltseva, Svetlana V.; Svetov, Ivan E. [Sobolev Institute of Mathematics of SB RAS, 630090, Novosibirsk (Russian Federation); Novosibirsk State University, 630090, Novosibirsk (Russian Federation); Sultanov, Murat A. [H. A. Yassawe International Kazakh-Turkish University, 161200, Turkestan (Kazakhstan)

    2016-08-10

    A problem of reconstruction of a discontinuous function by its Radon transform is considered. One of the approaches to the numerical solution for the problem consists in the next sequential steps: a visualization of a set of breaking points; an identification of this set; a determination of jump values; an elimination of discontinuities. We consider three of listed problems except the problem of jump values. The problems are investigated by mathematical modeling using numerical experiments. The results of simulation are satisfactory and allow to hope for the further development of the approach.

  4. Iterative Reconstruction Methods for Inverse Problems in Tomography with Hybrid Data

    DEFF Research Database (Denmark)

    Sherina, Ekaterina

    . The goal of these modalities is to quantify physical parameters of materials or tissues inside an object from given interior data, which is measured everywhere inside the object. The advantage of these modalities is that large variations in physical parameters can be resolved and therefore, they have...... data is precisely the reason why reconstructions with a high contrast and a high resolution can be expected. The main contributions of this thesis consist in formulating the underlying mathematical problems with interior data as nonlinear operator equations, theoretically analysing them within...... iteration and the Levenberg-Marquardt method are employed for solving the problems. The first problem considered in this thesis is a problem of conductivity estimation from interior measurements of the power density, known as Acousto-Electrical Tomography. A special case of limited angle tomography...

  5. The use of hamstring tendon graft for the anterior cruciate ligament reconstruction (benefi ts, problems and their solutions

    Directory of Open Access Journals (Sweden)

    V. V. Slastinin

    2017-01-01

    Full Text Available The search for optimal graft for anterior cruciate ligament reconstruction is going on. The donor site morbidity remains one of the major problems when using autografts. The article provides an overview of the advantages and disadvantages of using the hamstring tendon autografts for anterior cruciate ligament reconstruction, and the ways of solving the problems associated with using such types of grafts.

  6. Phylogenetic Diversity, Distribution, and Cophylogeny of Giant Bacteria (Epulopiscium) with their Surgeonfish Hosts in the Red Sea

    KAUST Repository

    Miyake, Sou

    2016-03-14

    Epulopiscium is a group of giant bacteria found in high abundance in intestinal tracts of herbivorous surgeonfish. Despite their peculiarly large cell size (can be up to 600 μm), extreme polyploidy (some with over 100,000 genome copies per cell) and viviparity (whereby mother cells produce live offspring), details about their diversity, distribution or their role in the host gut are lacking. Previous studies have highlighted the existence of morphologically distinct Epulopiscium cell types (defined as morphotypes A to J) in some surgeonfish genera, but the corresponding genetic diversity and distribution among other surgeonfishes remain mostly unknown. Therefore, we investigated the phylogenetic diversity of Epulopiscium, distribution and co-occurrence in multiple hosts. Here, we identified eleven new phylogenetic clades, six of which were also morphologically characterized. Three of these novel clades were phylogenetically and morphologically similar to cigar-shaped type A1 cells, found in a wide range of surgeonfishes including Acanthurus nigrofuscus, while three were similar to smaller, rod-shaped type E that has not been phylogenetically classified thus far. Our results also confirmed that biogeography appears to have relatively little influence on Epulopiscium diversity, as clades found in the Great Barrier Reef and Hawaii were also recovered from the Red Sea. Although multiple symbiont clades inhabited a given species of host surgeonfish and multiple host species possessed a given symbiont clade, statistical analysis of host and symbiont phylogenies indicated significant cophylogeny, which in turn suggests co-evolutionary relationships. A cluster analysis of Epulopiscium sequences from previously published amplicon sequencing dataset revealed a similar pattern, where specific clades were consistently found in high abundance amongst closely related surgeonfishes. Differences in abundance may indicate specialization of clades to certain gut environments

  7. Phylogenetic Diversity, Distribution, and Cophylogeny of Giant Bacteria (Epulopiscium) with their Surgeonfish Hosts in the Red Sea

    Science.gov (United States)

    Miyake, Sou; Ngugi, David K.; Stingl, Ulrich

    2016-01-01

    Epulopiscium is a group of giant bacteria found in high abundance in intestinal tracts of herbivorous surgeonfish. Despite their peculiarly large cell size (can be up to 600 μm), extreme polyploidy (some with over 100,000 genome copies per cell) and viviparity (whereby mother cells produce live offspring), details about their diversity, distribution or their role in the host gut are lacking. Previous studies have highlighted the existence of morphologically distinct Epulopiscium cell types (defined as morphotypes A to J) in some surgeonfish genera, but the corresponding genetic diversity and distribution among other surgeonfishes remain mostly unknown. Therefore, we investigated the phylogenetic diversity of Epulopiscium, distribution and co-occurrence in multiple hosts. Here, we identified eleven new phylogenetic clades, six of which were also morphologically characterized. Three of these novel clades were phylogenetically and morphologically similar to cigar-shaped type A1 cells, found in a wide range of surgeonfishes including Acanthurus nigrofuscus, while three were similar to smaller, rod-shaped type E that has not been phylogenetically classified thus far. Our results also confirmed that biogeography appears to have relatively little influence on Epulopiscium diversity, as clades found in the Great Barrier Reef and Hawaii were also recovered from the Red Sea. Although multiple symbiont clades inhabited a given species of host surgeonfish and multiple host species possessed a given symbiont clade, statistical analysis of host and symbiont phylogenies indicated significant cophylogeny, which in turn suggests co-evolutionary relationships. A cluster analysis of Epulopiscium sequences from previously published amplicon sequencing dataset revealed a similar pattern, where specific clades were consistently found in high abundance amongst closely related surgeonfishes. Differences in abundance may indicate specialization of clades to certain gut environments

  8. Phylogenetic Diversity, Distribution, and Cophylogeny of Giant Bacteria (Epulopiscium) with their Surgeonfish Hosts in the Red Sea

    KAUST Repository

    Miyake, Sou; Ngugi, David; Stingl, Ulrich

    2016-01-01

    Epulopiscium is a group of giant bacteria found in high abundance in intestinal tracts of herbivorous surgeonfish. Despite their peculiarly large cell size (can be up to 600 μm), extreme polyploidy (some with over 100,000 genome copies per cell) and viviparity (whereby mother cells produce live offspring), details about their diversity, distribution or their role in the host gut are lacking. Previous studies have highlighted the existence of morphologically distinct Epulopiscium cell types (defined as morphotypes A to J) in some surgeonfish genera, but the corresponding genetic diversity and distribution among other surgeonfishes remain mostly unknown. Therefore, we investigated the phylogenetic diversity of Epulopiscium, distribution and co-occurrence in multiple hosts. Here, we identified eleven new phylogenetic clades, six of which were also morphologically characterized. Three of these novel clades were phylogenetically and morphologically similar to cigar-shaped type A1 cells, found in a wide range of surgeonfishes including Acanthurus nigrofuscus, while three were similar to smaller, rod-shaped type E that has not been phylogenetically classified thus far. Our results also confirmed that biogeography appears to have relatively little influence on Epulopiscium diversity, as clades found in the Great Barrier Reef and Hawaii were also recovered from the Red Sea. Although multiple symbiont clades inhabited a given species of host surgeonfish and multiple host species possessed a given symbiont clade, statistical analysis of host and symbiont phylogenies indicated significant cophylogeny, which in turn suggests co-evolutionary relationships. A cluster analysis of Epulopiscium sequences from previously published amplicon sequencing dataset revealed a similar pattern, where specific clades were consistently found in high abundance amongst closely related surgeonfishes. Differences in abundance may indicate specialization of clades to certain gut environments

  9. Dynamic MRI reconstruction as a moment problem. Pt. 1

    International Nuclear Information System (INIS)

    Zwaan, M.

    1989-03-01

    This paper deals with some mathematical aspects of magnetic resonance imaging (MRI) concerning the beating heart. Some of the basic theory behind magnetic resonance is given. Of special interest is the mathematical theory concerning MRI and the ideas and problems in mathematical terms will be formulated. If one uses MRI to measure and display a so colled 'dynamic' organ, like the beating heart, the situation is more complex than the case of a static organ. Strategy is described how a cross section of a beating human heart is measured in practice and how the measurements are arranged before an image can be made. This technique is called retrospective synchronization. If the beating heart is measured and displayed with help of this method, artefacts often deteriorate the image quality. Some of these artefacts have a physical cause, while others are caused by the reconstruction algorithm. Perhaps mathematical techniques may be used to improve these algorithms hich are currently used in practice. The aim of this paper is not to solve problems, but to give an adequate mathematical formulation of the inversion problem concerning retrospective synchronization. (author). 3 refs.; 4 figs

  10. ITEM-QM solutions for EM problems in image reconstruction exemplary for the Compton Camera

    CERN Document Server

    Pauli, Josef; Anton, G

    2002-01-01

    Imaginary time expectation maximation (ITEM), a new algorithm for expectation maximization problems based on the quantum mechanics energy minimalization via imaginary (euclidian) time evolution is presented. Both (the algorithm as well as the implementation (http://www.johannes-pauli.de/item/index.html) are published under the terms of General GNU public License (http://www.gnu.org/copyleft/gpl.html). Due to its generality ITEM is applicable to various image reconstruction problems like CT, PET, SPECT, NMR, Compton Camera, tomosynthesis as well as any other energy minimization problem. The choice of the optimal ITEM Hamiltonian is discussed and numerical results are presented for the Compton Camera.

  11. Analytical reconstruction schemes for coarse-mesh spectral nodal solution of slab-geometry SN transport problems

    International Nuclear Information System (INIS)

    Barros, R. C.; Filho, H. A.; Platt, G. M.; Oliveira, F. B. S.; Militao, D. S.

    2009-01-01

    Coarse-mesh numerical methods are very efficient in the sense that they generate accurate results in short computational time, as the number of floating point operations generally decrease, as a result of the reduced number of mesh points. On the other hand, they generate numerical solutions that do not give detailed information on the problem solution profile, as the grid points can be located considerably away from each other. In this paper we describe two analytical reconstruction schemes for the coarse-mesh solution generated by the spectral nodal method for neutral particle discrete ordinates (S N ) transport model in slab geometry. The first scheme we describe is based on the analytical reconstruction of the coarse-mesh solution within each discretization cell of the spatial grid set up on the slab. The second scheme is based on the angular reconstruction of the discrete ordinates solution between two contiguous ordinates of the angular quadrature set used in the S N model. Numerical results are given so we can illustrate the accuracy of the two reconstruction schemes, as described in this paper. (authors)

  12. Stable methods for ill-posed problems and application to reconstruction of atmospheric temperature profile

    International Nuclear Information System (INIS)

    Son, H.H.; Luong, P.T.; Loan, N.T.

    1990-04-01

    The problems of Remote Sensing (passive or active) are investigated on the base of main principle which consists in interpretation of radiometric electromagnetic measurements in such spectral interval where the radiation is sensitive to interested physical property of medium. Those problems such as an analysis of composition and structure of atmosphere using the records of scattered radiation, cloud identification, investigation of thermodynamic state and composition of system, reconstructing the atmospheric temperature profile on the base of data processing of infrared radiation emitted by system Earth-Atmosphere... belong to class of inverse problems of mathematical physics which are often incorrect. Int his paper a new class of regularized solution corresponding to general formulated RATP-problem is considered. (author). 14 refs, 3 figs, 3 tabs

  13. Exact iterative reconstruction for the interior problem

    International Nuclear Information System (INIS)

    Zeng, Gengsheng L; Gullberg, Grant T

    2009-01-01

    There is a trend in single photon emission computed tomography (SPECT) that small and dedicated imaging systems are becoming popular. For example, many companies are developing small dedicated cardiac SPECT systems with different designs. These dedicated systems have a smaller field of view (FOV) than a full-size clinical system. Thus data truncation has become the norm rather than the exception in these systems. Therefore, it is important to develop region of interest (ROI) reconstruction algorithms using truncated data. This paper is a stepping stone toward this direction. This paper shows that the common generic iterative image reconstruction algorithms are able to exactly reconstruct the ROI under the conditions that the convex ROI is fully sampled and the image value in a sub-region within the ROI is known. If the ROI includes a sub-region that is outside the patient body, then the conditions can be easily satisfied.

  14. On Inverse Coefficient Heat-Conduction Problems on Reconstruction of Nonlinear Components of the Thermal-Conductivity Tensor of Anisotropic Bodies

    Science.gov (United States)

    Formalev, V. F.; Kolesnik, S. A.

    2017-11-01

    The authors are the first to present a closed procedure for numerical solution of inverse coefficient problems of heat conduction in anisotropic materials used as heat-shielding ones in rocket and space equipment. The reconstructed components of the thermal-conductivity tensor depend on temperature (are nonlinear). The procedure includes the formation of experimental data, the implicit gradient-descent method, the economical absolutely stable method of numerical solution of parabolic problems containing mixed derivatives, the parametric identification, construction, and numerical solution of the problem for elements of sensitivity matrices, the development of a quadratic residual functional and regularizing functionals, and also the development of algorithms and software systems. The implicit gradient-descent method permits expanding the quadratic functional in a Taylor series with retention of the linear terms for the increments of the sought functions. This substantially improves the exactness and stability of solution of the inverse problems. Software systems are developed with account taken of the errors in experimental data and disregarding them. On the basis of a priori assumptions of the qualitative behavior of the functional dependences of the components of the thermal-conductivity tensor on temperature, regularizing functionals are constructed by means of which one can reconstruct the components of the thermal-conductivity tensor with an error no higher than the error of the experimental data. Results of the numerical solution of the inverse coefficient problems on reconstruction of nonlinear components of the thermal-conductivity tensor have been obtained and are discussed.

  15. Image Reconstruction. Chapter 13

    Energy Technology Data Exchange (ETDEWEB)

    Nuyts, J. [Department of Nuclear Medicine and Medical Imaging Research Center, Katholieke Universiteit Leuven, Leuven (Belgium); Matej, S. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, PA (United States)

    2014-12-15

    This chapter discusses how 2‑D or 3‑D images of tracer distribution can be reconstructed from a series of so-called projection images acquired with a gamma camera or a positron emission tomography (PET) system [13.1]. This is often called an ‘inverse problem’. The reconstruction is the inverse of the acquisition. The reconstruction is called an inverse problem because making software to compute the true tracer distribution from the acquired data turns out to be more difficult than the ‘forward’ direction, i.e. making software to simulate the acquisition. There are basically two approaches to image reconstruction: analytical reconstruction and iterative reconstruction. The analytical approach is based on mathematical inversion, yielding efficient, non-iterative reconstruction algorithms. In the iterative approach, the reconstruction problem is reduced to computing a finite number of image values from a finite number of measurements. That simplification enables the use of iterative instead of mathematical inversion. Iterative inversion tends to require more computer power, but it can cope with more complex (and hopefully more accurate) models of the acquisition process.

  16. Vertex reconstruction in CMS

    International Nuclear Information System (INIS)

    Chabanat, E.; D'Hondt, J.; Estre, N.; Fruehwirth, R.; Prokofiev, K.; Speer, T.; Vanlaer, P.; Waltenberger, W.

    2005-01-01

    Due to the high track multiplicity in the final states expected in proton collisions at the LHC experiments, novel vertex reconstruction algorithms are required. The vertex reconstruction problem can be decomposed into a pattern recognition problem ('vertex finding') and an estimation problem ('vertex fitting'). Starting from least-squares methods, robustifications of the classical algorithms are discussed and the statistical properties of the novel methods are shown. A whole set of different approaches for the vertex finding problem is presented and compared in relevant physics channels

  17. Vertex Reconstruction in CMS

    CERN Document Server

    Chabanat, E; D'Hondt, J; Vanlaer, P; Prokofiev, K; Speer, T; Frühwirth, R; Waltenberger, W

    2005-01-01

    Because of the high track multiplicity in the final states expected in proton collisions at the LHC experiments, novel vertex reconstruction algorithms are required. The vertex reconstruction problem can be decomposed into a pattern recognition problem ("vertex finding") and an estimation problem ("vertex fitting"). Starting from least-square methods, ways to render the classical algorithms more robust are discussed and the statistical properties of the novel methods are shown. A whole set of different approaches for the vertex finding problem is presented and compared in relevant physics channels.

  18. Dynamic dual-tracer PET reconstruction.

    Science.gov (United States)

    Gao, Fei; Liu, Huafeng; Jian, Yiqiang; Shi, Pengcheng

    2009-01-01

    Although of important medical implications, simultaneous dual-tracer positron emission tomography reconstruction remains a challenging problem, primarily because the photon measurements from dual tracers are overlapped. In this paper, we propose a simultaneous dynamic dual-tracer reconstruction of tissue activity maps based on guidance from tracer kinetics. The dual-tracer reconstruction problem is formulated in a state-space representation, where parallel compartment models serve as continuous-time system equation describing the tracer kinetic processes of dual tracers, and the imaging data is expressed as discrete sampling of the system states in measurement equation. The image reconstruction problem has therefore become a state estimation problem in a continuous-discrete hybrid paradigm, and H infinity filtering is adopted as the estimation strategy. As H infinity filtering makes no assumptions on the system and measurement statistics, robust reconstruction results can be obtained for the dual-tracer PET imaging system where the statistical properties of measurement data and system uncertainty are not available a priori, even when there are disturbances in the kinetic parameters. Experimental results on digital phantoms, Monte Carlo simulations and physical phantoms have demonstrated the superior performance.

  19. Segmentation-DrivenTomographic Reconstruction

    DEFF Research Database (Denmark)

    Kongskov, Rasmus Dalgas

    such that the segmentation subsequently can be carried out by use of a simple segmentation method, for instance just a thresholding method. We tested the advantages of going from a two-stage reconstruction method to a one stage segmentation-driven reconstruction method for the phase contrast tomography reconstruction......The tomographic reconstruction problem is concerned with creating a model of the interior of an object from some measured data, typically projections of the object. After reconstructing an object it is often desired to segment it, either automatically or manually. For computed tomography (CT...

  20. Compressed Sensing, Pseudodictionary-Based, Superresolution Reconstruction

    Directory of Open Access Journals (Sweden)

    Chun-mei Li

    2016-01-01

    Full Text Available The spatial resolution of digital images is the critical factor that affects photogrammetry precision. Single-frame, superresolution, image reconstruction is a typical underdetermined, inverse problem. To solve this type of problem, a compressive, sensing, pseudodictionary-based, superresolution reconstruction method is proposed in this study. The proposed method achieves pseudodictionary learning with an available low-resolution image and uses the K-SVD algorithm, which is based on the sparse characteristics of the digital image. Then, the sparse representation coefficient of the low-resolution image is obtained by solving the norm of l0 minimization problem, and the sparse coefficient and high-resolution pseudodictionary are used to reconstruct image tiles with high resolution. Finally, single-frame-image superresolution reconstruction is achieved. The proposed method is applied to photogrammetric images, and the experimental results indicate that the proposed method effectively increase image resolution, increase image information content, and achieve superresolution reconstruction. The reconstructed results are better than those obtained from traditional interpolation methods in aspect of visual effects and quantitative indicators.

  1. The inverse problems of reconstruction in the X-rays, gamma or positron tomographic imaging systems

    International Nuclear Information System (INIS)

    Grangeat, P.

    1999-01-01

    The revolution in imagery, brought by the tomographic technic in the years 70, allows the computation of local values cartography for the attenuation or the emission activity. The reconstruction techniques thus allow the connection from integral measurements to characteristic information distribution by inversion of the measurement equations. They are a main application of the solution technic for inverse problems. In a first part the author recalls the physical principles for measures in X-rays, gamma and positron imaging. Then he presents the various problems with their associated inversion techniques. The third part is devoted to the activity sector and examples, to conclude in the last part with the forecast. (A.L.B.)

  2. Overview of image reconstruction

    International Nuclear Information System (INIS)

    Marr, R.B.

    1980-04-01

    Image reconstruction (or computerized tomography, etc.) is any process whereby a function, f, on R/sup n/ is estimated from empirical data pertaining to its integrals, ∫f(x) dx, for some collection of hyperplanes of dimension k < n. The paper begins with background information on how image reconstruction problems have arisen in practice, and describes some of the application areas of past or current interest; these include radioastronomy, optics, radiology and nuclear medicine, electron microscopy, acoustical imaging, geophysical tomography, nondestructive testing, and NMR zeugmatography. Then the various reconstruction algorithms are discussed in five classes: summation, or simple back-projection; convolution, or filtered back-projection; Fourier and other functional transforms; orthogonal function series expansion; and iterative methods. Certain more technical mathematical aspects of image reconstruction are considered from the standpoint of uniqueness, consistency, and stability of solution. The paper concludes by presenting certain open problems. 73 references

  3. Iterative methods for tomography problems: implementation to a cross-well tomography problem

    Science.gov (United States)

    Karadeniz, M. F.; Weber, G. W.

    2018-01-01

    The velocity distribution between two boreholes is reconstructed by cross-well tomography, which is commonly used in geology. In this paper, iterative methods, Kaczmarz’s algorithm, algebraic reconstruction technique (ART), and simultaneous iterative reconstruction technique (SIRT), are implemented to a specific cross-well tomography problem. Convergence to the solution of these methods and their CPU time for the cross-well tomography problem are compared. Furthermore, these three methods for this problem are compared for different tolerance values.

  4. A tensor-based dictionary learning approach to tomographic image reconstruction

    DEFF Research Database (Denmark)

    Soltani, Sara; Kilmer, Misha E.; Hansen, Per Christian

    2016-01-01

    We consider tomographic reconstruction using priors in the form of a dictionary learned from training images. The reconstruction has two stages: first we construct a tensor dictionary prior from our training data, and then we pose the reconstruction problem in terms of recovering the expansion...... coefficients in that dictionary. Our approach differs from past approaches in that (a) we use a third-order tensor representation for our images and (b) we recast the reconstruction problem using the tensor formulation. The dictionary learning problem is presented as a non-negative tensor factorization problem...... with sparsity constraints. The reconstruction problem is formulated in a convex optimization framework by looking for a solution with a sparse representation in the tensor dictionary. Numerical results show that our tensor formulation leads to very sparse representations of both the training images...

  5. Shredded banknotes reconstruction using AKAZE points.

    Science.gov (United States)

    Nabiyev, Vasif V; Yılmaz, Seçkin; Günay, Asuman; Muzaffer, Gül; Ulutaş, Güzin

    2017-09-01

    Shredded banknote reconstruction is a recent topic and can be viewed as solving large-scale jigsaw puzzles. Also, problems such as reconstruction of fragmented documents, photographs and historical artefacts are closely related with this topic. The high computational complexity of these problems increases the need for the development of new methods Reconstruction of shredded banknotes consists of three main stages. (1) Matching fragments with a reference banknote. (2) Aligning the fragments by rotating at certain angles. (3) Assembling the fragments. The existing methods can successfully applied to synthetic banknote fragments which are created in computer environment. But when real banknote reconstruction problem is considered, different sub problems arise and make the existing methods inadequate. In this study, a keypoint based method, named AKAZE, was used to make the matching process effective. This is the first study that uses the AKAZE method in the reconstruction of shredded banknotes. A new method for fragment alignment has also been proposed. In this method, the convex hulls that contain all true matched AKAZE keypoints were found on reference banknote and fragments. The orientations of fragments were estimated accurately by comparing these convex polygons. Also, a new criterion was developed to reveal the success rates of reconstructed banknotes. In addition, two different data sets including real and synthetic banknote fragments of different countries were created to test the success of proposed method. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Mastectomy Skin Necrosis After Breast Reconstruction: A Comparative Analysis Between Autologous Reconstruction and Implant-Based Reconstruction.

    Science.gov (United States)

    Sue, Gloria R; Lee, Gordon K

    2018-05-01

    Mastectomy skin necrosis is a significant problem after breast reconstruction. We sought to perform a comparative analysis on this complication between patients undergoing autologous breast reconstruction and patients undergoing 2-stage expander implant breast reconstruction. A retrospective review was performed on consecutive patients undergoing autologous breast reconstruction or 2-stage expander implant breast reconstruction by the senior author from 2006 through 2015. Patient demographic factors including age, body mass index, history of diabetes, history of smoking, and history of radiation to the breast were collected. Our primary outcome measure was mastectomy skin necrosis. Fisher exact test was used for statistical analysis between the 2 patient cohorts. The treatment patterns of mastectomy skin necrosis were then analyzed. We identified 204 patients who underwent autologous breast reconstruction and 293 patients who underwent 2-stage expander implant breast reconstruction. Patients undergoing autologous breast reconstruction were older, heavier, more likely to have diabetes, and more likely to have had prior radiation to the breast compared with patients undergoing implant-based reconstruction. The incidence of mastectomy skin necrosis was 30.4% of patients in the autologous group compared with only 10.6% of patients in the tissue expander group (P care in the autologous group, only 3.2% were treated with local wound care in the tissue expander group (P skin necrosis is significantly more likely to occur after autologous breast reconstruction compared with 2-stage expander implant-based breast reconstruction. Patients with autologous reconstructions are more readily treated with local wound care compared with patients with tissue expanders, who tended to require operative treatment of this complication. Patients considering breast reconstruction should be counseled appropriately regarding the differences in incidence and management of mastectomy skin

  7. Neural network algorithm for image reconstruction using the grid friendly projections

    International Nuclear Information System (INIS)

    Cierniak, R.

    2011-01-01

    Full text: The presented paper describes a development of original approach to the reconstruction problem using a recurrent neural network. Particularly, the 'grid-friendly' angles of performed projections are selected according to the discrete Radon transform (DRT) concept to decrease the number of projections required. The methodology of our approach is consistent with analytical reconstruction algorithms. Reconstruction problem is reformulated in our approach to optimization problem. This problem is solved in present concept using method based on the maximum likelihood methodology. The reconstruction algorithm proposed in this work is consequently adapted for more practical discrete fan beam projections. Computer simulation results show that the neural network reconstruction algorithm designed to work in this way improves obtained results and outperforms conventional methods in reconstructed image quality. (author)

  8. A fast sparse reconstruction algorithm for electrical tomography

    International Nuclear Information System (INIS)

    Zhao, Jia; Xu, Yanbin; Tan, Chao; Dong, Feng

    2014-01-01

    Electrical tomography (ET) has been widely investigated due to its advantages of being non-radiative, low-cost and high-speed. However, the image reconstruction of ET is a nonlinear and ill-posed inverse problem and the imaging results are easily affected by measurement noise. A sparse reconstruction algorithm based on L 1 regularization is robust to noise and consequently provides a high quality of reconstructed images. In this paper, a sparse reconstruction by separable approximation algorithm (SpaRSA) is extended to solve the ET inverse problem. The algorithm is competitive with the fastest state-of-the-art algorithms in solving the standard L 2 −L 1 problem. However, it is computationally expensive when the dimension of the matrix is large. To further improve the calculation speed of solving inverse problems, a projection method based on the Krylov subspace is employed and combined with the SpaRSA algorithm. The proposed algorithm is tested with image reconstruction of electrical resistance tomography (ERT). Both simulation and experimental results demonstrate that the proposed method can reduce the computational time and improve the noise robustness for the image reconstruction. (paper)

  9. AIR Tools - A MATLAB package of algebraic iterative reconstruction methods

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Saxild-Hansen, Maria

    2012-01-01

    We present a MATLAB package with implementations of several algebraic iterative reconstruction methods for discretizations of inverse problems. These so-called row action methods rely on semi-convergence for achieving the necessary regularization of the problem. Two classes of methods are impleme......We present a MATLAB package with implementations of several algebraic iterative reconstruction methods for discretizations of inverse problems. These so-called row action methods rely on semi-convergence for achieving the necessary regularization of the problem. Two classes of methods...... are implemented: Algebraic Reconstruction Techniques (ART) and Simultaneous Iterative Reconstruction Techniques (SIRT). In addition we provide a few simplified test problems from medical and seismic tomography. For each iterative method, a number of strategies are available for choosing the relaxation parameter...

  10. Permutationally invariant state reconstruction

    DEFF Research Database (Denmark)

    Moroder, Tobias; Hyllus, Philipp; Tóth, Géza

    2012-01-01

    Feasible tomography schemes for large particle numbers must possess, besides an appropriate data acquisition protocol, an efficient way to reconstruct the density operator from the observed finite data set. Since state reconstruction typically requires the solution of a nonlinear large-scale opti...... optimization, which has clear advantages regarding speed, control and accuracy in comparison to commonly employed numerical routines. First prototype implementations easily allow reconstruction of a state of 20 qubits in a few minutes on a standard computer.......-scale optimization problem, this is a major challenge in the design of scalable tomography schemes. Here we present an efficient state reconstruction scheme for permutationally invariant quantum state tomography. It works for all common state-of-the-art reconstruction principles, including, in particular, maximum...

  11. Photoacoustic image reconstruction via deep learning

    Science.gov (United States)

    Antholzer, Stephan; Haltmeier, Markus; Nuster, Robert; Schwab, Johannes

    2018-02-01

    Applying standard algorithms to sparse data problems in photoacoustic tomography (PAT) yields low-quality images containing severe under-sampling artifacts. To some extent, these artifacts can be reduced by iterative image reconstruction algorithms which allow to include prior knowledge such as smoothness, total variation (TV) or sparsity constraints. These algorithms tend to be time consuming as the forward and adjoint problems have to be solved repeatedly. Further, iterative algorithms have additional drawbacks. For example, the reconstruction quality strongly depends on a-priori model assumptions about the objects to be recovered, which are often not strictly satisfied in practical applications. To overcome these issues, in this paper, we develop direct and efficient reconstruction algorithms based on deep learning. As opposed to iterative algorithms, we apply a convolutional neural network, whose parameters are trained before the reconstruction process based on a set of training data. For actual image reconstruction, a single evaluation of the trained network yields the desired result. Our presented numerical results (using two different network architectures) demonstrate that the proposed deep learning approach reconstructs images with a quality comparable to state of the art iterative reconstruction methods.

  12. An Adaptive B-Spline Method for Low-order Image Reconstruction Problems - Final Report - 09/24/1997 - 09/24/2000

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xin; Miller, Eric L.; Rappaport, Carey; Silevich, Michael

    2000-04-11

    A common problem in signal processing is to estimate the structure of an object from noisy measurements linearly related to the desired image. These problems are broadly known as inverse problems. A key feature which complicates the solution to such problems is their ill-posedness. That is, small perturbations in the data arising e.g. from noise can and do lead to severe, non-physical artifacts in the recovered image. The process of stabilizing these problems is known as regularization of which Tikhonov regularization is one of the most common. While this approach leads to a simple linear least squares problem to solve for generating the reconstruction, it has the unfortunate side effect of producing smooth images thereby obscuring important features such as edges. Therefore, over the past decade there has been much work in the development of edge-preserving regularizers. This technique leads to image estimates in which the important features are retained, but computationally the y require the solution of a nonlinear least squares problem, a daunting task in many practical multi-dimensional applications. In this thesis we explore low-order models for reducing the complexity of the re-construction process. Specifically, B-Splines are used to approximate the object. If a ''proper'' collection B-Splines are chosen that the object can be efficiently represented using a few basis functions, the dimensionality of the underlying problem will be significantly decreased. Consequently, an optimum distribution of splines needs to be determined. Here, an adaptive refining and pruning algorithm is developed to solve the problem. The refining part is based on curvature information, in which the intuition is that a relatively dense set of fine scale basis elements should cluster near regions of high curvature while a spares collection of basis vectors are required to adequately represent the object over spatially smooth areas. The pruning part is a greedy

  13. Joint-2D-SL0 Algorithm for Joint Sparse Matrix Reconstruction

    Directory of Open Access Journals (Sweden)

    Dong Zhang

    2017-01-01

    Full Text Available Sparse matrix reconstruction has a wide application such as DOA estimation and STAP. However, its performance is usually restricted by the grid mismatch problem. In this paper, we revise the sparse matrix reconstruction model and propose the joint sparse matrix reconstruction model based on one-order Taylor expansion. And it can overcome the grid mismatch problem. Then, we put forward the Joint-2D-SL0 algorithm which can solve the joint sparse matrix reconstruction problem efficiently. Compared with the Kronecker compressive sensing method, our proposed method has a higher computational efficiency and acceptable reconstruction accuracy. Finally, simulation results validate the superiority of the proposed method.

  14. Network reconstruction via graph blending

    Science.gov (United States)

    Estrada, Rolando

    2016-05-01

    Graphs estimated from empirical data are often noisy and incomplete due to the difficulty of faithfully observing all the components (nodes and edges) of the true graph. This problem is particularly acute for large networks where the number of components may far exceed available surveillance capabilities. Errors in the observed graph can render subsequent analyses invalid, so it is vital to develop robust methods that can minimize these observational errors. Errors in the observed graph may include missing and spurious components, as well fused (multiple nodes are merged into one) and split (a single node is misinterpreted as many) nodes. Traditional graph reconstruction methods are only able to identify missing or spurious components (primarily edges, and to a lesser degree nodes), so we developed a novel graph blending framework that allows us to cast the full estimation problem as a simple edge addition/deletion problem. Armed with this framework, we systematically investigate the viability of various topological graph features, such as the degree distribution or the clustering coefficients, and existing graph reconstruction methods for tackling the full estimation problem. Our experimental results suggest that incorporating any topological feature as a source of information actually hinders reconstruction accuracy. We provide a theoretical analysis of this phenomenon and suggest several avenues for improving this estimation problem.

  15. AIR Tools - A MATLAB Package of Algebraic Iterative Reconstruction Techniques

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Saxild-Hansen, Maria

    This collection of MATLAB software contains implementations of several Algebraic Iterative Reconstruction methods for discretizations of inverse problems. These so-called row action methods rely on semi-convergence for achieving the necessary regularization of the problem. Two classes of methods...... are implemented: Algebraic Reconstruction Techniques (ART) and Simultaneous Iterative Reconstruction Techniques (SIRT). In addition we provide a few simplified test problems from medical and seismic tomography. For each iterative method, a number of strategies are available for choosing the relaxation parameter...

  16. Haplotyping Problem, A Clustering Approach

    International Nuclear Information System (INIS)

    Eslahchi, Changiz; Sadeghi, Mehdi; Pezeshk, Hamid; Kargar, Mehdi; Poormohammadi, Hadi

    2007-01-01

    Construction of two haplotypes from a set of Single Nucleotide Polymorphism (SNP) fragments is called haplotype reconstruction problem. One of the most popular computational model for this problem is Minimum Error Correction (MEC). Since MEC is an NP-hard problem, here we propose a novel heuristic algorithm based on clustering analysis in data mining for haplotype reconstruction problem. Based on hamming distance and similarity between two fragments, our iterative algorithm produces two clusters of fragments; then, in each iteration, the algorithm assigns a fragment to one of the clusters. Our results suggest that the algorithm has less reconstruction error rate in comparison with other algorithms

  17. A neural network image reconstruction technique for electrical impedance tomography

    International Nuclear Information System (INIS)

    Adler, A.; Guardo, R.

    1994-01-01

    Reconstruction of Images in Electrical Impedance Tomography requires the solution of a nonlinear inverse problem on noisy data. This problem is typically ill-conditioned and requires either simplifying assumptions or regularization based on a priori knowledge. This paper presents a reconstruction algorithm using neural network techniques which calculates a linear approximation of the inverse problem directly from finite element simulations of the forward problem. This inverse is adapted to the geometry of the medium and the signal-to-noise ratio (SNR) used during network training. Results show good conductivity reconstruction where measurement SNR is similar to the training conditions. The advantages of this method are its conceptual simplicity and ease of implementation, and the ability to control the compromise between the noise performance and resolution of the image reconstruction

  18. Adaptive multiresolution method for MAP reconstruction in electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Acar, Erman, E-mail: erman.acar@tut.fi [Department of Signal Processing, Tampere University of Technology, P.O. Box 553, FI-33101 Tampere (Finland); BioMediTech, Tampere University of Technology, Biokatu 10, 33520 Tampere (Finland); Peltonen, Sari; Ruotsalainen, Ulla [Department of Signal Processing, Tampere University of Technology, P.O. Box 553, FI-33101 Tampere (Finland); BioMediTech, Tampere University of Technology, Biokatu 10, 33520 Tampere (Finland)

    2016-11-15

    3D image reconstruction with electron tomography holds problems due to the severely limited range of projection angles and low signal to noise ratio of the acquired projection images. The maximum a posteriori (MAP) reconstruction methods have been successful in compensating for the missing information and suppressing noise with their intrinsic regularization techniques. There are two major problems in MAP reconstruction methods: (1) selection of the regularization parameter that controls the balance between the data fidelity and the prior information, and (2) long computation time. One aim of this study is to provide an adaptive solution to the regularization parameter selection problem without having additional knowledge about the imaging environment and the sample. The other aim is to realize the reconstruction using sequences of resolution levels to shorten the computation time. The reconstructions were analyzed in terms of accuracy and computational efficiency using a simulated biological phantom and publically available experimental datasets of electron tomography. The numerical and visual evaluations of the experiments show that the adaptive multiresolution method can provide more accurate results than the weighted back projection (WBP), simultaneous iterative reconstruction technique (SIRT), and sequential MAP expectation maximization (sMAPEM) method. The method is superior to sMAPEM also in terms of computation time and usability since it can reconstruct 3D images significantly faster without requiring any parameter to be set by the user. - Highlights: • An adaptive multiresolution reconstruction method is introduced for electron tomography. • The method provides more accurate results than the conventional reconstruction methods. • The missing wedge and noise problems can be compensated by the method efficiently.

  19. Low dose reconstruction algorithm for differential phase contrast imaging.

    Science.gov (United States)

    Wang, Zhentian; Huang, Zhifeng; Zhang, Li; Chen, Zhiqiang; Kang, Kejun; Yin, Hongxia; Wang, Zhenchang; Marco, Stampanoni

    2011-01-01

    Differential phase contrast imaging computed tomography (DPCI-CT) is a novel x-ray inspection method to reconstruct the distribution of refraction index rather than the attenuation coefficient in weakly absorbing samples. In this paper, we propose an iterative reconstruction algorithm for DPCI-CT which benefits from the new compressed sensing theory. We first realize a differential algebraic reconstruction technique (DART) by discretizing the projection process of the differential phase contrast imaging into a linear partial derivative matrix. In this way the compressed sensing reconstruction problem of DPCI reconstruction can be transformed to a resolved problem in the transmission imaging CT. Our algorithm has the potential to reconstruct the refraction index distribution of the sample from highly undersampled projection data. Thus it can significantly reduce the dose and inspection time. The proposed algorithm has been validated by numerical simulations and actual experiments.

  20. Reconstruction of driving forces through recurrence plots

    International Nuclear Information System (INIS)

    Tanio, Masaaki; Hirata, Yoshito; Suzuki, Hideyuki

    2009-01-01

    We consider the problem of reconstructing one-dimensional driving forces only from the observations of driven systems. We extend the approach presented in a seminal paper [M.C. Casdagli, Physica D 108 (1997) 12] and propose a method that is robust and has wider applicability. By reinterpreting the work of Thiel et al. [M. Thiel, M.C. Romano, J. Kurths, Phys. Lett. A 330 (2004) 343], we formulate the reconstruction problem as a combinatorial optimization problem and relax conditions by assuming that a driving force is continuous. The method is demonstrated by using a tent map driven by an external force.

  1. 3-D Reconstruction From Satellite Images

    DEFF Research Database (Denmark)

    Denver, Troelz

    1999-01-01

    of planetary surfaces, but other purposes is considered as well. The system performance is measured with respect to the precision and the time consumption.The reconstruction process is divided into four major areas: Acquisition, calibration, matching/reconstruction and presentation. Each of these areas...... are treated individually. A detailed treatment of various lens distortions is required, in order to correct for these problems. This subject is included in the acquisition part. In the calibration part, the perspective distortion is removed from the images. Most attention has been paid to the matching problem...

  2. Environmental dose reconstruction: Approaches to an inexact science

    International Nuclear Information System (INIS)

    Hoffman, F.O.

    1991-01-01

    The endpoints of environmental dose reconstruction are quantitative yet the science is inexact. Four problems related to this issue are described. These problems are: (1) Defining the scope of the assessment and setting logical priorities for detailed investigations, (2) Recognizing the influence of investigator judgment of the results, (3) Selecting an endpoint other than dose for the assessment of multiple contaminants, and (4) Resolving the conflict between credibility and expertise in selecting individuals responsible for dose reconstruction. Approaches are recommended for dealing with each of these problems

  3. Clinical applications of iterative reconstruction

    International Nuclear Information System (INIS)

    Eberl, S.

    1998-01-01

    Expectation maximisation (EM) reconstruction largely eliminates the hot and cold streaking artifacts characteristic of filtered-back projection (FBP) reconstruction around localised hot areas, such as the bladder. It also substantially reduces the problem of decreased inferior wall counts in MIBI myocardial perfusion studies due to ''streaking'' from high liver uptake. Non-uniform attenuation and scatter correction, resolution recovery, anatomical information, e.g. from MRI or CT tracer kinetic modelling, can all be built into the EM reconstruction imaging model. The properties of ordered subset EM (OSEM) have also been used to correct for known patient motion as part of the reconstruction process. These uses of EM are elaborated more fully in some of the other abstracts of this meeting. Currently we use OSEM routinely for: (i) studies where streaking is a problem, including all MIBI myocardial perfusion studies, to avoid hot liver inferior wall artifact, (ii) all whole body FDG PET, all lung V/Q SPECT (which have a short acquisition time) and all gated 201 TI myocardial perfusion studies due to improved noise characteristics of OSEM in these studies; (iii) studies with measured, non-uniform attenuation correction. With the accelerated OSEM algorithm, iterative reconstruction is practical for routine clinical applications and we have found OSEM to provide clearly superior reconstructions for the areas listed above and are investigating its application to other studies. In clinical use, we have not found OSEM to introduce artifacts which would not also occur with FBP, e.g. uncorrected patient motion will cause artifacts with both OSEM and FBP

  4. Early anterior cruciate ligament reconstruction can save meniscus without any complications

    Directory of Open Access Journals (Sweden)

    Chang-Ik Hur

    2017-01-01

    Conclusions: Early ACL reconstruction had excellent clinical results and stability as good as delayed reconstruction without the problem of knee motion, muscle power, and postural control. Moreover, early reconstruction showed the high possibility of meniscal repair. Therefore, early ACL reconstruction should be recommended.

  5. The problem of the architectural heritage reconstruction

    Directory of Open Access Journals (Sweden)

    Alfazhr M.A.

    2017-02-01

    Full Text Available the subject of this research is the modern technology of the architectural monuments restoration, which makes possible to increase the design and performance, as well as the durability of historical objects. Choosing the most efficient, cost-effective and durable recovery and expanding of architectural monuments technologies is a priority of historical cities. Adoption of the faster and sound monuments restoration technology is neсessay because there are a lot of historical Russian cities in need of repair and reconstruction. Therefore, it is essential that new renovation works improvement methods and technologies on the basis of the western experience in construction to be found.

  6. Graph-cut based discrete-valued image reconstruction.

    Science.gov (United States)

    Tuysuzoglu, Ahmet; Karl, W Clem; Stojanovic, Ivana; Castañòn, David; Ünlü, M Selim

    2015-05-01

    Efficient graph-cut methods have been used with great success for labeling and denoising problems occurring in computer vision. Unfortunately, the presence of linear image mappings has prevented the use of these techniques in most discrete-amplitude image reconstruction problems. In this paper, we develop a graph-cut based framework for the direct solution of discrete amplitude linear image reconstruction problems cast as regularized energy function minimizations. We first analyze the structure of discrete linear inverse problem cost functions to show that the obstacle to the application of graph-cut methods to their solution is the variable mixing caused by the presence of the linear sensing operator. We then propose to use a surrogate energy functional that overcomes the challenges imposed by the sensing operator yet can be utilized efficiently in existing graph-cut frameworks. We use this surrogate energy functional to devise a monotonic iterative algorithm for the solution of discrete valued inverse problems. We first provide experiments using local convolutional operators and show the robustness of the proposed technique to noise and stability to changes in regularization parameter. Then we focus on nonlocal, tomographic examples where we consider limited-angle data problems. We compare our technique with state-of-the-art discrete and continuous image reconstruction techniques. Experiments show that the proposed method outperforms state-of-the-art techniques in challenging scenarios involving discrete valued unknowns.

  7. Comparison of algebraic and analytical approaches to the formulation of the statistical model-based reconstruction problem for X-ray computed tomography.

    Science.gov (United States)

    Cierniak, Robert; Lorent, Anna

    2016-09-01

    The main aim of this paper is to investigate properties of our originally formulated statistical model-based iterative approach applied to the image reconstruction from projections problem which are related to its conditioning, and, in this manner, to prove a superiority of this approach over ones recently used by other authors. The reconstruction algorithm based on this conception uses a maximum likelihood estimation with an objective adjusted to the probability distribution of measured signals obtained from an X-ray computed tomography system with parallel beam geometry. The analysis and experimental results presented here show that our analytical approach outperforms the referential algebraic methodology which is explored widely in the literature and exploited in various commercial implementations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Maxillary reconstruction

    Directory of Open Access Journals (Sweden)

    Brown James

    2007-12-01

    Full Text Available This article aims to discuss the various defects that occur with maxillectomy with a full review of the literature and discussion of the advantages and disadvantages of the various techniques described. Reconstruction of the maxilla can be relatively simple for the standard low maxillectomy that does not involve the orbital floor (Class 2. In this situation the structure of the face is less damaged and the there are multiple reconstructive options for the restoration of the maxilla and dental alveolus. If the maxillectomy includes the orbit (Class 4 then problems involving the eye (enopthalmos, orbital dystopia, ectropion and diplopia are avoided which simplifies the reconstruction. Most controversy is associated with the maxillectomy that involves the orbital floor and dental alveolus (Class 3. A case is made for the use of the iliac crest with internal oblique as an ideal option but there are other methods, which may provide a similar result. A multidisciplinary approach to these patients is emphasised which should include a prosthodontist with a special expertise for these defects.

  9. Challenges in the reconstruction of bilateral maxillectomy defects.

    Science.gov (United States)

    Joseph, Shawn T; Thankappan, Krishnakumar; Buggaveeti, Rahul; Sharma, Mohit; Mathew, Jimmy; Iyer, Subramania

    2015-02-01

    Bilateral maxillectomy defects, if not adequately reconstructed, can result in grave esthetic and functional problems. The purpose of this study was to investigate the outcome of reconstruction of such defects. This is a retrospective case series. The defects were analyzed for their components and the flaps used for reconstruction. Outcomes for flap loss and functional indices, including oral diet, speech, and dental rehabilitation, also were evaluated. Ten consecutive patients who underwent bilateral maxillectomy reconstruction received 14 flaps. Six patients had malignancies of the maxilla, and 4 patients had nonmalignant indications. Ten bony free flaps were used. Four soft tissue flaps were used. The fibula free flap was the most common flap used. Three patients had total flap loss. Seven patients were alive and available for functional evaluation. Of these, 4 were taking an oral diet with altered consistency and 2 were on a regular diet. Speech was intelligible in all patients. Only 2 patients opted for dental rehabilitation with removable dentures. Reconstruction after bilateral maxillectomy is essential to prevent esthetic and functional problems. Bony reconstruction is ideal. The fibula bone free flap is commonly used. The complexity of the defect makes reconstruction difficult and the initial success rate of free flaps is low. Secondary reconstructions after the initial flap failures were successful. A satisfactory functional outcome can be achieved. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  10. Accelerated gradient methods for total-variation-based CT image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Joergensen, Jakob H.; Hansen, Per Christian [Technical Univ. of Denmark, Lyngby (Denmark). Dept. of Informatics and Mathematical Modeling; Jensen, Tobias L.; Jensen, Soeren H. [Aalborg Univ. (Denmark). Dept. of Electronic Systems; Sidky, Emil Y.; Pan, Xiaochuan [Chicago Univ., Chicago, IL (United States). Dept. of Radiology

    2011-07-01

    Total-variation (TV)-based CT image reconstruction has shown experimentally to be capable of producing accurate reconstructions from sparse-view data. In particular TV-based reconstruction is well suited for images with piecewise nearly constant regions. Computationally, however, TV-based reconstruction is demanding, especially for 3D imaging, and the reconstruction from clinical data sets is far from being close to real-time. This is undesirable from a clinical perspective, and thus there is an incentive to accelerate the solution of the underlying optimization problem. The TV reconstruction can in principle be found by any optimization method, but in practice the large scale of the systems arising in CT image reconstruction preclude the use of memory-intensive methods such as Newton's method. The simple gradient method has much lower memory requirements, but exhibits prohibitively slow convergence. In the present work we address the question of how to reduce the number of gradient method iterations needed to achieve a high-accuracy TV reconstruction. We consider the use of two accelerated gradient-based methods, GPBB and UPN, to solve the 3D-TV minimization problem in CT image reconstruction. The former incorporates several heuristics from the optimization literature such as Barzilai-Borwein (BB) step size selection and nonmonotone line search. The latter uses a cleverly chosen sequence of auxiliary points to achieve a better convergence rate. The methods are memory efficient and equipped with a stopping criterion to ensure that the TV reconstruction has indeed been found. An implementation of the methods (in C with interface to Matlab) is available for download from http://www2.imm.dtu.dk/~pch/TVReg/. We compare the proposed methods with the standard gradient method, applied to a 3D test problem with synthetic few-view data. We find experimentally that for realistic parameters the proposed methods significantly outperform the standard gradient method. (orig.)

  11. Versatility of the CFR algorithm for limited angle reconstruction

    International Nuclear Information System (INIS)

    Fujieda, I.; Heiskanen, K.; Perez-Mendez, V.

    1990-01-01

    The constrained Fourier reconstruction (CFR) algorithm and the iterative reconstruction-reprojection (IRR) algorithm are evaluated based on their accuracy for three types of limited angle reconstruction problems. The cFR algorithm performs better for problems such as Xray CT imaging of a nuclear reactor core with one large data gap due to structural blocking of the source and detector pair. For gated heart imaging by Xray CT, radioisotope distribution imaging by PET or SPECT, using a polygonal array of gamma cameras with insensitive gaps between camera boundaries, the IRR algorithm has a slight advantage over the CFR algorithm but the difference is not significant

  12. l1- and l2-Norm Joint Regularization Based Sparse Signal Reconstruction Scheme

    Directory of Open Access Journals (Sweden)

    Chanzi Liu

    2016-01-01

    Full Text Available Many problems in signal processing and statistical inference involve finding sparse solution to some underdetermined linear system of equations. This is also the application condition of compressive sensing (CS which can find the sparse solution from the measurements far less than the original signal. In this paper, we propose l1- and l2-norm joint regularization based reconstruction framework to approach the original l0-norm based sparseness-inducing constrained sparse signal reconstruction problem. Firstly, it is shown that, by employing the simple conjugate gradient algorithm, the new formulation provides an effective framework to deduce the solution as the original sparse signal reconstruction problem with l0-norm regularization item. Secondly, the upper reconstruction error limit is presented for the proposed sparse signal reconstruction framework, and it is unveiled that a smaller reconstruction error than l1-norm relaxation approaches can be realized by using the proposed scheme in most cases. Finally, simulation results are presented to validate the proposed sparse signal reconstruction approach.

  13. Performance bounds for sparse signal reconstruction with multiple side information [arXiv

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Seiler, Jurgen; Kaup, Andre

    2016-01-01

    In the context of compressive sensing (CS), this paper considers the problem of reconstructing sparse signals with the aid of other given correlated sources as multiple side information (SI). To address this problem, we propose a reconstruction algorithm with multiple SI (RAMSI) that solves...

  14. Mathematical Problems in Synthetic Aperture Radar

    Science.gov (United States)

    Klein, Jens

    2010-10-01

    This thesis is concerned with problems related to Synthetic Aperture Radar (SAR). The thesis is structured as follows: The first chapter explains what SAR is, and the physical and mathematical background is illuminated. The following chapter points out a problem with a divergent integral in a common approach and proposes an improvement. Numerical comparisons are shown that indicate that the improvements allow for a superior image quality. Thereafter the problem of limited data is analyzed. In a realistic SAR-measurement the data gathered from the electromagnetic waves reflected from the surface can only be collected from a limited area. However the reconstruction formula requires data from an infinite distance. The chapter gives an analysis of the artifacts which can obscure the reconstructed images due to this problem. Additionally, some numerical examples are shown that point to the severity of the problem. In chapter 4 the fact that data is available only from a limited area is used to propose a new inversion formula. This inversion formula has the potential to make it easier to suppress artifacts due to limited data and, depending on the application, can be refined to a fast reconstruction formula. In the penultimate chapter a solution to the problem of left-right ambiguity is presented. This problem exists since the invention of SAR and is caused by the geometry of the measurements. This leads to the fact that only symmetric images can be obtained. With the solution from this chapter it is possible to reconstruct not only the even part of the reflectivity function, but also the odd part, thus making it possible to reconstruct asymmetric images. Numerical simulations are shown to demonstrate that this solution is not affected by stability problems as other approaches have been. The final chapter develops some continuative ideas that could be pursued in the future.

  15. A clinical perspective of accelerated statistical reconstruction

    International Nuclear Information System (INIS)

    Hutton, B.F.; Hudson, H.M.; Beekman, F.J.

    1997-01-01

    Although the potential benefits of maximum likelihood reconstruction have been recognised for many years, the technique has only recently found widespread popularity in clinical practice. Factors which have contributed to the wider acceptance include improved models for the emission process, better understanding of the properties of the algorithm and, not least, the practicality of application with the development of acceleration schemes and the improved speed of computers. The objective in this article is to present a framework for applying maximum likelihood reconstruction for a wide range of clinically based problems. The article draws particularly on the experience of the three authors in applying an acceleration scheme involving use of ordered subsets to a range of applications. The potential advantages of statistical reconstruction techniques include: (a) the ability to better model the emission and detection process, in order to make the reconstruction converge to a quantitative image, (b) the inclusion of a statistical noise model which results in better noise characteristics, and (c) the possibility to incorporate prior knowledge about the distribution being imaged. The great flexibility in adapting the reconstruction for a specific model results in these techniques having wide applicability to problems in clinical nuclear medicine. (orig.). With 8 figs., 1 tab

  16. Reconstruction of phylogenetic trees of prokaryotes using maximal common intervals.

    Science.gov (United States)

    Heydari, Mahdi; Marashi, Sayed-Amir; Tusserkani, Ruzbeh; Sadeghi, Mehdi

    2014-10-01

    One of the fundamental problems in bioinformatics is phylogenetic tree reconstruction, which can be used for classifying living organisms into different taxonomic clades. The classical approach to this problem is based on a marker such as 16S ribosomal RNA. Since evolutionary events like genomic rearrangements are not included in reconstructions of phylogenetic trees based on single genes, much effort has been made to find other characteristics for phylogenetic reconstruction in recent years. With the increasing availability of completely sequenced genomes, gene order can be considered as a new solution for this problem. In the present work, we applied maximal common intervals (MCIs) in two or more genomes to infer their distance and to reconstruct their evolutionary relationship. Additionally, measures based on uncommon segments (UCS's), i.e., those genomic segments which are not detected as part of any of the MCIs, are also used for phylogenetic tree reconstruction. We applied these two types of measures for reconstructing the phylogenetic tree of 63 prokaryotes with known COG (clusters of orthologous groups) families. Similarity between the MCI-based (resp. UCS-based) reconstructed phylogenetic trees and the phylogenetic tree obtained from NCBI taxonomy browser is as high as 93.1% (resp. 94.9%). We show that in the case of this diverse dataset of prokaryotes, tree reconstruction based on MCI and UCS outperforms most of the currently available methods based on gene orders, including breakpoint distance and DCJ. We additionally tested our new measures on a dataset of 13 closely-related bacteria from the genus Prochlorococcus. In this case, distances like rearrangement distance, breakpoint distance and DCJ proved to be useful, while our new measures are still appropriate for phylogenetic reconstruction. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Sparse BLIP: BLind Iterative Parallel imaging reconstruction using compressed sensing.

    Science.gov (United States)

    She, Huajun; Chen, Rong-Rong; Liang, Dong; DiBella, Edward V R; Ying, Leslie

    2014-02-01

    To develop a sensitivity-based parallel imaging reconstruction method to reconstruct iteratively both the coil sensitivities and MR image simultaneously based on their prior information. Parallel magnetic resonance imaging reconstruction problem can be formulated as a multichannel sampling problem where solutions are sought analytically. However, the channel functions given by the coil sensitivities in parallel imaging are not known exactly and the estimation error usually leads to artifacts. In this study, we propose a new reconstruction algorithm, termed Sparse BLind Iterative Parallel, for blind iterative parallel imaging reconstruction using compressed sensing. The proposed algorithm reconstructs both the sensitivity functions and the image simultaneously from undersampled data. It enforces the sparseness constraint in the image as done in compressed sensing, but is different from compressed sensing in that the sensing matrix is unknown and additional constraint is enforced on the sensitivities as well. Both phantom and in vivo imaging experiments were carried out with retrospective undersampling to evaluate the performance of the proposed method. Experiments show improvement in Sparse BLind Iterative Parallel reconstruction when compared with Sparse SENSE, JSENSE, IRGN-TV, and L1-SPIRiT reconstructions with the same number of measurements. The proposed Sparse BLind Iterative Parallel algorithm reduces the reconstruction errors when compared to the state-of-the-art parallel imaging methods. Copyright © 2013 Wiley Periodicals, Inc.

  18. Class of reconstructed discontinuous Galerkin methods in computational fluid dynamics

    International Nuclear Information System (INIS)

    Luo, Hong; Xia, Yidong; Nourgaliev, Robert

    2011-01-01

    A class of reconstructed discontinuous Galerkin (DG) methods is presented to solve compressible flow problems on arbitrary grids. The idea is to combine the efficiency of the reconstruction methods in finite volume methods and the accuracy of the DG methods to obtain a better numerical algorithm in computational fluid dynamics. The beauty of the resulting reconstructed discontinuous Galerkin (RDG) methods is that they provide a unified formulation for both finite volume and DG methods, and contain both classical finite volume and standard DG methods as two special cases of the RDG methods, and thus allow for a direct efficiency comparison. Both Green-Gauss and least-squares reconstruction methods and a least-squares recovery method are presented to obtain a quadratic polynomial representation of the underlying linear discontinuous Galerkin solution on each cell via a so-called in-cell reconstruction process. The devised in-cell reconstruction is aimed to augment the accuracy of the discontinuous Galerkin method by increasing the order of the underlying polynomial solution. These three reconstructed discontinuous Galerkin methods are used to compute a variety of compressible flow problems on arbitrary meshes to assess their accuracy. The numerical experiments demonstrate that all three reconstructed discontinuous Galerkin methods can significantly improve the accuracy of the underlying second-order DG method, although the least-squares reconstructed DG method provides the best performance in terms of both accuracy, efficiency, and robustness. (author)

  19. Preconditioned alternating projection algorithms for maximum a posteriori ECT reconstruction

    International Nuclear Information System (INIS)

    Krol, Andrzej; Li, Si; Shen, Lixin; Xu, Yuesheng

    2012-01-01

    We propose a preconditioned alternating projection algorithm (PAPA) for solving the maximum a posteriori (MAP) emission computed tomography (ECT) reconstruction problem. Specifically, we formulate the reconstruction problem as a constrained convex optimization problem with the total variation (TV) regularization. We then characterize the solution of the constrained convex optimization problem and show that it satisfies a system of fixed-point equations defined in terms of two proximity operators raised from the convex functions that define the TV-norm and the constraint involved in the problem. The characterization (of the solution) via the proximity operators that define two projection operators naturally leads to an alternating projection algorithm for finding the solution. For efficient numerical computation, we introduce to the alternating projection algorithm a preconditioning matrix (the EM-preconditioner) for the dense system matrix involved in the optimization problem. We prove theoretically convergence of the PAPA. In numerical experiments, performance of our algorithms, with an appropriately selected preconditioning matrix, is compared with performance of the conventional MAP expectation-maximization (MAP-EM) algorithm with TV regularizer (EM-TV) and that of the recently developed nested EM-TV algorithm for ECT reconstruction. Based on the numerical experiments performed in this work, we observe that the alternating projection algorithm with the EM-preconditioner outperforms significantly the EM-TV in all aspects including the convergence speed, the noise in the reconstructed images and the image quality. It also outperforms the nested EM-TV in the convergence speed while providing comparable image quality. (paper)

  20. An Intelligent Actuator Fault Reconstruction Scheme for Robotic Manipulators.

    Science.gov (United States)

    Xiao, Bing; Yin, Shen

    2018-02-01

    This paper investigates a difficult problem of reconstructing actuator faults for robotic manipulators. An intelligent approach with fast reconstruction property is developed. This is achieved by using observer technique. This scheme is capable of precisely reconstructing the actual actuator fault. It is shown by Lyapunov stability analysis that the reconstruction error can converge to zero after finite time. A perfect reconstruction performance including precise and fast properties can be provided for actuator fault. The most important feature of the scheme is that, it does not depend on control law, dynamic model of actuator, faults' type, and also their time-profile. This super reconstruction performance and capability of the proposed approach are further validated by simulation and experimental results.

  1. Pan-sharpening via compressed superresolution reconstruction and multidictionary learning

    Science.gov (United States)

    Shi, Cheng; Liu, Fang; Li, Lingling; Jiao, Licheng; Hao, Hongxia; Shang, Ronghua; Li, Yangyang

    2018-01-01

    In recent compressed sensing (CS)-based pan-sharpening algorithms, pan-sharpening performance is affected by two key problems. One is that there are always errors between the high-resolution panchromatic (HRP) image and the linear weighted high-resolution multispectral (HRM) image, resulting in spatial and spectral information lost. The other is that the dictionary construction process depends on the nontruth training samples. These problems have limited applications to CS-based pan-sharpening algorithm. To solve these two problems, we propose a pan-sharpening algorithm via compressed superresolution reconstruction and multidictionary learning. Through a two-stage implementation, compressed superresolution reconstruction model reduces the error effectively between the HRP and the linear weighted HRM images. Meanwhile, the multidictionary with ridgelet and curvelet is learned for both the two stages in the superresolution reconstruction process. Since ridgelet and curvelet can better capture the structure and directional characteristics, a better reconstruction result can be obtained. Experiments are done on the QuickBird and IKONOS satellites images. The results indicate that the proposed algorithm is competitive compared with the recent CS-based pan-sharpening methods and other well-known methods.

  2. Blind compressed sensing image reconstruction based on alternating direction method

    Science.gov (United States)

    Liu, Qinan; Guo, Shuxu

    2018-04-01

    In order to solve the problem of how to reconstruct the original image under the condition of unknown sparse basis, this paper proposes an image reconstruction method based on blind compressed sensing model. In this model, the image signal is regarded as the product of a sparse coefficient matrix and a dictionary matrix. Based on the existing blind compressed sensing theory, the optimal solution is solved by the alternative minimization method. The proposed method solves the problem that the sparse basis in compressed sensing is difficult to represent, which restrains the noise and improves the quality of reconstructed image. This method ensures that the blind compressed sensing theory has a unique solution and can recover the reconstructed original image signal from a complex environment with a stronger self-adaptability. The experimental results show that the image reconstruction algorithm based on blind compressed sensing proposed in this paper can recover high quality image signals under the condition of under-sampling.

  3. MR image reconstruction via guided filter.

    Science.gov (United States)

    Huang, Heyan; Yang, Hang; Wang, Kang

    2018-04-01

    Magnetic resonance imaging (MRI) reconstruction from the smallest possible set of Fourier samples has been a difficult problem in medical imaging field. In our paper, we present a new approach based on a guided filter for efficient MRI recovery algorithm. The guided filter is an edge-preserving smoothing operator and has better behaviors near edges than the bilateral filter. Our reconstruction method is consist of two steps. First, we propose two cost functions which could be computed efficiently and thus obtain two different images. Second, the guided filter is used with these two obtained images for efficient edge-preserving filtering, and one image is used as the guidance image, the other one is used as a filtered image in the guided filter. In our reconstruction algorithm, we can obtain more details by introducing guided filter. We compare our reconstruction algorithm with some competitive MRI reconstruction techniques in terms of PSNR and visual quality. Simulation results are given to show the performance of our new method.

  4. SART-Type Half-Threshold Filtering Approach for CT Reconstruction.

    Science.gov (United States)

    Yu, Hengyong; Wang, Ge

    2014-01-01

    The [Formula: see text] regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the [Formula: see text] norm (0 < p < 1) and solve the [Formula: see text] minimization problem. Very recently, Xu et al. developed an analytic solution for the [Formula: see text] regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering.

  5. Increasing efficiency of reconstruction and technological development of coking enterprises

    Energy Technology Data Exchange (ETDEWEB)

    Rozenfel' d, M.S.; Martynenko, V.M.; Tytyuk, Yu.A.; Ivanov, V.V.; Svyatogorov, A.A.; Kolomiets, A.F. (NIISP, Voroshilovgrad (USSR))

    1989-07-01

    Discusses problems associated with reconstruction of coking plants in the USSR. Planning coking plant reconstruction is analyzed. Duration of individual stages of plant reconstruction is considered. A method developed by the Giprokoks research institute for calculating reconstruction time considering duration of individual stages of coke oven battery repair is analyzed: construction of storage facilities, transport of materials and equipment, safety requirements, coke oven cooling, dismantling, construction of coke oven walls, installation of machines and equipment. Advantages of using the methods for analysis of coke oven battery reconstruction and optimization of repair time are discussed.

  6. Failed medial patellofemoral ligament reconstruction: Causes and surgical strategies

    OpenAIRE

    Sanchis-Alfonso, Vicente; Montesinos-Berry, Erik; Ramirez-Fuentes, Cristina; Leal Blanquet, Joan; Gelber, Pablo-Eduardo; Monllau García, Juan Carlos

    2017-01-01

    Patellar instability is a common clinical problem encountered by orthopedic surgeons specializing in the knee. For patients with chronic lateral patellar instability, the standard surgical approach is to stabilize the patella through a medial patellofemoral ligament (MPFL) reconstruction. Foreseeably, an increasing number of revision surgeries of the reconstructed MPFL will be seen in upcoming years. In this paper, the causes of failed MPFL reconstruction are analyzed: (1) incorrect surgical ...

  7. Obtaining sparse distributions in 2D inverse problems

    OpenAIRE

    Reci, A; Sederman, Andrew John; Gladden, Lynn Faith

    2017-01-01

    The mathematics of inverse problems has relevance across numerous estimation problems in science and engineering. L1 regularization has attracted recent attention in reconstructing the system properties in the case of sparse inverse problems; i.e., when the true property sought is not adequately described by a continuous distribution, in particular in Compressed Sensing image reconstruction. In this work, we focus on the application of L1 regularization to a class of inverse problems; relaxat...

  8. ℓ0 Gradient Minimization Based Image Reconstruction for Limited-Angle Computed Tomography.

    Directory of Open Access Journals (Sweden)

    Wei Yu

    Full Text Available In medical and industrial applications of computed tomography (CT imaging, limited by the scanning environment and the risk of excessive X-ray radiation exposure imposed to the patients, reconstructing high quality CT images from limited projection data has become a hot topic. X-ray imaging in limited scanning angular range is an effective imaging modality to reduce the radiation dose to the patients. As the projection data available in this modality are incomplete, limited-angle CT image reconstruction is actually an ill-posed inverse problem. To solve the problem, image reconstructed by conventional filtered back projection (FBP algorithm frequently results in conspicuous streak artifacts and gradual changed artifacts nearby edges. Image reconstruction based on total variation minimization (TVM can significantly reduce streak artifacts in few-view CT, but it suffers from the gradual changed artifacts nearby edges in limited-angle CT. To suppress this kind of artifacts, we develop an image reconstruction algorithm based on ℓ0 gradient minimization for limited-angle CT in this paper. The ℓ0-norm of the image gradient is taken as the regularization function in the framework of developed reconstruction model. We transformed the optimization problem into a few optimization sub-problems and then, solved these sub-problems in the manner of alternating iteration. Numerical experiments are performed to validate the efficiency and the feasibility of the developed algorithm. From the statistical analysis results of the performance evaluations peak signal-to-noise ratio (PSNR and normalized root mean square distance (NRMSD, it shows that there are significant statistical differences between different algorithms from different scanning angular ranges (p<0.0001. From the experimental results, it also indicates that the developed algorithm outperforms classical reconstruction algorithms in suppressing the streak artifacts and the gradual changed

  9. Simulation and track reconstruction for beam telescopes

    CERN Document Server

    Maqbool, Salman

    2017-01-01

    Beam telescopes are used for testing new detectors under development. Sensors are placed and a particle beam is passed through them. To test these novel detectors and determine their properties, the particle tracks need to be reconstructed from the known detectors in the telescope. Based on the reconstructed track, it’s predicted hits on the Device under Test (DUT) are compared with the actual hits on the DUT. Several methods exist for track reconstruction, but most of them don’t account for the effects of multiple scattering. General Broken Lines is one such algorithm which incorporates these effects during reconstruction. The aim of this project was to simulate the beam telescope and extend the track reconstruction framework for the FE-I4 telescope, which takes these effects into account. Section 1 introduces the problem, while section 2 focuses on beam telescopes. This is followed by the Allpix2 simulation framework in Section 3. And finally, Section 4 introduces the Proteus track reconstruction framew...

  10. Simulation and Track Reconstruction for Beam Telescopes

    CERN Document Server

    Maqbool, Salman

    2017-01-01

    Beam telescopes are an important tool to test new detectors under development in a particle beam. To test these novel detectors and determine their properties, the particle tracks need to be reconstructed from the known detectors in the telescope. Based on the reconstructed track, its predicted position on the Device under Test (DUT) are compared with the actual hits on the DUT. Several methods exist for track reconstruction, but most of them do not account for the effects of multiple scattering. General Broken Lines is one such algorithm which incorporates these effects during reconstruction. The aim of this project was to simulate the beam telescope and extend the track reconstruction framework for the FE-I4 telescope, which takes these effects into account. Section 1 introduces the problem, while section 2 focuses on beam telescopes. This is followed by the Allpix2 simulation framework in Section 3. And finally, Section 4 introduces the Proteus track reconstruction framework along with the General Broken ...

  11. The generalized back projection theorem for cone beam reconstruction

    International Nuclear Information System (INIS)

    Peyrin, F.C.

    1985-01-01

    The use of cone beam scanners raises the problem of three dimensional reconstruction from divergent projections. After a survey on bidimensional analytical reconstruction methods we examine their application to the 3D problem. Finally, it is shown that the back projection theorem can be generalized to cone beam projections. This allows to state a new inversion formula suitable for both the 4 π parallel and divergent geometries. It leads to the generalization of the ''rho-filtered back projection'' algorithm which is outlined

  12. A Survey of Urban Reconstruction

    KAUST Repository

    Musialski, P.

    2013-05-10

    This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved. This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  13. A Survey of Urban Reconstruction

    KAUST Repository

    Musialski, P.; Wonka, Peter; Aliaga, D. G.; Wimmer, M.; van Gool, L.; Purgathofer, W.

    2013-01-01

    This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved. This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  14. Fast Tomographic Reconstruction From Limited Data Using Artificial Neural Networks

    NARCIS (Netherlands)

    D.M. Pelt (Daniël); K.J. Batenburg (Joost)

    2013-01-01

    htmlabstractImage reconstruction from a small number of projections is a challenging problem in tomography. Advanced algorithms that incorporate prior knowledge can sometimes produce accurate reconstructions, but they typically require long computation times. Furthermore, the required prior

  15. Super resolution reconstruction of infrared images based on classified dictionary learning

    Science.gov (United States)

    Liu, Fei; Han, Pingli; Wang, Yi; Li, Xuan; Bai, Lu; Shao, Xiaopeng

    2018-05-01

    Infrared images always suffer from low-resolution problems resulting from limitations of imaging devices. An economical approach to combat this problem involves reconstructing high-resolution images by reasonable methods without updating devices. Inspired by compressed sensing theory, this study presents and demonstrates a Classified Dictionary Learning method to reconstruct high-resolution infrared images. It classifies features of the samples into several reasonable clusters and trained a dictionary pair for each cluster. The optimal pair of dictionaries is chosen for each image reconstruction and therefore, more satisfactory results is achieved without the increase in computational complexity and time cost. Experiments and results demonstrated that it is a viable method for infrared images reconstruction since it improves image resolution and recovers detailed information of targets.

  16. Three-dimensional image reconstruction. I. Determination of pattern orientation

    International Nuclear Information System (INIS)

    Blankenbecler, Richard

    2004-01-01

    The problem of determining the Euler angles of a randomly oriented three-dimensional (3D) object from its 2D Fraunhofer diffraction patterns is discussed. This problem arises in the reconstruction of a positive semidefinite 3D object using oversampling techniques. In such a problem, the data consist of a measured set of magnitudes from 2D tomographic images of the object at several unknown orientations. After the orientation angles are determined, the object itself can then be reconstructed by a variety of methods using oversampling, the magnitude data from the 2D images, physical constraints on the image, and then iteration to determine the phases

  17. Tensor-based dictionary learning for dynamic tomographic reconstruction

    International Nuclear Information System (INIS)

    Tan, Shengqi; Wu, Zhifang; Zhang, Yanbo; Mou, Xuanqin; Wang, Ge; Cao, Guohua; Yu, Hengyong

    2015-01-01

    In dynamic computed tomography (CT) reconstruction, the data acquisition speed limits the spatio-temporal resolution. Recently, compressed sensing theory has been instrumental in improving CT reconstruction from far few-view projections. In this paper, we present an adaptive method to train a tensor-based spatio-temporal dictionary for sparse representation of an image sequence during the reconstruction process. The correlations among atoms and across phases are considered to capture the characteristics of an object. The reconstruction problem is solved by the alternating direction method of multipliers. To recover fine or sharp structures such as edges, the nonlocal total variation is incorporated into the algorithmic framework. Preclinical examples including a sheep lung perfusion study and a dynamic mouse cardiac imaging demonstrate that the proposed approach outperforms the vectorized dictionary-based CT reconstruction in the case of few-view reconstruction. (paper)

  18. Tensor-based Dictionary Learning for Dynamic Tomographic Reconstruction

    Science.gov (United States)

    Tan, Shengqi; Zhang, Yanbo; Wang, Ge; Mou, Xuanqin; Cao, Guohua; Wu, Zhifang; Yu, Hengyong

    2015-01-01

    In dynamic computed tomography (CT) reconstruction, the data acquisition speed limits the spatio-temporal resolution. Recently, compressed sensing theory has been instrumental in improving CT reconstruction from far few-view projections. In this paper, we present an adaptive method to train a tensor-based spatio-temporal dictionary for sparse representation of an image sequence during the reconstruction process. The correlations among atoms and across phases are considered to capture the characteristics of an object. The reconstruction problem is solved by the alternating direction method of multipliers. To recover fine or sharp structures such as edges, the nonlocal total variation is incorporated into the algorithmic framework. Preclinical examples including a sheep lung perfusion study and a dynamic mouse cardiac imaging demonstrate that the proposed approach outperforms the vectorized dictionary-based CT reconstruction in the case of few-view reconstruction. PMID:25779991

  19. 3D reconstruction, a new challenge in industrial radiography

    International Nuclear Information System (INIS)

    Lavayssiere, B.; Fleuet, E.; Georgel, B.

    1995-01-01

    In a NDT context, industrial radiography enables the detection of defects through their projection on a film. EDF has studied the benefit that may be brought in terms of localisation and orientation of the defects by the mean of 3D reconstruction using a very limited number of radiographs. The reconstruction issue consists of solving an integral equation of the first kind ; in a noisy context, the reconstruction belongs to the so-called ill-posed class of problem. Appropriate solutions may only be found with the help of regularization technique, by the introduction of a priori knowledge concerning the unknown solution and also by the use of a statistical modelization of the physical process which produces radiographs. Another approach simplifies the problem and reconstructs the skeleton of a defect only. All these methods coming from applied mathematical sciences enable a more precise diagnosis in non-destructive testing of thick inhomogeneous material. (authors). 4 refs., 4 figs

  20. New vertex reconstruction algorithms for CMS

    CERN Document Server

    Frühwirth, R; Prokofiev, Kirill; Speer, T.; Vanlaer, P.; Chabanat, E.; Estre, N.

    2003-01-01

    The reconstruction of interaction vertices can be decomposed into a pattern recognition problem (``vertex finding'') and a statistical problem (``vertex fitting''). We briefly review classical methods. We introduce novel approaches and motivate them in the framework of high-luminosity experiments like at the LHC. We then show comparisons with the classical methods in relevant physics channels

  1. Matrix-based image reconstruction methods for tomography

    International Nuclear Information System (INIS)

    Llacer, J.; Meng, J.D.

    1984-10-01

    Matrix methods of image reconstruction have not been used, in general, because of the large size of practical matrices, ill condition upon inversion and the success of Fourier-based techniques. An exception is the work that has been done at the Lawrence Berkeley Laboratory for imaging with accelerated radioactive ions. An extension of that work into more general imaging problems shows that, with a correct formulation of the problem, positron tomography with ring geometries results in well behaved matrices which can be used for image reconstruction with no distortion of the point response in the field of view and flexibility in the design of the instrument. Maximum Likelihood Estimator methods of reconstruction, which use the system matrices tailored to specific instruments and do not need matrix inversion, are shown to result in good preliminary images. A parallel processing computer structure based on multiple inexpensive microprocessors is proposed as a system to implement the matrix-MLE methods. 14 references, 7 figures

  2. Stable source reconstruction from a finite number of measurements in the multi-frequency inverse source problem

    DEFF Research Database (Denmark)

    Karamehmedovic, Mirza; Kirkeby, Adrian; Knudsen, Kim

    2018-01-01

    setting: From measurements made at a finite set of frequencies we uniquely determine and reconstruct sources in a subspace spanned by finitely many Fourier-Bessel functions. Further, we obtain a constructive criterion for identifying a minimal set of measurement frequencies sufficient for reconstruction......, and under an additional, mild assumption, the reconstruction method is shown to be stable." Our analysis is based on a singular value decomposition of the source-to-measurement forward operators and the distribution of positive zeros of the Bessel functions of the first kind. The reconstruction method...

  3. Titanium template for scaphoid reconstruction.

    Science.gov (United States)

    Haefeli, M; Schaefer, D J; Schumacher, R; Müller-Gerbl, M; Honigmann, P

    2015-06-01

    Reconstruction of a non-united scaphoid with a humpback deformity involves resection of the non-union followed by bone grafting and fixation of the fragments. Intraoperative control of the reconstruction is difficult owing to the complex three-dimensional shape of the scaphoid and the other carpal bones overlying the scaphoid on lateral radiographs. We developed a titanium template that fits exactly to the surfaces of the proximal and distal scaphoid poles to define their position relative to each other after resection of the non-union. The templates were designed on three-dimensional computed tomography reconstructions and manufactured using selective laser melting technology. Ten conserved human wrists were used to simulate the reconstruction. The achieved precision measured as the deviation of the surface of the reconstructed scaphoid from its virtual counterpart was good in five cases (maximal difference 1.5 mm), moderate in one case (maximal difference 3 mm) and inadequate in four cases (difference more than 3 mm). The main problems were attributed to the template design and can be avoided by improved pre-operative planning, as shown in a clinical case. © The Author(s) 2014.

  4. Low rank alternating direction method of multipliers reconstruction for MR fingerprinting.

    Science.gov (United States)

    Assländer, Jakob; Cloos, Martijn A; Knoll, Florian; Sodickson, Daniel K; Hennig, Jürgen; Lattanzi, Riccardo

    2018-01-01

    The proposed reconstruction framework addresses the reconstruction accuracy, noise propagation and computation time for magnetic resonance fingerprinting. Based on a singular value decomposition of the signal evolution, magnetic resonance fingerprinting is formulated as a low rank (LR) inverse problem in which one image is reconstructed for each singular value under consideration. This LR approximation of the signal evolution reduces the computational burden by reducing the number of Fourier transformations. Also, the LR approximation improves the conditioning of the problem, which is further improved by extending the LR inverse problem to an augmented Lagrangian that is solved by the alternating direction method of multipliers. The root mean square error and the noise propagation are analyzed in simulations. For verification, in vivo examples are provided. The proposed LR alternating direction method of multipliers approach shows a reduced root mean square error compared to the original fingerprinting reconstruction, to a LR approximation alone and to an alternating direction method of multipliers approach without a LR approximation. Incorporating sensitivity encoding allows for further artifact reduction. The proposed reconstruction provides robust convergence, reduced computational burden and improved image quality compared to other magnetic resonance fingerprinting reconstruction approaches evaluated in this study. Magn Reson Med 79:83-96, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  5. Maxillary reconstruction: Current concepts and controversies

    Directory of Open Access Journals (Sweden)

    Subramania Iyer

    2014-01-01

    Full Text Available Maxillary reconstruction is still an evolving art when compared to the reconstruction of the mandible. The defects of maxilla apart from affecting the functions of the speech, swallowing and mastication also cause cosmetic disfigurement. Rehabilitation of the form and function in patients with maxillary defects is either by using an obturator prosthesis or by a surgical reconstruction. Literature is abundant with a variety of reconstructive methods. The classification systems are also varied, with no universal acceptance of any one of them. The oncologic safety of these procedures is still debated, and conclusive evidence in this regard has not emerged yet. Management of the orbit is also not yet addressed properly. Tissue engineering, that has been hyped to be one of the possible solutions for this vexing reconstructive problem, has not come out with reliable and reproducible results so far. This review article discusses the rationale and oncological safety of the reconstructing the maxillary defects, critically analyzes the classification systems, offers the different reconstructive methods and touches upon the controversies in this subject. The management of the retained and exenterated orbit associated with maxillectomy is reviewed. The surgical morbidity, complications and the recent advances in this field are also looked into. An algorithm, based on our experience, is presented.

  6. Maxillary reconstruction: Current concepts and controversies

    Science.gov (United States)

    Iyer, Subramania; Thankappan, Krishnakumar

    2014-01-01

    Maxillary reconstruction is still an evolving art when compared to the reconstruction of the mandible. The defects of maxilla apart from affecting the functions of the speech, swallowing and mastication also cause cosmetic disfigurement. Rehabilitation of the form and function in patients with maxillary defects is either by using an obturator prosthesis or by a surgical reconstruction. Literature is abundant with a variety of reconstructive methods. The classification systems are also varied, with no universal acceptance of any one of them. The oncologic safety of these procedures is still debated, and conclusive evidence in this regard has not emerged yet. Management of the orbit is also not yet addressed properly. Tissue engineering, that has been hyped to be one of the possible solutions for this vexing reconstructive problem, has not come out with reliable and reproducible results so far. This review article discusses the rationale and oncological safety of the reconstructing the maxillary defects, critically analyzes the classification systems, offers the different reconstructive methods and touches upon the controversies in this subject. The management of the retained and exenterated orbit associated with maxillectomy is reviewed. The surgical morbidity, complications and the recent advances in this field are also looked into. An algorithm, based on our experience, is presented. PMID:24987199

  7. 3D Tomographic Image Reconstruction using CUDA C

    International Nuclear Information System (INIS)

    Dominguez, J. S.; Assis, J. T.; Oliveira, L. F. de

    2011-01-01

    This paper presents the study and implementation of a software for three dimensional reconstruction of images obtained with a tomographic system using the capabilities of Graphic Processing Units(GPU). The reconstruction by filtered back-projection method was developed using the CUDA C, for maximum utilization of the processing capabilities of GPUs to solve computational problems with large computational cost and highly parallelizable. It was discussed the potential of GPUs and shown its advantages to solving this kind of problems. The results in terms of runtime will be compared with non-parallelized implementations and must show a great reduction of processing time. (Author)

  8. Do our reconstructions of ENSO have too much low-frequency variability?

    Science.gov (United States)

    Loope, G. R.; Overpeck, J. T.

    2017-12-01

    Reconstructing the spectrum of Pacific SST variability has proven to be difficult both because of complications with proxy systems such as tree rings and the relatively small number of records from the tropical Pacific. We show that the small number of long coral δ18O and Sr/Ca records has caused a bias towards having too much low-frequency variability in PCR, CPS, and RegEM reconstructions of Pacific variability. This occurs because the individual coral records used in the reconstructions have redder spectra than the shared signal (e.g. ENSO). This causes some of the unshared, low-frequency signal from local climate, salinity and possibly coral biology to bleed into the reconstruction. With enough chronologies in a reconstruction, this unshared noise cancels out but the problem is exacerbated in our longest reconstructions where fewer records are available. Coral proxies tend to have more low-frequency variability than SST observations so this problem is smaller but can still be seen in pseudoproxy experiments using observations and reanalysis data. The identification of this low-frequency bias in coral reconstructions helps bring the spectra of ENSO reconstructions back into line with both models and observations. Although our analysis is mostly constrained to the 20th century due to lack of sufficient data, we expect that as more long chronologies are developed, the low-frequency signal in ENSO reconstructions will be greatly reduced.

  9. Is mammary reconstruction with the anatomical Becker expander a simple procedure? Complications and hidden problems leading to secondary surgical procedures: a follow-up study.

    Science.gov (United States)

    Farace, Francesco; Faenza, Mario; Bulla, Antonio; Rubino, Corrado; Campus, Gian Vittorio

    2013-06-01

    Debate over the role of Becker expander implants (BEIs) in breast reconstruction is still ongoing. There are no clear indications for BEI use. The main indications for BEI use are one-stage breast reconstruction procedure and congenital breast deformities correction, due to the postoperative ability to vary BEI volume. Recent studies showed that BEIs were removed 5 years after mammary reconstruction in 68% of operated patients. This entails a further surgical procedure. BEIs should not, therefore, be regarded as one-stage prostheses. We performed a case-series study of breast reconstructions with anatomically shaped Becker-35™ implants, in order to highlight complications and to flag unseen problems, which might entail a second surgical procedure. A total of 229 patients, reconstructed from 2005 to 2010, were enrolled in this study. Data relating to implant type, volume, mean operative time and complications were recorded. All the patients underwent the same surgical procedure. The minimum follow-up period was 18 months. During a 5-year follow-up, 99 patients required secondary surgery to correct their complications or sequelae; 46 of them underwent BEI removal within 2 years of implantation, 56 within 3 years, 65 within 4 years and 74 within 5 years. Our findings show that two different sorts of complications can arise with these devices, leading to premature implant removal, one common to any breast implant and one peculiar to BEIs. The Becker implant is a permanent expander. Surgeons must, therefore, be aware that, once positioned, the Becker expander cannot be adjusted at a later date, as in two-stage expander/prosthesis reconstructions for instance. Surgeons must have a clear understanding of possible BEI complications in order to be able to discuss these with their patients. Therefore, only surgeons experienced in breast reconstruction should use BEIs. Copyright © 2013 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by

  10. Hybrid spectral CT reconstruction.

    Directory of Open Access Journals (Sweden)

    Darin P Clark

    Full Text Available Current photon counting x-ray detector (PCD technology faces limitations associated with spectral fidelity and photon starvation. One strategy for addressing these limitations is to supplement PCD data with high-resolution, low-noise data acquired with an energy-integrating detector (EID. In this work, we propose an iterative, hybrid reconstruction technique which combines the spectral properties of PCD data with the resolution and signal-to-noise characteristics of EID data. Our hybrid reconstruction technique is based on an algebraic model of data fidelity which substitutes the EID data into the data fidelity term associated with the PCD reconstruction, resulting in a joint reconstruction problem. Within the split Bregman framework, these data fidelity constraints are minimized subject to additional constraints on spectral rank and on joint intensity-gradient sparsity measured between the reconstructions of the EID and PCD data. Following a derivation of the proposed technique, we apply it to the reconstruction of a digital phantom which contains realistic concentrations of iodine, barium, and calcium encountered in small-animal micro-CT. The results of this experiment suggest reliable separation and detection of iodine at concentrations ≥ 5 mg/ml and barium at concentrations ≥ 10 mg/ml in 2-mm features for EID and PCD data reconstructed with inherent spatial resolutions of 176 μm and 254 μm, respectively (point spread function, FWHM. Furthermore, hybrid reconstruction is demonstrated to enhance spatial resolution within material decomposition results and to improve low-contrast detectability by as much as 2.6 times relative to reconstruction with PCD data only. The parameters of the simulation experiment are based on an in vivo micro-CT experiment conducted in a mouse model of soft-tissue sarcoma. Material decomposition results produced from this in vivo data demonstrate the feasibility of distinguishing two K-edge contrast agents with

  11. Hybrid spectral CT reconstruction

    Science.gov (United States)

    Clark, Darin P.

    2017-01-01

    Current photon counting x-ray detector (PCD) technology faces limitations associated with spectral fidelity and photon starvation. One strategy for addressing these limitations is to supplement PCD data with high-resolution, low-noise data acquired with an energy-integrating detector (EID). In this work, we propose an iterative, hybrid reconstruction technique which combines the spectral properties of PCD data with the resolution and signal-to-noise characteristics of EID data. Our hybrid reconstruction technique is based on an algebraic model of data fidelity which substitutes the EID data into the data fidelity term associated with the PCD reconstruction, resulting in a joint reconstruction problem. Within the split Bregman framework, these data fidelity constraints are minimized subject to additional constraints on spectral rank and on joint intensity-gradient sparsity measured between the reconstructions of the EID and PCD data. Following a derivation of the proposed technique, we apply it to the reconstruction of a digital phantom which contains realistic concentrations of iodine, barium, and calcium encountered in small-animal micro-CT. The results of this experiment suggest reliable separation and detection of iodine at concentrations ≥ 5 mg/ml and barium at concentrations ≥ 10 mg/ml in 2-mm features for EID and PCD data reconstructed with inherent spatial resolutions of 176 μm and 254 μm, respectively (point spread function, FWHM). Furthermore, hybrid reconstruction is demonstrated to enhance spatial resolution within material decomposition results and to improve low-contrast detectability by as much as 2.6 times relative to reconstruction with PCD data only. The parameters of the simulation experiment are based on an in vivo micro-CT experiment conducted in a mouse model of soft-tissue sarcoma. Material decomposition results produced from this in vivo data demonstrate the feasibility of distinguishing two K-edge contrast agents with a spectral

  12. Tomographic reconstruction of the time-averaged density distribution in two-phase flow

    International Nuclear Information System (INIS)

    Fincke, J.R.

    1982-01-01

    The technique of reconstructive tomography has been applied to the measurement of time-average density and density distribution in a two-phase flow field. The technique of reconstructive tomography provides a model-independent method of obtaining flow-field density information. A tomographic densitometer system for the measurement of two-phase flow has two unique problems: a limited number of data values and a correspondingly coarse reconstruction grid. These problems were studied both experimentally through the use of prototype hardware on a 3-in. pipe, and analytically through computer generation of simulated data. The prototype data were taken on phantoms constructed of all Plexiglas and Plexiglas laminated with wood and polyurethane foam. Reconstructions obtained from prototype data are compared with reconstructions from the simulated data. Also presented are some representative results in a horizontal air/water flow

  13. A genetic approach to shape reconstruction in limited data tomography

    International Nuclear Information System (INIS)

    Turcanu, C.; Craciunescu, T.

    2001-01-01

    The paper proposes a new method for shape reconstruction in computerized tomography. Unlike nuclear medicine applications, in physical science problems we are often confronted with limited data sets: constraints in the number of projections or limited view angles . The problem of image reconstruction from projection may be considered as a problem of finding an image (solution) having projections that match the experimental ones. In our approach, we choose a statistical correlation coefficient to evaluate the fitness of any potential solution. The optimization process is carried out by a genetic algorithm. The algorithm has some features common to all genetic algorithms but also some problem-oriented characteristics. One of them is that a chromosome, representing a potential solution, is not linear but coded as a matrix of pixels corresponding to a two-dimensional image. This kind of internal representation reflects the genuine manifestation and slight differences between two points situated in the original problem space give rise to similar differences once they become coded. Another particular feature is a newly built crossover operator: the grid-based crossover, suitable for high dimension two-dimensional chromosomes. Except for the population size and the dimension of the cutting grid for the grid-based crossover, all the other parameters of the algorithm are independent of the geometry of the tomographic reconstruction. The performances of the method are evaluated on a phantom typical for an application with limited data sets: the determination of the neutron energy spectra with time resolution in case of short-pulsed neutron emission. A genetic reconstruction is presented. The qualitative judgement and also the quantitative one, based on some figures of merit, point out that the proposed method ensures an improved reconstruction of shapes, sizes and resolution in the image, even in the presence of noise. (authors)

  14. A GREIT-type linear reconstruction algorithm for EIT using eigenimages

    International Nuclear Information System (INIS)

    Antink, Christoph Hoog; Pikkemaat, Robert; Leonhardt, Steffen

    2013-01-01

    Reconstruction in electrical impedance tomography (EIT) is a nonlinear, ill-posed inverse problem. Based on point-shaped training and evaluation data, the 'Graz consensus Reconstruction algorithm for EIT' (GREIT) constitutes a universal, homogenous method. While this is a very reasonable approach to the general problem, we ask the question if an optimized reconstruction method for a specific application of EIT, i.e. thoracic imaging, can be found. Instead of point-shaped training data we propose to use spatially extended training data consisting of eigenimages. To evaluate the quality of reconstruction of the proposed approach, figures of merit (FOMs) derived from the ones used in GREIT are developed. For the application of thoracic imaging, lung-shapes were segmented from a publicly available CT-database (www.dir-lab.com) and used to calculate the novel FOMs. With those, the general feasibility of using eigenimages is demonstrated and compared to the standard approach. In addition, it is shown that by using different sets of training data, the creation of an individually optimized linear method of reconstruction is possible.

  15. Prepectoral Implant-Based Breast Reconstruction

    Directory of Open Access Journals (Sweden)

    Lyndsey Highton, BMBCh, MA, FRCS(Plast

    2017-09-01

    Conclusion:. Prepectoral implant placement with ADM cover is emerging as an alternative approach for IBR. This method facilitates breast reconstruction with a good cosmetic outcome for patients who want a quick recovery without potential compromise of pectoral muscle function and associated problems.

  16. Development of Image Reconstruction Algorithms in electrical Capacitance Tomography

    International Nuclear Information System (INIS)

    Fernandez Marron, J. L.; Alberdi Primicia, J.; Barcala Riveira, J. M.

    2007-01-01

    The Electrical Capacitance Tomography (ECT) has not obtained a good development in order to be used at industrial level. That is due first to difficulties in the measurement of very little capacitances (in the range of femto farads) and second to the problem of reconstruction on- line of the images. This problem is due also to the small numbers of electrodes (maximum 16), that made the usual algorithms of reconstruction has many errors. In this work it is described a new purely geometrical method that could be used for this purpose. (Author) 4 refs

  17. Fully 3D GPU PET reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L., E-mail: joaquin@nuclear.fis.ucm.es [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Espana, S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Cal-Gonzalez, J. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Vaquero, J.J. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Desco, M. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Udias, J.M. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain)

    2011-08-21

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  18. Fully 3D GPU PET reconstruction

    International Nuclear Information System (INIS)

    Herraiz, J.L.; Espana, S.; Cal-Gonzalez, J.; Vaquero, J.J.; Desco, M.; Udias, J.M.

    2011-01-01

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  19. Reconstruction Methods for Inverse Problems with Partial Data

    DEFF Research Database (Denmark)

    Hoffmann, Kristoffer

    This thesis presents a theoretical and numerical analysis of a general mathematical formulation of hybrid inverse problems in impedance tomography. This includes problems from several existing hybrid imaging modalities such as Current Density Impedance Imaging, Magnetic Resonance Electrical...... Impedance Tomography, and Ultrasound Modulated Electrical Impedance Tomography. After giving an introduction to hybrid inverse problems in impedance tomography and the mathematical tools that facilitate the related analysis, we explain in detail the stability properties associated with the classification...... of a linearised hybrid inverse problem. This is done using pseudo-differential calculus and theory for overdetermined boundary value problem. Using microlocal analysis we then present novel results on the propagation of singularities, which give a precise description of the distinct features of solutions...

  20. Self-prior strategy for organ reconstruction in fluorescence molecular tomography.

    Science.gov (United States)

    Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen

    2017-10-01

    The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy.

  1. L{sub 1/2} regularization based numerical method for effective reconstruction of bioluminescence tomography

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xueli, E-mail: xlchen@xidian.edu.cn, E-mail: jimleung@mail.xidian.edu.cn; Yang, Defu; Zhang, Qitan; Liang, Jimin, E-mail: xlchen@xidian.edu.cn, E-mail: jimleung@mail.xidian.edu.cn [School of Life Science and Technology, Xidian University, Xi' an 710071 (China); Engineering Research Center of Molecular and Neuro Imaging, Ministry of Education (China)

    2014-05-14

    Even though bioluminescence tomography (BLT) exhibits significant potential and wide applications in macroscopic imaging of small animals in vivo, the inverse reconstruction is still a tough problem that has plagued researchers in a related area. The ill-posedness of inverse reconstruction arises from insufficient measurements and modeling errors, so that the inverse reconstruction cannot be solved directly. In this study, an l{sub 1/2} regularization based numerical method was developed for effective reconstruction of BLT. In the method, the inverse reconstruction of BLT was constrained into an l{sub 1/2} regularization problem, and then the weighted interior-point algorithm (WIPA) was applied to solve the problem through transforming it into obtaining the solution of a series of l{sub 1} regularizers. The feasibility and effectiveness of the proposed method were demonstrated with numerical simulations on a digital mouse. Stability verification experiments further illustrated the robustness of the proposed method for different levels of Gaussian noise.

  2. Parallel CT image reconstruction based on GPUs

    International Nuclear Information System (INIS)

    Flores, Liubov A.; Vidal, Vicent; Mayo, Patricia; Rodenas, Francisco; Verdú, Gumersindo

    2014-01-01

    In X-ray computed tomography (CT) iterative methods are more suitable for the reconstruction of images with high contrast and precision in noisy conditions from a small number of projections. However, in practice, these methods are not widely used due to the high computational cost of their implementation. Nowadays technology provides the possibility to reduce effectively this drawback. It is the goal of this work to develop a fast GPU-based algorithm to reconstruct high quality images from under sampled and noisy projection data. - Highlights: • We developed GPU-based iterative algorithm to reconstruct images. • Iterative algorithms are capable to reconstruct images from under sampled set of projections. • The computer cost of the implementation of the developed algorithm is low. • The efficiency of the algorithm increases for the large scale problems

  3. Labral reconstruction: when to perform and how

    Directory of Open Access Journals (Sweden)

    Brian J White

    2015-07-01

    Full Text Available Over the past decade, the understanding of the anatomy and function of the hip joint has continuously evolved, and surgical treatment options for the hip have significantly progressed. Originally, surgical treatment of the hip primarily involved resection of damaged tissue. Procedures that maintain and preserve proper hip anatomy, such as labral repair and femoroacetabular impingement (FAI correction, have shown superior results, in terms of pain reduction, increased function, and ability to return to activities. Labral reconstruction is a treatment option that uses a graft to reconstruct the native labrum. The technique and outcomes of labral reconstruction have been described relatively recently, and labral reconstruction is a cutting edge procedure that has shown promising early outcomes. The aim of this article is to review the current literature on hip labral reconstruction. We will review the indications for labral reconstruction, surgical technique and graft options, and surgical outcomes that have been described to date. Labral reconstruction provides an alternative treatment option for challenging intra-articular hip problems. Labral reconstruction restores the original anatomy of the hip and has the potential to preserve the longevity of the hip joint. This technique is an important tool in the orthopaedic surgeon’s arsenal for hip joint treatment and preservation.

  4. Neural Network for Sparse Reconstruction

    Directory of Open Access Journals (Sweden)

    Qingfa Li

    2014-01-01

    Full Text Available We construct a neural network based on smoothing approximation techniques and projected gradient method to solve a kind of sparse reconstruction problems. Neural network can be implemented by circuits and can be seen as an important method for solving optimization problems, especially large scale problems. Smoothing approximation is an efficient technique for solving nonsmooth optimization problems. We combine these two techniques to overcome the difficulties of the choices of the step size in discrete algorithms and the item in the set-valued map of differential inclusion. In theory, the proposed network can converge to the optimal solution set of the given problem. Furthermore, some numerical experiments show the effectiveness of the proposed network in this paper.

  5. Profile reconstruction from neutron reflectivity data and a priori knowledge

    International Nuclear Information System (INIS)

    Leeb, H.

    2008-01-01

    The problem of incomplete and noisy information in profile reconstruction from neutron reflectometry data is considered. In particular methods of Bayesian statistics in combination with modelling or inverse scattering techniques are considered in order to properly include the required a priori knowledge to obtain quantitatively reliable estimates of the reconstructed profiles. Applying Bayes theorem the results of different experiments on the same sample can be consistently included in the profile reconstruction

  6. Image Reconstruction Based on Homotopy Perturbation Inversion Method for Electrical Impedance Tomography

    Directory of Open Access Journals (Sweden)

    Jing Wang

    2013-01-01

    Full Text Available The image reconstruction for electrical impedance tomography (EIT mathematically is a typed nonlinear ill-posed inverse problem. In this paper, a novel iteration regularization scheme based on the homotopy perturbation technique, namely, homotopy perturbation inversion method, is applied to investigate the EIT image reconstruction problem. To verify the feasibility and effectiveness, simulations of image reconstruction have been performed in terms of considering different locations, sizes, and numbers of the inclusions, as well as robustness to data noise. Numerical results indicate that this method can overcome the numerical instability and is robust to data noise in the EIT image reconstruction. Moreover, compared with the classical Landweber iteration method, our approach improves the convergence rate. The results are promising.

  7. Diffusion archeology for diffusion progression history reconstruction.

    Science.gov (United States)

    Sefer, Emre; Kingsford, Carl

    2016-11-01

    Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring - perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data.

  8. Reconstructing Neutrino Mass Spectrum

    OpenAIRE

    Smirnov, A. Yu.

    1999-01-01

    Reconstruction of the neutrino mass spectrum and lepton mixing is one of the fundamental problems of particle physics. In this connection we consider two central topics: (i) the origin of large lepton mixing, (ii) possible existence of new (sterile) neutrino states. We discuss also possible relation between large mixing and existence of sterile neutrinos.

  9. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  10. A phylogenetic Kalman filter for ancestral trait reconstruction using molecular data.

    Science.gov (United States)

    Lartillot, Nicolas

    2014-02-15

    Correlation between life history or ecological traits and genomic features such as nucleotide or amino acid composition can be used for reconstructing the evolutionary history of the traits of interest along phylogenies. Thus far, however, such ancestral reconstructions have been done using simple linear regression approaches that do not account for phylogenetic inertia. These reconstructions could instead be seen as a genuine comparative regression problem, such as formalized by classical generalized least-square comparative methods, in which the trait of interest and the molecular predictor are represented as correlated Brownian characters coevolving along the phylogeny. Here, a Bayesian sampler is introduced, representing an alternative and more efficient algorithmic solution to this comparative regression problem, compared with currently existing generalized least-square approaches. Technically, ancestral trait reconstruction based on a molecular predictor is shown to be formally equivalent to a phylogenetic Kalman filter problem, for which backward and forward recursions are developed and implemented in the context of a Markov chain Monte Carlo sampler. The comparative regression method results in more accurate reconstructions and a more faithful representation of uncertainty, compared with simple linear regression. Application to the reconstruction of the evolution of optimal growth temperature in Archaea, using GC composition in ribosomal RNA stems and amino acid composition of a sample of protein-coding genes, confirms previous findings, in particular, pointing to a hyperthermophilic ancestor for the kingdom. The program is freely available at www.phylobayes.org.

  11. Resolution effects in reconstructing ancestral genomes.

    Science.gov (United States)

    Zheng, Chunfang; Jeong, Yuji; Turcotte, Madisyn Gabrielle; Sankoff, David

    2018-05-09

    The reconstruction of ancestral genomes must deal with the problem of resolution, necessarily involving a trade-off between trying to identify genomic details and being overwhelmed by noise at higher resolutions. We use the median reconstruction at the synteny block level, of the ancestral genome of the order Gentianales, based on coffee, Rhazya stricta and grape, to exemplify the effects of resolution (granularity) on comparative genomic analyses. We show how decreased resolution blurs the differences between evolving genomes, with respect to rate, mutational process and other characteristics.

  12. Parallel MR image reconstruction using augmented Lagrangian methods.

    Science.gov (United States)

    Ramani, Sathish; Fessler, Jeffrey A

    2011-03-01

    Magnetic resonance image (MRI) reconstruction using SENSitivity Encoding (SENSE) requires regularization to suppress noise and aliasing effects. Edge-preserving and sparsity-based regularization criteria can improve image quality, but they demand computation-intensive nonlinear optimization. In this paper, we present novel methods for regularized MRI reconstruction from undersampled sensitivity encoded data--SENSE-reconstruction--using the augmented Lagrangian (AL) framework for solving large-scale constrained optimization problems. We first formulate regularized SENSE-reconstruction as an unconstrained optimization task and then convert it to a set of (equivalent) constrained problems using variable splitting. We then attack these constrained versions in an AL framework using an alternating minimization method, leading to algorithms that can be implemented easily. The proposed methods are applicable to a general class of regularizers that includes popular edge-preserving (e.g., total-variation) and sparsity-promoting (e.g., l(1)-norm of wavelet coefficients) criteria and combinations thereof. Numerical experiments with synthetic and in vivo human data illustrate that the proposed AL algorithms converge faster than both general-purpose optimization algorithms such as nonlinear conjugate gradient (NCG) and state-of-the-art MFISTA.

  13. Geometric reconstruction methods for electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Alpers, Andreas, E-mail: alpers@ma.tum.de [Zentrum Mathematik, Technische Universität München, D-85747 Garching bei München (Germany); Gardner, Richard J., E-mail: Richard.Gardner@wwu.edu [Department of Mathematics, Western Washington University, Bellingham, WA 98225-9063 (United States); König, Stefan, E-mail: koenig@ma.tum.de [Zentrum Mathematik, Technische Universität München, D-85747 Garching bei München (Germany); Pennington, Robert S., E-mail: robert.pennington@uni-ulm.de [Center for Electron Nanoscopy, Technical University of Denmark, DK-2800 Kongens Lyngby (Denmark); Boothroyd, Chris B., E-mail: ChrisBoothroyd@cantab.net [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons and Peter Grünberg Institute, Forschungszentrum Jülich, D-52425 Jülich (Germany); Houben, Lothar, E-mail: l.houben@fz-juelich.de [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons and Peter Grünberg Institute, Forschungszentrum Jülich, D-52425 Jülich (Germany); Dunin-Borkowski, Rafal E., E-mail: rdb@fz-juelich.de [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons and Peter Grünberg Institute, Forschungszentrum Jülich, D-52425 Jülich (Germany); Joost Batenburg, Kees, E-mail: Joost.Batenburg@cwi.nl [Centrum Wiskunde and Informatica, NL-1098XG, Amsterdam, The Netherlands and Vision Lab, Department of Physics, University of Antwerp, B-2610 Wilrijk (Belgium)

    2013-05-15

    Electron tomography is becoming an increasingly important tool in materials science for studying the three-dimensional morphologies and chemical compositions of nanostructures. The image quality obtained by many current algorithms is seriously affected by the problems of missing wedge artefacts and non-linear projection intensities due to diffraction effects. The former refers to the fact that data cannot be acquired over the full 180° tilt range; the latter implies that for some orientations, crystalline structures can show strong contrast changes. To overcome these problems we introduce and discuss several algorithms from the mathematical fields of geometric and discrete tomography. The algorithms incorporate geometric prior knowledge (mainly convexity and homogeneity), which also in principle considerably reduces the number of tilt angles required. Results are discussed for the reconstruction of an InAs nanowire. - Highlights: ► Four algorithms for electron tomography are introduced that utilize prior knowledge. ► Objects are assumed to be homogeneous; convexity and regularity is also discussed. ► We are able to reconstruct slices of a nanowire from as few as four projections. ► Algorithms should be selected based on the specific reconstruction task at hand.

  14. Geometric reconstruction methods for electron tomography

    International Nuclear Information System (INIS)

    Alpers, Andreas; Gardner, Richard J.; König, Stefan; Pennington, Robert S.; Boothroyd, Chris B.; Houben, Lothar; Dunin-Borkowski, Rafal E.; Joost Batenburg, Kees

    2013-01-01

    Electron tomography is becoming an increasingly important tool in materials science for studying the three-dimensional morphologies and chemical compositions of nanostructures. The image quality obtained by many current algorithms is seriously affected by the problems of missing wedge artefacts and non-linear projection intensities due to diffraction effects. The former refers to the fact that data cannot be acquired over the full 180° tilt range; the latter implies that for some orientations, crystalline structures can show strong contrast changes. To overcome these problems we introduce and discuss several algorithms from the mathematical fields of geometric and discrete tomography. The algorithms incorporate geometric prior knowledge (mainly convexity and homogeneity), which also in principle considerably reduces the number of tilt angles required. Results are discussed for the reconstruction of an InAs nanowire. - Highlights: ► Four algorithms for electron tomography are introduced that utilize prior knowledge. ► Objects are assumed to be homogeneous; convexity and regularity is also discussed. ► We are able to reconstruct slices of a nanowire from as few as four projections. ► Algorithms should be selected based on the specific reconstruction task at hand

  15. Brachytherapy reconstruction using orthogonal scout views from the CT

    International Nuclear Information System (INIS)

    Perez, J.; Lliso, F.; Carmona, V.; Bea, J.; Tormo, A.; Petschen, I.

    1996-01-01

    Introduction: CT assisted brachytherapy planning is demonstrating to have great advantages as external RT planning does. One of the problems we have found in this approach with the conventional gynecological Fletcher applicators is the high amount of artefacts (ovoids with rectal and vessical protections) in the CT slice. We have introduced a reconstruction method based on scout views in order to avoid this problem, allowing us to perform brachytherapy reconstruction completely CT assisted. We use a virtual simulation chain by General Electric Medical Systems. Method and discussion: Two orthogonal scout views (0 and 90 tube positions) are performed. The reconstruction method takes into account the virtual position of the focus and the fact that there is only divergence in the transverse plane. Algorithms developed for sources as well as for reference points localisation (A, B, lymphatic Fletcher trapezoid, pelvic wall, etc.) are presented. This method has the following practical advantages: the porte-cassette is not necessary, the image quality can be improved (it is very helpful in pelvic lateral views that are critical in conventional radiographs), the total time to get the data is smaller than for conventional radiographs (reduction of patient motion effects) and problems that appear in CT-slice based reconstruction in the case of strongly curved intrauterine applicators are avoided. Even though the resolution is smaller than in conventional radiographs it is good enough for brachytherapy. Regarding the CT planning this method presents the interesting feature that the co-ordinate system is the same for the reconstruction process that for the CT-slices set. As the application can be reconstructed from scout views and the doses can be evaluated on CT slices it is easier to correlate the dose values obtained for the traditional points with those provided by the CT information

  16. Instrument Variables for Reducing Noise in Parallel MRI Reconstruction

    Directory of Open Access Journals (Sweden)

    Yuchou Chang

    2017-01-01

    Full Text Available Generalized autocalibrating partially parallel acquisition (GRAPPA has been a widely used parallel MRI technique. However, noise deteriorates the reconstructed image when reduction factor increases or even at low reduction factor for some noisy datasets. Noise, initially generated from scanner, propagates noise-related errors during fitting and interpolation procedures of GRAPPA to distort the final reconstructed image quality. The basic idea we proposed to improve GRAPPA is to remove noise from a system identification perspective. In this paper, we first analyze the GRAPPA noise problem from a noisy input-output system perspective; then, a new framework based on errors-in-variables (EIV model is developed for analyzing noise generation mechanism in GRAPPA and designing a concrete method—instrument variables (IV GRAPPA to remove noise. The proposed EIV framework provides possibilities that noiseless GRAPPA reconstruction could be achieved by existing methods that solve EIV problem other than IV method. Experimental results show that the proposed reconstruction algorithm can better remove the noise compared to the conventional GRAPPA, as validated with both of phantom and in vivo brain data.

  17. A sparsity-regularized Born iterative method for reconstruction of two-dimensional piecewise continuous inhomogeneous domains

    KAUST Repository

    Sandhu, Ali Imran; Desmal, Abdulla; Bagci, Hakan

    2016-01-01

    A sparsity-regularized Born iterative method (BIM) is proposed for efficiently reconstructing two-dimensional piecewise-continuous inhomogeneous dielectric profiles. Such profiles are typically not spatially sparse, which reduces the efficiency of the sparsity-promoting regularization. To overcome this problem, scattered fields are represented in terms of the spatial derivative of the dielectric profile and reconstruction is carried out over samples of the dielectric profile's derivative. Then, like the conventional BIM, the nonlinear problem is iteratively converted into a sequence of linear problems (in derivative samples) and sparsity constraint is enforced on each linear problem using the thresholded Landweber iterations. Numerical results, which demonstrate the efficiency and accuracy of the proposed method in reconstructing piecewise-continuous dielectric profiles, are presented.

  18. Fast half-sibling population reconstruction: theory and algorithms.

    Science.gov (United States)

    Dexter, Daniel; Brown, Daniel G

    2013-07-12

    Kinship inference is the task of identifying genealogically related individuals. Kinship information is important for determining mating structures, notably in endangered populations. Although many solutions exist for reconstructing full sibling relationships, few exist for half-siblings. We consider the problem of determining whether a proposed half-sibling population reconstruction is valid under Mendelian inheritance assumptions. We show that this problem is NP-complete and provide a 0/1 integer program that identifies the minimum number of individuals that must be removed from a population in order for the reconstruction to become valid. We also present SibJoin, a heuristic-based clustering approach based on Mendelian genetics, which is strikingly fast. The software is available at http://github.com/ddexter/SibJoin.git+. Our SibJoin algorithm is reasonably accurate and thousands of times faster than existing algorithms. The heuristic is used to infer a half-sibling structure for a population which was, until recently, too large to evaluate.

  19. Automatic Texture Optimization for 3D Urban Reconstruction

    Directory of Open Access Journals (Sweden)

    LI Ming

    2017-03-01

    Full Text Available In order to solve the problem of texture optimization in 3D city reconstruction by using multi-lens oblique images, the paper presents a method of seamless texture model reconstruction. At first, it corrects the radiation information of images by camera response functions and image dark channel. Then, according to the corresponding relevance between terrain triangular mesh surface model to image, implements occlusion detection by sparse triangulation method, and establishes the triangles' texture list of visible. Finally, combines with triangles' topology relationship in 3D triangular mesh surface model and means and variances of image, constructs a graph-cuts-based texture optimization algorithm under the framework of MRF(Markov random filed, to solve the discrete label problem of texture optimization selection and clustering, ensures the consistency of the adjacent triangles in texture mapping, achieves the seamless texture reconstruction of city. The experimental results verify the validity and superiority of our proposed method.

  20. Reconstruction of network topology using status-time-series data

    Science.gov (United States)

    Pandey, Pradumn Kumar; Badarla, Venkataramana

    2018-01-01

    Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.

  1. Image reconstruction of fluorescent molecular tomography based on the tree structured Schur complement decomposition

    Directory of Open Access Journals (Sweden)

    Wang Jiajun

    2010-05-01

    Full Text Available Abstract Background The inverse problem of fluorescent molecular tomography (FMT often involves complex large-scale matrix operations, which may lead to unacceptable computational errors and complexity. In this research, a tree structured Schur complement decomposition strategy is proposed to accelerate the reconstruction process and reduce the computational complexity. Additionally, an adaptive regularization scheme is developed to improve the ill-posedness of the inverse problem. Methods The global system is decomposed level by level with the Schur complement system along two paths in the tree structure. The resultant subsystems are solved in combination with the biconjugate gradient method. The mesh for the inverse problem is generated incorporating the prior information. During the reconstruction, the regularization parameters are adaptive not only to the spatial variations but also to the variations of the objective function to tackle the ill-posed nature of the inverse problem. Results Simulation results demonstrate that the strategy of the tree structured Schur complement decomposition obviously outperforms the previous methods, such as the conventional Conjugate-Gradient (CG and the Schur CG methods, in both reconstruction accuracy and speed. As compared with the Tikhonov regularization method, the adaptive regularization scheme can significantly improve ill-posedness of the inverse problem. Conclusions The methods proposed in this paper can significantly improve the reconstructed image quality of FMT and accelerate the reconstruction process.

  2. Riemann–Hilbert problem approach for two-dimensional flow inverse scattering

    Energy Technology Data Exchange (ETDEWEB)

    Agaltsov, A. D., E-mail: agalets@gmail.com [Faculty of Computational Mathematics and Cybernetics, Lomonosov Moscow State University, 119991 Moscow (Russian Federation); Novikov, R. G., E-mail: novikov@cmap.polytechnique.fr [CNRS (UMR 7641), Centre de Mathématiques Appliquées, Ecole Polytechnique, 91128 Palaiseau (France); IEPT RAS, 117997 Moscow (Russian Federation); Moscow Institute of Physics and Technology, Dolgoprudny (Russian Federation)

    2014-10-15

    We consider inverse scattering for the time-harmonic wave equation with first-order perturbation in two dimensions. This problem arises in particular in the acoustic tomography of moving fluid. We consider linearized and nonlinearized reconstruction algorithms for this problem of inverse scattering. Our nonlinearized reconstruction algorithm is based on the non-local Riemann–Hilbert problem approach. Comparisons with preceding results are given.

  3. Riemann–Hilbert problem approach for two-dimensional flow inverse scattering

    International Nuclear Information System (INIS)

    Agaltsov, A. D.; Novikov, R. G.

    2014-01-01

    We consider inverse scattering for the time-harmonic wave equation with first-order perturbation in two dimensions. This problem arises in particular in the acoustic tomography of moving fluid. We consider linearized and nonlinearized reconstruction algorithms for this problem of inverse scattering. Our nonlinearized reconstruction algorithm is based on the non-local Riemann–Hilbert problem approach. Comparisons with preceding results are given

  4. Silicon and Germanium (111) Surface Reconstruction

    Science.gov (United States)

    Hao, You Gong

    Silicon (111) surface (7 x 7) reconstruction has been a long standing puzzle. For the last twenty years, various models were put forward to explain this reconstruction, but so far the problem still remains unsolved. Recent ion scattering and channeling (ISC), scanning tunneling microscopy (STM) and transmission electron diffraction (TED) experiments reveal some new results about the surface which greatly help investigators to establish better models. This work proposes a silicon (111) surface reconstruction mechanism, the raising and lowering mechanism which leads to benzene -like ring and flower (raised atom) building units. Based on these building units a (7 x 7) model is proposed, which is capable of explaining the STM and ISC experiment and several others. Furthermore the building units of the model can be used naturally to account for the germanium (111) surface c(2 x 8) reconstruction and other observed structures including (2 x 2), (5 x 5) and (7 x 7) for germanium as well as the (/3 x /3)R30 and (/19 x /19)R23.5 impurity induced structures for silicon, and the higher temperature disordered (1 x 1) structure for silicon. The model is closely related to the silicon (111) surface (2 x 1) reconstruction pi-bonded chain model, which is the most successful model for the reconstruction now. This provides an explanation for the rather low conversion temperature (560K) of the (2 x 1) to the (7 x 7). The model seems to meet some problems in the explanation of the TED result, which is explained very well by the dimer, adatom and stacking fault (DAS) model proposed by Takayanagi. In order to explain the TED result, a variation of the atomic scattering factor is proposed. Comparing the benzene-like ring model with the DAS model, the former needs more work to explain the TED result and the later has to find a way to explain the silicon (111) surface (1 x 1) disorder experiment.

  5. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  6. Analyzing octopus movements using three-dimensional reconstruction.

    Science.gov (United States)

    Yekutieli, Yoram; Mitelman, Rea; Hochner, Binyamin; Flash, Tamar

    2007-09-01

    Octopus arms, as well as other muscular hydrostats, are characterized by a very large number of degrees of freedom and a rich motion repertoire. Over the years, several attempts have been made to elucidate the interplay between the biomechanics of these organs and their control systems. Recent developments in electrophysiological recordings from both the arms and brains of behaving octopuses mark significant progress in this direction. The next stage is relating these recordings to the octopus arm movements, which requires an accurate and reliable method of movement description and analysis. Here we describe a semiautomatic computerized system for 3D reconstruction of an octopus arm during motion. It consists of two digital video cameras and a PC computer running custom-made software. The system overcomes the difficulty of extracting the motion of smooth, nonrigid objects in poor viewing conditions. Some of the trouble is explained by the problem of light refraction in recording underwater motion. Here we use both experiments and simulations to analyze the refraction problem and show that accurate reconstruction is possible. We have used this system successfully to reconstruct different types of octopus arm movements, such as reaching and bend initiation movements. Our system is noninvasive and does not require attaching any artificial markers to the octopus arm. It may therefore be of more general use in reconstructing other nonrigid, elongated objects in motion.

  7. Direct maximum parsimony phylogeny reconstruction from genotype data

    OpenAIRE

    Sridhar, Srinath; Lam, Fumei; Blelloch, Guy E; Ravi, R; Schwartz, Russell

    2007-01-01

    Abstract Background Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of ge...

  8. Prosthetic breast reconstruction: indications and update

    Science.gov (United States)

    Quinn, Tam T.; Miller, George S.; Rostek, Marie; Cabalag, Miguel S.; Rozen, Warren M.

    2016-01-01

    Background Despite 82% of patients reporting psychosocial improvement following breast reconstruction, only 33% patients choose to undergo surgery. Implant reconstruction outnumbers autologous reconstruction in many centres. Methods A systematic review of the literature was undertaken. Inclusion required: (I) Meta-analyses or review articles; (II) adult patients aged 18 years or over undergoing alloplastic breast reconstruction; (III) studies including outcome measures; (IV) case series with more than 10 patients; (V) English language; and (VI) publication after 1st January, 2000. Results After full text review, analysis and data extraction was conducted for a total of 63 articles. Definitive reconstruction with an implant can be immediate or delayed. Older patients have similar or even lower complication rates to younger patients. Complications include capsular contracture, hematoma and infection. Obesity, smoking, large breasts, diabetes and higher grade tumors are associated with increased risk of wound problems and reconstructive failure. Silicone implant patients have higher capsular contracture rates but have higher physical and psychosocial function. There were no associations made between silicone implants and cancer or systemic disease. There were no differences in outcomes or complications between round and shaped implants. Textured implants have a lower risk of capsular contracture than smooth implants. Smooth implants are more likely to be displaced as well as having higher rates of infection. Immediate breast reconstruction (IBR) gives the best aesthetic outcome if radiotherapy is not required but has a higher rate of capsular contracture and implant failure. Delayed-immediate reconstruction patients can achieve similar aesthetic results to IBR whilst preserving the breast skin if radiotherapy is required. Delayed breast reconstruction (DBR) patients have fewer complications than IBR patients. Conclusions Implant reconstruction is a safe and popular

  9. Polynomials, Riemann surfaces, and reconstructing missing-energy events

    CERN Document Server

    Gripaios, Ben; Webber, Bryan

    2011-01-01

    We consider the problem of reconstructing energies, momenta, and masses in collider events with missing energy, along with the complications introduced by combinatorial ambiguities and measurement errors. Typically, one reconstructs more than one value and we show how the wrong values may be correlated with the right ones. The problem has a natural formulation in terms of the theory of Riemann surfaces. We discuss examples including top quark decays in the Standard Model (relevant for top quark mass measurements and tests of spin correlation), cascade decays in models of new physics containing dark matter candidates, decays of third-generation leptoquarks in composite models of electroweak symmetry breaking, and Higgs boson decay into two tau leptons.

  10. A novel algorithm of super-resolution image reconstruction based on multi-class dictionaries for natural scene

    Science.gov (United States)

    Wu, Wei; Zhao, Dewei; Zhang, Huan

    2015-12-01

    Super-resolution image reconstruction is an effective method to improve the image quality. It has important research significance in the field of image processing. However, the choice of the dictionary directly affects the efficiency of image reconstruction. A sparse representation theory is introduced into the problem of the nearest neighbor selection. Based on the sparse representation of super-resolution image reconstruction method, a super-resolution image reconstruction algorithm based on multi-class dictionary is analyzed. This method avoids the redundancy problem of only training a hyper complete dictionary, and makes the sub-dictionary more representatives, and then replaces the traditional Euclidean distance computing method to improve the quality of the whole image reconstruction. In addition, the ill-posed problem is introduced into non-local self-similarity regularization. Experimental results show that the algorithm is much better results than state-of-the-art algorithm in terms of both PSNR and visual perception.

  11. Surface Reconstruction and Image Enhancement via $L^1$-Minimization

    KAUST Repository

    Dobrev, Veselin

    2010-01-01

    A surface reconstruction technique based on minimization of the total variation of the gradient is introduced. Convergence of the method is established, and an interior-point algorithm solving the associated linear programming problem is introduced. The reconstruction algorithm is illustrated on various test cases including natural and urban terrain data, and enhancement oflow-resolution or aliased images. Copyright © by SIAM.

  12. Spectrotemporal CT data acquisition and reconstruction at low dose

    International Nuclear Information System (INIS)

    Clark, Darin P.; Badea, Cristian T.; Lee, Chang-Lung; Kirsch, David G.

    2015-01-01

    Purpose: X-ray computed tomography (CT) is widely used, both clinically and preclinically, for fast, high-resolution anatomic imaging; however, compelling opportunities exist to expand its use in functional imaging applications. For instance, spectral information combined with nanoparticle contrast agents enables quantification of tissue perfusion levels, while temporal information details cardiac and respiratory dynamics. The authors propose and demonstrate a projection acquisition and reconstruction strategy for 5D CT (3D + dual energy + time) which recovers spectral and temporal information without substantially increasing radiation dose or sampling time relative to anatomic imaging protocols. Methods: The authors approach the 5D reconstruction problem within the framework of low-rank and sparse matrix decomposition. Unlike previous work on rank-sparsity constrained CT reconstruction, the authors establish an explicit rank-sparse signal model to describe the spectral and temporal dimensions. The spectral dimension is represented as a well-sampled time and energy averaged image plus regularly undersampled principal components describing the spectral contrast. The temporal dimension is represented as the same time and energy averaged reconstruction plus contiguous, spatially sparse, and irregularly sampled temporal contrast images. Using a nonlinear, image domain filtration approach, the authors refer to as rank-sparse kernel regression, the authors transfer image structure from the well-sampled time and energy averaged reconstruction to the spectral and temporal contrast images. This regularization strategy strictly constrains the reconstruction problem while approximately separating the temporal and spectral dimensions. Separability results in a highly compressed representation for the 5D data in which projections are shared between the temporal and spectral reconstruction subproblems, enabling substantial undersampling. The authors solved the 5D reconstruction

  13. A sparsity-regularized Born iterative method for reconstruction of two-dimensional piecewise continuous inhomogeneous domains

    KAUST Repository

    Sandhu, Ali Imran

    2016-04-10

    A sparsity-regularized Born iterative method (BIM) is proposed for efficiently reconstructing two-dimensional piecewise-continuous inhomogeneous dielectric profiles. Such profiles are typically not spatially sparse, which reduces the efficiency of the sparsity-promoting regularization. To overcome this problem, scattered fields are represented in terms of the spatial derivative of the dielectric profile and reconstruction is carried out over samples of the dielectric profile\\'s derivative. Then, like the conventional BIM, the nonlinear problem is iteratively converted into a sequence of linear problems (in derivative samples) and sparsity constraint is enforced on each linear problem using the thresholded Landweber iterations. Numerical results, which demonstrate the efficiency and accuracy of the proposed method in reconstructing piecewise-continuous dielectric profiles, are presented.

  14. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    Science.gov (United States)

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  15. Image reconstruction under non-Gaussian noise

    DEFF Research Database (Denmark)

    Sciacchitano, Federica

    During acquisition and transmission, images are often blurred and corrupted by noise. One of the fundamental tasks of image processing is to reconstruct the clean image from a degraded version. The process of recovering the original image from the data is an example of inverse problem. Due...... to the ill-posedness of the problem, the simple inversion of the degradation model does not give any good reconstructions. Therefore, to deal with the ill-posedness it is necessary to use some prior information on the solution or the model and the Bayesian approach. Additive Gaussian noise has been......D thesis intends to solve some of the many open questions for image restoration under non-Gaussian noise. The two main kinds of noise studied in this PhD project are the impulse noise and the Cauchy noise. Impulse noise is due to for instance the malfunctioning pixel elements in the camera sensors, errors...

  16. Reconstruction of chaotic signals with applications to chaos-based communications

    CERN Document Server

    Feng, Jiu Chao

    2008-01-01

    This book provides a systematic review of the fundamental theory of signal reconstruction and the practical techniques used in reconstructing chaotic signals. Specific applications of signal reconstruction methods in chaos-based communications are expounded in full detail, along with examples illustrating the various problems associated with such applications.The book serves as an advanced textbook for undergraduate and graduate courses in electronic and information engineering, automatic control, physics and applied mathematics. It is also highly suited for general nonlinear scientists who wi

  17. Inverse Heat Conduction Methods in the CHAR Code for Aerothermal Flight Data Reconstruction

    Science.gov (United States)

    Oliver, A. Brandon; Amar, Adam J.

    2016-01-01

    Reconstruction of flight aerothermal environments often requires the solution of an inverse heat transfer problem, which is an ill-posed problem of determining boundary conditions from discrete measurements in the interior of the domain. This paper will present the algorithms implemented in the CHAR code for use in reconstruction of EFT-1 flight data and future testing activities. Implementation details will be discussed, and alternative hybrid-methods that are permitted by the implementation will be described. Results will be presented for a number of problems.

  18. Electrical Impedance Tomography: 3D Reconstructions using Scattering Transforms

    DEFF Research Database (Denmark)

    Delbary, Fabrice; Hansen, Per Christian; Knudsen, Kim

    2012-01-01

    In three dimensions the Calderon problem was addressed and solved in theory in the 1980s. The main ingredients in the solution of the problem are complex geometrical optics solutions to the conductivity equation and a (non-physical) scattering transform. The resulting reconstruction algorithm...

  19. Exact reconstruction in 2D dynamic CT: compensation of time-dependent affine deformations

    International Nuclear Information System (INIS)

    Roux, Sebastien; Desbat, Laurent; Koenig, Anne; Grangeat, Pierre

    2004-01-01

    This work is dedicated to the reduction of reconstruction artefacts due to motion occurring during the acquisition of computerized tomographic projections. This problem has to be solved when imaging moving organs such as the lungs or the heart. The proposed method belongs to the class of motion compensation algorithms, where the model of motion is included in the reconstruction formula. We address two fundamental questions. First what conditions on the deformation are required for the reconstruction of the object from projections acquired sequentially during the deformation, and second how do we reconstruct the object from those projections. Here we answer these questions in the particular case of 2D general time-dependent affine deformations, assuming the motion parameters are known. We treat the problem of admissibility conditions on the deformation in the parallel-beam and fan-beam cases. Then we propose exact reconstruction methods based on rebinning or sequential FBP formulae for each of these geometries and present reconstructed images obtained with the fan-beam algorithm on simulated data

  20. A modified sparse reconstruction method for three-dimensional synthetic aperture radar image

    Science.gov (United States)

    Zhang, Ziqiang; Ji, Kefeng; Song, Haibo; Zou, Huanxin

    2018-03-01

    There is an increasing interest in three-dimensional Synthetic Aperture Radar (3-D SAR) imaging from observed sparse scattering data. However, the existing 3-D sparse imaging method requires large computing times and storage capacity. In this paper, we propose a modified method for the sparse 3-D SAR imaging. The method processes the collection of noisy SAR measurements, usually collected over nonlinear flight paths, and outputs 3-D SAR imagery. Firstly, the 3-D sparse reconstruction problem is transformed into a series of 2-D slices reconstruction problem by range compression. Then the slices are reconstructed by the modified SL0 (smoothed l0 norm) reconstruction algorithm. The improved algorithm uses hyperbolic tangent function instead of the Gaussian function to approximate the l0 norm and uses the Newton direction instead of the steepest descent direction, which can speed up the convergence rate of the SL0 algorithm. Finally, numerical simulation results are given to demonstrate the effectiveness of the proposed algorithm. It is shown that our method, compared with existing 3-D sparse imaging method, performs better in reconstruction quality and the reconstruction time.

  1. Detector independent cellular automaton algorithm for track reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Kisel, Ivan; Kulakov, Igor; Zyzak, Maksym [Goethe Univ. Frankfurt am Main (Germany); Frankfurt Institute for Advanced Studies, Frankfurt am Main (Germany); GSI Helmholtzzentrum fuer Schwerionenforschung GmbH (Germany); Collaboration: CBM-Collaboration

    2013-07-01

    Track reconstruction is one of the most challenging problems of data analysis in modern high energy physics (HEP) experiments, which have to process per second of the order of 10{sup 7} events with high track multiplicity and density, registered by detectors of different types and, in many cases, located in non-homogeneous magnetic field. Creation of reconstruction package common for all experiments is considered to be important in order to consolidate efforts. The cellular automaton (CA) track reconstruction approach has been used successfully in many HEP experiments. It is very simple, efficient, local and parallel. Meanwhile it is intrinsically independent of detector geometry and good candidate for common track reconstruction. The CA implementation for the CBM experiment has been generalized and applied to the ALICE ITS and STAR HFT detectors. Tests with simulated collisions have been performed. The track reconstruction efficiencies are at the level of 95% for majority of the signal tracks for all detectors.

  2. Optimization of reconstruction algorithms using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Hanson, K.M.

    1989-01-01

    A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by a iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a nonnegativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. 11 refs., 6 figs., 2 tabs

  3. Optimization of reconstruction algorithms using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Hanson, K.M.

    1989-01-01

    A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by an iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a non-negativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. (author)

  4. [Reconstructive surgery of cranio-orbital injuries].

    Science.gov (United States)

    Eolchiian, S A; Potapov, A A; Serova, N K; Kataev, M G; Sergeeva, L A; Zakharova, N E; Van Damm, P

    2011-01-01

    The aim of study was to optimize evaluation and surgery of cranioorbital injuries in different periods after trauma. Material and methods. We analyzed 374 patients with cranioorbital injuries treated in Burdenko Neurosurgery Institute in different periods after trauma from January 1998 till April 2010. 288 (77%) underwent skull and facial skeleton reconstructive surgery within 24 hours - 7 years after trauma. Clinical and CT examination data were used for preoperative planning and assessment of surgery results. Stereolithographic models (STLM) were applied for preoperative planning in 89 cases. The follow-up period ranged from 4 months up to 10 years. Results. In 254 (88%) of 288 patients reconstruction of anterior skull base, upper and/or midface with restoration of different parts of orbit was performed. Anterior skull base CSF leaks repair, calvarial vault reconstruction, maxillar and mandibular osteosynthesis were done in 34 (12%) cases. 242 (84%) of 288 patients underwent one reconstructive operation, while 46 (16%)--two and more (totally 105 operations). The patients with extended frontoorbital and midface fractures commonly needed more than one operation--in 27 (62.8%) cases. Different plastic materials were used for reconstruction in 233 (80.9%) patients, of those in 147 (51%) cases split calvarial bone grafts were preferred. Good functional and cosmetic results were achieved in 261 (90.6%) of 288 patients while acceptable were observed in 27 (9.4%). Conclusion. Active single-stage surgical management for repair of combined cranioorbital injury in acute period with primary reconstruction optimizes functional and cosmetic outcomes and prevents the problems of delayed or secondary reconstruction. Severe extended anterior skull base, upper and midface injuries when intracranial surgery is needed produced the most challenging difficulties for adequate reconstruction. Randomized trial is required to define the extent and optimal timing of reconstructive surgery

  5. Magnetic flux reconstruction methods for shaped tokamaks

    International Nuclear Information System (INIS)

    Tsui, Chi-Wa.

    1993-12-01

    The use of a variational method permits the Grad-Shafranov (GS) equation to be solved by reducing the problem of solving the 2D non-linear partial differential equation to the problem of minimizing a function of several variables. This high speed algorithm approximately solves the GS equation given a parameterization of the plasma boundary and the current profile (p' and FF' functions). The author treats the current profile parameters as unknowns. The goal is to reconstruct the internal magnetic flux surfaces of a tokamak plasma and the toroidal current density profile from the external magnetic measurements. This is a classic problem of inverse equilibrium determination. The current profile parameters can be evaluated by several different matching procedures. Matching of magnetic flux and field at the probe locations using the Biot-Savart law and magnetic Green's function provides a robust method of magnetic reconstruction. The matching of poloidal magnetic field on the plasma surface provides a unique method of identifying the plasma current profile. However, the power of this method is greatly compromised by the experimental errors of the magnetic signals. The Casing Principle provides a very fast way to evaluate the plasma contribution to the magnetic signals. It has the potential of being a fast matching method. The performance of this method is hindered by the accuracy of the poloidal magnetic field computed from the equilibrium solver. A flux reconstruction package has been implemented which integrates a vacuum field solver using a filament model for the plasma, a multi-layer perception neural network as an interface, and the volume integration of plasma current density using Green's functions as a matching method for the current profile parameters. The flux reconstruction package is applied to compare with the ASEQ and EFIT data. The results are promising

  6. Tomographic image reconstruction using training images

    DEFF Research Database (Denmark)

    Soltani, Sara; Andersen, Martin Skovgaard; Hansen, Per Christian

    2017-01-01

    We describe and examine an algorithm for tomographic image reconstruction where prior knowledge about the solution is available in the form of training images. We first construct a non-negative dictionary based on prototype elements from the training images; this problem is formulated within...

  7. Use of an object model in three dimensional image reconstruction. Application in medical imaging

    International Nuclear Information System (INIS)

    Delageniere-Guillot, S.

    1993-02-01

    Threedimensional image reconstruction from projections corresponds to a set of techniques which give information on the inner structure of the studied object. These techniques are mainly used in medical imaging or in non destructive evaluation. Image reconstruction is an ill-posed problem. So the inversion has to be regularized. This thesis deals with the introduction of a priori information within the reconstruction algorithm. The knowledge is introduced through an object model. The proposed scheme is applied to the medical domain for cone beam geometry. We address two specific problems. First, we study the reconstruction of high contrast objects. This can be applied to bony morphology (bone/soft tissue) or to angiography (vascular structures opacified by injection of contrast agent). With noisy projections, the filtering steps of standard methods tend to smooth the natural transitions of the investigated object. In order to regularize the reconstruction but to keep contrast, we introduce a model of classes which involves the Markov random fields theory. We develop a reconstruction scheme: analytic reconstruction-reprojection. Then, we address the case of an object changing during the acquisition. This can be applied to angiography when the contrast agent is moving through the vascular tree. The problem is then stated as a dynamic reconstruction. We define an evolution AR model and we use an algebraic reconstruction method. We represent the object at a particular moment as an intermediary state between the state of the object at the beginning and at the end of the acquisition. We test both methods on simulated and real data, and we prove how the use of an a priori model can improve the results. (author)

  8. On artefact-free reconstruction of low-energy (30–250 eV) electron holograms

    Energy Technology Data Exchange (ETDEWEB)

    Latychevskaia, Tatiana, E-mail: tatiana@physik.uzh.ch; Longchamp, Jean-Nicolas; Escher, Conrad; Fink, Hans-Werner

    2014-10-15

    Low-energy electrons (30–250 eV) have been successfully employed for imaging individual biomolecules. The most simple and elegant design of a low-energy electron microscope for imaging biomolecules is a lensless setup that operates in the holographic mode. In this work we address the problem associated with the reconstruction from the recorded holograms. We discuss the twin image problem intrinsic to inline holography and the problem of the so-called biprism-like effect specific to low-energy electrons. We demonstrate how the presence of the biprism-like effect can be efficiently identified and circumvented. The presented sideband filtering reconstruction method eliminates the twin image and allows for reconstruction despite the biprism-like effect, which we demonstrate on both, simulated and experimental examples. - Highlights: • Radiation damage-free imaging of individual biomolecules. • Elimination of the twin image in inline holograms. • Circumventing biprism-like effect in low-energy electron holograms. • Artefact-free reconstructions of low-energy electron holograms.

  9. On artefact-free reconstruction of low-energy (30–250 eV) electron holograms

    International Nuclear Information System (INIS)

    Latychevskaia, Tatiana; Longchamp, Jean-Nicolas; Escher, Conrad; Fink, Hans-Werner

    2014-01-01

    Low-energy electrons (30–250 eV) have been successfully employed for imaging individual biomolecules. The most simple and elegant design of a low-energy electron microscope for imaging biomolecules is a lensless setup that operates in the holographic mode. In this work we address the problem associated with the reconstruction from the recorded holograms. We discuss the twin image problem intrinsic to inline holography and the problem of the so-called biprism-like effect specific to low-energy electrons. We demonstrate how the presence of the biprism-like effect can be efficiently identified and circumvented. The presented sideband filtering reconstruction method eliminates the twin image and allows for reconstruction despite the biprism-like effect, which we demonstrate on both, simulated and experimental examples. - Highlights: • Radiation damage-free imaging of individual biomolecules. • Elimination of the twin image in inline holograms. • Circumventing biprism-like effect in low-energy electron holograms. • Artefact-free reconstructions of low-energy electron holograms

  10. [Development and current situation of reconstruction methods following total sacrectomy].

    Science.gov (United States)

    Huang, Siyi; Ji, Tao; Guo, Wei

    2018-05-01

    To review the development of the reconstruction methods following total sacrectomy, and to provide reference for finding a better reconstruction method following total sacrectomy. The case reports and biomechanical and finite element studies of reconstruction following total sacrectomy at home and abroad were searched. Development and current situation were summarized. After developing for nearly 30 years, great progress has been made in the reconstruction concept and fixation techniques. The fixation methods can be summarized as the following three strategies: spinopelvic fixation (SPF), posterior pelvic ring fixation (PPRF), and anterior spinal column fixation (ASCF). SPF has undergone technical progress from intrapelvic rod and hook constructs to pedicle and iliac screw-rod systems. PPRF and ASCF could improve the stability of the reconstruction system. Reconstruction following total sacrectomy remains a challenge. Reconstruction combining SPF, PPRF, and ASCF is the developmental direction to achieve mechanical stability. How to gain biological fixation to improve the long-term stability is an urgent problem to be solved.

  11. A biomechanical modeling-guided simultaneous motion estimation and image reconstruction technique (SMEIR-Bio) for 4D-CBCT reconstruction

    Science.gov (United States)

    Huang, Xiaokun; Zhang, You; Wang, Jing

    2018-02-01

    Reconstructing four-dimensional cone-beam computed tomography (4D-CBCT) images directly from respiratory phase-sorted traditional 3D-CBCT projections can capture target motion trajectory, reduce motion artifacts, and reduce imaging dose and time. However, the limited numbers of projections in each phase after phase-sorting decreases CBCT image quality under traditional reconstruction techniques. To address this problem, we developed a simultaneous motion estimation and image reconstruction (SMEIR) algorithm, an iterative method that can reconstruct higher quality 4D-CBCT images from limited projections using an inter-phase intensity-driven motion model. However, the accuracy of the intensity-driven motion model is limited in regions with fine details whose quality is degraded due to insufficient projection number, which consequently degrades the reconstructed image quality in corresponding regions. In this study, we developed a new 4D-CBCT reconstruction algorithm by introducing biomechanical modeling into SMEIR (SMEIR-Bio) to boost the accuracy of the motion model in regions with small fine structures. The biomechanical modeling uses tetrahedral meshes to model organs of interest and solves internal organ motion using tissue elasticity parameters and mesh boundary conditions. This physics-driven approach enhances the accuracy of solved motion in the organ’s fine structures regions. This study used 11 lung patient cases to evaluate the performance of SMEIR-Bio, making both qualitative and quantitative comparisons between SMEIR-Bio, SMEIR, and the algebraic reconstruction technique with total variation regularization (ART-TV). The reconstruction results suggest that SMEIR-Bio improves the motion model’s accuracy in regions containing small fine details, which consequently enhances the accuracy and quality of the reconstructed 4D-CBCT images.

  12. An interior-point method for total variation regularized positron emission tomography image reconstruction

    Science.gov (United States)

    Bai, Bing

    2012-03-01

    There has been a lot of work on total variation (TV) regularized tomographic image reconstruction recently. Many of them use gradient-based optimization algorithms with a differentiable approximation of the TV functional. In this paper we apply TV regularization in Positron Emission Tomography (PET) image reconstruction. We reconstruct the PET image in a Bayesian framework, using Poisson noise model and TV prior functional. The original optimization problem is transformed to an equivalent problem with inequality constraints by adding auxiliary variables. Then we use an interior point method with logarithmic barrier functions to solve the constrained optimization problem. In this method, a series of points approaching the solution from inside the feasible region are found by solving a sequence of subproblems characterized by an increasing positive parameter. We use preconditioned conjugate gradient (PCG) algorithm to solve the subproblems directly. The nonnegativity constraint is enforced by bend line search. The exact expression of the TV functional is used in our calculations. Simulation results show that the algorithm converges fast and the convergence is insensitive to the values of the regularization and reconstruction parameters.

  13. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  14. Accelerated Compressed Sensing Based CT Image Reconstruction.

    Science.gov (United States)

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  15. Accelerated Compressed Sensing Based CT Image Reconstruction

    Directory of Open Access Journals (Sweden)

    SayedMasoud Hashemi

    2015-01-01

    Full Text Available In X-ray computed tomography (CT an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  16. UV Reconstruction Algorithm And Diurnal Cycle Variability

    Science.gov (United States)

    Curylo, Aleksander; Litynska, Zenobia; Krzyscin, Janusz; Bogdanska, Barbara

    2009-03-01

    UV reconstruction is a method of estimation of surface UV with the use of available actinometrical and aerological measurements. UV reconstruction is necessary for the study of long-term UV change. A typical series of UV measurements is not longer than 15 years, which is too short for trend estimation. The essential problem in the reconstruction algorithm is the good parameterization of clouds. In our previous algorithm we used an empirical relation between Cloud Modification Factor (CMF) in global radiation and CMF in UV. The CMF is defined as the ratio between measured and modelled irradiances. Clear sky irradiance was calculated with a solar radiative transfer model. In the proposed algorithm, the time variability of global radiation during the diurnal cycle is used as an additional source of information. For elaborating an improved reconstruction algorithm relevant data from Legionowo [52.4 N, 21.0 E, 96 m a.s.l], Poland were collected with the following instruments: NILU-UV multi channel radiometer, Kipp&Zonen pyranometer, radiosonde profiles of ozone, humidity and temperature. The proposed algorithm has been used for reconstruction of UV at four Polish sites: Mikolajki, Kolobrzeg, Warszawa-Bielany and Zakopane since the early 1960s. Krzyscin's reconstruction of total ozone has been used in the calculations.

  17. A constrained reconstruction technique of hyperelasticity parameters for breast cancer assessment

    Science.gov (United States)

    Mehrabian, Hatef; Campbell, Gordon; Samani, Abbas

    2010-12-01

    In breast elastography, breast tissue usually undergoes large compression resulting in significant geometric and structural changes. This implies that breast elastography is associated with tissue nonlinear behavior. In this study, an elastography technique is presented and an inverse problem formulation is proposed to reconstruct parameters characterizing tissue hyperelasticity. Such parameters can potentially be used for tumor classification. This technique can also have other important clinical applications such as measuring normal tissue hyperelastic parameters in vivo. Such parameters are essential in planning and conducting computer-aided interventional procedures. The proposed parameter reconstruction technique uses a constrained iterative inversion; it can be viewed as an inverse problem. To solve this problem, we used a nonlinear finite element model corresponding to its forward problem. In this research, we applied Veronda-Westmann, Yeoh and polynomial models to model tissue hyperelasticity. To validate the proposed technique, we conducted studies involving numerical and tissue-mimicking phantoms. The numerical phantom consisted of a hemisphere connected to a cylinder, while we constructed the tissue-mimicking phantom from polyvinyl alcohol with freeze-thaw cycles that exhibits nonlinear mechanical behavior. Both phantoms consisted of three types of soft tissues which mimic adipose, fibroglandular tissue and a tumor. The results of the simulations and experiments show feasibility of accurate reconstruction of tumor tissue hyperelastic parameters using the proposed method. In the numerical phantom, all hyperelastic parameters corresponding to the three models were reconstructed with less than 2% error. With the tissue-mimicking phantom, we were able to reconstruct the ratio of the hyperelastic parameters reasonably accurately. Compared to the uniaxial test results, the average error of the ratios of the parameters reconstructed for inclusion to the middle

  18. A constrained reconstruction technique of hyperelasticity parameters for breast cancer assessment

    Energy Technology Data Exchange (ETDEWEB)

    Mehrabian, Hatef; Samani, Abbas [Department of Electrical and Computer Engineering, University of Western Ontario, London, ON (Canada); Campbell, Gordon, E-mail: asamani@uwo.c [Department of Medical Biophysics, University of Western Ontario, London, ON (Canada)

    2010-12-21

    In breast elastography, breast tissue usually undergoes large compression resulting in significant geometric and structural changes. This implies that breast elastography is associated with tissue nonlinear behavior. In this study, an elastography technique is presented and an inverse problem formulation is proposed to reconstruct parameters characterizing tissue hyperelasticity. Such parameters can potentially be used for tumor classification. This technique can also have other important clinical applications such as measuring normal tissue hyperelastic parameters in vivo. Such parameters are essential in planning and conducting computer-aided interventional procedures. The proposed parameter reconstruction technique uses a constrained iterative inversion; it can be viewed as an inverse problem. To solve this problem, we used a nonlinear finite element model corresponding to its forward problem. In this research, we applied Veronda-Westmann, Yeoh and polynomial models to model tissue hyperelasticity. To validate the proposed technique, we conducted studies involving numerical and tissue-mimicking phantoms. The numerical phantom consisted of a hemisphere connected to a cylinder, while we constructed the tissue-mimicking phantom from polyvinyl alcohol with freeze-thaw cycles that exhibits nonlinear mechanical behavior. Both phantoms consisted of three types of soft tissues which mimic adipose, fibroglandular tissue and a tumor. The results of the simulations and experiments show feasibility of accurate reconstruction of tumor tissue hyperelastic parameters using the proposed method. In the numerical phantom, all hyperelastic parameters corresponding to the three models were reconstructed with less than 2% error. With the tissue-mimicking phantom, we were able to reconstruct the ratio of the hyperelastic parameters reasonably accurately. Compared to the uniaxial test results, the average error of the ratios of the parameters reconstructed for inclusion to the middle

  19. Direct maximum parsimony phylogeny reconstruction from genotype data.

    Science.gov (United States)

    Sridhar, Srinath; Lam, Fumei; Blelloch, Guy E; Ravi, R; Schwartz, Russell

    2007-12-05

    Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of genotypes, which consist of conflated combinations of pairs of haplotypes from homologous chromosomes. Currently, there are no general algorithms for the direct reconstruction of maximum parsimony phylogenies from genotype data. Hence phylogenetic applications for autosomal data must therefore rely on other methods for first computationally inferring haplotypes from genotypes. In this work, we develop the first practical method for computing maximum parsimony phylogenies directly from genotype data. We show that the standard practice of first inferring haplotypes from genotypes and then reconstructing a phylogeny on the haplotypes often substantially overestimates phylogeny size. As an immediate application, our method can be used to determine the minimum number of mutations required to explain a given set of observed genotypes. Phylogeny reconstruction directly from unphased data is computationally feasible for moderate-sized problem instances and can lead to substantially more accurate tree size inferences than the standard practice of treating phasing and phylogeny construction as two separate analysis stages. The difference between the approaches is particularly important for downstream applications that require a lower-bound on the number of mutations that the genetic region has undergone.

  20. Direct maximum parsimony phylogeny reconstruction from genotype data

    Directory of Open Access Journals (Sweden)

    Ravi R

    2007-12-01

    Full Text Available Abstract Background Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of genotypes, which consist of conflated combinations of pairs of haplotypes from homologous chromosomes. Currently, there are no general algorithms for the direct reconstruction of maximum parsimony phylogenies from genotype data. Hence phylogenetic applications for autosomal data must therefore rely on other methods for first computationally inferring haplotypes from genotypes. Results In this work, we develop the first practical method for computing maximum parsimony phylogenies directly from genotype data. We show that the standard practice of first inferring haplotypes from genotypes and then reconstructing a phylogeny on the haplotypes often substantially overestimates phylogeny size. As an immediate application, our method can be used to determine the minimum number of mutations required to explain a given set of observed genotypes. Conclusion Phylogeny reconstruction directly from unphased data is computationally feasible for moderate-sized problem instances and can lead to substantially more accurate tree size inferences than the standard practice of treating phasing and phylogeny construction as two separate analysis stages. The difference between the approaches is particularly important for downstream applications that require a lower-bound on the number of mutations that the genetic region has undergone.

  1. Simultaneous reconstruction of shape and generalized impedance functions in electrostatic imaging

    International Nuclear Information System (INIS)

    Cakoni, Fioralba; Hu, Yuqing; Kress, Rainer

    2014-01-01

    Determining the geometry and the physical nature of an inclusion within a conducting medium from voltage and current measurements on the accessible boundary of the medium can be modeled as an inverse boundary value problem for the Laplace equation subject to appropriate boundary conditions on the inclusion. We continue the investigations on the particular inverse problem with a generalized impedance condition started in Cakoni and Kress (2013 Inverse Problems 29 015005) by presenting an inverse algorithm for the simultaneous reconstruction of both the shape of the inclusion and the two impedance functions via a boundary integral equation approach. In addition to describing the reconstruction algorithm and illustrating its feasibility by numerical examples we also provide some extensions to the uniqueness results in Cakoni and Kress (2013 Inverse Problems 29 015005). (paper)

  2. Reconstruction of a uniform star object from interior x-ray data: uniqueness, stability and algorithm

    International Nuclear Information System (INIS)

    Van Gompel, Gert; Batenburg, K Joost; Defrise, Michel

    2009-01-01

    In this paper we consider the problem of reconstructing a two-dimensional star-shaped object of uniform density from truncated projections of the object. In particular, we prove that such an object is uniquely determined by its parallel projections sampled over a full π angular range with a detector that only covers an interior field-of-view, even if the density of the object is not known a priori. We analyze the stability of this reconstruction problem and propose a reconstruction algorithm. Simulation experiments demonstrate that the algorithm is capable of reconstructing a star-shaped object from interior data, even if the interior region is much smaller than the size of the object. In addition, we present results for a heuristic reconstruction algorithm called DART, that was recently proposed. The heuristic method is shown to yield accurate reconstructions if the density is known in advance, and to have a very good stability in the presence of noisy projection data. Finally, the performance of the DBP and DART algorithms is illustrated for the reconstruction of real micro-CT data of a diamond

  3. Light field reconstruction robust to signal dependent noise

    Science.gov (United States)

    Ren, Kun; Bian, Liheng; Suo, Jinli; Dai, Qionghai

    2014-11-01

    Capturing four dimensional light field data sequentially using a coded aperture camera is an effective approach but suffers from low signal noise ratio. Although multiplexing can help raise the acquisition quality, noise is still a big issue especially for fast acquisition. To address this problem, this paper proposes a noise robust light field reconstruction method. Firstly, scene dependent noise model is studied and incorporated into the light field reconstruction framework. Then, we derive an optimization algorithm for the final reconstruction. We build a prototype by hacking an off-the-shelf camera for data capturing and prove the concept. The effectiveness of this method is validated with experiments on the real captured data.

  4. On simulated annealing phase transitions in phylogeny reconstruction.

    Science.gov (United States)

    Strobl, Maximilian A R; Barker, Daniel

    2016-08-01

    Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Improved magnetic resonance fingerprinting reconstruction with low-rank and subspace modeling.

    Science.gov (United States)

    Zhao, Bo; Setsompop, Kawin; Adalsteinsson, Elfar; Gagoski, Borjan; Ye, Huihui; Ma, Dan; Jiang, Yun; Ellen Grant, P; Griswold, Mark A; Wald, Lawrence L

    2018-02-01

    This article introduces a constrained imaging method based on low-rank and subspace modeling to improve the accuracy and speed of MR fingerprinting (MRF). A new model-based imaging method is developed for MRF to reconstruct high-quality time-series images and accurate tissue parameter maps (e.g., T 1 , T 2 , and spin density maps). Specifically, the proposed method exploits low-rank approximations of MRF time-series images, and further enforces temporal subspace constraints to capture magnetization dynamics. This allows the time-series image reconstruction problem to be formulated as a simple linear least-squares problem, which enables efficient computation. After image reconstruction, tissue parameter maps are estimated via dictionary-based pattern matching, as in the conventional approach. The effectiveness of the proposed method was evaluated with in vivo experiments. Compared with the conventional MRF reconstruction, the proposed method reconstructs time-series images with significantly reduced aliasing artifacts and noise contamination. Although the conventional approach exhibits some robustness to these corruptions, the improved time-series image reconstruction in turn provides more accurate tissue parameter maps. The improvement is pronounced especially when the acquisition time becomes short. The proposed method significantly improves the accuracy of MRF, and also reduces data acquisition time. Magn Reson Med 79:933-942, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  6. Atmospheric inverse modeling via sparse reconstruction

    Science.gov (United States)

    Hase, Nils; Miller, Scot M.; Maaß, Peter; Notholt, Justus; Palm, Mathias; Warneke, Thorsten

    2017-10-01

    Many applications in atmospheric science involve ill-posed inverse problems. A crucial component of many inverse problems is the proper formulation of a priori knowledge about the unknown parameters. In most cases, this knowledge is expressed as a Gaussian prior. This formulation often performs well at capturing smoothed, large-scale processes but is often ill equipped to capture localized structures like large point sources or localized hot spots. Over the last decade, scientists from a diverse array of applied mathematics and engineering fields have developed sparse reconstruction techniques to identify localized structures. In this study, we present a new regularization approach for ill-posed inverse problems in atmospheric science. It is based on Tikhonov regularization with sparsity constraint and allows bounds on the parameters. We enforce sparsity using a dictionary representation system. We analyze its performance in an atmospheric inverse modeling scenario by estimating anthropogenic US methane (CH4) emissions from simulated atmospheric measurements. Different measures indicate that our sparse reconstruction approach is better able to capture large point sources or localized hot spots than other methods commonly used in atmospheric inversions. It captures the overall signal equally well but adds details on the grid scale. This feature can be of value for any inverse problem with point or spatially discrete sources. We show an example for source estimation of synthetic methane emissions from the Barnett shale formation.

  7. Few-view image reconstruction with dual dictionaries

    International Nuclear Information System (INIS)

    Lu Yang; Zhao Jun; Wang Ge

    2012-01-01

    In this paper, we formulate the problem of computed tomography (CT) under sparsity and few-view constraints, and propose a novel algorithm for image reconstruction from few-view data utilizing the simultaneous algebraic reconstruction technique (SART) coupled with dictionary learning, sparse representation and total variation (TV) minimization on two interconnected levels. The main feature of our algorithm is the use of two dictionaries: a transitional dictionary for atom matching and a global dictionary for image updating. The atoms in the global and transitional dictionaries represent the image patches from high-quality and low-quality CT images, respectively. Experiments with simulated and real projections were performed to evaluate and validate the proposed algorithm. The results reconstructed using the proposed approach are significantly better than those using either SART or SART–TV. (paper)

  8. Blockwise conjugate gradient methods for image reconstruction in volumetric CT.

    Science.gov (United States)

    Qiu, W; Titley-Peloquin, D; Soleimani, M

    2012-11-01

    Cone beam computed tomography (CBCT) enables volumetric image reconstruction from 2D projection data and plays an important role in image guided radiation therapy (IGRT). Filtered back projection is still the most frequently used algorithm in applications. The algorithm discretizes the scanning process (forward projection) into a system of linear equations, which must then be solved to recover images from measured projection data. The conjugate gradients (CG) algorithm and its variants can be used to solve (possibly regularized) linear systems of equations Ax=b and linear least squares problems minx∥b-Ax∥2, especially when the matrix A is very large and sparse. Their applications can be found in a general CT context, but in tomography problems (e.g. CBCT reconstruction) they have not widely been used. Hence, CBCT reconstruction using the CG-type algorithm LSQR was implemented and studied in this paper. In CBCT reconstruction, the main computational challenge is that the matrix A usually is very large, and storing it in full requires an amount of memory well beyond the reach of commodity computers. Because of these memory capacity constraints, only a small fraction of the weighting matrix A is typically used, leading to a poor reconstruction. In this paper, to overcome this difficulty, the matrix A is partitioned and stored blockwise, and blockwise matrix-vector multiplications are implemented within LSQR. This implementation allows us to use the full weighting matrix A for CBCT reconstruction without further enhancing computer standards. Tikhonov regularization can also be implemented in this fashion, and can produce significant improvement in the reconstructed images. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Quantum interferences reconstruction with low homodyne detection efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Esposito, Martina; Randi, Francesco [Universita degli studi di Trieste, Dipartimento di Fisica, Trieste (Italy); Titimbo, Kelvin; Zimmermann, Klaus; Benatti, Fabio [Universita degli studi di Trieste, Dipartimento di Fisica, Trieste (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Trieste, Trieste (Italy); Kourousias, Georgios; Curri, Alessio [Sincrotrone Trieste S.C.p.A., Trieste (Italy); Floreanini, Roberto [Istituto Nazionale di Fisica Nucleare, Sezione di Trieste, Trieste (Italy); Parmigiani, Fulvio [Universita degli studi di Trieste, Dipartimento di Fisica, Trieste (Italy); Sincrotrone Trieste S.C.p.A., Trieste (Italy); University of Cologne, Institute of Physics II, Cologne (Germany); Fausti, Daniele [Universita degli studi di Trieste, Dipartimento di Fisica, Trieste (Italy); Sincrotrone Trieste S.C.p.A., Trieste (Italy)

    2016-12-15

    Optical homodyne tomography consists in reconstructing the quantum state of an optical field from repeated measurements of its amplitude at different field phases (homodyne data). The experimental noise, which unavoidably affects the homodyne data, leads to a detection efficiency η<1. The problem of reconstructing quantum states from noisy homodyne data sets prompted an intense scientific debate about the presence or absence of a lower homodyne efficiency bound (η>0.5) below which quantum features, like quantum interferences, cannot be retrieved. Here, by numerical experiments, we demonstrate that quantum interferences can be effectively reconstructed also for low homodyne detection efficiency. In particular, we address the challenging case of a Schroedinger cat state and test the minimax and adaptive Wigner function reconstruction technique by processing homodyne data distributed according to the chosen state but with an efficiency η>0.5. By numerically reproducing the Schroedinger's cat interference pattern, we give evidence that quantum state reconstruction is actually possible in these conditions, and provide a guideline for handling optical tomography based on homodyne data collected by low efficiency detectors. (orig.)

  10. Research on reconstruction of steel tube section from few projections

    International Nuclear Information System (INIS)

    Peng Shuaijun; Wu Haifeng; Wang Kai

    2007-01-01

    Most parameters of steel tube can be acquired from CT image of the section so as to evaluate its quality. But large numbers of projections are needed in order to reconstruct the section image, so the collection and calculation of the projections consume lots of time. In order to solve the problem, reconstruction algorithms of steel tube from few projections are researched and the results are validated with simulation data in the paper. Three iterative algorithms, ART, MAP and OSEM, are attempted to reconstruct the section of steel tube by using the simulation model. Considering the prior information distributing of steel tube, we improve the algorithms and get better reconstruction images. The results of simulation experiment indicate that ART, MAP and OSEM can reconstruct accurate section images of steel tube from less than 20 projections and approximate images from 10 projections. (authors)

  11. Sparsity reconstruction in electrical impedance tomography: An experimental evaluation

    KAUST Repository

    Gehre, Matthias

    2012-02-01

    We investigate the potential of sparsity constraints in the electrical impedance tomography (EIT) inverse problem of inferring the distributed conductivity based on boundary potential measurements. In sparsity reconstruction, inhomogeneities of the conductivity are a priori assumed to be sparse with respect to a certain basis. This prior information is incorporated into a Tikhonov-type functional by including a sparsity-promoting ℓ1-penalty term. The functional is minimized with an iterative soft shrinkage-type algorithm. In this paper, the feasibility of the sparsity reconstruction approach is evaluated by experimental data from water tank measurements. The reconstructions are computed both with sparsity constraints and with a more conventional smoothness regularization approach. The results verify that the adoption of ℓ1-type constraints can enhance the quality of EIT reconstructions: in most of the test cases the reconstructions with sparsity constraints are both qualitatively and quantitatively more feasible than that with the smoothness constraint. © 2011 Elsevier B.V. All rights reserved.

  12. Nonimaging aspects of follow-up in breast cancer reconstruction.

    Science.gov (United States)

    Wood, W C

    1991-09-01

    Follow-up of patients with breast cancer is directed to the early detection of recurrent or metastatic disease and the detection of new primary breast cancer. The survival benefit of early detection is limited to some patients with local failure or new primary tumors. That imaging is not used in follow-up of patients who have had breast cancer reconstruction is related to possible interference with this putative benefit by the reconstructive procedure. Such follow-up is accomplished by the patient's own surveillance, clinical examination, and laboratory testing supplemented by imaging studies. Clinical follow-up trials of women who have undergone breast reconstructive surgery show no evidence that locally recurrent breast carcinoma is masked when compared with follow-up of women who did not undergo reconstructive procedures. Reshaping of the contralateral breast to match the reconstructed breast introduces the possibility of interference with palpation as well as mammographic distortion in some women. This is an uncommon practical problem except when complicated by fat necrosis.

  13. A combined reconstruction-classification method for diffuse optical tomography

    Energy Technology Data Exchange (ETDEWEB)

    Hiltunen, P [Department of Biomedical Engineering and Computational Science, Helsinki University of Technology, PO Box 3310, FI-02015 TKK (Finland); Prince, S J D; Arridge, S [Department of Computer Science, University College London, Gower Street London, WC1E 6B (United Kingdom)], E-mail: petri.hiltunen@tkk.fi, E-mail: s.prince@cs.ucl.ac.uk, E-mail: s.arridge@cs.ucl.ac.uk

    2009-11-07

    We present a combined classification and reconstruction algorithm for diffuse optical tomography (DOT). DOT is a nonlinear ill-posed inverse problem. Therefore, some regularization is needed. We present a mixture of Gaussians prior, which regularizes the DOT reconstruction step. During each iteration, the parameters of a mixture model are estimated. These associate each reconstructed pixel with one of several classes based on the current estimate of the optical parameters. This classification is exploited to form a new prior distribution to regularize the reconstruction step and update the optical parameters. The algorithm can be described as an iteration between an optimization scheme with zeroth-order variable mean and variance Tikhonov regularization and an expectation-maximization scheme for estimation of the model parameters. We describe the algorithm in a general Bayesian framework. Results from simulated test cases and phantom measurements show that the algorithm enhances the contrast of the reconstructed images with good spatial accuracy. The probabilistic classifications of each image contain only a few misclassified pixels.

  14. Total variation regularization in measurement and image space for PET reconstruction

    KAUST Repository

    Burger, M

    2014-09-18

    © 2014 IOP Publishing Ltd. The aim of this paper is to test and analyse a novel technique for image reconstruction in positron emission tomography, which is based on (total variation) regularization on both the image space and the projection space. We formulate our variational problem considering both total variation penalty terms on the image and on an idealized sinogram to be reconstructed from a given Poisson distributed noisy sinogram. We prove existence, uniqueness and stability results for the proposed model and provide some analytical insight into the structures favoured by joint regularization. For the numerical solution of the corresponding discretized problem we employ the split Bregman algorithm and extensively test the approach in comparison to standard total variation regularization on the image. The numerical results show that an additional penalty on the sinogram performs better on reconstructing images with thin structures.

  15. Patient-adapted reconstruction and acquisition dynamic imaging method (PARADIGM) for MRI

    International Nuclear Information System (INIS)

    Aggarwal, Nitin; Bresler, Yoram

    2008-01-01

    Dynamic magnetic resonance imaging (MRI) is a challenging problem because the MR data acquisition is often not fast enough to meet the combined spatial and temporal Nyquist sampling rate requirements. Current approaches to this problem include hardware-based acceleration of the acquisition, and model-based image reconstruction techniques. In this paper we propose an alternative approach, called PARADIGM, which adapts both the acquisition and reconstruction to the spatio-temporal characteristics of the imaged object. The approach is based on time-sequential sampling theory, addressing the problem of acquiring a spatio-temporal signal under the constraint that only a limited amount of data can be acquired at a time instant. PARADIGM identifies a model class for the particular imaged object using a scout MR scan or auxiliary data. This object-adapted model is then used to optimize MR data acquisition, such that the imaging constraints are met, acquisition speed requirements are minimized, essentially perfect reconstruction of any object in the model class is guaranteed, and the inverse problem of reconstructing the dynamic object has a condition number of one. We describe spatio-temporal object models for various dynamic imaging applications including cardiac imaging. We present the theory underlying PARADIGM and analyze its performance theoretically and numerically. We also propose a practical MR imaging scheme for 2D dynamic cardiac imaging based on the theory. For this application, PARADIGM is predicted to provide a 10–25 × acceleration compared to the optimal non-adaptive scheme. Finally we present generalized optimality criteria and extend the scheme to dynamic imaging with three spatial dimensions

  16. A Novel Nipple Reconstruction Technique for Maintaining Nipple Projection: The Boomerang Flap

    Directory of Open Access Journals (Sweden)

    Young-Eun Kim

    2016-09-01

    Full Text Available Nipple-areolar complex (NAC reconstruction is the final step in the long journey of breast reconstruction for mastectomy patients. Successful NAC reconstruction depends on the use of appropriate surgical techniques that are simple and reliable. To date, numerous techniques have been used for nipple reconstruction, including contralateral nipple sharing and various local flaps. Recently, it has been common to utilize local flaps. However, the most common nipple reconstruction problem encountered with local flaps is the loss of nipple projection; there can be approximately 50% projection loss in reconstructed nipples over long-term follow-up. Several factors might contribute to nipple projection loss, and we tried to overcome these factors by performing nipple reconstructions using a boomerang flap technique, which is a modified C–V flap that utilizes the previous mastectomy scar to maintain long-term nipple projection.

  17. A Novel Nipple Reconstruction Technique for Maintaining Nipple Projection: The Boomerang Flap.

    Science.gov (United States)

    Kim, Young-Eun; Hong, Ki Yong; Minn, Kyung Won; Jin, Ung Sik

    2016-09-01

    Nipple-areolar complex (NAC) reconstruction is the final step in the long journey of breast reconstruction for mastectomy patients. Successful NAC reconstruction depends on the use of appropriate surgical techniques that are simple and reliable. To date, numerous techniques have been used for nipple reconstruction, including contralateral nipple sharing and various local flaps. Recently, it has been common to utilize local flaps. However, the most common nipple reconstruction problem encountered with local flaps is the loss of nipple projection; there can be approximately 50% projection loss in reconstructed nipples over long-term follow-up. Several factors might contribute to nipple projection loss, and we tried to overcome these factors by performing nipple reconstructions using a boomerang flap technique, which is a modified C-V flap that utilizes the previous mastectomy scar to maintain long-term nipple projection.

  18. Analysis of PWR control rod ejection accident with the coupled code system SKETCH-INS/TRACE by incorporating pin power reconstruction model

    International Nuclear Information System (INIS)

    Nakajima, T.; Sakai, T.

    2010-01-01

    The pin power reconstruction model was incorporated in the 3-D nodal kinetics code SKETCH-INS in order to produce accurate calculation of three-dimensional pin power distributions throughout the reactor core. In order to verify the employed pin power reconstruction model, the PWR MOX/UO_2 core transient benchmark problem was analyzed with the coupled code system SKETCH-INS/TRACE by incorporating the model and the influence of pin power reconstruction model was studied. SKETCH-INS pin power distributions for 3 benchmark problems were compared with the PARCS solutions which were provided by the host organisation of the benchmark. SKETCH-INS results were in good agreement with the PARCS results. The capability of employed pin power reconstruction model was confirmed through the analysis of benchmark problems. A PWR control rod ejection benchmark problem was analyzed with the coupled code system SKETCH-INS/ TRACE by incorporating the pin power reconstruction model. The influence of pin power reconstruction model was studied by comparing with the result of conventional node averaged flux model. The results indicate that the pin power reconstruction model has significant effect on the pin powers during transient and hence on the fuel enthalpy

  19. Reconstructible phylogenetic networks: do not distinguish the indistinguishable.

    Science.gov (United States)

    Pardi, Fabio; Scornavacca, Celine

    2015-04-01

    Phylogenetic networks represent the evolution of organisms that have undergone reticulate events, such as recombination, hybrid speciation or lateral gene transfer. An important way to interpret a phylogenetic network is in terms of the trees it displays, which represent all the possible histories of the characters carried by the organisms in the network. Interestingly, however, different networks may display exactly the same set of trees, an observation that poses a problem for network reconstruction: from the perspective of many inference methods such networks are "indistinguishable". This is true for all methods that evaluate a phylogenetic network solely on the basis of how well the displayed trees fit the available data, including all methods based on input data consisting of clades, triples, quartets, or trees with any number of taxa, and also sequence-based approaches such as popular formalisations of maximum parsimony and maximum likelihood for networks. This identifiability problem is partially solved by accounting for branch lengths, although this merely reduces the frequency of the problem. Here we propose that network inference methods should only attempt to reconstruct what they can uniquely identify. To this end, we introduce a novel definition of what constitutes a uniquely reconstructible network. For any given set of indistinguishable networks, we define a canonical network that, under mild assumptions, is unique and thus representative of the entire set. Given data that underwent reticulate evolution, only the canonical form of the underlying phylogenetic network can be uniquely reconstructed. While on the methodological side this will imply a drastic reduction of the solution space in network inference, for the study of reticulate evolution this is a fundamental limitation that will require an important change of perspective when interpreting phylogenetic networks.

  20. Reconstructible phylogenetic networks: do not distinguish the indistinguishable.

    Directory of Open Access Journals (Sweden)

    Fabio Pardi

    2015-04-01

    Full Text Available Phylogenetic networks represent the evolution of organisms that have undergone reticulate events, such as recombination, hybrid speciation or lateral gene transfer. An important way to interpret a phylogenetic network is in terms of the trees it displays, which represent all the possible histories of the characters carried by the organisms in the network. Interestingly, however, different networks may display exactly the same set of trees, an observation that poses a problem for network reconstruction: from the perspective of many inference methods such networks are "indistinguishable". This is true for all methods that evaluate a phylogenetic network solely on the basis of how well the displayed trees fit the available data, including all methods based on input data consisting of clades, triples, quartets, or trees with any number of taxa, and also sequence-based approaches such as popular formalisations of maximum parsimony and maximum likelihood for networks. This identifiability problem is partially solved by accounting for branch lengths, although this merely reduces the frequency of the problem. Here we propose that network inference methods should only attempt to reconstruct what they can uniquely identify. To this end, we introduce a novel definition of what constitutes a uniquely reconstructible network. For any given set of indistinguishable networks, we define a canonical network that, under mild assumptions, is unique and thus representative of the entire set. Given data that underwent reticulate evolution, only the canonical form of the underlying phylogenetic network can be uniquely reconstructed. While on the methodological side this will imply a drastic reduction of the solution space in network inference, for the study of reticulate evolution this is a fundamental limitation that will require an important change of perspective when interpreting phylogenetic networks.

  1. Multi-level damage identification with response reconstruction

    Science.gov (United States)

    Zhang, Chao-Dong; Xu, You-Lin

    2017-10-01

    Damage identification through finite element (FE) model updating usually forms an inverse problem. Solving the inverse identification problem for complex civil structures is very challenging since the dimension of potential damage parameters in a complex civil structure is often very large. Aside from enormous computation efforts needed in iterative updating, the ill-condition and non-global identifiability features of the inverse problem probably hinder the realization of model updating based damage identification for large civil structures. Following a divide-and-conquer strategy, a multi-level damage identification method is proposed in this paper. The entire structure is decomposed into several manageable substructures and each substructure is further condensed as a macro element using the component mode synthesis (CMS) technique. The damage identification is performed at two levels: the first is at macro element level to locate the potentially damaged region and the second is over the suspicious substructures to further locate as well as quantify the damage severity. In each level's identification, the damage searching space over which model updating is performed is notably narrowed down, not only reducing the computation amount but also increasing the damage identifiability. Besides, the Kalman filter-based response reconstruction is performed at the second level to reconstruct the response of the suspicious substructure for exact damage quantification. Numerical studies and laboratory tests are both conducted on a simply supported overhanging steel beam for conceptual verification. The results demonstrate that the proposed multi-level damage identification via response reconstruction does improve the identification accuracy of damage localization and quantization considerably.

  2. The free vascularized flap and the flap plate options: comparative results of reconstruction of lateral mandibular defects

    NARCIS (Netherlands)

    Shpitzer, T.; Gullane, P. J.; Neligan, P. C.; Irish, J. C.; Freeman, J. E.; van den Brekel, M.; Gur, E.

    2000-01-01

    OBJECTIVES/HYPOTHESIS: Reconstruction of the mandible and oral cavity after segmental resection is a challenging surgical problem. Although osteocutaneous free flaps are generally accepted to be optimal for reconstruction of anterior defects, the need for bony reconstruction for a pure lateral

  3. Reconstruction of Undersampled Atomic Force Microscopy Images

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm; Arildsen, Thomas; Østergaard, Jan

    2013-01-01

    Atomic force microscopy (AFM) is one of the most advanced tools for high-resolution imaging and manipulation of nanoscale matter. Unfortunately, standard AFM imaging requires a timescale on the order of seconds to minutes to acquire an image which makes it complicated to observe dynamic processes....... Moreover, it is often required to take several images before a relevant observation region is identified. In this paper we show how to significantly reduce the image acquisition time by undersampling. The reconstruction of an undersampled AFM image can be viewed as an inpainting, interpolating problem...... should be reconstructed using interpolation....

  4. Amplitude-based data selection for optimal retrospective reconstruction in micro-SPECT

    Science.gov (United States)

    Breuilly, M.; Malandain, G.; Guglielmi, J.; Marsault, R.; Pourcher, T.; Franken, P. R.; Darcourt, J.

    2013-04-01

    Respiratory motion can blur the tomographic reconstruction of positron emission tomography or single-photon emission computed tomography (SPECT) images, which subsequently impair quantitative measurements, e.g. in the upper abdomen area. Respiratory signal phase-based gated reconstruction addresses this problem, but deteriorates the signal-to-noise ratio (SNR) and other intensity-based quality measures. This paper proposes a 3D reconstruction method dedicated to micro-SPECT imaging of mice. From a 4D acquisition, the phase images exhibiting motion are identified and the associated list-mode data are discarded, which enables the reconstruction of a 3D image without respiratory artefacts. The proposed method allows a motion-free reconstruction exhibiting both satisfactory count statistics and accuracy of measures. With respect to standard 3D reconstruction (non-gated 3D reconstruction) without breathing motion correction, an increase of 14.6% of the mean standardized uptake value has been observed, while, with respect to a gated 4D reconstruction, up to 60% less noise and an increase of up to 124% of the SNR have been demonstrated.

  5. On distributed wavefront reconstruction for large-scale adaptive optics systems.

    Science.gov (United States)

    de Visser, Cornelis C; Brunner, Elisabeth; Verhaegen, Michel

    2016-05-01

    The distributed-spline-based aberration reconstruction (D-SABRE) method is proposed for distributed wavefront reconstruction with applications to large-scale adaptive optics systems. D-SABRE decomposes the wavefront sensor domain into any number of partitions and solves a local wavefront reconstruction problem on each partition using multivariate splines. D-SABRE accuracy is within 1% of a global approach with a speedup that scales quadratically with the number of partitions. The D-SABRE is compared to the distributed cumulative reconstruction (CuRe-D) method in open-loop and closed-loop simulations using the YAO adaptive optics simulation tool. D-SABRE accuracy exceeds CuRe-D for low levels of decomposition, and D-SABRE proved to be more robust to variations in the loop gain.

  6. Kuwaiti reconstruction project unprecedented in size, complexity

    Energy Technology Data Exchange (ETDEWEB)

    Tippee, B.

    1993-03-15

    There had been no challenge like it: a desert emirate ablaze; its main city sacked; the economically crucial oil industry devastated; countryside shrouded in smoke from oil well fires and littered with unexploded ordnance, disabled military equipment, and unignited crude oil. Like the well-documented effort that brought 749 burning wells under control in less than 7 months, Kuwaiti reconstruction had no precedent. Unlike the firefight, reconstruction is no-where complete. It nevertheless has placed two of three refineries back on stream, restored oil production to preinvasion levels, and repaired or rebuilt 17 of 26 oil field gathering stations. Most of the progress has come since the last well fire went out on Nov. 6, 1991. Expatriates in Kuwait since the days of Al-Awda- the return,' in Arabic- attribute much of the rapid progress under Al-Tameer- the reconstruction'- to decisions and preparations made while the well fires still raged. The article describes the planning for Al-Awda, reentering the country, drilling plans, facilities reconstruction, and special problems.

  7. Ensemble Kalman filter for the reconstruction of the Earth's mantle circulation

    Science.gov (United States)

    Bocher, Marie; Fournier, Alexandre; Coltice, Nicolas

    2018-02-01

    Recent advances in mantle convection modeling led to the release of a new generation of convection codes, able to self-consistently generate plate-like tectonics at their surface. Those models physically link mantle dynamics to surface tectonics. Combined with plate tectonic reconstructions, they have the potential to produce a new generation of mantle circulation models that use data assimilation methods and where uncertainties in plate tectonic reconstructions are taken into account. We provided a proof of this concept by applying a suboptimal Kalman filter to the reconstruction of mantle circulation (Bocher et al., 2016). Here, we propose to go one step further and apply the ensemble Kalman filter (EnKF) to this problem. The EnKF is a sequential Monte Carlo method particularly adapted to solve high-dimensional data assimilation problems with nonlinear dynamics. We tested the EnKF using synthetic observations consisting of surface velocity and heat flow measurements on a 2-D-spherical annulus model and compared it with the method developed previously. The EnKF performs on average better and is more stable than the former method. Less than 300 ensemble members are sufficient to reconstruct an evolution. We use covariance adaptive inflation and localization to correct for sampling errors. We show that the EnKF results are robust over a wide range of covariance localization parameters. The reconstruction is associated with an estimation of the error, and provides valuable information on where the reconstruction is to be trusted or not.

  8. Sparsity-promoting orthogonal dictionary updating for image reconstruction from highly undersampled magnetic resonance data

    International Nuclear Information System (INIS)

    Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu

    2015-01-01

    Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy. (paper)

  9. Sparsity-promoting orthogonal dictionary updating for image reconstruction from highly undersampled magnetic resonance data.

    Science.gov (United States)

    Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu

    2015-07-21

    Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy.

  10. Robust framework for PET image reconstruction incorporating system and measurement uncertainties.

    Directory of Open Access Journals (Sweden)

    Huafeng Liu

    Full Text Available In Positron Emission Tomography (PET, an optimal estimate of the radioactivity concentration is obtained from the measured emission data under certain criteria. So far, all the well-known statistical reconstruction algorithms require exactly known system probability matrix a priori, and the quality of such system model largely determines the quality of the reconstructed images. In this paper, we propose an algorithm for PET image reconstruction for the real world case where the PET system model is subject to uncertainties. The method counts PET reconstruction as a regularization problem and the image estimation is achieved by means of an uncertainty weighted least squares framework. The performance of our work is evaluated with the Shepp-Logan simulated and real phantom data, which demonstrates significant improvements in image quality over the least squares reconstruction efforts.

  11. Reconstructed coronal views of CT and isotopic images of the pancreas

    International Nuclear Information System (INIS)

    Kasuga, Toshio; Kobayashi, Toshio; Nakanishi, Fumiko

    1980-01-01

    To compare functional images of the pancreas by scintigraphy with morphological views of the pancreas by CT, CT coronal views of the pancreas were reconstructed. As CT coronal views were reconstructed from the routine scanning, there was a problem in longitudinal spatial resolution. However, almost satisfactory total images of the pancreas were obtained by improving images adequately. In 27 patients whose diseases had been confirmed, it was easy to compare pancreatic scintigrams with pancreatic CT images by using reconstructed CT coronal views, and information which had not been obtained by original CT images could be obtained by using reconstructed CT coronal views. Especially, defects on pancreatic images and the shape of pancreas which had not been visualized clearly by scintigraphy alone could be visualized by using reconstructed CT coronal views of the pancreas. (Tsunoda, M.)

  12. Developing milk industry estimates for dose reconstruction projects

    International Nuclear Information System (INIS)

    Beck, D.M.; Darwin, R.F.

    1991-01-01

    One of the most important contributors to radiation doses from hanford during the 1944-1947 period was radioactive iodine. Consumption of milk from cows that ate vegetation contaminated with iodine is likely the dominant pathway of human exposure. To estimate the doses people could have received from this pathway, it is necessary to reconstruct the amount of milk consumed by people living near Hanford, the source of the milk, and the type of feed that the milk cows ate. This task is challenging because the dairy industry has undergone radical changes since the end of World War 2, and records that document the impact of these changes on the study area are scarce. Similar problems are faced by researchers on most dose reconstruction efforts. The purpose of this work is to document and evaluate the methods used on the Hanford Environmental Dose Reconstruction (HEDR) Project to reconstruct the milk industry and to present preliminary results

  13. Reconstruction of signals with unknown spectra in information field theory with parameter uncertainty

    International Nuclear Information System (INIS)

    Ensslin, Torsten A.; Frommert, Mona

    2011-01-01

    The optimal reconstruction of cosmic metric perturbations and other signals requires knowledge of their power spectra and other parameters. If these are not known a priori, they have to be measured simultaneously from the same data used for the signal reconstruction. We formulate the general problem of signal inference in the presence of unknown parameters within the framework of information field theory. To solve this, we develop a generic parameter-uncertainty renormalized estimation (PURE) technique. As a concrete application, we address the problem of reconstructing Gaussian signals with unknown power-spectrum with five different approaches: (i) separate maximum-a-posteriori power-spectrum measurement and subsequent reconstruction, (ii) maximum-a-posteriori reconstruction with marginalized power-spectrum, (iii) maximizing the joint posterior of signal and spectrum, (iv) guessing the spectrum from the variance in the Wiener-filter map, and (v) renormalization flow analysis of the field-theoretical problem providing the PURE filter. In all cases, the reconstruction can be described or approximated as Wiener-filter operations with assumed signal spectra derived from the data according to the same recipe, but with differing coefficients. All of these filters, except the renormalized one, exhibit a perception threshold in case of a Jeffreys prior for the unknown spectrum. Data modes with variance below this threshold do not affect the signal reconstruction at all. Filter (iv) seems to be similar to the so-called Karhune-Loeve and Feldman-Kaiser-Peacock estimators for galaxy power spectra used in cosmology, which therefore should also exhibit a marginal perception threshold if correctly implemented. We present statistical performance tests and show that the PURE filter is superior to the others, especially if the post-Wiener-filter corrections are included or in case an additional scale-independent spectral smoothness prior can be adopted.

  14. [Current problems in the reconstructive surgery of the locomotor apparatus in children].

    Science.gov (United States)

    Kupatadze, D D; Nabokov, V V; Malikov, S A; Polozov, R N; Kanina, L Ia; Veselov, A G

    1997-01-01

    The authors analyze results of treatment of 778 children with malignant and benign tumors of the bones, pseudoarthroses, amputations of lower extremities and fingers, injuries of the tendons, vessels and contused-lacerated wounds of distal phalanges of fingers. The possibility to use a precision technique for the reconstructive operations of the vessels in children is shown.

  15. A variational study on BRDF reconstruction in a structured light scanner

    DEFF Research Database (Denmark)

    Nielsen, Jannik Boll; Stets, Jonathan Dyssel; Lyngby, Rasmus Ahrenkiel

    2017-01-01

    Time-efficient acquisition of reflectance behavior together with surface geometry is a challenging problem. In this study, we investigate the impact of system parameter uncertainties when incorporating a data-driven BRDF reconstruction approach into the standard pipeline of a structured light...... setup. Results show that while uncertainties in vertex positions and normals have a high impact on the quality of reconstructed BRDFs, object geometry and light source properties have very little influence on the reconstructed BRDFs. With this analysis, practitioners now have insight in the tolerances...... required for accurate BRDF acquisition to work....

  16. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00349786; The ATLAS collaboration; Salzburger, Andreas; Kiehn, Moritz; Hrdinka, Julia; Calace, Noemi

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  17. Haplotype reconstruction error as a classical misclassification problem: introducing sensitivity and specificity as error measures.

    Directory of Open Access Journals (Sweden)

    Claudia Lamina

    Full Text Available BACKGROUND: Statistically reconstructing haplotypes from single nucleotide polymorphism (SNP genotypes, can lead to falsely classified haplotypes. This can be an issue when interpreting haplotype association results or when selecting subjects with certain haplotypes for subsequent functional studies. It was our aim to quantify haplotype reconstruction error and to provide tools for it. METHODS AND RESULTS: By numerous simulation scenarios, we systematically investigated several error measures, including discrepancy, error rate, and R(2, and introduced the sensitivity and specificity to this context. We exemplified several measures in the KORA study, a large population-based study from Southern Germany. We find that the specificity is slightly reduced only for common haplotypes, while the sensitivity was decreased for some, but not all rare haplotypes. The overall error rate was generally increasing with increasing number of loci, increasing minor allele frequency of SNPs, decreasing correlation between the alleles and increasing ambiguity. CONCLUSIONS: We conclude that, with the analytical approach presented here, haplotype-specific error measures can be computed to gain insight into the haplotype uncertainty. This method provides the information, if a specific risk haplotype can be expected to be reconstructed with rather no or high misclassification and thus on the magnitude of expected bias in association estimates. We also illustrate that sensitivity and specificity separate two dimensions of the haplotype reconstruction error, which completely describe the misclassification matrix and thus provide the prerequisite for methods accounting for misclassification.

  18. On the Complexity of Reconstructing Chemical Reaction Networks

    DEFF Research Database (Denmark)

    Fagerberg, Rolf; Flamm, Christoph; Merkle, Daniel

    2013-01-01

    The analysis of the structure of chemical reaction networks is crucial for a better understanding of chemical processes. Such networks are well described as hypergraphs. However, due to the available methods, analyses regarding network properties are typically made on standard graphs derived from...... the full hypergraph description, e.g. on the so-called species and reaction graphs. However, a reconstruction of the underlying hypergraph from these graphs is not necessarily unique. In this paper, we address the problem of reconstructing a hypergraph from its species and reaction graph and show NP...

  19. Hopfield neural network in HEP track reconstruction

    International Nuclear Information System (INIS)

    Muresan, R.; Pentia, M.

    1997-01-01

    In experimental particle physics, pattern recognition problems, specifically for neural network methods, occur frequently in track finding or feature extraction. Track finding is a combinatorial optimization problem. Given a set of points in Euclidean space, one tries the reconstruction of particle trajectories, subject to smoothness constraints.The basic ingredients in a neural network are the N binary neurons and the synaptic strengths connecting them. In our case the neurons are the segments connecting all possible point pairs.The dynamics of the neural network is given by a local updating rule wich evaluates for each neuron the sign of the 'upstream activity'. An updating rule in the form of sigmoid function is given. The synaptic strengths are defined in terms of angle between the segments and the lengths of the segments implied in the track reconstruction. An algorithm based on Hopfield neural network has been developed and tested on the track coordinates measured by silicon microstrip tracking system

  20. Image Reconstruction of Metal Pipe in Electrical Resistance Tomography

    Directory of Open Access Journals (Sweden)

    Suzanna RIDZUAN AW

    2017-02-01

    Full Text Available This paper demonstrates a Linear Back Projection (LBP algorithm based on the reconstruction of conductivity distributions to identify different sizes and locations of bubble phantoms in a metal pipe. Both forward and inverse problems are discussed. Reconstructed images of the phantoms under test conditions are presented. From the results, it was justified that the sensitivity maps of the conducting boundary strategy can be applied successfully in identifying the location for the phantom of interest using LBP algorithm. Additionally, the number and spatial distribution of the bubble phantoms can be clearly distinguished at any location in the pipeline. It was also shown that the reconstructed images agree well with the bubble phantoms.

  1. A Total Variation-Based Reconstruction Method for Dynamic MRI

    Directory of Open Access Journals (Sweden)

    Germana Landi

    2008-01-01

    Full Text Available In recent years, total variation (TV regularization has become a popular and powerful tool for image restoration and enhancement. In this work, we apply TV minimization to improve the quality of dynamic magnetic resonance images. Dynamic magnetic resonance imaging is an increasingly popular clinical technique used to monitor spatio-temporal changes in tissue structure. Fast data acquisition is necessary in order to capture the dynamic process. Most commonly, the requirement of high temporal resolution is fulfilled by sacrificing spatial resolution. Therefore, the numerical methods have to address the issue of images reconstruction from limited Fourier data. One of the most successful techniques for dynamic imaging applications is the reduced-encoded imaging by generalized-series reconstruction method of Liang and Lauterbur. However, even if this method utilizes a priori data for optimal image reconstruction, the produced dynamic images are degraded by truncation artifacts, most notably Gibbs ringing, due to the spatial low resolution of the data. We use a TV regularization strategy in order to reduce these truncation artifacts in the dynamic images. The resulting TV minimization problem is solved by the fixed point iteration method of Vogel and Oman. The results of test problems with simulated and real data are presented to illustrate the effectiveness of the proposed approach in reducing the truncation artifacts of the reconstructed images.

  2. Homotopy Based Reconstruction from Acoustic Images

    DEFF Research Database (Denmark)

    Sharma, Ojaswa

    of the inherent arrangement. The problem of reconstruction from arbitrary cross sections is a generic problem and is also shown to be solved here using the mathematical tool of continuous deformations. As part of a complete processing, segmentation using level set methods is explored for acoustic images and fast...... GPU (Graphics Processing Unit) based methods are suggested for a streaming computation on large volumes of data. Validation of results for acoustic images is not straightforward due to unavailability of ground truth. Accuracy figures for the suggested methods are provided using phantom object...

  3. An Lq–Lp optimization framework for image reconstruction of electrical resistance tomography

    International Nuclear Information System (INIS)

    Zhao, Jia; Xu, Yanbin; Dong, Feng

    2014-01-01

    Image reconstruction in electrical resistance tomography (ERT) is an ill-posed and nonlinear problem, which is easily affected by measurement noise. The regularization method with L 2 constraint term or L 1 constraint term is often used to solve the inverse problem of ERT. It shows that the reconstruction method with L 2 regularization puts smoothness to obtain stability in the image reconstruction process, which is blurry at the interface of different conductivities. The regularization method with L 1 norm is powerful at dealing with the over-smoothing effects, which is beneficial in obtaining a sharp transaction in conductivity distribution. To find the reason for these effects, an L q –L p optimization framework (1 ⩽ q ⩽ 2, 1 ⩽ p ⩽ 2) for the image reconstruction of ERT is presented in this paper. The L q –L p optimization framework is solved based on an approximation handling with Gauss–Newton iteration algorithm. The optimization framework is tested for image reconstruction of ERT with different models and the effects of the L p regularization term on the quality of the reconstructed images are discussed with both simulation and experiment. By comparing the reconstructed results with different p in the regularization term, it is found that a large penalty is implemented on small data in the solution when p is small and a lesser penalty is implemented on small data in the solution when p is larger. It also makes the reconstructed images smoother and more easily affected by noise when p is larger. (paper)

  4. Realtime Reconstruction of an Animating Human Body from a Single Depth Camera.

    Science.gov (United States)

    Chen, Yin; Cheng, Zhi-Quan; Lai, Chao; Martin, Ralph R; Dang, Gang

    2016-08-01

    We present a method for realtime reconstruction of an animating human body,which produces a sequence of deforming meshes representing a given performance captured by a single commodity depth camera. We achieve realtime single-view mesh completion by enhancing the parameterized SCAPE model.Our method, which we call Realtime SCAPE, performs full-body reconstruction without the use of markers.In Realtime SCAPE, estimations of body shape parameters and pose parameters, needed for reconstruction, are decoupled. Intrinsic body shape is first precomputed for a given subject, by determining shape parameters with the aid of a body shape database. Subsequently, per-frame pose parameter estimation is performed by means of linear blending skinning (LBS); the problem is decomposed into separately finding skinning weights and transformations. The skinning weights are also determined offline from the body shape database,reducing online reconstruction to simply finding the transformations in LBS. Doing so is formulated as a linear variational problem;carefully designed constraints are used to impose temporal coherence and alleviate artifacts. Experiments demonstrate that our method can produce full-body mesh sequences with high fidelity.

  5. Ultrasound guided electrical impedance tomography for 2D free-interface reconstruction

    Science.gov (United States)

    Liang, Guanghui; Ren, Shangjie; Dong, Feng

    2017-07-01

    The free-interface detection problem is normally seen in industrial or biological processes. Electrical impedance tomography (EIT) is a non-invasive technique with advantages of high-speed and low cost, and is a promising solution for free-interface detection problems. However, due to the ill-posed and nonlinear characteristics, the spatial resolution of EIT is low. To deal with the issue, an ultrasound guided EIT is proposed to directly reconstruct the geometric configuration of the target free-interface. In the method, the position of the central point of the target interface is measured by a pair of ultrasound transducers mounted at the opposite side of the objective domain, and then the position measurement is used as the prior information for guiding the EIT-based free-interface reconstruction. During the process, a constrained least squares framework is used to fuse the information from different measurement modalities, and the Lagrange multiplier-based Levenberg-Marquardt method is adopted to provide the iterative solution of the constraint optimization problem. The numerical results show that the proposed ultrasound guided EIT method for the free-interface reconstruction is more accurate than the single modality method, especially when the number of valid electrodes is limited.

  6. Ultrasound guided electrical impedance tomography for 2D free-interface reconstruction

    International Nuclear Information System (INIS)

    Liang, Guanghui; Ren, Shangjie; Dong, Feng

    2017-01-01

    The free-interface detection problem is normally seen in industrial or biological processes. Electrical impedance tomography (EIT) is a non-invasive technique with advantages of high-speed and low cost, and is a promising solution for free-interface detection problems. However, due to the ill-posed and nonlinear characteristics, the spatial resolution of EIT is low. To deal with the issue, an ultrasound guided EIT is proposed to directly reconstruct the geometric configuration of the target free-interface. In the method, the position of the central point of the target interface is measured by a pair of ultrasound transducers mounted at the opposite side of the objective domain, and then the position measurement is used as the prior information for guiding the EIT-based free-interface reconstruction. During the process, a constrained least squares framework is used to fuse the information from different measurement modalities, and the Lagrange multiplier-based Levenberg–Marquardt method is adopted to provide the iterative solution of the constraint optimization problem. The numerical results show that the proposed ultrasound guided EIT method for the free-interface reconstruction is more accurate than the single modality method, especially when the number of valid electrodes is limited. (paper)

  7. Continuous analog of multiplicative algebraic reconstruction technique for computed tomography

    Science.gov (United States)

    Tateishi, Kiyoko; Yamaguchi, Yusaku; Abou Al-Ola, Omar M.; Kojima, Takeshi; Yoshinaga, Tetsuya

    2016-03-01

    We propose a hybrid dynamical system as a continuous analog to the block-iterative multiplicative algebraic reconstruction technique (BI-MART), which is a well-known iterative image reconstruction algorithm for computed tomography. The hybrid system is described by a switched nonlinear system with a piecewise smooth vector field or differential equation and, for consistent inverse problems, the convergence of non-negatively constrained solutions to a globally stable equilibrium is guaranteed by the Lyapunov theorem. Namely, we can prove theoretically that a weighted Kullback-Leibler divergence measure can be a common Lyapunov function for the switched system. We show that discretizing the differential equation by using the first-order approximation (Euler's method) based on the geometric multiplicative calculus leads to the same iterative formula of the BI-MART with the scaling parameter as a time-step of numerical discretization. The present paper is the first to reveal that a kind of iterative image reconstruction algorithm is constructed by the discretization of a continuous-time dynamical system for solving tomographic inverse problems. Iterative algorithms with not only the Euler method but also the Runge-Kutta methods of lower-orders applied for discretizing the continuous-time system can be used for image reconstruction. A numerical example showing the characteristics of the discretized iterative methods is presented.

  8. Codiversification of gastrointestinal microbiota and phylogenyin passerines is not explained by ecological divergence

    Czech Academy of Sciences Publication Activity Database

    Kropáčková, L.; Těšický, M.; Albrecht, Tomáš; Kubovčiak, J.; Čížková, Dagmar; Tomášek, Oldřich; Martin, J.-F.; Bobek, Lukáš; Králová, Tereza; Procházka, Petr; Kreisinger, J.

    2017-01-01

    Roč. 26, č. 19 (2017), s. 5292-5304 ISSN 0962-1083 R&D Projects: GA ČR GA15-11782S Institutional support: RVO:68081766 Keywords : birds * cophylogeny * metagenomics * microbiome * neutral/adaptive evolution Subject RIV: EG - Zoology OBOR OECD: Biochemistry and molecular biology Impact factor: 6.086, year: 2016

  9. Efficient L1 regularization-based reconstruction for fluorescent molecular tomography using restarted nonlinear conjugate gradient.

    Science.gov (United States)

    Shi, Junwei; Zhang, Bin; Liu, Fei; Luo, Jianwen; Bai, Jing

    2013-09-15

    For the ill-posed fluorescent molecular tomography (FMT) inverse problem, the L1 regularization can protect the high-frequency information like edges while effectively reduce the image noise. However, the state-of-the-art L1 regularization-based algorithms for FMT reconstruction are expensive in memory, especially for large-scale problems. An efficient L1 regularization-based reconstruction algorithm based on nonlinear conjugate gradient with restarted strategy is proposed to increase the computational speed with low memory consumption. The reconstruction results from phantom experiments demonstrate that the proposed algorithm can obtain high spatial resolution and high signal-to-noise ratio, as well as high localization accuracy for fluorescence targets.

  10. Total variation superiorized conjugate gradient method for image reconstruction

    Science.gov (United States)

    Zibetti, Marcelo V. W.; Lin, Chuan; Herman, Gabor T.

    2018-03-01

    The conjugate gradient (CG) method is commonly used for the relatively-rapid solution of least squares problems. In image reconstruction, the problem can be ill-posed and also contaminated by noise; due to this, approaches such as regularization should be utilized. Total variation (TV) is a useful regularization penalty, frequently utilized in image reconstruction for generating images with sharp edges. When a non-quadratic norm is selected for regularization, as is the case for TV, then it is no longer possible to use CG. Non-linear CG is an alternative, but it does not share the efficiency that CG shows with least squares and methods such as fast iterative shrinkage-thresholding algorithms (FISTA) are preferred for problems with TV norm. A different approach to including prior information is superiorization. In this paper it is shown that the conjugate gradient method can be superiorized. Five different CG variants are proposed, including preconditioned CG. The CG methods superiorized by the total variation norm are presented and their performance in image reconstruction is demonstrated. It is illustrated that some of the proposed variants of the superiorized CG method can produce reconstructions of superior quality to those produced by FISTA and in less computational time, due to the speed of the original CG for least squares problems. In the Appendix we examine the behavior of one of the superiorized CG methods (we call it S-CG); one of its input parameters is a positive number ɛ. It is proved that, for any given ɛ that is greater than the half-squared-residual for the least squares solution, S-CG terminates in a finite number of steps with an output for which the half-squared-residual is less than or equal to ɛ. Importantly, it is also the case that the output will have a lower value of TV than what would be provided by unsuperiorized CG for the same value ɛ of the half-squared residual.

  11. Gamma regularization based reconstruction for low dose CT

    International Nuclear Information System (INIS)

    Zhang, Junfeng; Chen, Yang; Hu, Yining; Luo, Limin; Shu, Huazhong; Li, Bicao; Liu, Jin; Coatrieux, Jean-Louis

    2015-01-01

    Reducing the radiation in computerized tomography is today a major concern in radiology. Low dose computerized tomography (LDCT) offers a sound way to deal with this problem. However, more severe noise in the reconstructed CT images is observed under low dose scan protocols (e.g. lowered tube current or voltage values). In this paper we propose a Gamma regularization based algorithm for LDCT image reconstruction. This solution is flexible and provides a good balance between the regularizations based on l 0 -norm and l 1 -norm. We evaluate the proposed approach using the projection data from simulated phantoms and scanned Catphan phantoms. Qualitative and quantitative results show that the Gamma regularization based reconstruction can perform better in both edge-preserving and noise suppression when compared with other norms. (paper)

  12. A Total Variation Regularization Based Super-Resolution Reconstruction Algorithm for Digital Video

    Directory of Open Access Journals (Sweden)

    Zhang Liangpei

    2007-01-01

    Full Text Available Super-resolution (SR reconstruction technique is capable of producing a high-resolution image from a sequence of low-resolution images. In this paper, we study an efficient SR algorithm for digital video. To effectively deal with the intractable problems in SR video reconstruction, such as inevitable motion estimation errors, noise, blurring, missing regions, and compression artifacts, the total variation (TV regularization is employed in the reconstruction model. We use the fixed-point iteration method and preconditioning techniques to efficiently solve the associated nonlinear Euler-Lagrange equations of the corresponding variational problem in SR. The proposed algorithm has been tested in several cases of motion and degradation. It is also compared with the Laplacian regularization-based SR algorithm and other TV-based SR algorithms. Experimental results are presented to illustrate the effectiveness of the proposed algorithm.

  13. Wavelet-sparsity based regularization over time in the inverse problem of electrocardiography.

    Science.gov (United States)

    Cluitmans, Matthijs J M; Karel, Joël M H; Bonizzi, Pietro; Volders, Paul G A; Westra, Ronald L; Peeters, Ralf L M

    2013-01-01

    Noninvasive, detailed assessment of electrical cardiac activity at the level of the heart surface has the potential to revolutionize diagnostics and therapy of cardiac pathologies. Due to the requirement of noninvasiveness, body-surface potentials are measured and have to be projected back to the heart surface, yielding an ill-posed inverse problem. Ill-posedness ensures that there are non-unique solutions to this problem, resulting in a problem of choice. In the current paper, it is proposed to restrict this choice by requiring that the time series of reconstructed heart-surface potentials is sparse in the wavelet domain. A local search technique is introduced that pursues a sparse solution, using an orthogonal wavelet transform. Epicardial potentials reconstructed from this method are compared to those from existing methods, and validated with actual intracardiac recordings. The new technique improves the reconstructions in terms of smoothness and recovers physiologically meaningful details. Additionally, reconstruction of activation timing seems to be improved when pursuing sparsity of the reconstructed signals in the wavelet domain.

  14. Non-stationary reconstruction for dynamic fluorescence molecular tomography with extended kalman filter.

    Science.gov (United States)

    Liu, Xin; Wang, Hongkai; Yan, Zhuangzhi

    2016-11-01

    Dynamic fluorescence molecular tomography (FMT) plays an important role in drug delivery research. However, the majority of current reconstruction methods focus on solving the stationary FMT problems. If the stationary reconstruction methods are applied to the time-varying fluorescence measurements, the reconstructed results may suffer from a high level of artifacts. In addition, based on the stationary methods, only one tomographic image can be obtained after scanning one circle projection data. As a result, the movement of fluorophore in imaged object may not be detected due to the relative long data acquisition time (typically >1 min). In this paper, we apply extended kalman filter (EKF) technique to solve the non-stationary fluorescence tomography problem. Especially, to improve the EKF reconstruction performance, the generalized inverse of kalman gain is calculated by a second-order iterative method. The numerical simulation, phantom, and in vivo experiments are performed to evaluate the performance of the method. The experimental results indicate that by using the proposed EKF-based second-order iterative (EKF-SOI) method, we cannot only clearly resolve the time-varying distributions of fluorophore within imaged object, but also greatly improve the reconstruction time resolution (~2.5 sec/frame) which makes it possible to detect the movement of fluorophore during the imaging processes.

  15. Application of Super-Resolution Image Reconstruction to Digital Holography

    Directory of Open Access Journals (Sweden)

    Zhang Shuqun

    2006-01-01

    Full Text Available We describe a new application of super-resolution image reconstruction to digital holography which is a technique for three-dimensional information recording and reconstruction. Digital holography has suffered from the low resolution of CCD sensors, which significantly limits the size of objects that can be recorded. The existing solution to this problem is to use optics to bandlimit the object to be recorded, which can cause the loss of details. Here super-resolution image reconstruction is proposed to be applied in enhancing the spatial resolution of digital holograms. By introducing a global camera translation before sampling, a high-resolution hologram can be reconstructed from a set of undersampled hologram images. This permits the recording of larger objects and reduces the distance between the object and the hologram. Practical results from real and simulated holograms are presented to demonstrate the feasibility of the proposed technique.

  16. A Convex Formulation for Magnetic Particle Imaging X-Space Reconstruction.

    Science.gov (United States)

    Konkle, Justin J; Goodwill, Patrick W; Hensley, Daniel W; Orendorff, Ryan D; Lustig, Michael; Conolly, Steven M

    2015-01-01

    Magnetic Particle Imaging (mpi) is an emerging imaging modality with exceptional promise for clinical applications in rapid angiography, cell therapy tracking, cancer imaging, and inflammation imaging. Recent publications have demonstrated quantitative mpi across rat sized fields of view with x-space reconstruction methods. Critical to any medical imaging technology is the reliability and accuracy of image reconstruction. Because the average value of the mpi signal is lost during direct-feedthrough signal filtering, mpi reconstruction algorithms must recover this zero-frequency value. Prior x-space mpi recovery techniques were limited to 1d approaches which could introduce artifacts when reconstructing a 3d image. In this paper, we formulate x-space reconstruction as a 3d convex optimization problem and apply robust a priori knowledge of image smoothness and non-negativity to reduce non-physical banding and haze artifacts. We conclude with a discussion of the powerful extensibility of the presented formulation for future applications.

  17. Industrial dynamic tomographic reconstruction

    International Nuclear Information System (INIS)

    Oliveira, Eric Ferreira de

    2016-01-01

    The state of the art methods applied to industrial processes is currently based on the principles of classical tomographic reconstructions developed for tomographic patterns of static distributions, or is limited to cases of low variability of the density distribution function of the tomographed object. Noise and motion artifacts are the main problems caused by a mismatch in the data from views acquired in different instants. All of these add to the known fact that using a limited amount of data can result in the presence of noise, artifacts and some inconsistencies with the distribution under study. One of the objectives of the present work is to discuss the difficulties that arise from implementing reconstruction algorithms in dynamic tomography that were originally developed for static distributions. Another objective is to propose solutions that aim at reducing a temporal type of information loss caused by employing regular acquisition systems to dynamic processes. With respect to dynamic image reconstruction it was conducted a comparison between different static reconstruction methods, like MART and FBP, when used for dynamic scenarios. This comparison was based on a MCNPx simulation as well as an analytical setup of an aluminum cylinder that moves along the section of a riser during the process of acquisition, and also based on cross section images from CFD techniques. As for the adaptation of current tomographic acquisition systems for dynamic processes, this work established a sequence of tomographic views in a just-in-time fashion for visualization purposes, a form of visually disposing density information as soon as it becomes amenable to image reconstruction. A third contribution was to take advantage of the triple color channel necessary to display colored images in most displays, so that, by appropriately scaling the acquired values of each view in the linear system of the reconstruction, it was possible to imprint a temporal trace into the regularly

  18. Image reconstruction from multiple fan-beam projections

    International Nuclear Information System (INIS)

    Jelinek, J.; Overton, T.R.

    1984-01-01

    Special-purpose third-generation fan-beam CT systems can be greatly simplified by limiting the number of detectors, but this requires a different mode of data collection to provide a set of projections appropriate to the required spatial resolution in the reconstructed image. Repeated rotation of the source-detector fan, combined with shift of the detector array and perhaps offset of the source with respect to the fan's axis after each 360 0 rotation(cycle), provides a fairly general pattern of projection space filling. The authors' investigated the problem of optimal data-collection geometry for a multiple-rotation fan-beam scanner and of corresponding reconstruction algorithm

  19. Ensemble Kalman filter for the reconstruction of the Earth's mantle circulation

    Directory of Open Access Journals (Sweden)

    M. Bocher

    2018-02-01

    Full Text Available Recent advances in mantle convection modeling led to the release of a new generation of convection codes, able to self-consistently generate plate-like tectonics at their surface. Those models physically link mantle dynamics to surface tectonics. Combined with plate tectonic reconstructions, they have the potential to produce a new generation of mantle circulation models that use data assimilation methods and where uncertainties in plate tectonic reconstructions are taken into account. We provided a proof of this concept by applying a suboptimal Kalman filter to the reconstruction of mantle circulation (Bocher et al., 2016. Here, we propose to go one step further and apply the ensemble Kalman filter (EnKF to this problem. The EnKF is a sequential Monte Carlo method particularly adapted to solve high-dimensional data assimilation problems with nonlinear dynamics. We tested the EnKF using synthetic observations consisting of surface velocity and heat flow measurements on a 2-D-spherical annulus model and compared it with the method developed previously. The EnKF performs on average better and is more stable than the former method. Less than 300 ensemble members are sufficient to reconstruct an evolution. We use covariance adaptive inflation and localization to correct for sampling errors. We show that the EnKF results are robust over a wide range of covariance localization parameters. The reconstruction is associated with an estimation of the error, and provides valuable information on where the reconstruction is to be trusted or not.

  20. Surface Reconstruction and Image Enhancement via $L^1$-Minimization

    KAUST Repository

    Dobrev, Veselin; Guermond, Jean-Luc; Popov, Bojan

    2010-01-01

    A surface reconstruction technique based on minimization of the total variation of the gradient is introduced. Convergence of the method is established, and an interior-point algorithm solving the associated linear programming problem is introduced

  1. First results of genetic algorithm application in ML image reconstruction in emission tomography

    International Nuclear Information System (INIS)

    Smolik, W.

    1999-01-01

    This paper concerns application of genetic algorithm in maximum likelihood image reconstruction in emission tomography. The example of genetic algorithm for image reconstruction is presented. The genetic algorithm was based on the typical genetic scheme modified due to the nature of solved problem. The convergence of algorithm was examined. The different adaption functions, selection and crossover methods were verified. The algorithm was tested on simulated SPECT data. The obtained results of image reconstruction are discussed. (author)

  2. Two-Stage Tissue-Expander Breast Reconstruction: A Focus on the Surgical Technique

    Directory of Open Access Journals (Sweden)

    Elisa Bellini

    2017-01-01

    Full Text Available Objective. Breast cancer, the most common malignancy in women, comprises 18% of all female cancers. Mastectomy is an essential intervention to save lives, but it can destroy one’s body image, causing both physical and psychological trauma. Reconstruction is an important step in restoring patient quality of life after the mutilating treatment. Material and Methods. Tissue expanders and implants are now commonly used in breast reconstruction. Autologous reconstruction allows a better aesthetic result; however, many patients prefer implant reconstruction due to the shorter operation time and lack of donor site morbidity. Moreover, this reconstruction strategy is safe and can be performed in patients with multiple health problems. Tissue-expander reconstruction is conventionally performed as a two-stage procedure starting immediately after mammary gland removal. Results. Mastectomy is a destructive but essential intervention for women with breast cancer. Tissue expansion breast reconstruction is a safe, reliable, and efficacious procedure with considerable psychological benefits since it provides a healthy body image. Conclusion. This article focuses on this surgical technique and how to achieve the best reconstruction possible.

  3. Simultaneous reconstruction and segmentation for dynamic SPECT imaging

    International Nuclear Information System (INIS)

    Burger, Martin; Rossmanith, Carolin; Zhang, Xiaoqun

    2016-01-01

    This work deals with the reconstruction of dynamic images that incorporate characteristic dynamics in certain subregions, as arising for the kinetics of many tracers in emission tomography (SPECT, PET). We make use of a basis function approach for the unknown tracer concentration by assuming that the region of interest can be divided into subregions with spatially constant concentration curves. Applying a regularised variational framework reminiscent of the Chan-Vese model for image segmentation we simultaneously reconstruct both the labelling functions of the subregions as well as the subconcentrations within each region. Our particular focus is on applications in SPECT with the Poisson noise model, resulting in a Kullback–Leibler data fidelity in the variational approach. We present a detailed analysis of the proposed variational model and prove existence of minimisers as well as error estimates. The latter apply to a more general class of problems and generalise existing results in literature since we deal with a nonlinear forward operator and a nonquadratic data fidelity. A computational algorithm based on alternating minimisation and splitting techniques is developed for the solution of the problem and tested on appropriately designed synthetic data sets. For those we compare the results to those of standard EM reconstructions and investigate the effects of Poisson noise in the data. (paper)

  4. Higher order total variation regularization for EIT reconstruction.

    Science.gov (United States)

    Gong, Bo; Schullcke, Benjamin; Krueger-Ziolek, Sabine; Zhang, Fan; Mueller-Lisse, Ullrich; Moeller, Knut

    2018-01-08

    Electrical impedance tomography (EIT) attempts to reveal the conductivity distribution of a domain based on the electrical boundary condition. This is an ill-posed inverse problem; its solution is very unstable. Total variation (TV) regularization is one of the techniques commonly employed to stabilize reconstructions. However, it is well known that TV regularization induces staircase effects, which are not realistic in clinical applications. To reduce such artifacts, modified TV regularization terms considering a higher order differential operator were developed in several previous studies. One of them is called total generalized variation (TGV) regularization. TGV regularization has been successively applied in image processing in a regular grid context. In this study, we adapted TGV regularization to the finite element model (FEM) framework for EIT reconstruction. Reconstructions using simulation and clinical data were performed. First results indicate that, in comparison to TV regularization, TGV regularization promotes more realistic images. Graphical abstract Reconstructed conductivity changes located on selected vertical lines. For each of the reconstructed images as well as the ground truth image, conductivity changes located along the selected left and right vertical lines are plotted. In these plots, the notation GT in the legend stands for ground truth, TV stands for total variation method, and TGV stands for total generalized variation method. Reconstructed conductivity distributions from the GREIT algorithm are also demonstrated.

  5. A Survey on Methods for Reconstructing Surfaces from Unorganized Point Sets

    Directory of Open Access Journals (Sweden)

    Vilius Matiukas

    2011-08-01

    Full Text Available This paper addresses the issue of reconstructing and visualizing surfaces from unorganized point sets. These can be acquired using different techniques, such as 3D-laser scanning, computerized tomography, magnetic resonance imaging and multi-camera imaging. The problem of reconstructing surfaces from their unorganized point sets is common for many diverse areas, including computer graphics, computer vision, computational geometry or reverse engineering. The paper presents three alternative methods that all use variations in complementary cones to triangulate and reconstruct the tested 3D surfaces. The article evaluates and contrasts three alternatives.Article in English

  6. Enhanced reconstruction of weighted networks from strengths and degrees

    International Nuclear Information System (INIS)

    Mastrandrea, Rossana; Fagiolo, Giorgio; Squartini, Tiziano; Garlaschelli, Diego

    2014-01-01

    Network topology plays a key role in many phenomena, from the spreading of diseases to that of financial crises. Whenever the whole structure of a network is unknown, one must resort to reconstruction methods that identify the least biased ensemble of networks consistent with the partial information available. A challenging case, frequently encountered due to privacy issues in the analysis of interbank flows and Big Data, is when there is only local (node-specific) aggregate information available. For binary networks, the relevant ensemble is one where the degree (number of links) of each node is constrained to its observed value. However, for weighted networks the problem is much more complicated. While the naïve approach prescribes to constrain the strengths (total link weights) of all nodes, recent counter-intuitive results suggest that in weighted networks the degrees are often more informative than the strengths. This implies that the reconstruction of weighted networks would be significantly enhanced by the specification of both strengths and degrees, a computationally hard and bias-prone procedure. Here we solve this problem by introducing an analytical and unbiased maximum-entropy method that works in the shortest possible time and does not require the explicit generation of reconstructed samples. We consider several real-world examples and show that, while the strengths alone give poor results, the additional knowledge of the degrees yields accurately reconstructed networks. Information-theoretic criteria rigorously confirm that the degree sequence, as soon as it is non-trivial, is irreducible to the strength sequence. Our results have strong implications for the analysis of motifs and communities and whenever the reconstructed ensemble is required as a null model to detect higher-order patterns

  7. An inverse Sturm–Liouville problem with a fractional derivative

    KAUST Repository

    Jin, Bangti

    2012-05-01

    In this paper, we numerically investigate an inverse problem of recovering the potential term in a fractional Sturm-Liouville problem from one spectrum. The qualitative behaviors of the eigenvalues and eigenfunctions are discussed, and numerical reconstructions of the potential with a Newton method from finite spectral data are presented. Surprisingly, it allows very satisfactory reconstructions for both smooth and discontinuous potentials, provided that the order . α∈. (1,. 2) of fractional derivative is sufficiently away from 2. © 2012 Elsevier Inc.

  8. Reconstruction of electrical impedance tomography (EIT) images based on the expectation maximum (EM) method.

    Science.gov (United States)

    Wang, Qi; Wang, Huaxiang; Cui, Ziqiang; Yang, Chengyi

    2012-11-01

    Electrical impedance tomography (EIT) calculates the internal conductivity distribution within a body using electrical contact measurements. The image reconstruction for EIT is an inverse problem, which is both non-linear and ill-posed. The traditional regularization method cannot avoid introducing negative values in the solution. The negativity of the solution produces artifacts in reconstructed images in presence of noise. A statistical method, namely, the expectation maximization (EM) method, is used to solve the inverse problem for EIT in this paper. The mathematical model of EIT is transformed to the non-negatively constrained likelihood minimization problem. The solution is obtained by the gradient projection-reduced Newton (GPRN) iteration method. This paper also discusses the strategies of choosing parameters. Simulation and experimental results indicate that the reconstructed images with higher quality can be obtained by the EM method, compared with the traditional Tikhonov and conjugate gradient (CG) methods, even with non-negative processing. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  9. A Support-Based Reconstruction for SENSE MRI

    Directory of Open Access Journals (Sweden)

    Bradley S. Peterson

    2013-03-01

    Full Text Available A novel, rapid algorithm to speed up and improve the reconstruction of sensitivity encoding (SENSE MRI was proposed in this paper. The essence of the algorithm was that it iteratively solved the model of simple SENSE on a pixel-by-pixel basis in the region of support (ROS. The ROS was obtained from scout images of eight channels by morphological operations such as opening and filling. All the pixels in the FOV were paired and classified into four types, according to their spatial locations with respect to the ROS, and each with corresponding procedures of solving the inverse problem for image reconstruction. The sensitivity maps, used for the image reconstruction and covering only the ROS, were obtained by a polynomial regression model without extrapolation to keep the estimation errors small. The experiments demonstrate that the proposed method improves the reconstruction of SENSE in terms of speed and accuracy. The mean square errors (MSE of our reconstruction is reduced by 16.05% for a 2D brain MR image and the mean MSE over the whole slices in a 3D brain MRI is reduced by 30.44% compared to those of the traditional methods. The computation time is only 25%, 45%, and 70% of the traditional method for images with numbers of pixels in the orders of 103, 104, and 105–107, respectively.

  10. Reconstruction of binary geological images using analytical edge and object models

    Science.gov (United States)

    Abdollahifard, Mohammad J.; Ahmadi, Sadegh

    2016-04-01

    Reconstruction of fields using partial measurements is of vital importance in different applications in geosciences. Solving such an ill-posed problem requires a well-chosen model. In recent years, training images (TI) are widely employed as strong prior models for solving these problems. However, in the absence of enough evidence it is difficult to find an adequate TI which is capable of describing the field behavior properly. In this paper a very simple and general model is introduced which is applicable to a fairly wide range of binary images without any modifications. The model is motivated by the fact that nearly all binary images are composed of simple linear edges in micro-scale. The analytic essence of this model allows us to formulate the template matching problem as a convex optimization problem having efficient and fast solutions. The model has the potential to incorporate the qualitative and quantitative information provided by geologists. The image reconstruction problem is also formulated as an optimization problem and solved using an iterative greedy approach. The proposed method is capable of recovering the image unknown values with accuracies about 90% given samples representing as few as 2% of the original image.

  11. Visualization and Analysis-Oriented Reconstruction of Material Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Childs, Henry R.

    2010-03-05

    Reconstructing boundaries along material interfaces from volume fractions is a difficult problem, especially because the under-resolved nature of the input data allows for many correct interpretations. Worse, algorithms widely accepted as appropriate for simulation are inappropriate for visualization. In this paper, we describe a new algorithm that is specifically intended for reconstructing material interfaces for visualization and analysis requirements. The algorithm performs well with respect to memory footprint and execution time, has desirable properties in various accuracy metrics, and also produces smooth surfaces with few artifacts, even when faced with more than two materials per cell.

  12. Noise reduction by sparse representation in learned dictionaries for application to blind tip reconstruction problem

    International Nuclear Information System (INIS)

    Jóźwiak, Grzegorz

    2017-01-01

    Scanning probe microscopy (SPM) is a well known tool used for the investigation of phenomena in objects in the nanometer size range. However, quantitative results are limited by the size and the shape of the nanoprobe used in experiments. Blind tip reconstruction (BTR) is a very popular method used to reconstruct the upper boundary on the shape of the probe. This method is known to be very sensitive to all kinds of interference in the atomic force microscopy (AFM) image. Due to mathematical morphology calculus, the interference makes the BTR results biased rather than randomly disrupted. For this reason, the careful choice of methods used for image enhancement and denoising, as well as the shape of a calibration sample are very important. In the paper, the results of thorough investigations on the shape of a calibration standard are shown. A novel shape is proposed and a tool for the simulation of AFM images of this calibration standard was designed. It was shown that careful choice of the initial tip allows us to use images of hole structures to blindly reconstruct the shape of a probe. The simulator was used to test the impact of modern filtration algorithms on the BTR process. These techniques are based on sparse approximation with function dictionaries learned on the basis of an image itself. Various learning algorithms and parameters were tested to determine the optimal combination for sparse representation. It was observed that the strong reduction of noise does not guarantee strong reduction in reconstruction errors. It seems that further improvements will be possible by the combination of BTR and a noise reduction procedure. (paper)

  13. Accurate reconstruction in digital holographic microscopy using antialiasing shift-invariant contourlet transform

    Science.gov (United States)

    Zhang, Xiaolei; Zhang, Xiangchao; Xu, Min; Zhang, Hao; Jiang, Xiangqian

    2018-03-01

    The measurement of microstructured components is a challenging task in optical engineering. Digital holographic microscopy has attracted intensive attention due to its remarkable capability of measuring complex surfaces. However, speckles arise in the recorded interferometric holograms, and they will degrade the reconstructed wavefronts. Existing speckle removal methods suffer from the problems of frequency aliasing and phase distortions. A reconstruction method based on the antialiasing shift-invariant contourlet transform (ASCT) is developed. Salient edges and corners have sparse representations in the transform domain of ASCT, and speckles can be recognized and removed effectively. As subsampling in the scale and directional filtering schemes is avoided, the problems of frequency aliasing and phase distortions occurring in the conventional multiscale transforms can be effectively overcome, thereby improving the accuracy of wavefront reconstruction. As a result, the proposed method is promising for the digital holographic measurement of complex structures.

  14. Sediment core and glacial environment reconstruction - a method review

    Science.gov (United States)

    Bakke, Jostein; Paasche, Øyvind

    2010-05-01

    Alpine glaciers are often located in remote and high-altitude regions of the world, areas that only rarely are covered by instrumental records. Reconstructions of glaciers has therefore proven useful for understanding past climate dynamics on both shorter and longer time-scales. One major drawback with glacier reconstructions based solely on moraine chronologies - by far the most common -, is that due to selective preservation of moraine ridges such records do not exclude the possibility of multiple Holocene glacier advances. This problem is true regardless whether cosmogenic isotopes or lichenometry have been used to date the moraines, or also radiocarbon dating of mega-fossils buried in till or underneath the moraines themselves. To overcome this problem Karlén (1976) initially suggested that glacial erosion and the associated production of rock-flour deposited in downstream lakes could provide a continuous record of glacial fluctuations, hence overcoming the problem of incomplete reconstructions. We want to discuss the methods used to reconstruct past glacier activity based on sediments deposited in distal glacier-fed lakes. By quantifying physical properties of glacial and extra-glacial sediments deposited in catchments, and in downstream lakes and fjords, it is possible to isolate and identify past glacier activity - size and production rate - that subsequently can be used to reconstruct changing environmental shifts and trends. Changes in average sediment evacuation from alpine glaciers are mainly governed by glacier size and the mass turnover gradient, determining the deformation rate at any given time. The amount of solid precipitation (mainly winter accumulation) versus loss due to melting during the ablation-season (mainly summer temperature) determines the mass turnover gradient in either positive or negative direction. A prevailing positive net balance will lead to higher sedimentation rates and vice versa, which in turn can be recorded in downstream

  15. Direct delayed breast reconstruction with TAP flap, implant and acellular dermal matrix (TAPIA)

    DEFF Research Database (Denmark)

    Børsen-Koch, Mikkel; Gunnarsson, Gudjon L; Udesen, Ann

    2015-01-01

    BACKGROUND: The latissimus dorsi (LD) flap is considered one of the working horses within the field of breast reconstruction and it offers several advantages. However, donor-site morbidity may pose a problem. This article describes a new and modified technique for delayed breast reconstruction...... there is a learning curve, this simple modified technique does not demand any perforator or other vessel dissection. Any trained plastic surgeon should be able to adopt the technique into the growing armamentarium of breast reconstruction possibilities....

  16. An iterative hyperelastic parameters reconstruction for breast cancer assessment

    Science.gov (United States)

    Mehrabian, Hatef; Samani, Abbas

    2008-03-01

    In breast elastography, breast tissues usually undergo large compressions resulting in significant geometric and structural changes, and consequently nonlinear mechanical behavior. In this study, an elastography technique is presented where parameters characterizing tissue nonlinear behavior is reconstructed. Such parameters can be used for tumor tissue classification. To model the nonlinear behavior, tissues are treated as hyperelastic materials. The proposed technique uses a constrained iterative inversion method to reconstruct the tissue hyperelastic parameters. The reconstruction technique uses a nonlinear finite element (FE) model for solving the forward problem. In this research, we applied Yeoh and Polynomial models to model the tissue hyperelasticity. To mimic the breast geometry, we used a computational phantom, which comprises of a hemisphere connected to a cylinder. This phantom consists of two types of soft tissue to mimic adipose and fibroglandular tissues and a tumor. Simulation results show the feasibility of the proposed method in reconstructing the hyperelastic parameters of the tumor tissue.

  17. Failed medial patellofemoral ligament reconstruction: Causes and surgical strategies

    Science.gov (United States)

    Sanchis-Alfonso, Vicente; Montesinos-Berry, Erik; Ramirez-Fuentes, Cristina; Leal-Blanquet, Joan; Gelber, Pablo E; Monllau, Joan Carles

    2017-01-01

    Patellar instability is a common clinical problem encountered by orthopedic surgeons specializing in the knee. For patients with chronic lateral patellar instability, the standard surgical approach is to stabilize the patella through a medial patellofemoral ligament (MPFL) reconstruction. Foreseeably, an increasing number of revision surgeries of the reconstructed MPFL will be seen in upcoming years. In this paper, the causes of failed MPFL reconstruction are analyzed: (1) incorrect surgical indication or inappropriate surgical technique/patient selection; (2) a technical error; and (3) an incorrect assessment of the concomitant risk factors for instability. An understanding of the anatomy and biomechanics of the MPFL and cautiousness with the imaging techniques while favoring clinical over radiological findings and the use of common sense to determine the adequate surgical technique for each particular case, are critical to minimizing MPFL surgery failure. Additionally, our approach to dealing with failure after primary MPFL reconstruction is also presented. PMID:28251062

  18. Reconstruction of Low Pressure Gas Supply System

    Directory of Open Access Journals (Sweden)

    S. N. Osipov

    2013-01-01

    Full Text Available The current reconstruction of residential areas in large cities especially with the developed heat-supply systems from thermal power stations and reduction of heat consumption for heating due to higher thermal resistance of building enclosing structures requires new technical solutions in respect of gas-supply problems. While making reconstruction of a gas-supply system of the modernized or new buildings in the operating zone of one gas-distribution plant it is necessary to change hot water-supply systems from gas direct-flow water heaters to centralized heat-supply and free gas volumes are to be used for other needs or gas-supply of new buildings with the current external gas distribution network.Selection of additional gas-line sections and points of gas-supply systems pertaining to new and reconstructed buildings for their connection to the current distribution system of gas-supply is to be executed in accordance with the presented methodology.

  19. Manhattan-World Urban Reconstruction from Point Clouds

    KAUST Repository

    Li, Minglei; Wonka, Peter; Nan, Liangliang

    2016-01-01

    Manhattan-world urban scenes are common in the real world. We propose a fully automatic approach for reconstructing such scenes from 3D point samples. Our key idea is to represent the geometry of the buildings in the scene using a set of well-aligned boxes. We first extract plane hypothesis from the points followed by an iterative refinement step. Then, candidate boxes are obtained by partitioning the space of the point cloud into a non-uniform grid. After that, we choose an optimal subset of the candidate boxes to approximate the geometry of the buildings. The contribution of our work is that we transform scene reconstruction into a labeling problem that is solved based on a novel Markov Random Field formulation. Unlike previous methods designed for particular types of input point clouds, our method can obtain faithful reconstructions from a variety of data sources. Experiments demonstrate that our method is superior to state-of-the-art methods. © Springer International Publishing AG 2016.

  20. Manhattan-World Urban Reconstruction from Point Clouds

    KAUST Repository

    Li, Minglei

    2016-09-16

    Manhattan-world urban scenes are common in the real world. We propose a fully automatic approach for reconstructing such scenes from 3D point samples. Our key idea is to represent the geometry of the buildings in the scene using a set of well-aligned boxes. We first extract plane hypothesis from the points followed by an iterative refinement step. Then, candidate boxes are obtained by partitioning the space of the point cloud into a non-uniform grid. After that, we choose an optimal subset of the candidate boxes to approximate the geometry of the buildings. The contribution of our work is that we transform scene reconstruction into a labeling problem that is solved based on a novel Markov Random Field formulation. Unlike previous methods designed for particular types of input point clouds, our method can obtain faithful reconstructions from a variety of data sources. Experiments demonstrate that our method is superior to state-of-the-art methods. © Springer International Publishing AG 2016.

  1. Inverse problems in classical and quantum physics

    International Nuclear Information System (INIS)

    Almasy, A.A.

    2007-01-01

    The subject of this thesis is in the area of Applied Mathematics known as Inverse Problems. Inverse problems are those where a set of measured data is analysed in order to get as much information as possible on a model which is assumed to represent a system in the real world. We study two inverse problems in the fields of classical and quantum physics: QCD condensates from tau-decay data and the inverse conductivity problem. Despite a concentrated effort by physicists extending over many years, an understanding of QCD from first principles continues to be elusive. Fortunately, data continues to appear which provide a rather direct probe of the inner workings of the strong interactions. We use a functional method which allows us to extract within rather general assumptions phenomenological parameters of QCD (the condensates) from a comparison of the time-like experimental data with asymptotic space-like results from theory. The price to be paid for the generality of assumptions is relatively large errors in the values of the extracted parameters. Although we do not claim that our method is superior to other approaches, we hope that our results lend additional confidence to the numerical results obtained with the help of methods based on QCD sum rules. EIT is a technology developed to image the electrical conductivity distribution of a conductive medium. The technique works by performing simultaneous measurements of direct or alternating electric currents and voltages on the boundary of an object. These are the data used by an image reconstruction algorithm to determine the electrical conductivity distribution within the object. In this thesis, two approaches of EIT image reconstruction are proposed. The first is based on reformulating the inverse problem in terms of integral equations. This method uses only a single set of measurements for the reconstruction. The second approach is an algorithm based on linearisation which uses more then one set of measurements. A

  2. Inverse problems in classical and quantum physics

    Energy Technology Data Exchange (ETDEWEB)

    Almasy, A.A.

    2007-06-29

    The subject of this thesis is in the area of Applied Mathematics known as Inverse Problems. Inverse problems are those where a set of measured data is analysed in order to get as much information as possible on a model which is assumed to represent a system in the real world. We study two inverse problems in the fields of classical and quantum physics: QCD condensates from tau-decay data and the inverse conductivity problem. Despite a concentrated effort by physicists extending over many years, an understanding of QCD from first principles continues to be elusive. Fortunately, data continues to appear which provide a rather direct probe of the inner workings of the strong interactions. We use a functional method which allows us to extract within rather general assumptions phenomenological parameters of QCD (the condensates) from a comparison of the time-like experimental data with asymptotic space-like results from theory. The price to be paid for the generality of assumptions is relatively large errors in the values of the extracted parameters. Although we do not claim that our method is superior to other approaches, we hope that our results lend additional confidence to the numerical results obtained with the help of methods based on QCD sum rules. EIT is a technology developed to image the electrical conductivity distribution of a conductive medium. The technique works by performing simultaneous measurements of direct or alternating electric currents and voltages on the boundary of an object. These are the data used by an image reconstruction algorithm to determine the electrical conductivity distribution within the object. In this thesis, two approaches of EIT image reconstruction are proposed. The first is based on reformulating the inverse problem in terms of integral equations. This method uses only a single set of measurements for the reconstruction. The second approach is an algorithm based on linearisation which uses more then one set of measurements. A

  3. 3D reconstruction from X-ray fluoroscopy for clinical veterinary medicine using differential volume rendering

    International Nuclear Information System (INIS)

    Khongsomboon, K.; Hamamoto, Kazuhiko; Kondo, Shozo

    2007-01-01

    3D reconstruction from ordinary X-ray equipment which is not CT or MRI is required in clinical veterinary medicine. Authors have already proposed a 3D reconstruction technique from X-ray photograph to present bone structure. Although the reconstruction is useful for veterinary medicine, the technique has two problems. One is about exposure of X-ray and the other is about data acquisition process. An x-ray equipment which is not special one but can solve the problems is X-ray fluoroscopy. Therefore, in this paper, we propose a method for 3D-reconstruction from X-ray fluoroscopy for clinical veterinary medicine. Fluoroscopy is usually used to observe a movement of organ or to identify a position of organ for surgery by weak X-ray intensity. Since fluoroscopy can output a observed result as movie, the previous two problems which are caused by use of X-ray photograph can be solved. However, a new problem arises due to weak X-ray intensity. Although fluoroscopy can present information of not only bone structure but soft tissues, the contrast is very low and it is very difficult to recognize some soft tissues. It is very useful to be able to observe not only bone structure but soft tissues clearly by ordinary X-ray equipment in the field of clinical veterinary medicine. To solve this problem, this paper proposes a new method to determine opacity in volume rendering process. The opacity is determined according to 3D differential coefficient of 3D reconstruction. This differential volume rendering can present a 3D structure image of multiple organs volumetrically and clearly for clinical veterinary medicine. This paper shows results of simulation and experimental investigation of small dog and evaluation by veterinarians. (author)

  4. Hybrid light transport model based bioluminescence tomography reconstruction for early gastric cancer detection

    Science.gov (United States)

    Chen, Xueli; Liang, Jimin; Hu, Hao; Qu, Xiaochao; Yang, Defu; Chen, Duofang; Zhu, Shouping; Tian, Jie

    2012-03-01

    Gastric cancer is the second cause of cancer-related death in the world, and it remains difficult to cure because it has been in late-stage once that is found. Early gastric cancer detection becomes an effective approach to decrease the gastric cancer mortality. Bioluminescence tomography (BLT) has been applied to detect early liver cancer and prostate cancer metastasis. However, the gastric cancer commonly originates from the gastric mucosa and grows outwards. The bioluminescent light will pass through a non-scattering region constructed by gastric pouch when it transports in tissues. Thus, the current BLT reconstruction algorithms based on the approximation model of radiative transfer equation are not optimal to handle this problem. To address the gastric cancer specific problem, this paper presents a novel reconstruction algorithm that uses a hybrid light transport model to describe the bioluminescent light propagation in tissues. The radiosity theory integrated with the diffusion equation to form the hybrid light transport model is utilized to describe light propagation in the non-scattering region. After the finite element discretization, the hybrid light transport model is converted into a minimization problem which fuses an l1 norm based regularization term to reveal the sparsity of bioluminescent source distribution. The performance of the reconstruction algorithm is first demonstrated with a digital mouse based simulation with the reconstruction error less than 1mm. An in situ gastric cancer-bearing nude mouse based experiment is then conducted. The primary result reveals the ability of the novel BLT reconstruction algorithm in early gastric cancer detection.

  5. Accuracy improvement of CT reconstruction using tree-structured filter bank

    International Nuclear Information System (INIS)

    Ueda, Kazuhiro; Morimoto, Hiroaki; Morikawa, Yoshitaka; Murakami, Junichi

    2009-01-01

    Accuracy improvement of 'CT reconstruction algorithm using TSFB (Tree-Structured Filter Bank)' that is high-speed CT reconstruction algorithm, was proposed. TSFB method could largely reduce the amount of computation in comparison with the CB (Convolution Backprojection) method, but it was the problem that an artifact occurred in a reconstruction image since reconstruction was performed with disregard to a signal out of the reconstruction domain in stage processing. Also the whole band filter being the component of a two-dimensional synthesis filter was IIR filter and then an artifact occurred at the end of the reconstruction image. In order to suppress these artifacts the proposed method enlarged the processing range by the TSFB method in the domain outside by the width control of the specimen line and line addition to the reconstruction domain outside. And, furthermore, to avoid increase of the amount of computation, the algorithm was proposed such as to decide the needed processing range depending on the number of steps processing with the TSFB and the degree of incline of filter, and then update the position and width of the specimen line to process the needed range. According to the simulation to realize a high-speed and highly accurate CT reconstruction in this way, the quality of the reconstruction image of the proposed method was improved in comparison with the TSFB method and got the same result with the CB method. (T. Tanaka)

  6. Quantitative tomography simulations and reconstruction algorithms

    International Nuclear Information System (INIS)

    Martz, H.E.; Aufderheide, M.B.; Goodman, D.; Schach von Wittenau, A.; Logan, C.; Hall, J.; Jackson, J.; Slone, D.

    2000-01-01

    X-ray, neutron and proton transmission radiography and computed tomography (CT) are important diagnostic tools that are at the heart of LLNL's effort to meet the goals of the DOE's Advanced Radiography Campaign. This campaign seeks to improve radiographic simulation and analysis so that radiography can be a useful quantitative diagnostic tool for stockpile stewardship. Current radiographic accuracy does not allow satisfactory separation of experimental effects from the true features of an object's tomographically reconstructed image. This can lead to difficult and sometimes incorrect interpretation of the results. By improving our ability to simulate the whole radiographic and CT system, it will be possible to examine the contribution of system components to various experimental effects, with the goal of removing or reducing them. In this project, we are merging this simulation capability with a maximum-likelihood (constrained-conjugate-gradient-CCG) reconstruction technique yielding a physics-based, forward-model image-reconstruction code. In addition, we seek to improve the accuracy of computed tomography from transmission radiographs by studying what physics is needed in the forward model. During FY 2000, an improved version of the LLNL ray-tracing code called HADES has been coupled with a recently developed LLNL CT algorithm known as CCG. The problem of image reconstruction is expressed as a large matrix equation relating a model for the object being reconstructed to its projections (radiographs). Using a constrained-conjugate-gradient search algorithm, a maximum likelihood solution is sought. This search continues until the difference between the input measured radiographs or projections and the simulated or calculated projections is satisfactorily small

  7. Multifractal signal reconstruction based on singularity power spectrum

    International Nuclear Information System (INIS)

    Xiong, Gang; Yu, Wenxian; Xia, Wenxiang; Zhang, Shuning

    2016-01-01

    Highlights: • We propose a novel multifractal reconstruction method based on singularity power spectrum analysis (MFR-SPS). • The proposed MFR-SPS method has better power characteristic than the algorithm in Fraclab. • Further, the SPS-ISE algorithm performs better than the SPS-MFS algorithm. • Based on the proposed MFR-SPS method, we can restructure singularity white fractal noise (SWFN) and linear singularity modulation (LSM) multifractal signal, in equivalent sense, similar with the linear frequency modulation(LFM) signal and WGN in the Fourier domain. - Abstract: Fractal reconstruction (FR) and multifractal reconstruction (MFR) can be considered as the inverse problem of singularity spectrum analysis, and it is challenging to reconstruct fractal signal in accord with multifractal spectrum (MFS). Due to the multiple solutions of fractal reconstruction, the traditional methods of FR/MFR, such as FBM based method, wavelet based method, random wavelet series, fail to reconstruct fractal signal deterministically, and besides, those methods neglect the power spectral distribution in the singular domain. In this paper, we propose a novel MFR method based singularity power spectrum (SPS). Supposing the consistent uniform covering of multifractal measurement, we control the traditional power law of each scale of wavelet coefficients based on the instantaneous singularity exponents (ISE) or MFS, simultaneously control the singularity power law based on the SPS, and deduce the principle and algorithm of MFR based on SPS. Reconstruction simulation and error analysis of estimated ISE, MFS and SPS show the effectiveness and the improvement of the proposed methods compared to those obtained by the Fraclab package.

  8. Reconstruction and restoration of historical buildings of transport infrastructure

    Science.gov (United States)

    Kareeva, Daria; Glazkova, Valeriya

    2017-10-01

    The aim of this article is to identify the main problems in the restoration of the historical objects. For this reason, it is rationally to collect and analyze the existing world experience of restoration. The information which was put together showed that there are some problems which are common and can be solved. In addition, the protection of the Monuments of Culture and Architecture Comittees always makes the restoration and reconstruction of the historical buildings complicated. By the examples of Germany, Italy and Russia it is shown that there are problems in organization, economy, planning and control. Engineers should think of and justify the methodology of organizing and monitoring of the restoration of the historical buildings. As a second solution, it will be possible to minimize time and financial costs through a favorable financial and legal background for investors and through the creation of a system of restoration work organizing. And for a faster process of restoration the imitation programs should be optimized for research and selection of the reconstruction technological and economic methods.

  9. Blind spectrum reconstruction algorithm with L0-sparse representation

    International Nuclear Information System (INIS)

    Liu, Hai; Zhang, Zhaoli; Liu, Sanyan; Shu, Jiangbo; Liu, Tingting; Zhang, Tianxu

    2015-01-01

    Raman spectrum often suffers from band overlap and Poisson noise. This paper presents a new blind Poissonian Raman spectrum reconstruction method, which incorporates the L 0 -sparse prior together with the total variation constraint into the maximum a posteriori framework. Furthermore, the greedy analysis pursuit algorithm is adopted to solve the L 0 -based minimization problem. Simulated and real spectrum experimental results show that the proposed method can effectively preserve spectral structure and suppress noise. The reconstructed Raman spectra are easily used for interpreting unknown chemical mixtures. (paper)

  10. Genital reconstruction in exstrophy patients

    Directory of Open Access Journals (Sweden)

    R B Nerli

    2012-01-01

    Full Text Available Introduction: Surgery for bladder exstrophy has been evolving over the last four to five decades. Because survival has become almost universal, the focus has changed in the exstrophy-epispadias complex to improving quality of life. The most prevalent problem in the long-term function of exstrophy patients is the sexual activity of the adolescent and adult males. The penis in exstrophy patients appears short because of marked congenital deficiency of anterior corporal tissue. Many patients approach for genital reconstruction to improve cosmesis as well as to correct chordee. We report our series of male patients seeking genital reconstruction following exstrophy repair in the past. Materials and Methods: Fourteen adolescent/adult male patients attended urology services during the period January 2000-December 2009 seeking genital reconstruction following exstrophy repair in the past. Results: Three patients underwent epispadias repair, four patients had chordee correction with cosmetic excision of skin tags and seven patients underwent chordee correction with penile lengthening. All patients reported satisfaction in the answered questionnaire. Patients undergoing penile lengthening by partial corporal dissection achieved a mean increase in length of 1.614 ± 0.279 cm dorsally and 1.543 ± 0.230 cm ventrally. The satisfactory rate assessed by the Short Form-36 (SF-36 showed that irrespective of the different genital reconstructive procedures done, the patients were satisfied with cosmetic and functional outcome. Conclusions: Surgical procedures have transformed the management in these patients with bladder exstrophy. Bladders can be safely placed within the pelvis, with most patients achieving urinary continence and cosmetically acceptable external genitalia. Genital reconstruction in the form of correction of chordee, excision of ugly skin tags and lengthening of penis can be performed to give the patients a satisfactory cosmetic and functional

  11. Self-narrative reconstruction in emotion-focused therapy: A preliminary task analysis.

    Science.gov (United States)

    Cunha, Carla; Mendes, Inês; Ribeiro, António P; Angus, Lynne; Greenberg, Leslie S; Gonçalves, Miguel M

    2017-11-01

    This research explored the consolidation phase of emotion-focused therapy (EFT) for depression and studies-through a task-analysis method-how client-therapist dyads evolved from the exploration of the problem to self-narrative reconstruction. Innovative moments (IMs) were used to situate the process of self-narrative reconstruction within sessions, particularly through reconceptualization and performing change IMs. We contrasted the observation of these occurrences with a rational model of self-narrative reconstruction, previously built. This study presents the rational model and the revised rational-empirical model of the self-narrative reconstruction task in three EFT dyads, suggesting nine steps necessary for task resolution: (1) Explicit recognition of differences in the present and steps in the path of change; (2) Development of a meta-perspective contrast between present self and past self; (3) Amplification of contrast in the self; (4) A positive appreciation of changes is conveyed; (5) Occurrence of feelings of empowerment, competence, and mastery; (6) Reference to difficulties still present; (7) Emphasis on the loss of centrality of the problem; (8) Perception of change as a gradual, developing process; and (9) Reference to projects, experiences of change, or elaboration of new plans. Central aspects of therapist activity in facilitating the client's progression along these nine steps are also elaborated.

  12. Revision allograft reconstruction of the lateral collateral ligament complex in elbows with previous failed reconstruction and persistent posterolateral rotatory instability.

    Science.gov (United States)

    Baghdadi, Yaser M K; Morrey, Bernard F; O'Driscoll, Shawn W; Steinmann, Scott P; Sanchez-Sotelo, Joaquin

    2014-07-01

    six elbows were rated with a good or excellent result. All patients with persistent instability had some degree of preoperative bone loss. Revision allograft reconstruction of the LCLC is an option for treating recurrent PLRI, although this is a complex and resistant problem, and nearly ½ of the patients in this cohort either had persistent instability and/or had a fair or poor elbow score. Level IV, therapeutic study. See Instructions for Authors for a complete description of levels of evidence.

  13. Well-posedness of the conductivity reconstruction from an interior current density in terms of Schauder theory

    KAUST Repository

    Kim, Yong-Jung

    2015-06-23

    We show the well-posedness of the conductivity image reconstruction problem with a single set of interior electrical current data and boundary conductivity data. Isotropic conductivity is considered in two space dimensions. Uniqueness for similar conductivity reconstruction problems has been known for several cases. However, the existence and the stability are obtained in this paper for the first time. The main tool of the proof is the method of characteristics of a related curl equation.

  14. Well-posedness of the conductivity reconstruction from an interior current density in terms of Schauder theory

    KAUST Repository

    Kim, Yong-Jung; Lee, Min-Gi

    2015-01-01

    We show the well-posedness of the conductivity image reconstruction problem with a single set of interior electrical current data and boundary conductivity data. Isotropic conductivity is considered in two space dimensions. Uniqueness for similar conductivity reconstruction problems has been known for several cases. However, the existence and the stability are obtained in this paper for the first time. The main tool of the proof is the method of characteristics of a related curl equation.

  15. Existence and uniqueness in anisotropic conductivity reconstruction with Faraday's law

    KAUST Repository

    Lee, Min-Gi

    2015-03-18

    We show that three sets of internal current densities are the right amount of data that give the existence and the uniqueness at the same time in reconstructing an anisotropic conductivity in two space dimensions. The curl free equation of Faraday’s law is taken instead of the usual divergence free equation of the electrical impedance to- mography. Boundary conditions related to given current densities are introduced which complete a well determined problem for conductivity reconstruction together with Fara- day’s law.

  16. Linearized image reconstruction method for ultrasound modulated electrical impedance tomography based on power density distribution

    International Nuclear Information System (INIS)

    Song, Xizi; Xu, Yanbin; Dong, Feng

    2017-01-01

    Electrical resistance tomography (ERT) is a promising measurement technique with important industrial and clinical applications. However, with limited effective measurements, it suffers from poor spatial resolution due to the ill-posedness of the inverse problem. Recently, there has been an increasing research interest in hybrid imaging techniques, utilizing couplings of physical modalities, because these techniques obtain much more effective measurement information and promise high resolution. Ultrasound modulated electrical impedance tomography (UMEIT) is one of the newly developed hybrid imaging techniques, which combines electric and acoustic modalities. A linearized image reconstruction method based on power density is proposed for UMEIT. The interior data, power density distribution, is adopted to reconstruct the conductivity distribution with the proposed image reconstruction method. At the same time, relating the power density change to the change in conductivity, the Jacobian matrix is employed to make the nonlinear problem into a linear one. The analytic formulation of this Jacobian matrix is derived and its effectiveness is also verified. In addition, different excitation patterns are tested and analyzed, and opposite excitation provides the best performance with the proposed method. Also, multiple power density distributions are combined to implement image reconstruction. Finally, image reconstruction is implemented with the linear back-projection (LBP) algorithm. Compared with ERT, with the proposed image reconstruction method, UMEIT can produce reconstructed images with higher quality and better quantitative evaluation results. (paper)

  17. Homotopic non-local regularized reconstruction from sparse positron emission tomography measurements

    International Nuclear Information System (INIS)

    Wong, Alexander; Liu, Chenyi; Wang, Xiao Yu; Fieguth, Paul; Bie, Hongxia

    2015-01-01

    Positron emission tomography scanners collect measurements of a patient’s in vivo radiotracer distribution. The system detects pairs of gamma rays emitted indirectly by a positron-emitting radionuclide (tracer), which is introduced into the body on a biologically active molecule, and the tomograms must be reconstructed from projections. The reconstruction of tomograms from the acquired PET data is an inverse problem that requires regularization. The use of tightly packed discrete detector rings, although improves signal-to-noise ratio, are often associated with high costs of positron emission tomography systems. Thus a sparse reconstruction, which would be capable of overcoming the noise effect while allowing for a reduced number of detectors, would have a great deal to offer. In this study, we introduce and investigate the potential of a homotopic non-local regularization reconstruction framework for effectively reconstructing positron emission tomograms from such sparse measurements. Results obtained using the proposed approach are compared with traditional filtered back-projection as well as expectation maximization reconstruction with total variation regularization. A new reconstruction method was developed for the purpose of improving the quality of positron emission tomography reconstruction from sparse measurements. We illustrate that promising reconstruction performance can be achieved for the proposed approach even at low sampling fractions, which allows for the use of significantly fewer detectors and have the potential to reduce scanner costs

  18. Duality reconstruction algorithm for use in electrical impedance tomography

    International Nuclear Information System (INIS)

    Abdullah, M.Z.; Dickin, F.J.

    1996-01-01

    A duality reconstruction algorithm for solving the inverse problem in electrical impedance tomography (EIT) is described. In this method, an algorithm based on the Geselowitz compensation (GC) theorem is used first to reconstruct an approximate version of the image. It is then fed as a first guessed data to the modified Newton-Raphson (MNR) algorithm which iteratively correct the image until a final acceptable solution is reached. The implementation of the GC and MNR based algorithms using the finite element method will be discussed. Reconstructed images produced by the algorithm will also be presented. Consideration is also given to the most computationally intensive aspects of the algorithm, namely the inversion of the large and sparse matrices. The methods taken to approximately compute the inverse ot those matrices will be outlined. (author)

  19. REGULARIZED D-BAR METHOD FOR THE INVERSE CONDUCTIVITY PROBLEM

    DEFF Research Database (Denmark)

    Knudsen, Kim; Lassas, Matti; Mueller, Jennifer

    2009-01-01

    A strategy for regularizing the inversion procedure for the two-dimensional D-bar reconstruction algorithm based on the global uniqueness proof of Nachman [Ann. Math. 143 (1996)] for the ill-posed inverse conductivity problem is presented. The strategy utilizes truncation of the boundary integral...... the convergence of the reconstructed conductivity to the true conductivity as the noise level tends to zero. The results provide a link between two traditions of inverse problems research: theory of regularization and inversion methods based on complex geometrical optics. Also, the procedure is a novel...

  20. Scenery reconstruction in two dimensions with many colors

    NARCIS (Netherlands)

    Löwe, M.; Matzinger, H.

    2002-01-01

    Kesten has observed that the known reconstruction methods of random sceneries seem to strongly depend on the one-dimensional setting of the problem and asked whether a construction still is possible in two dimensions. In this paper we answer this question in the affirmative under the condition that

  1. Three-dimension reconstruction based on spatial light modulator

    International Nuclear Information System (INIS)

    Deng Xuejiao; Zhang Nanyang; Zeng Yanan; Yin Shiliang; Wang Weiyu

    2011-01-01

    Three-dimension reconstruction, known as an important research direction of computer graphics, is widely used in the related field such as industrial design and manufacture, construction, aerospace, biology and so on. Via such technology we can obtain three-dimension digital point cloud from a two-dimension image, and then simulate the three-dimensional structure of the physical object for further study. At present, the obtaining of three-dimension digital point cloud data is mainly based on the adaptive optics system with Shack-Hartmann sensor and phase-shifting digital holography. Referring to surface fitting, there are also many available methods such as iterated discrete fourier transform, convolution and image interpolation, linear phase retrieval. The main problems we came across in three-dimension reconstruction are the extraction of feature points and arithmetic of curve fitting. To solve such problems, we can, first of all, calculate the relevant surface normal vector information of each pixel in the light source coordinate system, then these vectors are to be converted to the coordinates of image through the coordinate conversion, so the expectant 3D point cloud get arise. Secondly, after the following procedures of de-noising, repairing, the feature points can later be selected and fitted to get the fitting function of the surface topography by means of Zernike polynomial, so as to reconstruct the determinand's three-dimensional topography. In this paper, a new kind of three-dimension reconstruction algorithm is proposed, with the assistance of which, the topography can be estimated from its grayscale at different sample points. Moreover, the previous stimulation and the experimental results prove that the new algorithm has a strong capability to fit, especially for large-scale objects .

  2. Three-dimension reconstruction based on spatial light modulator

    Science.gov (United States)

    Deng, Xuejiao; Zhang, Nanyang; Zeng, Yanan; Yin, Shiliang; Wang, Weiyu

    2011-02-01

    Three-dimension reconstruction, known as an important research direction of computer graphics, is widely used in the related field such as industrial design and manufacture, construction, aerospace, biology and so on. Via such technology we can obtain three-dimension digital point cloud from a two-dimension image, and then simulate the three-dimensional structure of the physical object for further study. At present, the obtaining of three-dimension digital point cloud data is mainly based on the adaptive optics system with Shack-Hartmann sensor and phase-shifting digital holography. Referring to surface fitting, there are also many available methods such as iterated discrete fourier transform, convolution and image interpolation, linear phase retrieval. The main problems we came across in three-dimension reconstruction are the extraction of feature points and arithmetic of curve fitting. To solve such problems, we can, first of all, calculate the relevant surface normal vector information of each pixel in the light source coordinate system, then these vectors are to be converted to the coordinates of image through the coordinate conversion, so the expectant 3D point cloud get arise. Secondly, after the following procedures of de-noising, repairing, the feature points can later be selected and fitted to get the fitting function of the surface topography by means of Zernike polynomial, so as to reconstruct the determinand's three-dimensional topography. In this paper, a new kind of three-dimension reconstruction algorithm is proposed, with the assistance of which, the topography can be estimated from its grayscale at different sample points. Moreover, the previous stimulation and the experimental results prove that the new algorithm has a strong capability to fit, especially for large-scale objects .

  3. A Superresolution Image Reconstruction Algorithm Based on Landweber in Electrical Capacitance Tomography

    Directory of Open Access Journals (Sweden)

    Chen Deyun

    2013-01-01

    Full Text Available According to the image reconstruction accuracy influenced by the “soft field” nature and ill-conditioned problems in electrical capacitance tomography, a superresolution image reconstruction algorithm based on Landweber is proposed in the paper, which is based on the working principle of the electrical capacitance tomography system. The method uses the algorithm which is derived by regularization of solutions derived and derives closed solution by fast Fourier transform of the convolution kernel. So, it ensures the certainty of the solution and improves the stability and quality of image reconstruction results. Simulation results show that the imaging precision and real-time imaging of the algorithm are better than Landweber algorithm, and this algorithm proposes a new method for the electrical capacitance tomography image reconstruction algorithm.

  4. Algorithms For Phylogeny Reconstruction In a New Mathematical Model

    NARCIS (Netherlands)

    Lenzini, Gabriele; Marianelli, Silvia

    1997-01-01

    The evolutionary history of a set of species is represented by a tree called phylogenetic tree or phylogeny. Its structure depends on precise biological assumptions about the evolution of species. Problems related to phylogeny reconstruction (i.e., finding a tree representation of information

  5. A New Method for Coronal Magnetic Field Reconstruction

    Science.gov (United States)

    Yi, Sibaek; Choe, Gwang-Son; Cho, Kyung-Suk; Kim, Kap-Sung

    2017-08-01

    A precise way of coronal magnetic field reconstruction (extrapolation) is an indispensable tool for understanding of various solar activities. A variety of reconstruction codes have been developed so far and are available to researchers nowadays, but they more or less bear this and that shortcoming. In this paper, a new efficient method for coronal magnetic field reconstruction is presented. The method imposes only the normal components of magnetic field and current density at the bottom boundary to avoid the overspecification of the reconstruction problem, and employs vector potentials to guarantee the divergence-freeness. In our method, the normal component of current density is imposed, not by adjusting the tangential components of A, but by adjusting its normal component. This allows us to avoid a possible numerical instability that on and off arises in codes using A. In real reconstruction problems, the information for the lateral and top boundaries is absent. The arbitrariness of the boundary conditions imposed there as well as various preprocessing brings about the diversity of resulting solutions. We impose the source surface condition at the top boundary to accommodate flux imbalance, which always shows up in magnetograms. To enhance the convergence rate, we equip our code with a gradient-method type accelerator. Our code is tested on two analytical force-free solutions. When the solution is given only at the bottom boundary, our result surpasses competitors in most figures of merits devised by Schrijver et al. (2006). We have also applied our code to a real active region NOAA 11974, in which two M-class flares and a halo CME took place. The EUV observation shows a sudden appearance of an erupting loop before the first flare. Our numerical solutions show that two entwining flux tubes exist before the flare and their shackling is released after the CME with one of them opened up. We suggest that the erupting loop is created by magnetic reconnection between

  6. A two-way regularization method for MEG source reconstruction

    KAUST Repository

    Tian, Tian Siva; Huang, Jianhua Z.; Shen, Haipeng; Li, Zhimin

    2012-01-01

    The MEG inverse problem refers to the reconstruction of the neural activity of the brain from magnetoencephalography (MEG) measurements. We propose a two-way regularization (TWR) method to solve the MEG inverse problem under the assumptions that only a small number of locations in space are responsible for the measured signals (focality), and each source time course is smooth in time (smoothness). The focality and smoothness of the reconstructed signals are ensured respectively by imposing a sparsity-inducing penalty and a roughness penalty in the data fitting criterion. A two-stage algorithm is developed for fast computation, where a raw estimate of the source time course is obtained in the first stage and then refined in the second stage by the two-way regularization. The proposed method is shown to be effective on both synthetic and real-world examples. © Institute of Mathematical Statistics, 2012.

  7. A two-way regularization method for MEG source reconstruction

    KAUST Repository

    Tian, Tian Siva

    2012-09-01

    The MEG inverse problem refers to the reconstruction of the neural activity of the brain from magnetoencephalography (MEG) measurements. We propose a two-way regularization (TWR) method to solve the MEG inverse problem under the assumptions that only a small number of locations in space are responsible for the measured signals (focality), and each source time course is smooth in time (smoothness). The focality and smoothness of the reconstructed signals are ensured respectively by imposing a sparsity-inducing penalty and a roughness penalty in the data fitting criterion. A two-stage algorithm is developed for fast computation, where a raw estimate of the source time course is obtained in the first stage and then refined in the second stage by the two-way regularization. The proposed method is shown to be effective on both synthetic and real-world examples. © Institute of Mathematical Statistics, 2012.

  8. A BPF-FBP tandem algorithm for image reconstruction in reverse helical cone-beam CT

    International Nuclear Information System (INIS)

    Cho, Seungryong; Xia, Dan; Pellizzari, Charles A.; Pan Xiaochuan

    2010-01-01

    Purpose: Reverse helical cone-beam computed tomography (CBCT) is a scanning configuration for potential applications in image-guided radiation therapy in which an accurate anatomic image of the patient is needed for image-guidance procedures. The authors previously developed an algorithm for image reconstruction from nontruncated data of an object that is completely within the reverse helix. The purpose of this work is to develop an image reconstruction approach for reverse helical CBCT of a long object that extends out of the reverse helix and therefore constitutes data truncation. Methods: The proposed approach comprises of two reconstruction steps. In the first step, a chord-based backprojection-filtration (BPF) algorithm reconstructs a volumetric image of an object from the original cone-beam data. Because there exists a chordless region in the middle of the reverse helix, the image obtained in the first step contains an unreconstructed central-gap region. In the second step, the gap region is reconstructed by use of a Pack-Noo-formula-based filteredbackprojection (FBP) algorithm from the modified cone-beam data obtained by subtracting from the original cone-beam data the reprojection of the image reconstructed in the first step. Results: The authors have performed numerical studies to validate the proposed approach in image reconstruction from reverse helical cone-beam data. The results confirm that the proposed approach can reconstruct accurate images of a long object without suffering from data-truncation artifacts or cone-angle artifacts. Conclusions: They developed and validated a BPF-FBP tandem algorithm to reconstruct images of a long object from reverse helical cone-beam data. The chord-based BPF algorithm was utilized for converting the long-object problem into a short-object problem. The proposed approach is applicable to other scanning configurations such as reduced circular sinusoidal trajectories.

  9. Adaptive wavelet tight frame construction for accelerating MRI reconstruction

    Directory of Open Access Journals (Sweden)

    Genjiao Zhou

    2017-09-01

    Full Text Available The sparsity regularization approach, which assumes that the image of interest is likely to have sparse representation in some transform domain, has been an active research area in image processing and medical image reconstruction. Although various sparsifying transforms have been used in medical image reconstruction such as wavelet, contourlet, and total variation (TV etc., the efficiency of these transforms typically rely on the special structure of the underlying image. A better way to address this issue is to develop an overcomplete dictionary from the input data in order to get a better sparsifying transform for the underlying image. However, the general overcomplete dictionaries do not satisfy the so-called perfect reconstruction property which ensures that the given signal can be perfectly represented by its canonical coefficients in a manner similar to orthonormal bases, resulting in time consuming in the iterative image reconstruction. This work is to develop an adaptive wavelet tight frame method for magnetic resonance image reconstruction. The proposed scheme incorporates the adaptive wavelet tight frame approach into the magnetic resonance image reconstruction by solving a l0-regularized minimization problem. Numerical results show that the proposed approach provides significant time savings as compared to the over-complete dictionary based methods with comparable performance in terms of both peak signal-to-noise ratio and subjective visual quality.

  10. One-stage bilateral anterior cruciate ligament reconstruction with use of hamstring tendon autografts: a case report

    Directory of Open Access Journals (Sweden)

    Matjaž Sajovic

    2007-12-01

    Full Text Available Background: Bilateral ACL rupture is not a common clinical problem, but the incidence of the bilateral knee injuries is increasing especially within highly sports active population. Mechanism of the injury rarely causes simultaneous bilateral ACL tear. Usually unilateral injury of the knee has occurred and later on contralateral injury has taken place.Patient and methods: Case report presented an unusual problem of a patient with chronic bilateral ACL – deficient knees and constitutionally very thin patellar tendons. Author decided to perform onestage bilateral ACL reconstructions using hamstring tendon autographs so as not too weaken his quadriceps muscles by compromising his extensor mechanism.Results: At three years follow-up the patient’s opinion was that both ACL reconstructed knees had normal function, and he had returned to his preinjury activity level. The overall result of the Lysholm knee score for left knee was 100 and for right one 95. Both knees had full range of motion, Lachman and pivot shift signs were negative.Conclusions: Two-stage bilateral ACL reconstruction is much more time consuming for the patient and expensive for health insurance, so one-stage bilateral ACL reconstruction is a logical solution of the problem.

  11. Postoperative 3D spine reconstruction by navigating partitioning manifolds

    Energy Technology Data Exchange (ETDEWEB)

    Kadoury, Samuel, E-mail: samuel.kadoury@polymtl.ca [Department of Computer and Software Engineering, Ecole Polytechnique Montreal, Montréal, Québec H3C 3A7 (Canada); Labelle, Hubert, E-mail: hubert.labelle@recherche-ste-justine.qc.ca; Parent, Stefan, E-mail: stefan.parent@umontreal.ca [CHU Sainte-Justine Hospital Research Center, Montréal, Québec H3T 1C5 (Canada)

    2016-03-15

    Purpose: The postoperative evaluation of scoliosis patients undergoing corrective treatment is an important task to assess the strategy of the spinal surgery. Using accurate 3D geometric models of the patient’s spine is essential to measure longitudinal changes in the patient’s anatomy. On the other hand, reconstructing the spine in 3D from postoperative radiographs is a challenging problem due to the presence of instrumentation (metallic rods and screws) occluding vertebrae on the spine. Methods: This paper describes the reconstruction problem by searching for the optimal model within a manifold space of articulated spines learned from a training dataset of pathological cases who underwent surgery. The manifold structure is implemented based on a multilevel manifold ensemble to structure the data, incorporating connections between nodes within a single manifold, in addition to connections between different multilevel manifolds, representing subregions with similar characteristics. Results: The reconstruction pipeline was evaluated on x-ray datasets from both preoperative patients and patients with spinal surgery. By comparing the method to ground-truth models, a 3D reconstruction accuracy of 2.24 ± 0.90 mm was obtained from 30 postoperative scoliotic patients, while handling patients with highly deformed spines. Conclusions: This paper illustrates how this manifold model can accurately identify similar spine models by navigating in the low-dimensional space, as well as computing nonlinear charts within local neighborhoods of the embedded space during the testing phase. This technique allows postoperative follow-ups of spinal surgery using personalized 3D spine models and assess surgical strategies for spinal deformities.

  12. Postoperative 3D spine reconstruction by navigating partitioning manifolds

    International Nuclear Information System (INIS)

    Kadoury, Samuel; Labelle, Hubert; Parent, Stefan

    2016-01-01

    Purpose: The postoperative evaluation of scoliosis patients undergoing corrective treatment is an important task to assess the strategy of the spinal surgery. Using accurate 3D geometric models of the patient’s spine is essential to measure longitudinal changes in the patient’s anatomy. On the other hand, reconstructing the spine in 3D from postoperative radiographs is a challenging problem due to the presence of instrumentation (metallic rods and screws) occluding vertebrae on the spine. Methods: This paper describes the reconstruction problem by searching for the optimal model within a manifold space of articulated spines learned from a training dataset of pathological cases who underwent surgery. The manifold structure is implemented based on a multilevel manifold ensemble to structure the data, incorporating connections between nodes within a single manifold, in addition to connections between different multilevel manifolds, representing subregions with similar characteristics. Results: The reconstruction pipeline was evaluated on x-ray datasets from both preoperative patients and patients with spinal surgery. By comparing the method to ground-truth models, a 3D reconstruction accuracy of 2.24 ± 0.90 mm was obtained from 30 postoperative scoliotic patients, while handling patients with highly deformed spines. Conclusions: This paper illustrates how this manifold model can accurately identify similar spine models by navigating in the low-dimensional space, as well as computing nonlinear charts within local neighborhoods of the embedded space during the testing phase. This technique allows postoperative follow-ups of spinal surgery using personalized 3D spine models and assess surgical strategies for spinal deformities

  13. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history

    OpenAIRE

    Cherry, Joshua L.

    2017-01-01

    Background Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Results Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data....

  14. Reconstruction of the MSRs in-situ at Beaver Valley

    International Nuclear Information System (INIS)

    Yarden, A.; Tam, C.W.; Deahna, S.T.; McFeaters, C.V.

    1992-01-01

    The Moisture Separator Reheaters (MSRs) have been problem components at Beaver Valley 1 pressurized water reactor since the plant started up 16 years ago, many of the problems encountered being widespread in the nuclear industry. In 1991, Duquesne Light rebuilt the Beaver Valley 1 MSRs and in 1992 did the same at unit 2. The reconstruction projects have proved cost effective with short payback times and significant improvements in station performance. (Author)

  15. Reconstruction of Ancestral Genomes in Presence of Gene Gain and Loss.

    Science.gov (United States)

    Avdeyev, Pavel; Jiang, Shuai; Aganezov, Sergey; Hu, Fei; Alekseyev, Max A

    2016-03-01

    Since most dramatic genomic changes are caused by genome rearrangements as well as gene duplications and gain/loss events, it becomes crucial to understand their mechanisms and reconstruct ancestral genomes of the given genomes. This problem was shown to be NP-complete even in the "simplest" case of three genomes, thus calling for heuristic rather than exact algorithmic solutions. At the same time, a larger number of input genomes may actually simplify the problem in practice as it was earlier illustrated with MGRA, a state-of-the-art software tool for reconstruction of ancestral genomes of multiple genomes. One of the key obstacles for MGRA and other similar tools is presence of breakpoint reuses when the same breakpoint region is broken by several different genome rearrangements in the course of evolution. Furthermore, such tools are often limited to genomes composed of the same genes with each gene present in a single copy in every genome. This limitation makes these tools inapplicable for many biological datasets and degrades the resolution of ancestral reconstructions in diverse datasets. We address these deficiencies by extending the MGRA algorithm to genomes with unequal gene contents. The developed next-generation tool MGRA2 can handle gene gain/loss events and shares the ability of MGRA to reconstruct ancestral genomes uniquely in the case of limited breakpoint reuse. Furthermore, MGRA2 employs a number of novel heuristics to cope with higher breakpoint reuse and process datasets inaccessible for MGRA. In practical experiments, MGRA2 shows superior performance for simulated and real genomes as compared to other ancestral genome reconstruction tools.

  16. Relaxed Simultaneous Tomographic Reconstruction and Segmentation with Class Priors for Poisson Noise

    DEFF Research Database (Denmark)

    Romanov, Mikhail; Dahl, Anders Bjorholm; Dong, Yiqiu

    : our new algorithm can handle Poisson noise in the data, and it can solve much larger problems since it does not store the matrix. We formulate this algorithm and test it on artificial test problems. Our results show that the algorithm performs well, and that we are able to produce reconstructions...

  17. Exact fan-beam image reconstruction algorithm for truncated projection data acquired from an asymmetric half-size detector

    International Nuclear Information System (INIS)

    Leng Shuai; Zhuang Tingliang; Nett, Brian E; Chen Guanghong

    2005-01-01

    In this paper, we present a new algorithm designed for a specific data truncation problem in fan-beam CT. We consider a scanning configuration in which the fan-beam projection data are acquired from an asymmetrically positioned half-sized detector. Namely, the asymmetric detector only covers one half of the scanning field of view. Thus, the acquired fan-beam projection data are truncated at every view angle. If an explicit data rebinning process is not invoked, this data acquisition configuration will reek havoc on many known fan-beam image reconstruction schemes including the standard filtered backprojection (FBP) algorithm and the super-short-scan FBP reconstruction algorithms. However, we demonstrate that a recently developed fan-beam image reconstruction algorithm which reconstructs an image via filtering a backprojection image of differentiated projection data (FBPD) survives the above fan-beam data truncation problem. Namely, we may exactly reconstruct the whole image object using the truncated data acquired in a full scan mode (2π angular range). We may also exactly reconstruct a small region of interest (ROI) using the truncated projection data acquired in a short-scan mode (less than 2π angular range). The most important characteristic of the proposed reconstruction scheme is that an explicit data rebinning process is not introduced. Numerical simulations were conducted to validate the new reconstruction algorithm

  18. Problems in modernization of automation systems at coal preparation plants

    Science.gov (United States)

    Myshlyaev, L. P.; Lyakhovets, M. V.; Venger, K. G.; Leontiev, I. A.; Makarov, G. V.; Salamatin, A. S.

    2018-05-01

    The factors influencing the process of modernization (reconstruction) of the automation systems at coal preparation plants are described. Problems such as heterogeneity of existing and developed systems, planning of reconstruction of a technological complex without taking into account modernization of automated systems, commissioning without stopping the existing technological complex, as well as problems of conducting procurement procedures are discussed. The option of stage-by-stage start-up and adjustment works in the conditions of modernization of systems without long stops of the process equipment is offered.

  19. Noise propagation in iterative reconstruction algorithms with line searches

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    In this paper we analyze the propagation of noise in iterative image reconstruction algorithms. We derive theoretical expressions for the general form of preconditioned gradient algorithms with line searches. The results are applicable to a wide range of iterative reconstruction problems, such as emission tomography, transmission tomography, and image restoration. A unique contribution of this paper comparing to our previous work [1] is that the line search is explicitly modeled and we do not use the approximation that the gradient of the objective function is zero. As a result, the error in the estimate of noise at early iterations is significantly reduced

  20. EM for phylogenetic topology reconstruction on nonhomogeneous data.

    Science.gov (United States)

    Ibáñez-Marcelo, Esther; Casanellas, Marta

    2014-06-17

    The reconstruction of the phylogenetic tree topology of four taxa is, still nowadays, one of the main challenges in phylogenetics. Its difficulties lie in considering not too restrictive evolutionary models, and correctly dealing with the long-branch attraction problem. The correct reconstruction of 4-taxon trees is crucial for making quartet-based methods work and being able to recover large phylogenies. We adapt the well known expectation-maximization algorithm to evolutionary Markov models on phylogenetic 4-taxon trees. We then use this algorithm to estimate the substitution parameters, compute the corresponding likelihood, and to infer the most likely quartet. In this paper we consider an expectation-maximization method for maximizing the likelihood of (time nonhomogeneous) evolutionary Markov models on trees. We study its success on reconstructing 4-taxon topologies and its performance as input method in quartet-based phylogenetic reconstruction methods such as QFIT and QuartetSuite. Our results show that the method proposed here outperforms neighbor-joining and the usual (time-homogeneous continuous-time) maximum likelihood methods on 4-leaved trees with among-lineage instantaneous rate heterogeneity, and perform similarly to usual continuous-time maximum-likelihood when data satisfies the assumptions of both methods. The method presented in this paper is well suited for reconstructing the topology of any number of taxa via quartet-based methods and is highly accurate, specially regarding largely divergent trees and time nonhomogeneous data.

  1. A dynamic state observer for real-time reconstruction of the tokamak plasma profile state and disturbances

    NARCIS (Netherlands)

    Felici, F.; De Baar, M.; Steinbuch, M.

    2014-01-01

    A dynamic observer is presented which can reconstruct the internal state of a tokamak fusion plasma, consisting of the spatial distribution of current and temperature, from measurements. Today, the internal plasma state is usually reconstructed by solving an ill-conditioned inversion problem using a

  2. Reconstruction and modernization of Novi Han radioactive waste repository

    International Nuclear Information System (INIS)

    Kolev, I.; Dralchev, D.; Spasov, P.; Jordanov, M.

    2000-01-01

    This report presents briefly the most important issues of the study performed by EQE - Bulgaria. The objectives of the study are the development of conceptual solutions for construction of the following facilities in the Novi Han radioactive waste repository: an operational storage for unconditioned high level spent sources; new temporary buildings over the existing radioactive waste storage facilities; a rain-water draining system ect. The study also includes the engineering solutions for conservation of the existing facilities, currently full with high level spent sources. A 'Program for reconstruction and modernization' has been created, including the analysis of some regulation aspects concerning this program implementation. In conclusions the engineering problems of Novi Han repository are clear and appropriate solutions are available. They can be implemented in both cases of 'small' or 'large' reconstruction. The reconstruction project anyway should start with the construction of a new site infrastructure. Reconstruction and modernization of Novi Han radioactive waste repository is the only way to improve the management and safety of radioactive waste from medicine, industry and scientific research in Bulgaria

  3. Algebraic reconstruction techniques for spectral reconstruction in diffuse optical tomography

    International Nuclear Information System (INIS)

    Brendel, Bernhard; Ziegler, Ronny; Nielsen, Tim

    2008-01-01

    Reconstruction in diffuse optical tomography (DOT) necessitates solving the diffusion equation, which is nonlinear with respect to the parameters that have to be reconstructed. Currently applied solving methods are based on the linearization of the equation. For spectral three-dimensional reconstruction, the emerging equation system is too large for direct inversion, but the application of iterative methods is feasible. Computational effort and speed of convergence of these iterative methods are crucial since they determine the computation time of the reconstruction. In this paper, the iterative methods algebraic reconstruction technique (ART) and conjugated gradients (CGs) as well as a new modified ART method are investigated for spectral DOT reconstruction. The aim of the modified ART scheme is to speed up the convergence by considering the specific conditions of spectral reconstruction. As a result, it converges much faster to favorable results than conventional ART and CG methods

  4. Simulation and reconstruction of free-streaming data in CBM

    International Nuclear Information System (INIS)

    Friese, Volker

    2011-01-01

    The CBM experiment will investigate heavy-ion reactions at the FAIR facility at unprecedented interaction rates. This implies a novel read-out and data acquisition concept with self-triggered front-end electronics and free-streaming data. Event association must be performed in software on-line, and may require four-dimensional reconstruction routines. In order to study the problem of event association and to develop proper algorithms, simulations must be performed which go beyond the normal event-by-event processing as available from most experimental simulation frameworks. In this article, we discuss the challenges and concepts for the reconstruction of such free-streaming data and present first steps for a time-based simulation which is necessary for the development and validation of the reconstruction algorithms, and which requires modifications to the current software framework FAIRROOT as well as to the data model.

  5. Image reconstruction with an adaptive threshold technique in electrical resistance tomography

    International Nuclear Information System (INIS)

    Kim, Bong Seok; Khambampati, Anil Kumar; Kim, Sin; Kim, Kyung Youn

    2011-01-01

    In electrical resistance tomography, electrical currents are injected through the electrodes placed on the surface of a domain and the corresponding voltages are measured. Based on these currents and voltage data, the cross-sectional resistivity distribution is reconstructed. Electrical resistance tomography shows high temporal resolution for monitoring fast transient processes, but it still remains a challenging problem to improve the spatial resolution of the reconstructed images. In this paper, a novel image reconstruction technique is proposed to improve the spatial resolution by employing an adaptive threshold method to the iterative Gauss–Newton method. Numerical simulations and phantom experiments have been performed to illustrate the superior performance of the proposed scheme in the sense of spatial resolution

  6. Reconstruction phases in the planar three- and four-vortex problems

    Science.gov (United States)

    Hernández-Garduño, Antonio; Shashikanth, Banavara N.

    2018-03-01

    Pure reconstruction phases—geometric and dynamic—are computed in the N-point-vortex model in the plane, for the cases N=3 and N=4 . The phases are computed relative to a metric-orthogonal connection on appropriately defined principal fiber bundles. The metric is similar to the kinetic energy metric for point masses but with the masses replaced by vortex strengths. The geometric phases are shown to be proportional to areas enclosed by the closed orbit on the symmetry reduced spaces. More interestingly, simple formulae are obtained for the dynamic phases, analogous to Montgomery’s result for the free rigid body, which show them to be proportional to the time period of the symmetry reduced closed orbits. For the case N = 3 a non-zero total vortex strength is assumed. For the case N = 4 the vortex strengths are assumed equal.

  7. Reconstruction of blood propagation in three-dimensional rotational X-ray angiography (3D-RA).

    Science.gov (United States)

    Schmitt, Holger; Grass, Michael; Suurmond, Rolf; Köhler, Thomas; Rasche, Volker; Hähnel, Stefan; Heiland, Sabine

    2005-10-01

    This paper presents a framework of non-interactive algorithms for the mapping of blood flow information to vessels in 3D-RA images. With the presented method, mapping of flow information to 3D-RA images is done automatically without user interaction. So far, radiologists had to perform this task by extensive image comparisons and did not obtain visualizations of the results. In our approach, flow information is reconstructed by forward projection of vessel pieces in a 3D-RA image to a two-dimensional projection series capturing the propagation of a short additional contrast agent bolus. For accurate 2D-3D image registration, an efficient patient motion compensation technique is introduced. As an exemplary flow-related quantity, bolus arrival times are reconstructed for the vessel pieces by matching of intensity-time curves. A plausibility check framework was developed which handles projection ambiguities and corrects for noisy flow reconstruction results. It is based on a linear programming approach to model the feeding structure of the vessel. The flow reconstruction method was applied to 12 cases of cerebral stenoses, AVMs and aneurysms, and it proved to be feasible in the clinical environment. The propagation of the injected contrast agent was reconstructed and visualized in three-dimensional images. The flow reconstruction method was able to visualize different types of useful information. In cases of stenosis of the middle cerebral artery (MCA), flow reconstruction can reveal impeded blood flow depending on the severeness of the stenosis. With cases of AVMs, flow reconstruction can clarify the feeding structure. The presented methods handle the problems imposed by clinical demands such as non-interactive algorithms, patient motion compensation, short reconstruction times, and technical requirements such as correction of noisy bolus arrival times and handling of overlapping vessel pieces. Problems occurred mainly in the reconstruction and segmentation of 3D

  8. Dynamic PET Image reconstruction for parametric imaging using the HYPR kernel method

    Science.gov (United States)

    Spencer, Benjamin; Qi, Jinyi; Badawi, Ramsey D.; Wang, Guobao

    2017-03-01

    Dynamic PET image reconstruction is a challenging problem because of the ill-conditioned nature of PET and the lowcounting statistics resulted from short time-frames in dynamic imaging. The kernel method for image reconstruction has been developed to improve image reconstruction of low-count PET data by incorporating prior information derived from high-count composite data. In contrast to most of the existing regularization-based methods, the kernel method embeds image prior information in the forward projection model and does not require an explicit regularization term in the reconstruction formula. Inspired by the existing highly constrained back-projection (HYPR) algorithm for dynamic PET image denoising, we propose in this work a new type of kernel that is simpler to implement and further improves the kernel-based dynamic PET image reconstruction. Our evaluation study using a physical phantom scan with synthetic FDG tracer kinetics has demonstrated that the new HYPR kernel-based reconstruction can achieve a better region-of-interest (ROI) bias versus standard deviation trade-off for dynamic PET parametric imaging than the post-reconstruction HYPR denoising method and the previously used nonlocal-means kernel.

  9. Reconstruction of ribosomal RNA genes from metagenomic data.

    Directory of Open Access Journals (Sweden)

    Lu Fan

    Full Text Available Direct sequencing of environmental DNA (metagenomics has a great potential for describing the 16S rRNA gene diversity of microbial communities. However current approaches using this 16S rRNA gene information to describe community diversity suffer from low taxonomic resolution or chimera problems. Here we describe a new strategy that involves stringent assembly and data filtering to reconstruct full-length 16S rRNA genes from metagenomicpyrosequencing data. Simulations showed that reconstructed 16S rRNA genes provided a true picture of the community diversity, had minimal rates of chimera formation and gave taxonomic resolution down to genus level. The strategy was furthermore compared to PCR-based methods to determine the microbial diversity in two marine sponges. This showed that about 30% of the abundant phylotypes reconstructed from metagenomic data failed to be amplified by PCR. Our approach is readily applicable to existing metagenomic datasets and is expected to lead to the discovery of new microbial phylotypes.

  10. Regional compensation for statistical maximum likelihood reconstruction error of PET image pixels

    International Nuclear Information System (INIS)

    Forma, J; Ruotsalainen, U; Niemi, J A

    2013-01-01

    In positron emission tomography (PET), there is an increasing interest in studying not only the regional mean tracer concentration, but its variation arising from local differences in physiology, the tissue heterogeneity. However, in reconstructed images this physiological variation is shadowed by a large reconstruction error, which is caused by noisy data and the inversion of tomographic problem. We present a new procedure which can quantify the error variation in regional reconstructed values for given PET measurement, and reveal the remaining tissue heterogeneity. The error quantification is made by creating and reconstructing the noise realizations of virtual sinograms, which are statistically similar with the measured sinogram. Tests with physical phantom data show that the characterization of error variation and the true heterogeneity are possible, despite the existing model error when real measurement is considered. (paper)

  11. Total variation regularization for a backward time-fractional diffusion problem

    International Nuclear Information System (INIS)

    Wang, Liyan; Liu, Jijun

    2013-01-01

    Consider a two-dimensional backward problem for a time-fractional diffusion process, which can be considered as image de-blurring where the blurring process is assumed to be slow diffusion. In order to avoid the over-smoothing effect for object image with edges and to construct a fast reconstruction scheme, the total variation regularizing term and the data residual error in the frequency domain are coupled to construct the cost functional. The well posedness of this optimization problem is studied. The minimizer is sought approximately using the iteration process for a series of optimization problems with Bregman distance as a penalty term. This iteration reconstruction scheme is essentially a new regularizing scheme with coupling parameter in the cost functional and the iteration stopping times as two regularizing parameters. We give the choice strategy for the regularizing parameters in terms of the noise level of measurement data, which yields the optimal error estimate on the iterative solution. The series optimization problems are solved by alternative iteration with explicit exact solution and therefore the amount of computation is much weakened. Numerical implementations are given to support our theoretical analysis on the convergence rate and to show the significant reconstruction improvements. (paper)

  12. Acellular dermal matrix based nipple reconstruction: A modified technique

    Directory of Open Access Journals (Sweden)

    Raghavan Vidya

    2017-09-01

    Full Text Available Nipple areolar reconstruction (NAR has evolved with the advancement in breast reconstruction and can improve self-esteem and, consequently, patient satisfaction. Although a variety of reconstruction techniques have been described in the literature varying from nipple sharing, local flaps to alloplastic and allograft augmentation, over time, loss of nipple projection remains a major problem. Acellular dermal matrices (ADM have revolutionised breast reconstruction more recently. We discuss the use of ADM to act as a base plate and strut to give support to the base and offer nipple bulk and projection in a primary procedure of NAR with a local clover shaped dermal flap in 5 breasts (4 patients. We used 5-point Likert scales (1 = highly unsatisfied, 5 = highly satisfied to assess patient satisfaction. Median age was 46 years (range: 38–55 years. Nipple projection of 8 mm, 7 mm, and 7 mms were achieved in the unilateral cases and 6 mm in the bilateral case over a median 18 month period. All patients reported at least a 4 on the Likert scale. We had no post-operative complications. It seems that nipple areolar reconstruction [NAR] using ADM can achieve nipple projection which is considered aesthetically pleasing for patients.

  13. Block Compressed Sensing of Images Using Adaptive Granular Reconstruction

    Directory of Open Access Journals (Sweden)

    Ran Li

    2016-01-01

    Full Text Available In the framework of block Compressed Sensing (CS, the reconstruction algorithm based on the Smoothed Projected Landweber (SPL iteration can achieve the better rate-distortion performance with a low computational complexity, especially for using the Principle Components Analysis (PCA to perform the adaptive hard-thresholding shrinkage. However, during learning the PCA matrix, it affects the reconstruction performance of Landweber iteration to neglect the stationary local structural characteristic of image. To solve the above problem, this paper firstly uses the Granular Computing (GrC to decompose an image into several granules depending on the structural features of patches. Then, we perform the PCA to learn the sparse representation basis corresponding to each granule. Finally, the hard-thresholding shrinkage is employed to remove the noises in patches. The patches in granule have the stationary local structural characteristic, so that our method can effectively improve the performance of hard-thresholding shrinkage. Experimental results indicate that the reconstructed image by the proposed algorithm has better objective quality when compared with several traditional ones. The edge and texture details in the reconstructed image are better preserved, which guarantees the better visual quality. Besides, our method has still a low computational complexity of reconstruction.

  14. Experiments in Reconstructing Twentieth-Century Sea Levels

    Science.gov (United States)

    Ray, Richard D.; Douglas, Bruce C.

    2011-01-01

    One approach to reconstructing historical sea level from the relatively sparse tide-gauge network is to employ Empirical Orthogonal Functions (EOFs) as interpolatory spatial basis functions. The EOFs are determined from independent global data, generally sea-surface heights from either satellite altimetry or a numerical ocean model. The problem is revisited here for sea level since 1900. A new approach to handling the tide-gauge datum problem by direct solution offers possible advantages over the method of integrating sea-level differences, with the potential of eventually adjusting datums into the global terrestrial reference frame. The resulting time series of global mean sea levels appears fairly insensitive to the adopted set of EOFs. In contrast, charts of regional sea level anomalies and trends are very sensitive to the adopted set of EOFs, especially for the sparser network of gauges in the early 20th century. The reconstructions appear especially suspect before 1950 in the tropical Pacific. While this limits some applications of the sea-level reconstructions, the sensitivity does appear adequately captured by formal uncertainties. All our solutions show regional trends over the past five decades to be fairly uniform throughout the global ocean, in contrast to trends observed over the shorter altimeter era. Consistent with several previous estimates, the global sea-level rise since 1900 is 1.70 +/- 0.26 mm/yr. The global trend since 1995 exceeds 3 mm/yr which is consistent with altimeter measurements, but this large trend was possibly also reached between 1935 and 1950.

  15. Accelerating image reconstruction in three-dimensional optoacoustic tomography on graphics processing units.

    Science.gov (United States)

    Wang, Kun; Huang, Chao; Kao, Yu-Jiun; Chou, Cheng-Ying; Oraevsky, Alexander A; Anastasio, Mark A

    2013-02-01

    Optoacoustic tomography (OAT) is inherently a three-dimensional (3D) inverse problem. However, most studies of OAT image reconstruction still employ two-dimensional imaging models. One important reason is because 3D image reconstruction is computationally burdensome. The aim of this work is to accelerate existing image reconstruction algorithms for 3D OAT by use of parallel programming techniques. Parallelization strategies are proposed to accelerate a filtered backprojection (FBP) algorithm and two different pairs of projection/backprojection operations that correspond to two different numerical imaging models. The algorithms are designed to fully exploit the parallel computing power of graphics processing units (GPUs). In order to evaluate the parallelization strategies for the projection/backprojection pairs, an iterative image reconstruction algorithm is implemented. Computer simulation and experimental studies are conducted to investigate the computational efficiency and numerical accuracy of the developed algorithms. The GPU implementations improve the computational efficiency by factors of 1000, 125, and 250 for the FBP algorithm and the two pairs of projection/backprojection operators, respectively. Accurate images are reconstructed by use of the FBP and iterative image reconstruction algorithms from both computer-simulated and experimental data. Parallelization strategies for 3D OAT image reconstruction are proposed for the first time. These GPU-based implementations significantly reduce the computational time for 3D image reconstruction, complementing our earlier work on 3D OAT iterative image reconstruction.

  16. Three-dimension reconstruction based on spatial light modulator

    Energy Technology Data Exchange (ETDEWEB)

    Deng Xuejiao; Zhang Nanyang; Zeng Yanan; Yin Shiliang; Wang Weiyu, E-mail: daisydelring@yahoo.com.cn [Huazhong University of Science and Technology (China)

    2011-02-01

    Three-dimension reconstruction, known as an important research direction of computer graphics, is widely used in the related field such as industrial design and manufacture, construction, aerospace, biology and so on. Via such technology we can obtain three-dimension digital point cloud from a two-dimension image, and then simulate the three-dimensional structure of the physical object for further study. At present, the obtaining of three-dimension digital point cloud data is mainly based on the adaptive optics system with Shack-Hartmann sensor and phase-shifting digital holography. Referring to surface fitting, there are also many available methods such as iterated discrete fourier transform, convolution and image interpolation, linear phase retrieval. The main problems we came across in three-dimension reconstruction are the extraction of feature points and arithmetic of curve fitting. To solve such problems, we can, first of all, calculate the relevant surface normal vector information of each pixel in the light source coordinate system, then these vectors are to be converted to the coordinates of image through the coordinate conversion, so the expectant 3D point cloud get arise. Secondly, after the following procedures of de-noising, repairing, the feature points can later be selected and fitted to get the fitting function of the surface topography by means of Zernike polynomial, so as to reconstruct the determinand's three-dimensional topography. In this paper, a new kind of three-dimension reconstruction algorithm is proposed, with the assistance of which, the topography can be estimated from its grayscale at different sample points. Moreover, the previous stimulation and the experimental results prove that the new algorithm has a strong capability to fit, especially for large-scale objects .

  17. Joint reconstruction of activity and attenuation in Time-of-Flight PET: A Quantitative Analysis.

    Science.gov (United States)

    Rezaei, Ahmadreza; Deroose, Christophe M; Vahle, Thomas; Boada, Fernando; Nuyts, Johan

    2018-03-01

    Joint activity and attenuation reconstruction methods from time of flight (TOF) positron emission tomography (PET) data provide an effective solution to attenuation correction when no (or incomplete/inaccurate) information on the attenuation is available. One of the main barriers limiting their use in clinical practice is the lack of validation of these methods on a relatively large patient database. In this contribution, we aim at validating the activity reconstructions of the maximum likelihood activity reconstruction and attenuation registration (MLRR) algorithm on a whole-body patient data set. Furthermore, a partial validation (since the scale problem of the algorithm is avoided for now) of the maximum likelihood activity and attenuation reconstruction (MLAA) algorithm is also provided. We present a quantitative comparison of the joint reconstructions to the current clinical gold-standard maximum likelihood expectation maximization (MLEM) reconstruction with CT-based attenuation correction. Methods: The whole-body TOF-PET emission data of each patient data set is processed as a whole to reconstruct an activity volume covering all the acquired bed positions, which helps to reduce the problem of a scale per bed position in MLAA to a global scale for the entire activity volume. Three reconstruction algorithms are used: MLEM, MLRR and MLAA. A maximum likelihood (ML) scaling of the single scatter simulation (SSS) estimate to the emission data is used for scatter correction. The reconstruction results are then analyzed in different regions of interest. Results: The joint reconstructions of the whole-body patient data set provide better quantification in case of PET and CT misalignments caused by patient and organ motion. Our quantitative analysis shows a difference of -4.2% (±2.3%) and -7.5% (±4.6%) between the joint reconstructions of MLRR and MLAA compared to MLEM, averaged over all regions of interest, respectively. Conclusion: Joint activity and attenuation

  18. Three-dimensional digital tomosynthesis iterative reconstruction, artifact reduction and alternative acquisition geometry

    CERN Document Server

    Levakhina, Yulia

    2014-01-01

    Yulia Levakhina gives an introduction to the major challenges of image reconstruction in Digital Tomosynthesis (DT), particularly to the connection of the reconstruction problem with the incompleteness of the DT dataset. The author discusses the factors which cause the formation of limited angle artifacts and proposes how to account for them in order to improve image quality and axial resolution of modern DT. The addressed methods include a weighted non-linear back projection scheme for algebraic reconstruction and?novel dual-axis acquisition geometry. All discussed algorithms and methods are supplemented by detailed illustrations, hints for practical implementation, pseudo-code, simulation results and real patient case examples.

  19. Super-Resolution Image Reconstruction Applied to Medical Ultrasound

    Science.gov (United States)

    Ellis, Michael

    Ultrasound is the preferred imaging modality for many diagnostic applications due to its real-time image reconstruction and low cost. Nonetheless, conventional ultrasound is not used in many applications because of limited spatial resolution and soft tissue contrast. Most commercial ultrasound systems reconstruct images using a simple delay-and-sum architecture on receive, which is fast and robust but does not utilize all information available in the raw data. Recently, more sophisticated image reconstruction methods have been developed that make use of far more information in the raw data to improve resolution and contrast. One such method is the Time-Domain Optimized Near-Field Estimator (TONE), which employs a maximum a priori estimation to solve a highly underdetermined problem, given a well-defined system model. TONE has been shown to significantly improve both the contrast and resolution of ultrasound images when compared to conventional methods. However, TONE's lack of robustness to variations from the system model and extremely high computational cost hinder it from being readily adopted in clinical scanners. This dissertation aims to reduce the impact of TONE's shortcomings, transforming it from an academic construct to a clinically viable image reconstruction algorithm. By altering the system model from a collection of individual hypothetical scatterers to a collection of weighted, diffuse regions, dTONE is able to achieve much greater robustness to modeling errors. A method for efficient parallelization of dTONE is presented that reduces reconstruction time by more than an order of magnitude with little loss in image fidelity. An alternative reconstruction algorithm, called qTONE, is also developed and is able to reduce reconstruction times by another two orders of magnitude while simultaneously improving image contrast. Each of these methods for improving TONE are presented, their limitations are explored, and all are used in concert to reconstruct in

  20. Parallel computing for event reconstruction in high-energy physics

    International Nuclear Information System (INIS)

    Wolbers, S.

    1993-01-01

    Parallel computing has been recognized as a solution to large computing problems. In High Energy Physics offline event reconstruction of detector data is a very large computing problem that has been solved with parallel computing techniques. A review of the parallel programming package CPS (Cooperative Processes Software) developed and used at Fermilab for offline reconstruction of Terabytes of data requiring the delivery of hundreds of Vax-Years per experiment is given. The Fermilab UNIX farms, consisting of 180 Silicon Graphics workstations and 144 IBM RS6000 workstations, are used to provide the computing power for the experiments. Fermilab has had a long history of providing production parallel computing starting with the ACP (Advanced Computer Project) Farms in 1986. The Fermilab UNIX Farms have been in production for over 2 years with 24 hour/day service to experimental user groups. Additional tools for management, control and monitoring these large systems will be described. Possible future directions for parallel computing in High Energy Physics will be given

  1. ACTS: from ATLAS software towards a common track reconstruction software

    Science.gov (United States)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  2. TECHNIQUES FOR RECONSTRUCTION OF THE PRESERVED HOUSING STOCK

    Directory of Open Access Journals (Sweden)

    Kustikova Yuliya Olegovna

    2017-10-01

    Full Text Available Nowadays in Russian cities a significant part of the housing stock in areas of existing buildings has a high level of physical and moral deterioration, indicators of infrastructure elements do not meet the current and future requirements. Reconstruction of residential buildings is one of the important directions in solving the housing problem. This will allow us to not only extend the life cycle but also significantly improve the quality of housing, eliminate a communal settlement, provide the houses with modern engineering equipment, improve architectural expressiveness of buildings and increase their energy efficiency. For buildings of different construction periods an individual approach is required in the development of methods and technologies of reconstruction. At the same time, the process should take place not in a separate building but in a group of buildings, neighborhood or district. This makes it possible to undertake a comprehensive assessment of the urban development situation and make the most rational decisions to meet modern conditions, and provide logical connection between various architectural trends. At the same time, there are possibilities for compaction and decompaction of buildings, the rational use of inter-district, underground space and communication systems. Moscow region is a large region, which occupies an area of 46 thousand square kilometers. The region includes more than 38 municipalities (municipal districts, urban and rural settlements. The region’s population is more than 7 million people. Moscow oblast has a central location in the Russian Federation and a close relationship with the capital. This relationship with Moscow is manifested through common social, scientific, industrial, transport links, environmental protection problems, labor resources. In 2016 the total area of the housing stock in Moscow region was about 220 million sq. m. The total area of dilapidated and emergency housing stock is just

  3. Reconstruction-based Digital Dental Occlusion of the Partially Edentulous Dentition

    Science.gov (United States)

    Zhang, Jian; Xia, James J.; Li, Jianfu; Zhou, Xiaobo

    2016-01-01

    Partially edentulous dentition presents a challenging problem for the surgical planning of digital dental occlusion in the field of craniomaxillofacial surgery because of the incorrect maxillomandibular distance caused by missing teeth. We propose an innovative approach called Dental Reconstruction with Symmetrical Teeth (DRST) to achieve accurate dental occlusion for the partially edentulous cases. In this DRST approach, the rigid transformation between two symmetrical teeth existing on the left and right dental model is estimated through probabilistic point registration by matching the two shapes. With the estimated transformation, the partially edentulous space can be virtually filled with the teeth in its symmetrical position. Dental alignment is performed by digital dental occlusion reestablishment algorithm with the reconstructed complete dental model. Satisfactory reconstruction and occlusion results are demonstrated with the synthetic and real partially edentulous models. PMID:26584502

  4. Nash evolutionary algorithms : Testing problem size in reconstruction problems in frame structures

    OpenAIRE

    Greiner, D.; Periaux, Jacques; Emperador, J.M.; Galván, B.; Winter, G.

    2016-01-01

    The use of evolutionary algorithms has been enhanced in recent years for solving real engineering problems, where the requirements of intense computational calculations are needed, especially when computational engineering simulations are involved (use of finite element method, boundary element method, etc). The coupling of game-theory concepts in evolutionary algorithms has been a recent line of research which could enhance the efficiency of the optimum design procedure and th...

  5. Multi-view Multi-sparsity Kernel Reconstruction for Multi-class Image Classification

    KAUST Repository

    Zhu, Xiaofeng; Xie, Qing; Zhu, Yonghua; Liu, Xingyi; Zhang, Shichao

    2015-01-01

    This paper addresses the problem of multi-class image classification by proposing a novel multi-view multi-sparsity kernel reconstruction (MMKR for short) model. Given images (including test images and training images) representing with multiple

  6. Resolving ambiguities in reconstructed grain maps using discrete tomography

    DEFF Research Database (Denmark)

    Alpers, A.; Knudsen, E.; Poulsen, H.F.

    2005-01-01

    reconstruct the image from diffraction data, but they are often unable to assign unambiguous values to all pixels. We present an approach that resolves these ambiguous pixels by using a Monte Carlo technique that exploits the discrete nature of the problem and utilizes proven methods of discrete tomography...

  7. Investigation on magnetoacoustic signal generation with magnetic induction and its application to electrical conductivity reconstruction

    International Nuclear Information System (INIS)

    Ma Qingyu; He Bin

    2007-01-01

    A theoretical study on the magnetoacoustic signal generation with magnetic induction and its applications to electrical conductivity reconstruction is conducted. An object with a concentric cylindrical geometry is located in a static magnetic field and a pulsed magnetic field. Driven by Lorentz force generated by the static magnetic field, the magnetically induced eddy current produces acoustic vibration and the propagated sound wave is received by a transducer around the object to reconstruct the corresponding electrical conductivity distribution of the object. A theory on the magnetoacoustic waveform generation for a circular symmetric model is provided as a forward problem. The explicit formulae and quantitative algorithm for the electrical conductivity reconstruction are then presented as an inverse problem. Computer simulations were conducted to test the proposed theory and assess the performance of the inverse algorithms for a multi-layer cylindrical model. The present simulation results confirm the validity of the proposed theory and suggest the feasibility of reconstructing electrical conductivity distribution based on the proposed theory on the magnetoacoustic signal generation with magnetic induction

  8. Choice of reconstructed tissue properties affects interpretation of lung EIT images.

    Science.gov (United States)

    Grychtol, Bartłomiej; Adler, Andy

    2014-06-01

    Electrical impedance tomography (EIT) estimates an image of change in electrical properties within a body from stimulations and measurements at surface electrodes. There is significant interest in EIT as a tool to monitor and guide ventilation therapy in mechanically ventilated patients. In lung EIT, the EIT inverse problem is commonly linearized and only changes in electrical properties are reconstructed. Early algorithms reconstructed changes in resistivity, while most recent work using the finite element method reconstructs conductivity. Recently, we demonstrated that EIT images of ventilation can be misleading if the electrical contrasts within the thorax are not taken into account during the image reconstruction process. In this paper, we explore the effect of the choice of the reconstructed electrical properties (resistivity or conductivity) on the resulting EIT images. We show in simulation and experimental data that EIT images reconstructed with the same algorithm but with different parametrizations lead to large and clinically significant differences in the resulting images, which persist even after attempts to eliminate the impact of the parameter choice by recovering volume changes from the EIT images. Since there is no consensus among the most popular reconstruction algorithms and devices regarding the parametrization, this finding has implications for potential clinical use of EIT. We propose a program of research to develop reconstruction techniques that account for both the relationship between air volume and electrical properties of the lung and artefacts introduced by the linearization.

  9. Choice of reconstructed tissue properties affects interpretation of lung EIT images

    International Nuclear Information System (INIS)

    Grychtol, Bartłomiej; Adler, Andy

    2014-01-01

    Electrical impedance tomography (EIT) estimates an image of change in electrical properties within a body from stimulations and measurements at surface electrodes. There is significant interest in EIT as a tool to monitor and guide ventilation therapy in mechanically ventilated patients. In lung EIT, the EIT inverse problem is commonly linearized and only changes in electrical properties are reconstructed. Early algorithms reconstructed changes in resistivity, while most recent work using the finite element method reconstructs conductivity. Recently, we demonstrated that EIT images of ventilation can be misleading if the electrical contrasts within the thorax are not taken into account during the image reconstruction process. In this paper, we explore the effect of the choice of the reconstructed electrical properties (resistivity or conductivity) on the resulting EIT images. We show in simulation and experimental data that EIT images reconstructed with the same algorithm but with different parametrizations lead to large and clinically significant differences in the resulting images, which persist even after attempts to eliminate the impact of the parameter choice by recovering volume changes from the EIT images. Since there is no consensus among the most popular reconstruction algorithms and devices regarding the parametrization, this finding has implications for potential clinical use of EIT. We propose a program of research to develop reconstruction techniques that account for both the relationship between air volume and electrical properties of the lung and artefacts introduced by the linearization. (paper)

  10. Multiple Linear Regression for Reconstruction of Gene Regulatory Networks in Solving Cascade Error Problems.

    Science.gov (United States)

    Salleh, Faridah Hani Mohamed; Zainudin, Suhaila; Arif, Shereena M

    2017-01-01

    Gene regulatory network (GRN) reconstruction is the process of identifying regulatory gene interactions from experimental data through computational analysis. One of the main reasons for the reduced performance of previous GRN methods had been inaccurate prediction of cascade motifs. Cascade error is defined as the wrong prediction of cascade motifs, where an indirect interaction is misinterpreted as a direct interaction. Despite the active research on various GRN prediction methods, the discussion on specific methods to solve problems related to cascade errors is still lacking. In fact, the experiments conducted by the past studies were not specifically geared towards proving the ability of GRN prediction methods in avoiding the occurrences of cascade errors. Hence, this research aims to propose Multiple Linear Regression (MLR) to infer GRN from gene expression data and to avoid wrongly inferring of an indirect interaction (A → B → C) as a direct interaction (A → C). Since the number of observations of the real experiment datasets was far less than the number of predictors, some predictors were eliminated by extracting the random subnetworks from global interaction networks via an established extraction method. In addition, the experiment was extended to assess the effectiveness of MLR in dealing with cascade error by using a novel experimental procedure that had been proposed in this work. The experiment revealed that the number of cascade errors had been very minimal. Apart from that, the Belsley collinearity test proved that multicollinearity did affect the datasets used in this experiment greatly. All the tested subnetworks obtained satisfactory results, with AUROC values above 0.5.

  11. Convergence of iterative image reconstruction algorithms for Digital Breast Tomosynthesis

    DEFF Research Database (Denmark)

    Sidky, Emil; Jørgensen, Jakob Heide; Pan, Xiaochuan

    2012-01-01

    Most iterative image reconstruction algorithms are based on some form of optimization, such as minimization of a data-fidelity term plus an image regularizing penalty term. While achieving the solution of these optimization problems may not directly be clinically relevant, accurate optimization s...

  12. Greedy algorithms for diffuse optical tomography reconstruction

    Science.gov (United States)

    Dileep, B. P. V.; Das, Tapan; Dutta, Pranab K.

    2018-03-01

    Diffuse optical tomography (DOT) is a noninvasive imaging modality that reconstructs the optical parameters of a highly scattering medium. However, the inverse problem of DOT is ill-posed and highly nonlinear due to the zig-zag propagation of photons that diffuses through the cross section of tissue. The conventional DOT imaging methods iteratively compute the solution of forward diffusion equation solver which makes the problem computationally expensive. Also, these methods fail when the geometry is complex. Recently, the theory of compressive sensing (CS) has received considerable attention because of its efficient use in biomedical imaging applications. The objective of this paper is to solve a given DOT inverse problem by using compressive sensing framework and various Greedy algorithms such as orthogonal matching pursuit (OMP), compressive sampling matching pursuit (CoSaMP), and stagewise orthogonal matching pursuit (StOMP), regularized orthogonal matching pursuit (ROMP) and simultaneous orthogonal matching pursuit (S-OMP) have been studied to reconstruct the change in the absorption parameter i.e, Δα from the boundary data. Also, the Greedy algorithms have been validated experimentally on a paraffin wax rectangular phantom through a well designed experimental set up. We also have studied the conventional DOT methods like least square method and truncated singular value decomposition (TSVD) for comparison. One of the main features of this work is the usage of less number of source-detector pairs, which can facilitate the use of DOT in routine applications of screening. The performance metrics such as mean square error (MSE), normalized mean square error (NMSE), structural similarity index (SSIM), and peak signal to noise ratio (PSNR) have been used to evaluate the performance of the algorithms mentioned in this paper. Extensive simulation results confirm that CS based DOT reconstruction outperforms the conventional DOT imaging methods in terms of

  13. Tomographic reconstruction of transverse phase space from turn-by-turn profile data

    CERN Document Server

    Hancock, S; Lindroos, M

    1999-01-01

    Tomographic methods have the potential for useful application in beam diagnostics. The tomographic reconstruction of transverse phase space density from turn-by-turn profile data has been studied with particular attention to the effects of dispersion and chromaticity. It is shown that the modified Algebraic Reconstruction Technique (ART) that deals successfully with the problem of non-linear motion in the longitudinal plane cannot, in general, be extended to cover the transverse case. Instead, an approach is proposed in which the effect of dispersion is deconvoluted from the measured profiles before the phase space picture is reconstructed using either the modified ART algorithm or the inverse Radon Transform. This requires an accurate knowledge of the momentum distribution of the beam and the modified ART reconstruction of longitudinal phase space density yields just such information. The method has been tested extensively with simulated data.

  14. Fan-beam filtered-backprojection reconstruction without backprojection weight

    International Nuclear Information System (INIS)

    Dennerlein, Frank; Noo, Frederic; Hornegger, Joachim; Lauritsch, Guenter

    2007-01-01

    In this paper, we address the problem of two-dimensional image reconstruction from fan-beam data acquired along a full 2π scan. Conventional approaches that follow the filtered-backprojection (FBP) structure require a weighted backprojection with the weight depending on the point to be reconstructed and also on the source position; this weight appears only in the case of divergent beam geometries. Compared to reconstruction from parallel-beam data, the backprojection weight implies an increase in computational effort and is also thought to have some negative impacts on noise properties of the reconstructed images. We demonstrate here that direct FBP reconstruction from full-scan fan-beam data is possible with no backprojection weight. Using computer-simulated, realistic fan-beam data, we compared our novel FBP formula with no backprojection weight to the use of an FBP formula based on equal weighting of all data. Comparisons in terms of signal-to-noise ratio, spatial resolution and computational efficiency are presented. These studies show that the formula we suggest yields images with a reduced noise level, at almost identical spatial resolution. This effect increases quickly with the distance from the center of the field of view, from 0% at the center to 20% less noise at 20 cm, and to 40% less noise at 25 cm. Furthermore, the suggested method is computationally less demanding and reduces computation time with a gain that was found to vary between 12% and 43% on the computers used for evaluation

  15. Reconstructing source-sink dynamics in a population with a pelagic dispersal phase.

    Directory of Open Access Journals (Sweden)

    Kun Chen

    Full Text Available For many organisms, the reconstruction of source-sink dynamics is hampered by limited knowledge of the spatial assemblage of either the source or sink components or lack of information on the strength of the linkage for any source-sink pair. In the case of marine species with a pelagic dispersal phase, these problems may be mitigated through the use of particle drift simulations based on an ocean circulation model. However, when simulated particle trajectories do not intersect sampling sites, the corroboration of model drift simulations with field data is hampered. Here, we apply a new statistical approach for reconstructing source-sink dynamics that overcomes the aforementioned problems. Our research is motivated by the need for understanding observed changes in jellyfish distributions in the eastern Bering Sea since 1990. By contrasting the source-sink dynamics reconstructed with data from the pre-1990 period with that from the post-1990 period, it appears that changes in jellyfish distribution resulted from the combined effects of higher jellyfish productivity and longer dispersal of jellyfish resulting from a shift in the ocean circulation starting in 1991. A sensitivity analysis suggests that the source-sink reconstruction is robust to typical systematic and random errors in the ocean circulation model driving the particle drift simulations. The jellyfish analysis illustrates that new insights can be gained by studying structural changes in source-sink dynamics. The proposed approach is applicable for the spatial source-sink reconstruction of other species and even abiotic processes, such as sediment transport.

  16. Faster PET reconstruction with a stochastic primal-dual hybrid gradient method

    KAUST Repository

    Ehrhardt, Matthias J.; Markiewicz, Pawel J.; Richtá rik, Peter; Schott, Jonathan; Chambolle, Antonin; Schoenlieb, Carola-Bibiane

    2017-01-01

    Image reconstruction in positron emission tomography (PET) is computationally challenging due to Poisson noise, constraints and potentially non-smooth priors-let alone the sheer size of the problem. An algorithm that can cope well with the first

  17. A penalized linear and nonlinear combined conjugate gradient method for the reconstruction of fluorescence molecular tomography.

    Science.gov (United States)

    Shang, Shang; Bai, Jing; Song, Xiaolei; Wang, Hongkai; Lau, Jaclyn

    2007-01-01

    Conjugate gradient method is verified to be efficient for nonlinear optimization problems of large-dimension data. In this paper, a penalized linear and nonlinear combined conjugate gradient method for the reconstruction of fluorescence molecular tomography (FMT) is presented. The algorithm combines the linear conjugate gradient method and the nonlinear conjugate gradient method together based on a restart strategy, in order to take advantage of the two kinds of conjugate gradient methods and compensate for the disadvantages. A quadratic penalty method is adopted to gain a nonnegative constraint and reduce the illposedness of the problem. Simulation studies show that the presented algorithm is accurate, stable, and fast. It has a better performance than the conventional conjugate gradient-based reconstruction algorithms. It offers an effective approach to reconstruct fluorochrome information for FMT.

  18. Maximum entropy reconstruction of poloidal magnetic field and radial electric field profiles in tokamaks

    Science.gov (United States)

    Chen, Yihang; Xiao, Chijie; Yang, Xiaoyi; Wang, Tianbo; Xu, Tianchao; Yu, Yi; Xu, Min; Wang, Long; Lin, Chen; Wang, Xiaogang

    2017-10-01

    The Laser-driven Ion beam trace probe (LITP) is a new diagnostic method for measuring poloidal magnetic field (Bp) and radial electric field (Er) in tokamaks. LITP injects a laser-driven ion beam into the tokamak, and Bp and Er profiles can be reconstructed using tomography methods. A reconstruction code has been developed to validate the LITP theory, and both 2D reconstruction of Bp and simultaneous reconstruction of Bp and Er have been attained. To reconstruct from experimental data with noise, Maximum Entropy and Gaussian-Bayesian tomography methods were applied and improved according to the characteristics of the LITP problem. With these improved methods, a reconstruction error level below 15% has been attained with a data noise level of 10%. These methods will be further tested and applied in the following LITP experiments. Supported by the ITER-CHINA program 2015GB120001, CHINA MOST under 2012YQ030142 and National Natural Science Foundation Abstract of China under 11575014 and 11375053.

  19. (Re)Constructing the Wicked Problem Through the Visual and the Verbal

    DEFF Research Database (Denmark)

    Holm Jacobsen, Peter; Harty, Chris; Tryggestad, Kjell

    2016-01-01

    Wicked problems are open ended and complex societal problems. There is a lack of empirical research into the dynamics and mechanisms that (re) construct problems to become wicked. This paper builds on an ethnographic study of a dialogue-based architect competition to do just that. The competition...... processes creates new knowledge and insights, but at the same time present new problems related to the ongoing verbal feedback. The design problem being (re) constructed appears as Heracles' fight with Hydra: Every time Heracles cut of a head, two new heads grow back. The paper contributes to understanding...... the relationship between the visual and the verbal (dialogue) in complex design processes in the early phases of large construction projects, and how the dynamic interplay between the design visualization and verbal dialogue develops before the competition produces, or negotiates, “a "winning design”....

  20. A memory efficient method for fully three-dimensional object reconstruction with HAADF STEM

    International Nuclear Information System (INIS)

    Van den Broek, W.; Rosenauer, A.; Van Aert, S.; Sijbers, J.; Van Dyck, D.

    2014-01-01

    The conventional approach to object reconstruction through electron tomography is to reduce the three-dimensional problem to a series of independent two-dimensional slice-by-slice reconstructions. However, at atomic resolution the image of a single atom extends over many such slices and incorporating this image as prior knowledge in tomography or depth sectioning therefore requires a fully three-dimensional treatment. Unfortunately, the size of the three-dimensional projection operator scales highly unfavorably with object size and readily exceeds the available computer memory. In this paper, it is shown that for incoherent image formation the memory requirement can be reduced to the fundamental lower limit of the object size, both for tomography and depth sectioning. Furthermore, it is shown through multislice calculations that high angle annular dark field scanning transmission electron microscopy can be sufficiently incoherent for the reconstruction of single element nanocrystals, but that dynamical diffraction effects can cause classification problems if more than one element is present. - Highlights: • The full 3D approach to atomic resolution object retrieval has high memory load. • For incoherent imaging the projection process is a matrix–vector product. • Carrying out this product implicitly as Fourier transforms reduces memory load. • Reconstructions are demonstrated from HAADF STEM and depth sectioning simulations

  1. Reconstruction of hyperspectral image using matting model for classification

    Science.gov (United States)

    Xie, Weiying; Li, Yunsong; Ge, Chiru

    2016-05-01

    Although hyperspectral images (HSIs) captured by satellites provide much information in spectral regions, some bands are redundant or have large amounts of noise, which are not suitable for image analysis. To address this problem, we introduce a method for reconstructing the HSI with noise reduction and contrast enhancement using a matting model for the first time. The matting model refers to each spectral band of an HSI that can be decomposed into three components, i.e., alpha channel, spectral foreground, and spectral background. First, one spectral band of an HSI with more refined information than most other bands is selected, and is referred to as an alpha channel of the HSI to estimate the hyperspectral foreground and hyperspectral background. Finally, a combination operation is applied to reconstruct the HSI. In addition, the support vector machine (SVM) classifier and three sparsity-based classifiers, i.e., orthogonal matching pursuit (OMP), simultaneous OMP, and OMP based on first-order neighborhood system weighted classifiers, are utilized on the reconstructed HSI and the original HSI to verify the effectiveness of the proposed method. Specifically, using the reconstructed HSI, the average accuracy of the SVM classifier can be improved by as much as 19%.

  2. A three-step reconstruction method for fluorescence molecular tomography based on compressive sensing

    DEFF Research Database (Denmark)

    Zhu, Yansong; Jha, Abhinav K.; Dreyer, Jakob K.

    2017-01-01

    Fluorescence molecular tomography (FMT) is a promising tool for real time in vivo quantification of neurotransmission (NT) as we pursue in our BRAIN initiative effort. However, the acquired image data are noisy and the reconstruction problem is ill-posed. Further, while spatial sparsity of the NT...... matrix coherence. The resultant image data are input to a homotopy-based reconstruction strategy that exploits sparsity via ℓ1 regularization. The reconstructed image is then input to a maximum-likelihood expectation maximization (MLEM) algorithm that retains the sparseness of the input estimate...... and improves upon the quantitation by accurate Poisson noise modeling. The proposed reconstruction method was evaluated in a three-dimensional simulated setup with fluorescent sources in a cuboidal scattering medium with optical properties simulating human brain cortex (reduced scattering coefficient: 9.2 cm-1...

  3. Tomographic ventricular reconstruction using multiple view first-pass radionuclide angiography

    International Nuclear Information System (INIS)

    Lacy, J.L.; Ball, M.E.; Verani, M.S.; Wiles, H.; Roberts, R.

    1985-01-01

    In first-pass radionuclide angiography (FPRA) images of both left and right ventricles are uncontaminated by adjacent structures. Thus, the problem of tomographic reconstruction is vastly simplified compared to equilibrium blood pool imaging in which all structures are imaged simultaneously. Tomographic reconstruction from a limited number of views may thus be possible. A simple filtered interpolative back-projection reconstruction technique was employed. In this technique interpolation was used between sectional distributions at successive angles. Interpolations yielding 9 and 13 back projection angles of 22.5 0 and 15 0 were evaluated. Ventricular borders were obtained in each back-projected tomographic slice by location of the intensity level which provided correct total ventricular volume. Cast cross sections were quantitatively well represented by these borders. This ventricular border definition algorithm forms the basis for applications of the technique in animals and humans

  4. Breast reconstruction after mastectomy

    Directory of Open Access Journals (Sweden)

    Daniel eSchmauss

    2016-01-01

    Full Text Available Breast cancer is the leading cause of cancer death in women worldwide. Its surgical approach has become less and less mutilating in the last decades. However, the overall number of breast reconstructions has significantly increased lately. Nowadays breast reconstruction should be individualized at its best, first of all taking into consideration oncological aspects of the tumor, neo-/adjuvant treatment and genetic predisposition, but also its timing (immediate versus delayed breast reconstruction, as well as the patient’s condition and wish. This article gives an overview over the various possibilities of breast reconstruction, including implant- and expander-based reconstruction, flap-based reconstruction (vascularized autologous tissue, the combination of implant and flap, reconstruction using non-vascularized autologous fat, as well as refinement surgery after breast reconstruction.

  5. Breast reconstruction - implants

    Science.gov (United States)

    Breast implants surgery; Mastectomy - breast reconstruction with implants; Breast cancer - breast reconstruction with implants ... harder to find a tumor if your breast cancer comes back. Getting breast implants does not take as long as breast reconstruction ...

  6. A simple method to take urethral sutures for neobladder reconstruction and radical prostatectomy

    Directory of Open Access Journals (Sweden)

    B Satheesan

    2007-01-01

    Full Text Available For the reconstruction of urethra-vesical anastamosis after radical prostatectomy and for neobladder reconstruction, taking adequate sutures to include the urethral mucosa is vital. Due to the retraction of the urethra and unfriendly pelvis, the process of taking satisfactory urethral sutures may be laborious. Here, we describe a simple method by which we could overcome similar technical problems during surgery using Foley catheter as the guide for the suture.

  7. NCT-ART - a neutron computer tomography code based on the algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Krueger, A.

    1988-01-01

    A computer code is presented, which calculates two-dimensional cuts of material assemblies from a number of neutron radiographic projections. Mathematically, the reconstruction is performed by an iterative solution of a system of linear equations. If the system is fully determined, clear pictures are obtained. Even for an underdetermined system (low number of projections) reasonable pictures are reconstructed, but then picture artefacts and convergence problems occur increasingly. (orig.) With 37 figs [de

  8. Time Reversal Reconstruction Algorithm Based on PSO Optimized SVM Interpolation for Photoacoustic Imaging

    Directory of Open Access Journals (Sweden)

    Mingjian Sun

    2015-01-01

    Full Text Available Photoacoustic imaging is an innovative imaging technique to image biomedical tissues. The time reversal reconstruction algorithm in which a numerical model of the acoustic forward problem is run backwards in time is widely used. In the paper, a time reversal reconstruction algorithm based on particle swarm optimization (PSO optimized support vector machine (SVM interpolation method is proposed for photoacoustics imaging. Numerical results show that the reconstructed images of the proposed algorithm are more accurate than those of the nearest neighbor interpolation, linear interpolation, and cubic convolution interpolation based time reversal algorithm, which can provide higher imaging quality by using significantly fewer measurement positions or scanning times.

  9. Maxillary Tuberosity Reconstruction with Transport Distraction Osteogenesis

    Directory of Open Access Journals (Sweden)

    F. Ugurlu

    2012-01-01

    Full Text Available Severe bone loss due to pathology in the maxillary tuberosity region is a challenging problem both surgically and prosthetically. Large bone grafts have a poor survival rate due to the delicate bony architecture in this area and presence of the maxillary sinus. Our case presentation describes a new technique for reconstructing severe bony defect in the maxillary tuberosity with horizontal distraction osteogenesis in a 45-year-old man. A 4×6×3cm cyst was discovered in the left maxillary molar region and enucleated. Three months postoperatively, the area had a severe bone defect extending to the zygomatic buttress superiorly and hamular notch posteriorly. Three months later, a bone segment including the right upper second premolar was osteotomised and distracted horizontally. The bone segment was distracted 15 mm distally. After consolidation, implants were placed when the distractor was removed. A fixed denture was loaded over the implants after 3 months. Complete alveolar bone loss extending to the cranial base can be reconstructed with transport distraction osteogenesis. Distalisation of the alveolar bone segment adjacent to the bony defect is an easy method for reconstructing such severe defects.

  10. Born iterative reconstruction using perturbed-phase field estimates.

    Science.gov (United States)

    Astheimer, Jeffrey P; Waag, Robert C

    2008-10-01

    A method of image reconstruction from scattering measurements for use in ultrasonic imaging is presented. The method employs distorted-wave Born iteration but does not require using a forward-problem solver or solving large systems of equations. These calculations are avoided by limiting intermediate estimates of medium variations to smooth functions in which the propagated fields can be approximated by phase perturbations derived from variations in a geometric path along rays. The reconstruction itself is formed by a modification of the filtered-backpropagation formula that includes correction terms to account for propagation through an estimated background. Numerical studies that validate the method for parameter ranges of interest in medical applications are presented. The efficiency of this method offers the possibility of real-time imaging from scattering measurements.

  11. Three Dimensional Dynamic Model Based Wind Field Reconstruction from Lidar Data

    International Nuclear Information System (INIS)

    Raach, Steffen; Schlipf, David; Haizmann, Florian; Cheng, Po Wen

    2014-01-01

    Using the inflowing horizontal and vertical wind shears for individual pitch controller is a promising method if blade bending measurements are not available. Due to the limited information provided by a lidar system the reconstruction of shears in real-time is a challenging task especially for the horizontal shear in the presence of changing wind direction. The internal model principle has shown to be a promising approach to estimate the shears and directions in 10 minutes averages with real measurement data. The static model based wind vector field reconstruction is extended in this work taking into account a dynamic reconstruction model based on Taylor's Frozen Turbulence Hypothesis. The presented method provides time series over several seconds of the wind speed, shears and direction, which can be directly used in advanced optimal preview control. Therefore, this work is an important step towards the application of preview individual blade pitch control under realistic wind conditions. The method is tested using a turbulent wind field and a detailed lidar simulator. For the simulation, the turbulent wind field structure is flowing towards the lidar system and is continuously misaligned with respect to the horizontal axis of the wind turbine. Taylor's Frozen Turbulence Hypothesis is taken into account to model the wind evolution. For the reconstruction, the structure is discretized into several stages where each stage is reduced to an effective wind speed, superposed with a linear horizontal and vertical wind shear. Previous lidar measurements are shifted using again Taylor's Hypothesis. The wind field reconstruction problem is then formulated as a nonlinear optimization problem, which minimizes the residual between the assumed wind model and the lidar measurements to obtain the misalignment angle and the effective wind speed and the wind shears for each stage. This method shows good results in reconstructing the wind characteristics of a three

  12. Factors associated with returning to football after anterior cruciate ligament reconstruction.

    Science.gov (United States)

    Sandon, Alexander; Werner, Suzanne; Forssblad, Magnus

    2015-09-01

    The aim of the present investigation was to identify possible factors associated with returning to football on an average 3.2 ± 1.4 years after anterior cruciate ligament (ACL) reconstruction in both male and female football players. The players were recruited from a patient database of football players that have undergone an ACL reconstruction between 2004 and 2007 at the Capio Artro Clinic, Sophiahemmet in Stockholm, Sweden. Special attention was paid to gender, age, type of graft for ACL reconstruction, associated injuries, anterior knee laxity, thigh muscle torques and symptoms/problems during, and/or after physical activity. In the beginning of the summer of 2009, 205 players (37.9 %) out of 541 players filled out a questionnaire designed to evaluate physical activity and knee function in a sports-specific setting. A detailed dropout analysis showed that females responded to a higher degree than males. No other significant differences between responders and non-responders were found. Fifty-four per cent (n = 111) had returned to football, and 46 % (n = 94) had not. Using logistic regression analyses, we found that the female gender (p = 0.036, OR 0.518), cartilage injury (p = 0.013, OR 0.368), and pain during physical activity (p = 0.002, OR 0.619) were significant negative predictors for returning to football after ACL reconstruction and rehabilitation. For players with all three significant factors, only 10 % returned to football compared to 76.5 % of those without any of these factors. Female gender, cartilage injury, and knee pain during physical activity were independent negative predictors for returning to football after ACL reconstruction. At a mean follow-up of 3.2 ± 1.4 years after ACL reconstruction, pain during physical activity was reported to be the most common symptom/problem in football players. The clinical relevance of this study is to improve the treatment of ACL injured football players focusing on female gender and knee pain. Furthermore

  13. Adaptive algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Lu Wenkai; Yin Fangfang

    2004-01-01

    Algebraic reconstruction techniques (ART) are iterative procedures for reconstructing objects from their projections. It is proven that ART can be computationally efficient by carefully arranging the order in which the collected data are accessed during the reconstruction procedure and adaptively adjusting the relaxation parameters. In this paper, an adaptive algebraic reconstruction technique (AART), which adopts the same projection access scheme in multilevel scheme algebraic reconstruction technique (MLS-ART), is proposed. By introducing adaptive adjustment of the relaxation parameters during the reconstruction procedure, one-iteration AART can produce reconstructions with better quality, in comparison with one-iteration MLS-ART. Furthermore, AART outperforms MLS-ART with improved computational efficiency

  14. Extra Facial Landmark Localization via Global Shape Reconstruction

    Directory of Open Access Journals (Sweden)

    Shuqiu Tan

    2017-01-01

    Full Text Available Localizing facial landmarks is a popular topic in the field of face analysis. However, problems arose in practical applications such as handling pose variations and partial occlusions while maintaining moderate training model size and computational efficiency still challenges current solutions. In this paper, we present a global shape reconstruction method for locating extra facial landmarks comparing to facial landmarks used in the training phase. In the proposed method, the reduced configuration of facial landmarks is first decomposed into corresponding sparse coefficients. Then explicit face shape correlations are exploited to regress between sparse coefficients of different facial landmark configurations. Finally extra facial landmarks are reconstructed by combining the pretrained shape dictionary and the approximation of sparse coefficients. By applying the proposed method, both the training time and the model size of a class of methods which stack local evidences as an appearance descriptor can be scaled down with only a minor compromise in detection accuracy. Extensive experiments prove that the proposed method is feasible and is able to reconstruct extra facial landmarks even under very asymmetrical face poses.

  15. Image reconstruction from limited angle Compton camera data

    International Nuclear Information System (INIS)

    Tomitani, T.; Hirasawa, M.

    2002-01-01

    The Compton camera is used for imaging the distributions of γ ray direction in a γ ray telescope for astrophysics and for imaging radioisotope distributions in nuclear medicine without the need for collimators. The integration of γ rays on a cone is measured with the camera, so that some sort of inversion method is needed. Parra found an analytical inversion algorithm based on spherical harmonics expansion of projection data. His algorithm is applicable to the full set of projection data. In this paper, six possible reconstruction algorithms that allow image reconstruction from projections with a finite range of scattering angles are investigated. Four algorithms have instability problems and two others are practical. However, the variance of the reconstructed image diverges in these two cases, so that window functions are introduced with which the variance becomes finite at a cost of spatial resolution. These two algorithms are compared in terms of variance. The algorithm based on the inversion of the summed back-projection is superior to the algorithm based on the inversion of the summed projection. (author)

  16. Electromagnetic three-dimensional reconstruction of targets from free space experimental data

    International Nuclear Information System (INIS)

    Geffrin, J.-M.; Chaumet, P. C.; Eyraud, C.; Belkebir, K.; Sabouroux, P.

    2008-01-01

    This paper deals with the problem of reconstructing the relative permittivity of three-dimensional targets using experimental scattered fields. The fields concerned were measured in an anechoic chamber on the surface of a sphere surrounding the target. The inverse scattering problem is reformulated as an optimization problem that is iteratively solved thanks to a conjugate gradient method and by using the coupled dipoles method as a forward problem solver. The measurement technique and the inversion procedure are briefly described with the inversion results. This work demonstrates the reliability of the experiments and the efficiency of the proposed inverse scattering scheme

  17. Improved convergence of gradient-based reconstruction using multi-scale models

    International Nuclear Information System (INIS)

    Cunningham, G.S.; Hanson, K.M.; Koyfman, I.

    1996-01-01

    Geometric models have received increasing attention in medical imaging for tasks such as segmentation, reconstruction, restoration, and registration. In order to determine the best configuration of the geometric model in the context of any of these tasks, one needs to perform a difficult global optimization of an energy function that may have many local minima. Explicit models of geometry, also called deformable models, snakes, or active contours, have been used extensively to solve image segmentation problems in a non-Bayesian framework. Researchers have seen empirically that multi-scale analysis is useful for convergence to a configuration that is near the global minimum. In this type of analysis, the image data are convolved with blur functions of increasing resolution, and an optimal configuration of the snake is found for each blurred image. The configuration obtained using the highest resolution blur is used as the solution to the global optimization problem. In this article, the authors use explicit models of geometry for a variety of Bayesian estimation problems, including image segmentation, reconstruction and restoration. The authors introduce a multi-scale approach that blurs the geometric model, rather than the image data, and show that this approach turns a global, highly nonquadratic optimization into a sequence of local, approximately quadratic problems that converge to the global minimum. The result is a deterministic, robust, and efficient optimization strategy applicable to a wide variety of Bayesian estimation problems in which geometric models of images are an important component

  18. Efficient reconstruction of dispersive dielectric profiles using time domain reflectometry (TDR

    Directory of Open Access Journals (Sweden)

    P. Leidenberger

    2006-01-01

    Full Text Available We present a numerical model for time domain reflectometry (TDR signal propagation in dispersive dielectric materials. The numerical probe model is terminated with a parallel circuit, consisting of an ohmic resistor and an ideal capacitance. We derive analytical approximations for the capacitance, the inductance and the conductance of three-wire probes. We couple the time domain model with global optimization in order to reconstruct water content profiles from TDR traces. For efficiently solving the inverse problem we use genetic algorithms combined with a hierarchical parameterization. We investigate the performance of the method by reconstructing synthetically generated profiles. The algorithm is then applied to retrieve dielectric profiles from TDR traces measured in the field. We succeed in reconstructing dielectric and ohmic profiles where conventional methods, based on travel time extraction, fail.

  19. Flap Lymphedema after Successful Reconstruction of the Chronic Inguinal Wound with a Vertical Rectus Abdominis Flap (VRAM

    Directory of Open Access Journals (Sweden)

    Yalcin Kulahci

    2012-07-01

    Full Text Available The reconstruction of extensive and complex wounds represents a challenging problem for reconstructive surgeon. The reconstructive options to provide cover-age following debridment of these complicated wounds are local, distant flaps, or freetissue transfer. Vertical rectus abdominis flaps have been used succes-sully to repair defects in the groin, hip, perineal, trunk, and breast regions. We encountered flap lymphedema after successful reconstruction of the chronic in-guinal wound with a vertical rectus abdominis (VRAM flap. As far as were able to ascertain, there is no report in the literature related to flap lymphedema.

  20. Reconstruction of radionecrotic ulcer using a myocutaneous flap

    International Nuclear Information System (INIS)

    Takahashi, Hiroyuki; Okano, Shinji; Iwasaki, Yasumasa; Mori, Tamotsu; Miyamoto, Yoshihiro; Shigeki, Sadayuki

    1990-01-01

    Problems in the surgical treatment of radionecrotic ulcers, using a myocutaneous flap, have been reviewed in 21 patients. These problems included poor wound healing, radiation damage to important nerves and vessels there by making dissection difficult, malignant changes, infections, continuing necrosis of the tissue, and bleeding during surgery and secondary hemorrhaging. The use of a myocutaneous flap has many advantages when compared with conventional flaps and free skin grafts in the reconstruction of radionecrotic ulcers. Flap survival was good, but an incomplete excision of the ulcer delayed primary wound healing. Therefore, complete excision of the radionecrotic ulcer is imperative. (author)

  1. Reconstruction of Undersampled Big Dynamic MRI Data Using Non-Convex Low-Rank and Sparsity Constraints

    Directory of Open Access Journals (Sweden)

    Ryan Wen Liu

    2017-03-01

    Full Text Available Dynamic magnetic resonance imaging (MRI has been extensively utilized for enhancing medical living environment visualization, however, in clinical practice it often suffers from long data acquisition times. Dynamic imaging essentially reconstructs the visual image from raw (k,t-space measurements, commonly referred to as big data. The purpose of this work is to accelerate big medical data acquisition in dynamic MRI by developing a non-convex minimization framework. In particular, to overcome the inherent speed limitation, both non-convex low-rank and sparsity constraints were combined to accelerate the dynamic imaging. However, the non-convex constraints make the dynamic reconstruction problem difficult to directly solve through the commonly-used numerical methods. To guarantee solution efficiency and stability, a numerical algorithm based on Alternating Direction Method of Multipliers (ADMM is proposed to solve the resulting non-convex optimization problem. ADMM decomposes the original complex optimization problem into several simple sub-problems. Each sub-problem has a closed-form solution or could be efficiently solved using existing numerical methods. It has been proven that the quality of images reconstructed from fewer measurements can be significantly improved using non-convex minimization. Numerous experiments have been conducted on two in vivo cardiac datasets to compare the proposed method with several state-of-the-art imaging methods. Experimental results illustrated that the proposed method could guarantee the superior imaging performance in terms of quantitative and visual image quality assessments.

  2. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  3. Multiple Linear Regression for Reconstruction of Gene Regulatory Networks in Solving Cascade Error Problems

    Directory of Open Access Journals (Sweden)

    Faridah Hani Mohamed Salleh

    2017-01-01

    Full Text Available Gene regulatory network (GRN reconstruction is the process of identifying regulatory gene interactions from experimental data through computational analysis. One of the main reasons for the reduced performance of previous GRN methods had been inaccurate prediction of cascade motifs. Cascade error is defined as the wrong prediction of cascade motifs, where an indirect interaction is misinterpreted as a direct interaction. Despite the active research on various GRN prediction methods, the discussion on specific methods to solve problems related to cascade errors is still lacking. In fact, the experiments conducted by the past studies were not specifically geared towards proving the ability of GRN prediction methods in avoiding the occurrences of cascade errors. Hence, this research aims to propose Multiple Linear Regression (MLR to infer GRN from gene expression data and to avoid wrongly inferring of an indirect interaction (A → B → C as a direct interaction (A → C. Since the number of observations of the real experiment datasets was far less than the number of predictors, some predictors were eliminated by extracting the random subnetworks from global interaction networks via an established extraction method. In addition, the experiment was extended to assess the effectiveness of MLR in dealing with cascade error by using a novel experimental procedure that had been proposed in this work. The experiment revealed that the number of cascade errors had been very minimal. Apart from that, the Belsley collinearity test proved that multicollinearity did affect the datasets used in this experiment greatly. All the tested subnetworks obtained satisfactory results, with AUROC values above 0.5.

  4. Self-expressive Dictionary Learning for Dynamic 3D Reconstruction.

    Science.gov (United States)

    Zheng, Enliang; Ji, Dinghuang; Dunn, Enrique; Frahm, Jan-Michael

    2017-08-22

    We target the problem of sparse 3D reconstruction of dynamic objects observed by multiple unsynchronized video cameras with unknown temporal overlap. To this end, we develop a framework to recover the unknown structure without sequencing information across video sequences. Our proposed compressed sensing framework poses the estimation of 3D structure as the problem of dictionary learning, where the dictionary is defined as an aggregation of the temporally varying 3D structures. Given the smooth motion of dynamic objects, we observe any element in the dictionary can be well approximated by a sparse linear combination of other elements in the same dictionary (i.e. self-expression). Our formulation optimizes a biconvex cost function that leverages a compressed sensing formulation and enforces both structural dependency coherence across video streams, as well as motion smoothness across estimates from common video sources. We further analyze the reconstructability of our approach under different capture scenarios, and its comparison and relation to existing methods. Experimental results on large amounts of synthetic data as well as real imagery demonstrate the effectiveness of our approach.

  5. Control-oriented modeling of the plasma particle density in tokamaks and application to real-time density profile reconstruction

    NARCIS (Netherlands)

    Blanken, T.C.; Felici, F.; Rapson, C.J.; de Baar, M.R.; Heemels, W.P.M.H.

    2018-01-01

    A model-based approach to real-time reconstruction of the particle density profile in tokamak plasmas is presented, based on a dynamic state estimator. Traditionally, the density profile is reconstructed in real-time by solving an ill-conditioned inversion problem using a measurement at a single

  6. Reconstructing perceived faces from brain activations with deep adversarial neural decoding

    NARCIS (Netherlands)

    Güçlütürk, Y.; Güçlü, U.; Seeliger, K.; Bosch, S.E.; Lier, R.J. van; Gerven, M.A.J. van; Guyon, I.; Luxburg, U.V.; Bengio, S.; Wallach, H.; Fergus, R.; Vishwanathan, S.; Garnett, R.

    2017-01-01

    Here, we present a novel approach to solve the problem of reconstructing perceived stimuli from brain responses by combining probabilistic inference with deep learning. Our approach first inverts the linear transformation from latent features to brain responses with maximum a posteriori estimation

  7. Statistical perspectives on inverse problems

    DEFF Research Database (Denmark)

    Andersen, Kim Emil

    of the interior of an object from electrical boundary measurements. One part of this thesis concerns statistical approaches for solving, possibly non-linear, inverse problems. Thus inverse problems are recasted in a form suitable for statistical inference. In particular, a Bayesian approach for regularisation...... problem is given in terms of probability distributions. Posterior inference is obtained by Markov chain Monte Carlo methods and new, powerful simulation techniques based on e.g. coupled Markov chains and simulated tempering is developed to improve the computational efficiency of the overall simulation......Inverse problems arise in many scientific disciplines and pertain to situations where inference is to be made about a particular phenomenon from indirect measurements. A typical example, arising in diffusion tomography, is the inverse boundary value problem for non-invasive reconstruction...

  8. Solution of 3D inverse scattering problems by combined inverse equivalent current and finite element methods

    International Nuclear Information System (INIS)

    Kılıç, Emre; Eibert, Thomas F.

    2015-01-01

    An approach combining boundary integral and finite element methods is introduced for the solution of three-dimensional inverse electromagnetic medium scattering problems. Based on the equivalence principle, unknown equivalent electric and magnetic surface current densities on a closed surface are utilized to decompose the inverse medium problem into two parts: a linear radiation problem and a nonlinear cavity problem. The first problem is formulated by a boundary integral equation, the computational burden of which is reduced by employing the multilevel fast multipole method (MLFMM). Reconstructed Cauchy data on the surface allows the utilization of the Lorentz reciprocity and the Poynting's theorems. Exploiting these theorems, the noise level and an initial guess are estimated for the cavity problem. Moreover, it is possible to determine whether the material is lossy or not. In the second problem, the estimated surface currents form inhomogeneous boundary conditions of the cavity problem. The cavity problem is formulated by the finite element technique and solved iteratively by the Gauss–Newton method to reconstruct the properties of the object. Regularization for both the first and the second problems is achieved by a Krylov subspace method. The proposed method is tested against both synthetic and experimental data and promising reconstruction results are obtained

  9. Solution of 3D inverse scattering problems by combined inverse equivalent current and finite element methods

    Energy Technology Data Exchange (ETDEWEB)

    Kılıç, Emre, E-mail: emre.kilic@tum.de; Eibert, Thomas F.

    2015-05-01

    An approach combining boundary integral and finite element methods is introduced for the solution of three-dimensional inverse electromagnetic medium scattering problems. Based on the equivalence principle, unknown equivalent electric and magnetic surface current densities on a closed surface are utilized to decompose the inverse medium problem into two parts: a linear radiation problem and a nonlinear cavity problem. The first problem is formulated by a boundary integral equation, the computational burden of which is reduced by employing the multilevel fast multipole method (MLFMM). Reconstructed Cauchy data on the surface allows the utilization of the Lorentz reciprocity and the Poynting's theorems. Exploiting these theorems, the noise level and an initial guess are estimated for the cavity problem. Moreover, it is possible to determine whether the material is lossy or not. In the second problem, the estimated surface currents form inhomogeneous boundary conditions of the cavity problem. The cavity problem is formulated by the finite element technique and solved iteratively by the Gauss–Newton method to reconstruct the properties of the object. Regularization for both the first and the second problems is achieved by a Krylov subspace method. The proposed method is tested against both synthetic and experimental data and promising reconstruction results are obtained.

  10. Bayesian tomographic reconstruction of microsystems

    International Nuclear Information System (INIS)

    Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali

    2007-01-01

    The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast).To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique.In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations

  11. L'Aquila's reconstruction challenges: has Italy learned from its previous earthquake disasters?

    Science.gov (United States)

    Ozerdem, Alpaslan; Rufini, Gianni

    2013-01-01

    Italy is an earthquake-prone country and its disaster emergency response experiences over the past few decades have varied greatly, with some being much more successful than others. Overall, however, its reconstruction efforts have been criticised for being ad hoc, delayed, ineffective, and untargeted. In addition, while the emergency relief response to the L'Aquila earthquake of 6 April 2009-the primary case study in this evaluation-seems to have been successful, the reconstruction initiative got off to a very problematic start. To explore the root causes of this phenomenon, the paper argues that, owing to the way in which Italian Prime Minister Silvio Berlusconi has politicised the process, the L'Aquila reconstruction endeavour is likely to suffer problems with local ownership, national/regional/municipal coordination, and corruption. It concludes with a set of recommendations aimed at addressing the pitfalls that may confront the L'Aquila reconstruction process over the next few years. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  12. Sparse regularization for EIT reconstruction incorporating structural information derived from medical imaging.

    Science.gov (United States)

    Gong, Bo; Schullcke, Benjamin; Krueger-Ziolek, Sabine; Mueller-Lisse, Ullrich; Moeller, Knut

    2016-06-01

    Electrical impedance tomography (EIT) reconstructs the conductivity distribution of a domain using electrical data on its boundary. This is an ill-posed inverse problem usually solved on a finite element mesh. For this article, a special regularization method incorporating structural information of the targeted domain is proposed and evaluated. Structural information was obtained either from computed tomography images or from preliminary EIT reconstructions by a modified k-means clustering. The proposed regularization method integrates this structural information into the reconstruction as a soft constraint preferring sparsity in group level. A first evaluation with Monte Carlo simulations indicated that the proposed solver is more robust to noise and the resulting images show fewer artifacts. This finding is supported by real data analysis. The structure based regularization has the potential to balance structural a priori information with data driven reconstruction. It is robust to noise, reduces artifacts and produces images that reflect anatomy and are thus easier to interpret for physicians.

  13. Hybrid inverse problems for a system of Maxwell’s equations

    International Nuclear Information System (INIS)

    Bal, Guillaume; Zhou, Ting

    2014-01-01

    This paper concerns the quantitative step of the medical imaging modality thermo-acoustic tomography (TAT). We model the radiation propagation by a system of Maxwell’s equations. We show that the index of refraction of light and the absorption coefficient (conductivity) can be uniquely and stably reconstructed from a sufficiently large number of TAT measurements. Our method is based on verifying that the linearization of the inverse problem forms a redundant elliptic system of equations. We also observe that the reconstructions are qualitatively quite different from the setting where radiation is modeled by a scalar Helmholtz equation as in Bal G et al (2011 Inverse Problems 27 055007). (paper)

  14. Accurate 3D reconstruction by a new PDS-OSEM algorithm for HRRT

    International Nuclear Information System (INIS)

    Chen, Tai-Been; Horng-Shing Lu, Henry; Kim, Hang-Keun; Son, Young-Don; Cho, Zang- Hee

    2014-01-01

    State-of-the-art high resolution research tomography (HRRT) provides high resolution PET images with full 3D human brain scanning. But, a short time frame in dynamic study causes many problems related to the low counts in the acquired data. The PDS-OSEM algorithm was proposed to reconstruct the HRRT image with a high signal-to-noise ratio that provides accurate information for dynamic data. The new algorithm was evaluated by simulated image, empirical phantoms, and real human brain data. Meanwhile, the time activity curve was adopted to validate a reconstructed performance of dynamic data between PDS-OSEM and OP-OSEM algorithms. According to simulated and empirical studies, the PDS-OSEM algorithm reconstructs images with higher quality, higher accuracy, less noise, and less average sum of square error than those of OP-OSEM. The presented algorithm is useful to provide quality images under the condition of low count rates in dynamic studies with a short scan time. - Highlights: • The PDS-OSEM reconstructs PET images with iteratively compensating random and scatter corrections from prompt sinogram. • The PDS-OSEM can reconstruct PET images with low count data and data contaminations. • The PDS-OSEM provides less noise and higher quality of reconstructed images than those of OP-OSEM algorithm in statistical sense

  15. A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction

    Directory of Open Access Journals (Sweden)

    Yiming Yan

    2017-01-01

    Full Text Available In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM, which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed ‘occlusions of random textures model’ are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images.

  16. MAP-MRF-Based Super-Resolution Reconstruction Approach for Coded Aperture Compressive Temporal Imaging

    Directory of Open Access Journals (Sweden)

    Tinghua Zhang

    2018-02-01

    Full Text Available Coded Aperture Compressive Temporal Imaging (CACTI can afford low-cost temporal super-resolution (SR, but limits are imposed by noise and compression ratio on reconstruction quality. To utilize inter-frame redundant information from multiple observations and sparsity in multi-transform domains, a robust reconstruction approach based on maximum a posteriori probability and Markov random field (MAP-MRF model for CACTI is proposed. The proposed approach adopts a weighted 3D neighbor system (WNS and the coordinate descent method to perform joint estimation of model parameters, to achieve the robust super-resolution reconstruction. The proposed multi-reconstruction algorithm considers both total variation (TV and ℓ 2 , 1 norm in wavelet domain to address the minimization problem for compressive sensing, and solves it using an accelerated generalized alternating projection algorithm. The weighting coefficient for different regularizations and frames is resolved by the motion characteristics of pixels. The proposed approach can provide high visual quality in the foreground and background of a scene simultaneously and enhance the fidelity of the reconstruction results. Simulation results have verified the efficacy of our new optimization framework and the proposed reconstruction approach.

  17. Generalized Fourier slice theorem for cone-beam image reconstruction.

    Science.gov (United States)

    Zhao, Shuang-Ren; Jiang, Dazong; Yang, Kevin; Yang, Kang

    2015-01-01

    The cone-beam reconstruction theory has been proposed by Kirillov in 1961, Tuy in 1983, Feldkamp in 1984, Smith in 1985, Pierre Grangeat in 1990. The Fourier slice theorem is proposed by Bracewell 1956, which leads to the Fourier image reconstruction method for parallel-beam geometry. The Fourier slice theorem is extended to fan-beam geometry by Zhao in 1993 and 1995. By combining the above mentioned cone-beam image reconstruction theory and the above mentioned Fourier slice theory of fan-beam geometry, the Fourier slice theorem in cone-beam geometry is proposed by Zhao 1995 in short conference publication. This article offers the details of the derivation and implementation of this Fourier slice theorem for cone-beam geometry. Especially the problem of the reconstruction from Fourier domain has been overcome, which is that the value of in the origin of Fourier space is 0/0. The 0/0 type of limit is proper handled. As examples, the implementation results for the single circle and two perpendicular circle source orbits are shown. In the cone-beam reconstruction if a interpolation process is considered, the number of the calculations for the generalized Fourier slice theorem algorithm is O(N^4), which is close to the filtered back-projection method, here N is the image size of 1-dimension. However the interpolation process can be avoid, in that case the number of the calculations is O(N5).

  18. LOR-interleaving image reconstruction for PET imaging with fractional-crystal collimation

    International Nuclear Information System (INIS)

    Li, Yusheng; Matej, Samuel; Karp, Joel S; Metzler, Scott D

    2015-01-01

    Positron emission tomography (PET) has become an important modality in medical and molecular imaging. However, in most PET applications, the resolution is still mainly limited by the physical crystal sizes or the detector’s intrinsic spatial resolution. To achieve images with better spatial resolution in a central region of interest (ROI), we have previously proposed using collimation in PET scanners. The collimator is designed to partially mask detector crystals to detect lines of response (LORs) within fractional crystals. A sequence of collimator-encoded LORs is measured with different collimation configurations. This novel collimated scanner geometry makes the reconstruction problem challenging, as both detector and collimator effects need to be modeled to reconstruct high-resolution images from collimated LORs. In this paper, we present a LOR-interleaving (LORI) algorithm, which incorporates these effects and has the advantage of reusing existing reconstruction software, to reconstruct high-resolution images for PET with fractional-crystal collimation. We also develop a 3D ray-tracing model incorporating both the collimator and crystal penetration for simulations and reconstructions of the collimated PET. By registering the collimator-encoded LORs with the collimator configurations, high-resolution LORs are restored based on the modeled transfer matrices using the non-negative least-squares method and EM algorithm. The resolution-enhanced images are then reconstructed from the high-resolution LORs using the MLEM or OSEM algorithm. For validation, we applied the LORI method to a small-animal PET scanner, A-PET, with a specially designed collimator. We demonstrate through simulated reconstructions with a hot-rod phantom and MOBY phantom that the LORI reconstructions can substantially improve spatial resolution and quantification compared to the uncollimated reconstructions. The LORI algorithm is crucial to improve overall image quality of collimated PET, which

  19. Reconstructing Unrooted Phylogenetic Trees from Symbolic Ternary Metrics.

    Science.gov (United States)

    Grünewald, Stefan; Long, Yangjing; Wu, Yaokun

    2018-03-09

    Böcker and Dress (Adv Math 138:105-125, 1998) presented a 1-to-1 correspondence between symbolically dated rooted trees and symbolic ultrametrics. We consider the corresponding problem for unrooted trees. More precisely, given a tree T with leaf set X and a proper vertex coloring of its interior vertices, we can map every triple of three different leaves to the color of its median vertex. We characterize all ternary maps that can be obtained in this way in terms of 4- and 5-point conditions, and we show that the corresponding tree and its coloring can be reconstructed from a ternary map that satisfies those conditions. Further, we give an additional condition that characterizes whether the tree is binary, and we describe an algorithm that reconstructs general trees in a bottom-up fashion.

  20. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  1. Multi-view Multi-sparsity Kernel Reconstruction for Multi-class Image Classification

    KAUST Repository

    Zhu, Xiaofeng

    2015-05-28

    This paper addresses the problem of multi-class image classification by proposing a novel multi-view multi-sparsity kernel reconstruction (MMKR for short) model. Given images (including test images and training images) representing with multiple visual features, the MMKR first maps them into a high-dimensional space, e.g., a reproducing kernel Hilbert space (RKHS), where test images are then linearly reconstructed by some representative training images, rather than all of them. Furthermore a classification rule is proposed to classify test images. Experimental results on real datasets show the effectiveness of the proposed MMKR while comparing to state-of-the-art algorithms.

  2. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

    International Nuclear Information System (INIS)

    Cai, C.; Rodet, T.; Mohammad-Djafari, A.; Legoupil, S.

    2013-01-01

    Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  3. An Algorithmic Approach to Total Breast Reconstruction with Free Tissue Transfer

    Directory of Open Access Journals (Sweden)

    Seong Cheol Yu

    2013-05-01

    Full Text Available As microvascular techniques continue to improve, perforator flap free tissue transfer is now the gold standard for autologous breast reconstruction. Various options are available for breast reconstruction with autologous tissue. These include the free transverse rectus abdominis myocutaneous (TRAM flap, deep inferior epigastric perforator flap, superficial inferior epigastric artery flap, superior gluteal artery perforator flap, and transverse/vertical upper gracilis flap. In addition, pedicled flaps can be very successful in the right hands and the right patient, such as the pedicled TRAM flap, latissimus dorsi flap, and thoracodorsal artery perforator. Each flap comes with its own advantages and disadvantages related to tissue properties and donor-site morbidity. Currently, the problem is how to determine the most appropriate flap for a particular patient among those potential candidates. Based on a thorough review of the literature and accumulated experiences in the author’s institution, this article provides a logical approach to autologous breast reconstruction. The algorithms presented here can be helpful to customize breast reconstruction to individual patient needs.

  4. Analysis of stability of tomographic reconstruction of x-ray medical images

    Directory of Open Access Journals (Sweden)

    Л. А. Булавін

    2017-09-01

    Full Text Available Slice reconstruction in X-ray computed tomography is reduced to the solution of integral equations, or a system of algebraic equations in discrete case. It is considered to be an ill-posed problem due to the inconsistencies in the number of equations and variables and due to errors in the experimental data. Therefore, determination of the best method of the slice reconstruction is of great interest. Furthermore, all available methods give approximate results. The aim of this article was two-fold: i to compare two methods of image reconstruction, viz. inverse projection and variation, using the numerical experiment; ii to obtain the relationship between image accuracy and experimental error. It appeared that the image obtained by inverse projection is unstable: there was no convergence of the approximate image to the accurate one, when the experimental error reached zero. In turn, the image obtained by variational method was accurate at zero experimental error. Finally, the latter showed better slice reconstruction, despite the low number of projections and the experimental errors.

  5. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history.

    Science.gov (United States)

    Cherry, Joshua L

    2017-02-23

    Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data. The algorithm is applied to bacterial data sets containing up to nearly 2000 genomes with several thousand variable nucleotide sites. Run times are several seconds or less. Computational experiments show that maximum compatibility is less sensitive than maximum parsimony to the inclusion of nucleotide data that, though derived from actual sequence reads, has been identified as likely to be misleading. Maximum compatibility is a useful tool for certain phylogenetic problems, such as inferring the relationships among closely-related bacteria from whole-genome sequence data. The algorithm presented here rapidly solves fairly large problems of this type, and provides robustness against misleading characters than can pollute large-scale sequencing data.

  6. BIOLOGICAL AND SYNTHETIC MATERIALS IN RECONSTRUCTIVE SURGERY FOR BREAST CANCER TREATMENT (LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    A. D. Zikiryakhodzhaev

    2018-01-01

    Full Text Available During the last years has been a worldwide trend towards rejuvenating breast cancer, and the evolution of reconstructive breast surgery is proceeding at a rapid pace. The surgical method is the primary method in the combined and complex treatment of breast cancer, and radical mastectomy is still the main option for surgical treatment in most Russian clinics. Most women who need a mastectomy prefer a one-stage breast reconstruction, because the woman is quickly rehabilitated psychologically and physically after this operation. Nevertheless, the use of silicone endoprostheses did not solve the problems of breast reconstruction in combined treatment in oncology. The issue remains unresolved of various complications, related not only to infections, but also to the development of capsular contracture after radiotherapy. Many patients with a one-stage breast reconstruction using a silicone endoprostheses lack the volume of their own tissues for reliable shelter of the endoprosthesis. In such cases, synthetic reticulated implants, biological implants or autologous flaps are used to cover and strengthen the lower slope of the reconstructed breast.

  7. An Approximate Cone Beam Reconstruction Algorithm for Gantry-Tilted CT Using Tangential Filtering

    Directory of Open Access Journals (Sweden)

    Ming Yan

    2006-01-01

    Full Text Available FDK algorithm is a well-known 3D (three-dimensional approximate algorithm for CT (computed tomography image reconstruction and is also known to suffer from considerable artifacts when the scanning cone angle is large. Recently, it has been improved by performing the ramp filtering along the tangential direction of the X-ray source helix for dealing with the large cone angle problem. In this paper, we present an FDK-type approximate reconstruction algorithm for gantry-tilted CT imaging. The proposed method improves the image reconstruction by filtering the projection data along a proper direction which is determined by CT parameters and gantry-tilted angle. As a result, the proposed algorithm for gantry-tilted CT reconstruction can provide more scanning flexibilities in clinical CT scanning and is efficient in computation. The performance of the proposed algorithm is evaluated with turbell clock phantom and thorax phantom and compared with FDK algorithm and a popular 2D (two-dimensional approximate algorithm. The results show that the proposed algorithm can achieve better image quality for gantry-tilted CT image reconstruction.

  8. Reconstruction of CMB temperature anisotropies with primordial CMB induced polarization in galaxy clusters

    Science.gov (United States)

    Liu, Guo-Chin; Ichiki, Kiyotomo; Tashiro, Hiroyuki; Sugiyama, Naoshi

    2016-07-01

    Scattering of cosmic microwave background (CMB) radiation in galaxy clusters induces polarization signals determined by the quadrupole anisotropy in the photon distribution at the location of clusters. This `remote quadrupole' derived from the measurements of the induced polarization in galaxy clusters provides an opportunity to reconstruct local CMB temperature anisotropies. In this Letter, we develop an algorithm of the reconstruction through the estimation of the underlying primordial gravitational potential, which is the origin of the CMB temperature and polarization fluctuations and CMB induced polarization in galaxy clusters. We found a nice reconstruction for the quadrupole and octopole components of the CMB temperature anisotropies with the assistance of the CMB induced polarization signals. The reconstruction can be an important consistency test on the puzzles of CMB anomalies, especially for the low-quadrupole and axis-of-evil problems reported in Wilkinson Microwave Anisotropy Probe and Planck data.

  9. Exact Reconstruction From Uniformly Attenuated Helical Cone-Beam Projections in SPECT

    International Nuclear Information System (INIS)

    Gullberg, Grant T.; Huang, Qiu; You, Jiangsheng; Zeng, Gengsheng L.

    2008-01-01

    In recent years the development of cone-beam reconstruction algorithms has been an active research area in x-ray computed tomography (CT), and significant progress has been made in the advancement of algorithms. Theoretically exact and computationally efficient analytical algorithms can be found in the literature. However, in single photon emission computed tomography (SPECT), published cone-beam reconstruction algorithms are either approximate or involve iterative methods. The SPECT reconstruction problem is more complicated due to degradations in the imaging detection process, one of which is the effect of attenuation of gamma ray photons. Attenuation should be compensated for to obtain quantitative results. In this paper, an analytical reconstruction algorithm for uniformly attenuated cone-beam projection data is presented for SPECT imaging. The algorithm adopts the DBH method, a procedure consisting of differentiation and backprojection followed by a finite inverse cosh-weighted Hilbert transform. The significance of the proposed approach is that a selected region of interest can be reconstructed even with a detector with a reduced field of view. The algorithm is designed for a general trajectory. However, to validate the algorithm, a numerical study was performed using a helical trajectory. The implementation is efficient and the simulation result is promising

  10. Efficient Sum of Outer Products Dictionary Learning (SOUP-DIL) and Its Application to Inverse Problems.

    Science.gov (United States)

    Ravishankar, Saiprasad; Nadakuditi, Raj Rao; Fessler, Jeffrey A

    2017-12-01

    The sparsity of signals in a transform domain or dictionary has been exploited in applications such as compression, denoising and inverse problems. More recently, data-driven adaptation of synthesis dictionaries has shown promise compared to analytical dictionary models. However, dictionary learning problems are typically non-convex and NP-hard, and the usual alternating minimization approaches for these problems are often computationally expensive, with the computations dominated by the NP-hard synthesis sparse coding step. This paper exploits the ideas that drive algorithms such as K-SVD, and investigates in detail efficient methods for aggregate sparsity penalized dictionary learning by first approximating the data with a sum of sparse rank-one matrices (outer products) and then using a block coordinate descent approach to estimate the unknowns. The resulting block coordinate descent algorithms involve efficient closed-form solutions. Furthermore, we consider the problem of dictionary-blind image reconstruction, and propose novel and efficient algorithms for adaptive image reconstruction using block coordinate descent and sum of outer products methodologies. We provide a convergence study of the algorithms for dictionary learning and dictionary-blind image reconstruction. Our numerical experiments show the promising performance and speedups provided by the proposed methods over previous schemes in sparse data representation and compressed sensing-based image reconstruction.

  11. Coordinate reconstruction using box reconstruction and projection of X-ray photo

    International Nuclear Information System (INIS)

    Achmad Suntoro

    2011-01-01

    Some mathematical formula have been derived for a process of reconstruction to define the coordinate of any point relative to a pre set coordinate system. The process of reconstruction uses a reconstruction box in which each edge's length of the box is known, each top-bottom face and left-right face of the box having a cross marker, and the top face and the right face of the box as plane projections by X-ray source in perspective projection -system. Using the data of the two X-ray projection images, any point inside the reconstruction box, as long as its projection is recorded in the two photos, will be determined its coordinate relative to the midpoint of the reconstruction box as the central point coordinates. (author)

  12. Track reconstruction in liquid hydrogen ionization chamber

    International Nuclear Information System (INIS)

    Balbekov, V.I.; Baranov, A.M.; Krasnokutski, R.N.; Perelygin, V.P.; Rasuvaev, E.A.; Shuvalov, R.S.; Zhigunov, V.P.; Lebedenko, V.N.; Stern, B.E.

    1979-01-01

    It is shown that particle track parameters can be reconstructed by the currents in the anode cells of the ionization chamber. The calculations are carried out for the chamber with 10 cm anode-cathode gap width. For simplicity a two-dimensional chamber model is used. To make the calculations simpler the charge density along the track is considered to be constant and equal to 10 4 electrons/mm. The drift velocity of electrons is assumed to be 5x10 6 cm/s. The anode is devided into cells 2 cm in width. The events in the chamber is defined with the coordinates X and Z of the event vertex, polar angles THETA of each track and track length l. The coordinates x, y and track angle THETA are reconstructed by currents with errors of up to millimetre and milliradian. The reconstruction errors are proportional to noise levels of electronics and also depend on the track geometry and argon purification. The energy resolution of the chamber is calculated for high energy electrons by means of computer program based on a Monter-Carlo method. The conclusion is made that the energy resolution depends on the gap width as a square root. Two ways to solve the track reconstruction problem are considered: 1. the initial charge density is determined by measuring the charges induced in anode strips at some discrete moments of time; 2. the evaluation of the parameters ia made by traditional minimization technique. The second method is applicable only for a not very large number of hypothesis, but it is less time consuming

  13. Missing data reconstruction using Gaussian mixture models for fingerprint images

    Science.gov (United States)

    Agaian, Sos S.; Yeole, Rushikesh D.; Rao, Shishir P.; Mulawka, Marzena; Troy, Mike; Reinecke, Gary

    2016-05-01

    Publisher's Note: This paper, originally published on 25 May 2016, was replaced with a revised version on 16 June 2016. If you downloaded the original PDF, but are unable to access the revision, please contact SPIE Digital Library Customer Service for assistance. One of the most important areas in biometrics is matching partial fingerprints in fingerprint databases. Recently, significant progress has been made in designing fingerprint identification systems for missing fingerprint information. However, a dependable reconstruction of fingerprint images still remains challenging due to the complexity and the ill-posed nature of the problem. In this article, both binary and gray-level images are reconstructed. This paper also presents a new similarity score to evaluate the performance of the reconstructed binary image. The offered fingerprint image identification system can be automated and extended to numerous other security applications such as postmortem fingerprints, forensic science, investigations, artificial intelligence, robotics, all-access control, and financial security, as well as for the verification of firearm purchasers, driver license applicants, etc.

  14. A Novel Kernel-Based Regularization Technique for PET Image Reconstruction

    Directory of Open Access Journals (Sweden)

    Abdelwahhab Boudjelal

    2017-06-01

    Full Text Available Positron emission tomography (PET is an imaging technique that generates 3D detail of physiological processes at the cellular level. The technique requires a radioactive tracer, which decays and releases a positron that collides with an electron; consequently, annihilation photons are emitted, which can be measured. The purpose of PET is to use the measurement of photons to reconstruct the distribution of radioisotopes in the body. Currently, PET is undergoing a revamp, with advancements in data measurement instruments and the computing methods used to create the images. These computer methods are required to solve the inverse problem of “image reconstruction from projection”. This paper proposes a novel kernel-based regularization technique for maximum-likelihood expectation-maximization ( κ -MLEM to reconstruct the image. Compared to standard MLEM, the proposed algorithm is more robust and is more effective in removing background noise, whilst preserving the edges; this suppresses image artifacts, such as out-of-focus slice blur.

  15. Bayesian image reconstruction for improving detection performance of muon tomography.

    Science.gov (United States)

    Wang, Guobao; Schultz, Larry J; Qi, Jinyi

    2009-05-01

    Muon tomography is a novel technology that is being developed for detecting high-Z materials in vehicles or cargo containers. Maximum likelihood methods have been developed for reconstructing the scattering density image from muon measurements. However, the instability of maximum likelihood estimation often results in noisy images and low detectability of high-Z targets. In this paper, we propose using regularization to improve the image quality of muon tomography. We formulate the muon reconstruction problem in a Bayesian framework by introducing a prior distribution on scattering density images. An iterative shrinkage algorithm is derived to maximize the log posterior distribution. At each iteration, the algorithm obtains the maximum a posteriori update by shrinking an unregularized maximum likelihood update. Inverse quadratic shrinkage functions are derived for generalized Laplacian priors and inverse cubic shrinkage functions are derived for generalized Gaussian priors. Receiver operating characteristic studies using simulated data demonstrate that the Bayesian reconstruction can greatly improve the detection performance of muon tomography.

  16. Reconstruction of Consistent 3d CAD Models from Point Cloud Data Using a Priori CAD Models

    Science.gov (United States)

    Bey, A.; Chaine, R.; Marc, R.; Thibault, G.; Akkouche, S.

    2011-09-01

    We address the reconstruction of 3D CAD models from point cloud data acquired in industrial environments, using a pre-existing 3D model as an initial estimate of the scene to be processed. Indeed, this prior knowledge can be used to drive the reconstruction so as to generate an accurate 3D model matching the point cloud. We more particularly focus our work on the cylindrical parts of the 3D models. We propose to state the problem in a probabilistic framework: we have to search for the 3D model which maximizes some probability taking several constraints into account, such as the relevancy with respect to the point cloud and the a priori 3D model, and the consistency of the reconstructed model. The resulting optimization problem can then be handled using a stochastic exploration of the solution space, based on the random insertion of elements in the configuration under construction, coupled with a greedy management of the conflicts which efficiently improves the configuration at each step. We show that this approach provides reliable reconstructed 3D models by presenting some results on industrial data sets.

  17. Reconstruction of the spatial dependence of dielectric and geometrical properties of adhesively bonded structures

    International Nuclear Information System (INIS)

    Mackay, C; Hayward, D; Mulholland, A J; McKee, S; Pethrick, R A

    2005-01-01

    An inverse problem motivated by the nondestructive testing of adhesively bonded structures used in the aircraft industry is studied. Using transmission line theory, a model is developed which, when supplied with electrical and geometrical parameters, accurately predicts the reflection coefficient associated with such structures. Particular attention is paid to modelling the connection between the structures and the equipment used to measure the reflection coefficient. The inverse problem is then studied and an optimization approach employed to recover these electrical and geometrical parameters from experimentally obtained data. In particular the approach focuses on the recovery of spatially varying geometrical parameters as this is paramount to the successful reconstruction of electrical parameters. Reconstructions of structure geometry using this method are found to be in close agreement with experimental observations

  18. TV-based conjugate gradient method and discrete L-curve for few-view CT reconstruction of X-ray in vivo data.

    Science.gov (United States)

    Yang, Xiaoli; Hofmann, Ralf; Dapp, Robin; van de Kamp, Thomas; dos Santos Rolo, Tomy; Xiao, Xianghui; Moosmann, Julian; Kashef, Jubin; Stotzka, Rainer

    2015-03-09

    High-resolution, three-dimensional (3D) imaging of soft tissues requires the solution of two inverse problems: phase retrieval and the reconstruction of the 3D image from a tomographic stack of two-dimensional (2D) projections. The number of projections per stack should be small to accommodate fast tomography of rapid processes and to constrain X-ray radiation dose to optimal levels to either increase the duration of in vivo time-lapse series at a given goal for spatial resolution and/or the conservation of structure under X-ray irradiation. In pursuing the 3D reconstruction problem in the sense of compressive sampling theory, we propose to reduce the number of projections by applying an advanced algebraic technique subject to the minimisation of the total variation (TV) in the reconstructed slice. This problem is formulated in a Lagrangian multiplier fashion with the parameter value determined by appealing to a discrete L-curve in conjunction with a conjugate gradient method. The usefulness of this reconstruction modality is demonstrated for simulated and in vivo data, the latter acquired in parallel-beam imaging experiments using synchrotron radiation.

  19. Reconstruction of structural damage based on reflection intensity spectra of fiber Bragg gratings

    International Nuclear Information System (INIS)

    Huang, Guojun; Wei, Changben; Chen, Shiyuan; Yang, Guowei

    2014-01-01

    We present an approach for structural damage reconstruction based on the reflection intensity spectra of fiber Bragg gratings (FBGs). Our approach incorporates the finite element method, transfer matrix (T-matrix), and genetic algorithm to solve the inverse photo-elastic problem of damage reconstruction, i.e. to identify the location, size, and shape of a defect. By introducing a parameterized characterization of the damage information, the inverse photo-elastic problem is reduced to an optimization problem, and a relevant computational scheme was developed. The scheme iteratively searches for the solution to the corresponding direct photo-elastic problem until the simulated and measured (or target) reflection intensity spectra of the FBGs near the defect coincide within a prescribed error. Proof-of-concept validations of our approach were performed numerically and experimentally using both holed and cracked plate samples as typical cases of plane-stress problems. The damage identifiability was simulated by changing the deployment of the FBG sensors, including the total number of sensors and their distance to the defect. Both the numerical and experimental results demonstrate that our approach is effective and promising. It provides us with a photo-elastic method for developing a remote, automatic damage-imaging technique that substantially improves damage identification for structural health monitoring. (paper)

  20. Sparse Image Reconstruction on the Sphere: Analysis and Synthesis.

    Science.gov (United States)

    Wallis, Christopher G R; Wiaux, Yves; McEwen, Jason D

    2017-11-01

    We develop techniques to solve ill-posed inverse problems on the sphere by sparse regularization, exploiting sparsity in both axisymmetric and directional scale-discretized wavelet space. Denoising, inpainting, and deconvolution problems and combinations thereof, are considered as examples. Inverse problems are solved in both the analysis and synthesis settings, with a number of different sampling schemes. The most effective approach is that with the most restricted solution-space, which depends on the interplay between the adopted sampling scheme, the selection of the analysis/synthesis problem, and any weighting of the l 1 norm appearing in the regularization problem. More efficient sampling schemes on the sphere improve reconstruction fidelity by restricting the solution-space and also by improving sparsity in wavelet space. We apply the technique to denoise Planck 353-GHz observations, improving the ability to extract the structure of Galactic dust emission, which is important for studying Galactic magnetism.

  1. Three-dimensional reconstruction of vessels with stenoses and aneurysms from dual biplane angiograms

    Science.gov (United States)

    Fessler, Jeffrey A.; Macovski, Albert

    1989-05-01

    Parametric model-based approaches to 3-D reconstruction of vessels overcome the inherent problem of underdeterminancy in reconstruction from limited views by incorporating a priori knowledge about the structure of vessels and about the measurement statistics. In this paper, we describe two extensions to the parametric approach. First, we consider the problem of reconstruction from a pair of bi-plane angiograms that are acquired at different projection angles. Since bi-plane angiography systems are widely available, this is a practical measurement geometry. The patient may move between acquisitions, so we have extended our model to allow for object translation between the first and second pair of projections. Second, we describe how to accurately estimate the dimensions of a aneurysm from the dual-biplane angiogram. We applied the new algorithm to four synthetic angiograms (projection angles 0°, 20°, 90°, and 110°) of a vessel with a small aneurysm and an eccentric stenosis. The angiograms were corrupted by additive noise and background structure. Except near the top and bottom of the aneurysm, the estimated cross sections of the aneurysm and stenosis agree very well with the true cross sections.

  2. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed

  3. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus

    Energy Technology Data Exchange (ETDEWEB)

    Bielecki, J.; Scholz, M.; Drozdowicz, K. [Institute of Nuclear Physics, Polish Academy of Sciences, PL-31342 Krakow (Poland); Giacomelli, L. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Istituto di Fisica del Plasma “P. Caldirola,” Milano (Italy); Kiptily, V.; Kempenaars, M. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Conroy, S. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Department of Physics and Astronomy, Uppsala University (Sweden); Craciunescu, T. [IAP, National Institute for Laser Plasma and Radiation Physics, Bucharest (Romania); Collaboration: EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-09-15

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  4. [Trachea repair and reconstruction with new composite artificial trachea transplantation].

    Science.gov (United States)

    Liu, Wenliang; Xiao, Peng; Liang, Hengxing; An, Ran; Cheng, Gang; Yu, Fenglei

    2013-03-01

    To construct a new composite artificial trachea and to investigate the feasibility of trachea repair and reconstruction with the new composite artificial trachea transplantation in dogs. The basic skeleton of the new composite artificial trachea was polytetrafluoroethylene vascular prosthesis linked with titanium rings at both ends. Dualmesh was sutured on titanium rings. Sixteen dogs, weighing (14.9 +/- 2.0) kg, female or male, were selected. The 5 cm cervical trachea was resected to prepare the cervical trachea defect model. The trachea repair and reconstruction was performed with the new composite artificial trachea. Then fiberoptic bronchoscope examination, CT scan and three-dimensinal reconstruction were conducted at immediate, 1 month, and 6 months after operation. Gross observation and histological examination were conducted at 14 months to evaluate the repair and reconstruction efficacy. No dog died during operation of trachea reconstruction. One dog died of dyspnea at 37, 41, 55, 66, 140, and 274 days respectively because of anastomotic dehiscence and artificial trachea displacement; the other 10 dogs survived until 14 months. The fiberoptic bronchoscope examination, CT scan and three-dimensinal reconstruction showed that artificial tracheas were all in good location without twisting at immediate after operation; mild stenosis occurred and anastomoses had slight granulation in 6 dogs at 1 month; severe stenosis developed and anastomosis had more granulation in 1 dog and the other dogs were well alive without anastomotic stenosis at 6 months. At 14 months, gross observation revealed that outer surface of the artificial trachea were encapsulated by fibrous connective tissue in all of 10 dogs. Histological examination showed inflammatory infiltration and hyperplasia of fibrous tissue and no epithelium growth on the inner wall of the artificial trachea. The new composite artificial trachea can be used to repair and reconstruct defect of the trachea for a short

  5. Vaginal reconstruction

    International Nuclear Information System (INIS)

    Lesavoy, M.A.

    1985-01-01

    Vaginal reconstruction can be an uncomplicated and straightforward procedure when attention to detail is maintained. The Abbe-McIndoe procedure of lining the neovaginal canal with split-thickness skin grafts has become standard. The use of the inflatable Heyer-Schulte vaginal stent provides comfort to the patient and ease to the surgeon in maintaining approximation of the skin graft. For large vaginal and perineal defects, myocutaneous flaps such as the gracilis island have been extremely useful for correction of radiation-damaged tissue of the perineum or for the reconstruction of large ablative defects. Minimal morbidity and scarring ensue because the donor site can be closed primarily. With all vaginal reconstruction, a compliant patient is a necessity. The patient must wear a vaginal obturator for a minimum of 3 to 6 months postoperatively and is encouraged to use intercourse as an excellent obturator. In general, vaginal reconstruction can be an extremely gratifying procedure for both the functional and emotional well-being of patients

  6. Early and late complications in the reconstructed mandible with free fibula flaps.

    Science.gov (United States)

    van Gemert, Johannes T M; Abbink, Jan H; van Es, Robert J J; Rosenberg, Antoine J W P; Koole, Ron; Van Cann, Ellen M

    2018-03-01

    Evaluation of mandibular reconstructions with free fibula flaps. Identification of factors associated with major recipient site complications, that is, necessitating surgical intervention under general anaesthesia. Seventy-nine reconstructions were included. The following factors were analyzed: fixation type, number of osteotomies, site of defect (bilateral/unilateral), surgeon, sex, ASA classification, continuous smoking, pathological N-stage, age, defect size, flap ischemic time, and postoperative radiotherapy. Proportional hazards regression was used to test the effect on the time between reconstruction and intervention. Sixty-nine (87%) of the 79 fibula flaps were successful at the last follow-up. Forty-eight major recipient site complications occurred in 41 reconstructions. Nineteen complications required surgical intervention within six weeks and were mostly vascular problems, necessitating immediate intervention. These early complications were associated with defects crossing the midline, with an estimated relative risk of 5.3 (CI 1.1-20, P = 0.01). Twenty-nine complications required surgical intervention more than 6 weeks after the reconstruction. These late complications generally occurred after months or years, and were associated with smoking, with an estimated relative risk of 2.8 (CI 1.0-8.3, P = 0.05). Fibula flaps crossing the midline have a higher risk of early major recipient site complications than unilateral reconstructions. Smoking increases the risk of late complications. © 2018 The Authors. Journal of Surgical Oncology Published by Wiley Periodicals, Inc.

  7. Direct and inverse problems of infrared tomography

    DEFF Research Database (Denmark)

    Sizikov, Valery S.; Evseev, Vadim; Fateev, Alexander

    2016-01-01

    The problems of infrared tomography-direct (the modeling of measured functions) and inverse (the reconstruction of gaseous medium parameters)-are considered with a laboratory burner flame as an example of an application. The two measurement modes are used: active (ON) with an external IR source...

  8. Bessel Fourier orientation reconstruction: an analytical EAP reconstruction using multiple shell acquisitions in diffusion MRI.

    Science.gov (United States)

    Hosseinbor, Ameer Pasha; Chung, Moo K; Wu, Yu-Chien; Alexander, Andrew L

    2011-01-01

    The estimation of the ensemble average propagator (EAP) directly from q-space DWI signals is an open problem in diffusion MRI. Diffusion spectrum imaging (DSI) is one common technique to compute the EAP directly from the diffusion signal, but it is burdened by the large sampling required. Recently, several analytical EAP reconstruction schemes for multiple q-shell acquisitions have been proposed. One, in particular, is Diffusion Propagator Imaging (DPI) which is based on the Laplace's equation estimation of diffusion signal for each shell acquisition. Viewed intuitively in terms of the heat equation, the DPI solution is obtained when the heat distribution between temperatuere measurements at each shell is at steady state. We propose a generalized extension of DPI, Bessel Fourier Orientation Reconstruction (BFOR), whose solution is based on heat equation estimation of the diffusion signal for each shell acquisition. That is, the heat distribution between shell measurements is no longer at steady state. In addition to being analytical, the BFOR solution also includes an intrinsic exponential smootheing term. We illustrate the effectiveness of the proposed method by showing results on both synthetic and real MR datasets.

  9. Technical Note: Correcting for signal attenuation from noisy proxy data in climate reconstructions

    KAUST Repository

    Ammann, C. M.

    2010-04-20

    Regression-based climate reconstructions scale one or more noisy proxy records against a (generally) short instrumental data series. Based on that relationship, the indirect information is then used to estimate that particular measure of climate back in time. A well-calibrated proxy record(s), if stationary in its relationship to the target, should faithfully preserve the mean amplitude of the climatic variable. However, it is well established in the statistical literature that traditional regression parameter estimation can lead to substantial amplitude attenuation if the predictors carry significant amounts of noise. This issue is known as "Measurement Error" (Fuller, 1987; Carroll et al., 2006). Climate proxies derived from tree-rings, ice cores, lake sediments, etc., are inherently noisy and thus all regression-based reconstructions could suffer from this problem. Some recent applications attempt to ward off amplitude attenuation, but implementations are often complex (Lee et al., 2008) or require additional information, e.g. from climate models (Hegerl et al., 2006, 2007). Here we explain the cause of the problem and propose an easy, generally applicable, data-driven strategy to effectively correct for attenuation (Fuller, 1987; Carroll et al., 2006), even at annual resolution. The impact is illustrated in the context of a Northern Hemisphere mean temperature reconstruction. An inescapable trade-off for achieving an unbiased reconstruction is an increase in variance, but for many climate applications the change in mean is a core interest.

  10. Technical Note: Correcting for signal attenuation from noisy proxy data in climate reconstructions

    Directory of Open Access Journals (Sweden)

    C. M. Ammann

    2010-04-01

    Full Text Available Regression-based climate reconstructions scale one or more noisy proxy records against a (generally short instrumental data series. Based on that relationship, the indirect information is then used to estimate that particular measure of climate back in time. A well-calibrated proxy record(s, if stationary in its relationship to the target, should faithfully preserve the mean amplitude of the climatic variable. However, it is well established in the statistical literature that traditional regression parameter estimation can lead to substantial amplitude attenuation if the predictors carry significant amounts of noise. This issue is known as "Measurement Error" (Fuller, 1987; Carroll et al., 2006. Climate proxies derived from tree-rings, ice cores, lake sediments, etc., are inherently noisy and thus all regression-based reconstructions could suffer from this problem. Some recent applications attempt to ward off amplitude attenuation, but implementations are often complex (Lee et al., 2008 or require additional information, e.g. from climate models (Hegerl et al., 2006, 2007. Here we explain the cause of the problem and propose an easy, generally applicable, data-driven strategy to effectively correct for attenuation (Fuller, 1987; Carroll et al., 2006, even at annual resolution. The impact is illustrated in the context of a Northern Hemisphere mean temperature reconstruction. An inescapable trade-off for achieving an unbiased reconstruction is an increase in variance, but for many climate applications the change in mean is a core interest.

  11. Breast Reconstruction After Mastectomy

    Science.gov (United States)

    ... Cancer Prevention Genetics of Breast & Gynecologic Cancers Breast Cancer Screening Research Breast Reconstruction After Mastectomy On This Page What is breast reconstruction? How do surgeons use implants to reconstruct a woman’s breast? How do surgeons ...

  12. Multi-sheet surface rebinning methods for reconstruction from asymmetrically truncated cone beam projections: I. Approximation and optimality

    International Nuclear Information System (INIS)

    Betcke, Marta M; Lionheart, William R B

    2013-01-01

    The mechanical motion of the gantry in conventional cone beam CT scanners restricts the speed of data acquisition in applications with near real time requirements. A possible resolution of this problem is to replace the moving source detector assembly with static parts that are electronically activated. An example of such a system is the Rapiscan Systems RTT80 real time tomography scanner, with a static ring of sources and axially offset static cylinder of detectors. A consequence of such a design is asymmetrical axial truncation of the cone beam projections resulting, in the sense of integral geometry, in severely incomplete data. In particular we collect data only in a fraction of the Tam–Danielsson window, hence the standard cone beam reconstruction techniques do not apply. In this work we propose a family of multi-sheet surface rebinning methods for reconstruction from such truncated projections. The proposed methods combine analytical and numerical ideas utilizing linearity of the ray transform to reconstruct data on multi-sheet surfaces, from which the volumetric image is obtained through deconvolution. In this first paper in the series, we discuss the rebinning to multi-sheet surfaces. In particular we concentrate on the underlying transforms on multi-sheet surfaces and their approximation with data collected by offset multi-source scanning geometries like the RTT. The optimal multi-sheet surface and the corresponding rebinning function are found as a solution of a variational problem. In the case of the quadratic objective, the variational problem for the optimal rebinning pair can be solved by a globally convergent iteration. Examples of optimal rebinning pairs are computed for different trajectories. We formulate the axial deconvolution problem for the recovery of the volumetric image from the reconstructions on multi-sheet surfaces. Efficient and stable solution of the deconvolution problem is the subject of the second paper in this series (Betcke and

  13. A maximum-likelihood reconstruction algorithm for tomographic gamma-ray nondestructive assay

    International Nuclear Information System (INIS)

    Prettyman, T.H.; Estep, R.J.; Cole, R.A.; Sheppard, G.A.

    1994-01-01

    A new tomographic reconstruction algorithm for nondestructive assay with high resolution gamma-ray spectroscopy (HRGS) is presented. The reconstruction problem is formulated using a maximum-likelihood approach in which the statistical structure of both the gross and continuum measurements used to determine the full-energy response in HRGS is precisely modeled. An accelerated expectation-maximization algorithm is used to determine the optimal solution. The algorithm is applied to safeguards and environmental assays of large samples (for example, 55-gal. drums) in which high continuum levels caused by Compton scattering are routinely encountered. Details of the implementation of the algorithm and a comparative study of the algorithm's performance are presented

  14. Weighted expectation maximization reconstruction algorithms with application to gated megavoltage tomography

    International Nuclear Information System (INIS)

    Zhang Jin; Shi Daxin; Anastasio, Mark A; Sillanpaa, Jussi; Chang Jenghwa

    2005-01-01

    We propose and investigate weighted expectation maximization (EM) algorithms for image reconstruction in x-ray tomography. The development of the algorithms is motivated by the respiratory-gated megavoltage tomography problem, in which the acquired asymmetric cone-beam projections are limited in number and unevenly sampled over view angle. In these cases, images reconstructed by use of the conventional EM algorithm can contain ring- and streak-like artefacts that are attributable to a combination of data inconsistencies and truncation of the projection data. By use of computer-simulated and clinical gated fan-beam megavoltage projection data, we demonstrate that the proposed weighted EM algorithms effectively mitigate such image artefacts. (note)

  15. Evidence-Based ACL Reconstruction

    Directory of Open Access Journals (Sweden)

    E. Carlos RODRIGUEZ-MERCHAN

    2015-01-01

    Full Text Available There is controversy in the literature regarding a number of topics related to anterior cruciate ligament (ACLreconstruction. The purpose of this article is to answer the following questions: 1 Bone patellar tendon bone (BPTB reconstruction or hamstring reconstruction (HR; 2 Double bundle or single bundle; 3 Allograft or authograft; 4 Early or late reconstruction; 5 Rate of return to sports after ACL reconstruction; 6 Rate of osteoarthritis after ACL reconstruction. A Cochrane Library and PubMed (MEDLINE search of systematic reviews and meta-analysis related to ACL reconstruction was performed. The key words were: ACL reconstruction, systematic reviews and meta-analysis. The main criteria for selection were that the articles were systematic reviews and meta-analysesfocused on the aforementioned questions. Sixty-nine articles were found, but only 26 were selected and reviewed because they had a high grade (I-II of evidence. BPTB-R was associated with better postoperative knee stability but with a higher rate of morbidity. However, the results of both procedures in terms of functional outcome in the long-term were similar. The double-bundle ACL reconstruction technique showed better outcomes in rotational laxity, although functional recovery was similar between single-bundle and double-bundle. Autograft yielded better results than allograft. There was no difference between early and delayed reconstruction. 82% of patients were able to return to some kind of sport participation. 28% of patients presented radiological signs of osteoarthritis with a follow-up of minimum 10 years.

  16. Quiet planting in the locked constraints satisfaction problems

    Energy Technology Data Exchange (ETDEWEB)

    Zdeborova, Lenka [Los Alamos National Laboratory; Krzakala, Florent [Los Alamos National Laboratory

    2009-01-01

    We study the planted ensemble of locked constraint satisfaction problems. We describe the connection between the random and planted ensembles. The use of the cavity method is combined with arguments from reconstruction on trees and first and second moment considerations; in particular the connection with the reconstruction on trees appears to be crucial. Our main result is the location of the hard region in the planted ensemble, thus providing hard satisfiable benchmarks. In a part of that hard region instances have with high probability a single satisfying assignment.

  17. Automatic Indoor Building Reconstruction from Mobile Laser Scanning Data

    Science.gov (United States)

    Xie, L.; Wang, R.

    2017-09-01

    Indoor reconstruction from point clouds is a hot topic in photogrammetry, computer vision and computer graphics. Reconstructing indoor scene from point clouds is challenging due to complex room floorplan and line-of-sight occlusions. Most of existing methods deal with stationary terrestrial laser scanning point clouds or RGB-D point clouds. In this paper, we propose an automatic method for reconstructing indoor 3D building models from mobile laser scanning point clouds. The method includes 2D floorplan generation, 3D building modeling, door detection and room segmentation. The main idea behind our approach is to separate wall structure into two different types as the inner wall and the outer wall based on the observation of point distribution. Then we utilize a graph cut based optimization method to solve the labeling problem and generate the 2D floorplan based on the optimization result. Subsequently, we leverage an ?-shape based method to detect the doors on the 2D projected point clouds and utilize the floorplan to segment the individual room. The experiments show that this door detection method can achieve a recognition rate at 97% and the room segmentation method can attain the correct segmentation results. We also evaluate the reconstruction accuracy on the synthetic data, which indicates the accuracy of our method is comparable to the state-of-the art.

  18. PET image reconstruction: mean, variance, and optimal minimax criterion

    International Nuclear Information System (INIS)

    Liu, Huafeng; Guo, Min; Gao, Fei; Shi, Pengcheng; Xue, Liying; Nie, Jing

    2015-01-01

    Given the noise nature of positron emission tomography (PET) measurements, it is critical to know the image quality and reliability as well as expected radioactivity map (mean image) for both qualitative interpretation and quantitative analysis. While existing efforts have often been devoted to providing only the reconstructed mean image, we present a unified framework for joint estimation of the mean and corresponding variance of the radioactivity map based on an efficient optimal min–max criterion. The proposed framework formulates the PET image reconstruction problem to be a transformation from system uncertainties to estimation errors, where the minimax criterion is adopted to minimize the estimation errors with possibly maximized system uncertainties. The estimation errors, in the form of a covariance matrix, express the measurement uncertainties in a complete way. The framework is then optimized by ∞-norm optimization and solved with the corresponding H ∞ filter. Unlike conventional statistical reconstruction algorithms, that rely on the statistical modeling methods of the measurement data or noise, the proposed joint estimation stands from the point of view of signal energies and can handle from imperfect statistical assumptions to even no a priori statistical assumptions. The performance and accuracy of reconstructed mean and variance images are validated using Monte Carlo simulations. Experiments on phantom scans with a small animal PET scanner and real patient scans are also conducted for assessment of clinical potential. (paper)

  19. A TRAM flap design refinement for use in delayed breast reconstruction.

    LENUS (Irish Health Repository)

    Patel, A J K

    2009-09-01

    Autologous breast reconstruction following mastectomy is commonly achieved using the free Transverse Rectus Abdominis Myocutaneous (TRAM) flap. Since its first description, refinements and modifications have resulted in improved operative techniques and more aesthetically pleasing reconstructions. Pre-operative flap design, however, is a relatively new concept that has not received much attention in the literature. Patients who undergo breast reconstruction may have large, ptotic contralateral breasts. In these patients there is a tendency to raise a large abdominal flap in an attempt to achieve symmetry, or simply a larger breast. This has the potential to lead to tight closure of the abdomen and the risk of subsequent wound problems. Reconstructions that are too small or have inadequate ptosis commit the patient to contralateral breast surgery to achieve symmetry. Pre-operatively designing the flap, using a template created from the opposite breast, can help achieve a good match, often reducing the need for contralateral breast surgery. Even when contralateral breast reduction surgery is planned in advance, many of these patients still require, and prefer, a large reconstruction in order to achieve a well-proportioned result. We present a design template that addresses these particular issues and in the senior author\\'s hands has proved to be a very effective technique. Our technique allows raising an abdominal flap of less vertical height than traditionally used (thus reducing the risk of tight abdominal closure) and incorporates an inverted V-shaped flap of skin from the inferior mastectomy skin flap into the reconstruction. This allows more flap tissue to be available to fill the upper poles of the reconstructed breast and at the same time produces good ptosis.

  20. Reconstructing missing daily precipitation data using regression trees and artificial neural networks

    Science.gov (United States)

    Incomplete meteorological data has been a problem in environmental modeling studies. The objective of this work was to develop a technique to reconstruct missing daily precipitation data in the central part of Chesapeake Bay Watershed using regression trees (RT) and artificial neural networks (ANN)....

  1. Tomographic Reconstruction from a Few Views: A Multi-Marginal Optimal Transport Approach

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, I., E-mail: isabelle.abraham@cea.fr [CEA Ile de France (France); Abraham, R., E-mail: romain.abraham@univ-orleans.fr; Bergounioux, M., E-mail: maitine.bergounioux@univ-orleans.fr [Université d’Orléans, UFR Sciences, MAPMO, UMR 7349 (France); Carlier, G., E-mail: carlier@ceremade.dauphine.fr [CEREMADE, UMR CNRS 7534, Université Paris IX Dauphine, Pl. de Lattre de Tassigny (France)

    2017-02-15

    In this article, we focus on tomographic reconstruction. The problem is to determine the shape of the interior interface using a tomographic approach while very few X-ray radiographs are performed. We use a multi-marginal optimal transport approach. Preliminary numerical results are presented.

  2. In-process 3D geometry reconstruction of objects produced by direct light projection

    DEFF Research Database (Denmark)

    Andersen, Ulrik Vølcker; Pedersen, David Bue; Hansen, Hans Nørgaard

    2013-01-01

    al. 2011), this method has shown its potential with 3D printing (3DP) and selective laser sintering additive manufacturing processes, where it is possible to directly capture the geometrical features of each individual layer during a build job using a digital camera. When considering the process...... equipment such as coordinate measuring machines cannot be verified easily. This problem is addressed by developing an in-line reverse engineering and 3D reconstruction method that allows a true-to-scale reconstruction of a part being additively manufactured. In earlier works (Pedersen et al. 2010; Hansen et...

  3. Simultaneous reconstruction of outer boundary shape and conductivity distribution in electrical impedance tomography

    KAUST Repository

    Hyvönen, Nuutti

    2016-01-05

    The simultaneous retrieval of the exterior boundary shape and the interior admittivity distribution of an examined body in electrical impedance tomography is considered. The reconstruction method is built for the complete electrode model and it is based on the Frechet derivative of the corresponding current-to-voltage map with respect to the body shape. The reconstruction problem is cast into the Bayesian framework, and maximum a posteriori estimates for the admittivity and the boundary geometry are computed. The feasibility of the approach is evaluated by experimental data from water tank measurements.

  4. Reconstruction of initial pressure from limited view photoacoustic images using deep learning

    Science.gov (United States)

    Waibel, Dominik; Gröhl, Janek; Isensee, Fabian; Kirchner, Thomas; Maier-Hein, Klaus; Maier-Hein, Lena

    2018-02-01

    Quantification of tissue properties with photoacoustic (PA) imaging typically requires a highly accurate representation of the initial pressure distribution in tissue. Almost all PA scanners reconstruct the PA image only from a partial scan of the emitted sound waves. Especially handheld devices, which have become increasingly popular due to their versatility and ease of use, only provide limited view data because of their geometry. Owing to such limitations in hardware as well as to the acoustic attenuation in tissue, state-of-the-art reconstruction methods deliver only approximations of the initial pressure distribution. To overcome the limited view problem, we present a machine learning-based approach to the reconstruction of initial pressure from limited view PA data. Our method involves a fully convolutional deep neural network based on a U-Net-like architecture with pixel-wise regression loss on the acquired PA images. It is trained and validated on in silico data generated with Monte Carlo simulations. In an initial study we found an increase in accuracy over the state-of-the-art when reconstructing simulated linear-array scans of blood vessels.

  5. The degeneracy problem in non-canonical inflation

    International Nuclear Information System (INIS)

    Easson, Damien A.; Powell, Brian A.

    2013-01-01

    While attempting to connect inflationary theories to observational physics, a potential difficulty is the degeneracy problem: a single set of observables maps to a range of different inflaton potentials. Two important classes of models affected by the degeneracy problem are canonical and non-canonical models, the latter marked by the presence of a non-standard kinetic term that generates observables beyond the scalar and tensor two-point functions on CMB scales. The degeneracy problem is manifest when these distinguishing observables go undetected. We quantify the size of the resulting degeneracy in this case by studying the most well-motivated non-canonical theory having Dirac-Born-Infeld Lagrangian. Beyond the scalar and tensor two-point functions on CMB scales, we then consider the possible detection of equilateral non-Gaussianity at Planck-precision and a measurement of primordial gravitational waves from prospective space-based laser interferometers. The former detection breaks the degeneracy with canonical inflation but results in poor reconstruction prospects, while the latter measurement enables a determination of n T which, while not breaking the degeneracy, can be shown to greatly improve the non-canonical reconstruction

  6. Novel arenavirus sequences in Hylomyscus sp. and Mus (Nannomys setulosus from Côte d'Ivoire: implications for evolution of arenaviruses in Africa.

    Directory of Open Access Journals (Sweden)

    David Coulibaly-N'Golo

    Full Text Available This study aimed to identify new arenaviruses and gather insights in the evolution of arenaviruses in Africa. During 2003 through 2005, 1,228 small mammals representing 14 different genera were trapped in 9 villages in south, east, and middle west of Côte d'Ivoire. Specimens were screened by pan-Old World arenavirus RT-PCRs targeting S and L RNA segments as well as immunofluorescence assay. Sequences of two novel tentative species of the family Arenaviridae, Menekre and Gbagroube virus, were detected in Hylomyscus sp. and Mus (Nannomys setulosus, respectively. Arenavirus infection of Mus (Nannomys setulosus was also demonstrated by serological testing. Lassa virus was not found, although 60% of the captured animals were Mastomys natalensis. Complete S RNA and partial L RNA sequences of the novel viruses were recovered from the rodent specimens and subjected to phylogenetic analysis. Gbagroube virus is a closely related sister taxon of Lassa virus, while Menekre virus clusters with the Ippy/Mobala/Mopeia virus complex. Reconstruction of possible virus-host co-phylogeny scenarios suggests that, within the African continent, signatures of co-evolution might have been obliterated by multiple host-switching events.

  7. Four-dimensional volume-of-interest reconstruction for cone-beam computed tomography-guided radiation therapy.

    Science.gov (United States)

    Ahmad, Moiz; Balter, Peter; Pan, Tinsu

    2011-10-01

    Data sufficiency are a major problem in four-dimensional cone-beam computed tomography (4D-CBCT) on linear accelerator-integrated scanners for image-guided radiotherapy. Scan times must be in the range of 4-6 min to avoid undersampling artifacts. Various image reconstruction algorithms have been proposed to accommodate undersampled data acquisitions, but these algorithms are computationally expensive, may require long reconstruction times, and may require algorithm parameters to be optimized. The authors present a novel reconstruction method, 4D volume-of-interest (4D-VOI) reconstruction which suppresses undersampling artifacts and resolves lung tumor motion for undersampled 1-min scans. The 4D-VOI reconstruction is much less computationally expensive than other 4D-CBCT algorithms. The 4D-VOI method uses respiration-correlated projection data to reconstruct a four-dimensional (4D) image inside a VOI containing the moving tumor, and uncorrelated projection data to reconstruct a three-dimensional (3D) image outside the VOI. Anatomical motion is resolved inside the VOI and blurred outside the VOI. The authors acquired a 1-min. scan of an anthropomorphic chest phantom containing a moving water-filled sphere. The authors also used previously acquired 1-min scans for two lung cancer patients who had received CBCT-guided radiation therapy. The same raw data were used to test and compare the 4D-VOI reconstruction with the standard 4D reconstruction and the McKinnon-Bates (MB) reconstruction algorithms. Both the 4D-VOI and the MB reconstructions suppress nearly all the streak artifacts compared with the standard 4D reconstruction, but the 4D-VOI has 3-8 times greater contrast-to-noise ratio than the MB reconstruction. In the dynamic chest phantom study, the 4D-VOI and the standard 4D reconstructions both resolved a moving sphere with an 18 mm displacement. The 4D-VOI reconstruction shows a motion blur of only 3 mm, whereas the MB reconstruction shows a motion blur of 13 mm

  8. l0 regularization based on a prior image incorporated non-local means for limited-angle X-ray CT reconstruction.

    Science.gov (United States)

    Zhang, Lingli; Zeng, Li; Guo, Yumeng

    2018-03-15

    Restricted by the scanning environment in some CT imaging modalities, the acquired projection data are usually incomplete, which may lead to a limited-angle reconstruction problem. Thus, image quality usually suffers from the slope artifacts. The objective of this study is to first investigate the distorted domains of the reconstructed images which encounter the slope artifacts and then present a new iterative reconstruction method to address the limited-angle X-ray CT reconstruction problem. The presented framework of new method exploits the structural similarity between the prior image and the reconstructed image aiming to compensate the distorted edges. Specifically, the new method utilizes l0 regularization and wavelet tight framelets to suppress the slope artifacts and pursue the sparsity. New method includes following 4 steps to (1) address the data fidelity using SART; (2) compensate for the slope artifacts due to the missed projection data using the prior image and modified nonlocal means (PNLM); (3) utilize l0 regularization to suppress the slope artifacts and pursue the sparsity of wavelet coefficients of the transformed image by using iterative hard thresholding (l0W); and (4) apply an inverse wavelet transform to reconstruct image. In summary, this method is referred to as "l0W-PNLM". Numerical implementations showed that the presented l0W-PNLM was superior to suppress the slope artifacts while preserving the edges of some features as compared to the commercial and other popular investigative algorithms. When the image to be reconstructed is inconsistent with the prior image, the new method can avoid or minimize the distorted edges in the reconstructed images. Quantitative assessments also showed that applying the new method obtained the highest image quality comparing to the existing algorithms. This study demonstrated that the presented l0W-PNLM yielded higher image quality due to a number of unique characteristics, which include that (1) it utilizes

  9. STEP: Self-supporting tailored k-space estimation for parallel imaging reconstruction.

    Science.gov (United States)

    Zhou, Zechen; Wang, Jinnan; Balu, Niranjan; Li, Rui; Yuan, Chun

    2016-02-01

    A new subspace-based iterative reconstruction method, termed Self-supporting Tailored k-space Estimation for Parallel imaging reconstruction (STEP), is presented and evaluated in comparison to the existing autocalibrating method SPIRiT and calibrationless method SAKE. In STEP, two tailored schemes including k-space partition and basis selection are proposed to promote spatially variant signal subspace and incorporated into a self-supporting structured low rank model to enforce properties of locality, sparsity, and rank deficiency, which can be formulated into a constrained optimization problem and solved by an iterative algorithm. Simulated and in vivo datasets were used to investigate the performance of STEP in terms of overall image quality and detail structure preservation. The advantage of STEP on image quality is demonstrated by retrospectively undersampled multichannel Cartesian data with various patterns. Compared with SPIRiT and SAKE, STEP can provide more accurate reconstruction images with less residual aliasing artifacts and reduced noise amplification in simulation and in vivo experiments. In addition, STEP has the capability of combining compressed sensing with arbitrary sampling trajectory. Using k-space partition and basis selection can further improve the performance of parallel imaging reconstruction with or without calibration signals. © 2015 Wiley Periodicals, Inc.

  10. Optimized 3D Street Scene Reconstruction from Driving Recorder Images

    Directory of Open Access Journals (Sweden)

    Yongjun Zhang

    2015-07-01

    Full Text Available The paper presents an automatic region detection based method to reconstruct street scenes from driving recorder images. The driving recorder in this paper is a dashboard camera that collects images while the motor vehicle is moving. An enormous number of moving vehicles are included in the collected data because the typical recorders are often mounted in the front of moving vehicles and face the forward direction, which can make matching points on vehicles and guardrails unreliable. Believing that utilizing these image data can reduce street scene reconstruction and updating costs because of their low price, wide use, and extensive shooting coverage, we therefore proposed a new method, which is called the Mask automatic detecting method, to improve the structure results from the motion reconstruction. Note that we define vehicle and guardrail regions as “mask” in this paper since the features on them should be masked out to avoid poor matches. After removing the feature points in our new method, the camera poses and sparse 3D points that are reconstructed with the remaining matches. Our contrast experiments with the typical pipeline of structure from motion (SfM reconstruction methods, such as Photosynth and VisualSFM, demonstrated that the Mask decreased the root-mean-square error (RMSE of the pairwise matching results, which led to more accurate recovering results from the camera-relative poses. Removing features from the Mask also increased the accuracy of point clouds by nearly 30%–40% and corrected the problems of the typical methods on repeatedly reconstructing several buildings when there was only one target building.

  11. A new OH5 reconstruction with an assessment of its uncertainty.

    Science.gov (United States)

    Benazzi, Stefano; Bookstein, Fred L; Strait, David S; Weber, Gerhard W

    2011-07-01

    The OH5 cranium, holotype of Paranthropus boisei consists of two main portions that do not fit together: the extensively reconstructed face and a portion of the neurocranium. A physical reconstruction of the cranium was carried out by Tobias in 1967, who did not discuss problems related to deformation, although he noted a slight functional asymmetry. Nevertheless, the reconstructed cranium shows some anomalies, mainly due to the right skewed position of the upper calvariofacial fragment and uncertainty of the relative position of the neurocranium to the face, which hamper further quantitative analysis of OH5's cranial geometry. Here, we present a complete virtual reconstruction of OH5, using three-dimensional (3D) digital data, geometric morphometric (GM) methods and computer-aided design (CAD) techniques. Starting from a CT scan of Tobias's reconstruction, a semi-automatic segmentation method was used to remove Tobias's plaster. The upper calvariofacial fragment was separated from the lower facial fragment and re-aligned using superposition of their independent midsagittal planes in a range of feasible positions. The missing parts of the right hemiface were reconstructed using non-uniform rational basis-spline (NURBS) surface and subsequently mirrored using the midsagittal plane to arrive at a symmetrical facial reconstruction. A symmetric neurocranium was obtained as the average of the original shape and its mirrored version. The alignment between the two symmetric shapes (face and neurocranium) used their independent midsagittal plane and a reference shape (KNM-ER 406) to highly reduce their degrees of freedom. From the series of alternative reconstructions, we selected the middle of this rather small feasible range. When reconstructed as a range in this way, the whole cranial form of this unique specimen can be further quantified by comparative coordinate-based methods such as GM or can be used for finite element modeling (FEM) explorations of hypotheses about

  12. The Effects of High-Intensity versus Low-Intensity Resistance Training on Leg Extensor Power and Recovery of Knee Function after ACL-Reconstruction

    DEFF Research Database (Denmark)

    Bieler, Theresa; Sobol, Nanna Aue; Andersen, Lars L

    2014-01-01

    OBJECTIVE: Persistent weakness is a common problem after anterior cruciate ligament- (ACL-) reconstruction. This study investigated the effects of high-intensity (HRT) versus low-intensity (LRT) resistance training on leg extensor power and recovery of knee function after ACL-reconstruction. METH...

  13. Noniterative MAP reconstruction using sparse matrix representations.

    Science.gov (United States)

    Cao, Guangzhi; Bouman, Charles A; Webb, Kevin J

    2009-09-01

    We present a method for noniterative maximum a posteriori (MAP) tomographic reconstruction which is based on the use of sparse matrix representations. Our approach is to precompute and store the inverse matrix required for MAP reconstruction. This approach has generally not been used in the past because the inverse matrix is typically large and fully populated (i.e., not sparse). In order to overcome this problem, we introduce two new ideas. The first idea is a novel theory for the lossy source coding of matrix transformations which we refer to as matrix source coding. This theory is based on a distortion metric that reflects the distortions produced in the final matrix-vector product, rather than the distortions in the coded matrix itself. The resulting algorithms are shown to require orthonormal transformations of both the measurement data and the matrix rows and columns before quantization and coding. The second idea is a method for efficiently storing and computing the required orthonormal transformations, which we call a sparse-matrix transform (SMT). The SMT is a generalization of the classical FFT in that it uses butterflies to compute an orthonormal transform; but unlike an FFT, the SMT uses the butterflies in an irregular pattern, and is numerically designed to best approximate the desired transforms. We demonstrate the potential of the noniterative MAP reconstruction with examples from optical tomography. The method requires offline computation to encode the inverse transform. However, once these offline computations are completed, the noniterative MAP algorithm is shown to reduce both storage and computation by well over two orders of magnitude, as compared to a linear iterative reconstruction methods.

  14. [Reconstructive methods after Fournier gangrene].

    Science.gov (United States)

    Wallner, C; Behr, B; Ring, A; Mikhail, B D; Lehnhardt, M; Daigeler, A

    2016-04-01

    Fournier's gangrene is a variant of the necrotizing fasciitis restricted to the perineal and genital region. It presents as an acute life-threatening disease and demands rapid surgical debridement, resulting in large soft tissue defects. Various reconstructive methods have to be applied to reconstitute functionality and aesthetics. The objective of this work is to identify different reconstructive methods in the literature and compare them to our current concepts for reconstructing defects caused by Fournier gangrene. Analysis of the current literature and our reconstructive methods on Fournier gangrene. The Fournier gangrene is an emergency requiring rapid, calculated antibiotic treatment and radical surgical debridement. After the acute phase of the disease, appropriate reconstructive methods are indicated. The planning of the reconstruction of the defect depends on many factors, especially functional and aesthetic demands. Scrotal reconstruction requires a higher aesthetic and functional reconstructive degree than perineal cutaneous wounds. In general, thorough wound hygiene, proper pre-operative planning, and careful consideration of the patient's demands are essential for successful reconstruction. In the literature, various methods for reconstruction after Fournier gangrene are described. Reconstruction with a flap is required for a good functional result in complex regions as the scrotum and penis, while cutaneous wounds can be managed through skin grafting. Patient compliance and tissue demand are crucial factors in the decision-making process.

  15. Stability analysis and reconstruction of wave distribution functions in warm plasmas

    International Nuclear Information System (INIS)

    Oscarsson, T.E.

    1989-05-01

    The purpose of this thesis is first to describe stability analysis and reconstruction of the wave distribution function (WDF) separately, and then to show how the two approaches can be combined in an investigation of satellite data. To demonstrate the type of stability investigation that is often used in space physics we study instabilities below the local proton gyrofrequency which are caused by anisotropic proton distributions. Arbitrary angles between the wavevector and the background magnetic field are considered, and effects of warm plasma on the wave propagation properties are included. We also comment briefly given on an often-used scheme for classifying instabilities. In our discussion on WDF analysis we develop a completely new and general method for reconstructing the WDF. Our scheme can be used to reconstruct the distribution function of waves in warm as well as cold plasma. Doppler effects introduced by satellite motion are included, and the reconstructions can be performed over a broad frequency range simultaneously. The applicability of our new WDF reconstruction method is studied in model problems and in an application to observations made by the Swedish satellite Viking. In the application to Viking data we combine stability and WDF analyses in a unique way that promises to become an important tool in future studies of wave-particle interactions in space plasmas. (author)

  16. Efficient operator splitting algorithm for joint sparsity-regularized SPIRiT-based parallel MR imaging reconstruction.

    Science.gov (United States)

    Duan, Jizhong; Liu, Yu; Jing, Peiguang

    2018-02-01

    Self-consistent parallel imaging (SPIRiT) is an auto-calibrating model for the reconstruction of parallel magnetic resonance imaging, which can be formulated as a regularized SPIRiT problem. The Projection Over Convex Sets (POCS) method was used to solve the formulated regularized SPIRiT problem. However, the quality of the reconstructed image still needs to be improved. Though methods such as NonLinear Conjugate Gradients (NLCG) can achieve higher spatial resolution, these methods always demand very complex computation and converge slowly. In this paper, we propose a new algorithm to solve the formulated Cartesian SPIRiT problem with the JTV and JL1 regularization terms. The proposed algorithm uses the operator splitting (OS) technique to decompose the problem into a gradient problem and a denoising problem with two regularization terms, which is solved by our proposed split Bregman based denoising algorithm, and adopts the Barzilai and Borwein method to update step size. Simulation experiments on two in vivo data sets demonstrate that the proposed algorithm is 1.3 times faster than ADMM for datasets with 8 channels. Especially, our proposal is 2 times faster than ADMM for the dataset with 32 channels. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Shaping the breast in secondary microsurgical breast reconstruction: single- vs. two-esthetic unit reconstruction.

    Science.gov (United States)

    Gravvanis, Andreas; Smith, Roger W

    2010-10-01

    The esthetic outcome is dictated essentially not only by the position, size, and shape of the reconstructed breast, but also by the extra scaring involved. In the present study, we conducted a visual analog scale survey to compare the esthetic outcome in delayed autologous breast reconstruction following two different abdominal flaps inset. Twenty-five patients had their reconstruction using the Single-esthetic Unit principle and were compared with 25 patients that their breast was reconstructed using the Two-Esthetic Unit principle. Photographic images were formulated to a PowerPoint presentation and cosmetic outcomes were assessed from 30 physicians, by means of a Questionnaire and a visual analog scale. Our data showed that the single-esthetic unit breast reconstruction presents significant advantages over the traditional two-esthetic units, due to inconspicuous flap reconstruction, better position of the inframammary fold, and more natural transition from native and reconstructed tissues. Moreover, patient self-evaluation of esthetic outcome and quality of life showed that single-esthetic unit reconstruction is associated with higher patient satisfaction, therefore should be considered the method of choice. © 2010 Wiley-Liss, Inc.

  18. Separated reconstruction of images from ultrasonic holograms with tridimensional object by digital processing

    International Nuclear Information System (INIS)

    Son, J.H.

    1979-01-01

    Because of much attractiveness, digital reconstruction of image from ultrasonic hologram by computer has been widely studied in recent years. But the method of digital reconstruction of image is displayed in the plain only, so study is done mainly of the hologram obtained from bidimensional objects. Many applications of the ultrasonic holography such as the non-distructive testing and the ultrasonic diagnosis are mostly of the tridimensional object. In the ordinary digital reconstruction of the image from the hologram obtained from tridimensional object, a question of hidden-image problem arises, and the separated reconstruction of the image for the considered part of the object is required. In this paper, multi-diffraction by tridimensional object is assumed to have linearity, ie. superposition property by each diffraction of bidimensional objects. And a new algorithm is proposed here, namely reconstructed image for considered one of bidimensional objects in tridimensional object obtained by means of operation from the two holograms tilted in unequal angles. Such tilted holograms are obtained from the tilted linear array receivers by scanning method. That images can be reconstructed by the operation from two holograms means that the new algorithm is verified. And another new method of the transformation of hologram, that is, transformation of a hologram to arbitrarily tilted hologram, has been proved valid. The reconstructed images obtained with the method of transformation and the method of operation, are the images reconstructed from one hologram by the tridimensional object and more distinctly separated that any images mentioned above. (author)

  19. [Use and versatility of titanium for the reconstruction of the thoracic wall].

    Science.gov (United States)

    Córcoles Padilla, Juan Manuel; Bolufer Nadal, Sergio; Kurowski, Krzysztof; Gálvez Muñoz, Carlos; Rodriguez Paniagua, José Manuel

    2014-02-01

    Chest wall deformities/defects and chest wall resections, as well as complex rib fractures require reconstruction with various prosthetic materials to ensure the basic functions of the chest wall. Titanium provides many features that make it an ideal material for this surgery. The aim is to present our initial results with this material in several diseases. From 2008 to 2012, 14 patients were operated on and titanium was used for reconstruction of the chest wall. A total of 7 patients had chest wall tumors, 2 with sternal resection, 4 patients with chest wall deformities/defects and 3 patients with severe rib injury due to traffic accident. The reconstruction was successful in all cases, with early extubation without detecting problems in the functionality of the chest wall at a respiratory level. Patients with chest wall tumors including sternal resections were extubated in the operating room as well as the chest wall deformities. Chest trauma cases were extubated within 24h from internal rib fixation. There were no complications related to the material used and the method of implementation. Titanium is an ideal material for reconstruction of the chest wall in several clinical situations allowing for great versatility and adaptability in different chest wall reconstructions. Copyright © 2013 AEC. Published by Elsevier Espana. All rights reserved.

  20. EDITORIAL: BUILT ENVIRONMENT PERSPECTIVES ON POST-DISASTER RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Jason von Meding

    2013-11-01

    It is with great pleasure that we would like to introduce this special issue of IJAR, a compilation of cutting-edge research that covers many of the key themes relevant to built environment researchers in disaster-related areas. This knowledge area is by its very nature absolutely multidisciplinary and for this reason it is difficult to quantify built environment impacts, drivers and outcomes in isolation and disaggregate them from non-built environment factors. However, regardless of certain limitations to research carried out from a built environment perspective, as would be the case from any specific disciplinary perspective, a significant body of work has emerged and is constantly growing and evolving in parallel with the research agenda. Built environment researchers around the globe are now, more than ever, exploring various problems that threaten humanity in the way of dire vulnerability and more frequent and powerful hazards. This collection of papers will look specifically at one area of disaster management, postdisaster reconstruction. Reconstruction projects primarily occur during the recovery phase of the disaster cycle, playing a key role in bringing vulnerable communities back to normalcy, integrating disaster risk reduction and preparedness measures to increase resilience to future hazard events. The special issue is broken into four thematic areas; Context and Culture, Theory, Resilience and Risk Reduction and Design. The three papers in Section 1- Context and Culture deal with the impacts of disasters on places and the people that inhabit them, investigating the extent to which reconstruction projects can address social and cultural problems. Meanwhile, the two papers in Section 2- Theory put forward new theoretical perspectives with regards to stakeholder engagement and management, representing the growth of alternative points of departure in this area. The three papers in Section 3- Resilience and Risk Reduction explore various approaches to

  1. Our Clinical Experience in Lower Eyelid Reconstruction and Comparison of Reconstruction Techniques.

    Directory of Open Access Journals (Sweden)

    Erol Kesiktas

    2011-03-01

    Full Text Available Lower eyelid reconstruction after tumor resection or trauma is extremely important in order to obtain the protection of the globe and it requires extra care. The aim of this study is to evaluate the techniques that we use in lower eyelid reconstruction. Between the years of 1999-2009, a total of 23 patients who had lower eyelid defects due to tumor resection (21 patients or trauma (2 patients were reconstructed by the use of various flaps. Eleven of the patients were women and twelve of them were men. Average age was 60.7. For the reconstruction 13 Mustarde flap, 9 bipedicled Tripier flap, 1 Tenzel flap and 1 bipedicled infraorbital flaps were used. Average postoperative follow up interval was 35 months. (18-47 months. Flap necrosis, function loss in the lower lid were not experienced in any of the patients. Cosmetically satisfactory results were obtained. In lower eyelid reconstruction, the size of the defect and the amount of skin around must be carefully evaluated and should be considered while planning the reconstruction. [Cukurova Med J 2011; 36(1: 15-23

  2. Theoretical stability in coefficient inverse problems for general hyperbolic equations with numerical reconstruction

    Science.gov (United States)

    Yu, Jie; Liu, Yikan; Yamamoto, Masahiro

    2018-04-01

    In this article, we investigate the determination of the spatial component in the time-dependent second order coefficient of a hyperbolic equation from both theoretical and numerical aspects. By the Carleman estimates for general hyperbolic operators and an auxiliary Carleman estimate, we establish local Hölder stability with either partial boundary or interior measurements under certain geometrical conditions. For numerical reconstruction, we minimize a Tikhonov functional which penalizes the gradient of the unknown function. Based on the resulting variational equation, we design an iteration method which is updated by solving a Poisson equation at each step. One-dimensional prototype examples illustrate the numerical performance of the proposed iteration.

  3. Strategies of reconstruction algorithms for computerized tomography

    International Nuclear Information System (INIS)

    Garderet, P.

    1984-10-01

    Image reconstruction from projections has progressively spread out over all fields of medical imaging. As the mathematical aspects of the problem become more and more comprehensively explored a great variety of numerical solutions have been developed best suited to such-and-such imaging medical application and taking into account the physical phenomena related to data collection (a priori properties for signal and noise). The purpose of that survey is to present the general mathematical frame and the fundamental assumptions of various strategies; Fourier methods approximate explicit deterministic inversion formula for the Radon transform. Algebraic reconstruction techniques set up an a priori discrete model through a series expansion approach of the solution. The numerical system to be solved is huge when a fine grid of pixels is to be reconstructed; iterative solutions may then be found. Recently some least square procedures have been shown to be tractable which avoid the use of iterative methods. Finally maximum like hood approach incorporates accurately the Poisson nature of photon noise and are well adapted to emission computed tomography. The various strategies will be analysed from both aspects of theoretical assumptions needed for suitable use and of computing facilities, actual performance and cost. In the end we take a glimpse of the extension of the algorithms from two dimensional imaging to fully three dimensional volume analysis in preparation of the future medical imaging technologies

  4. Research on Image Reconstruction Algorithms for Tuber Electrical Resistance Tomography System

    Directory of Open Access Journals (Sweden)

    Jiang Zili

    2016-01-01

    Full Text Available The application of electrical resistance tomography (ERT technology has been expanded to the field of agriculture, and the concept of TERT (Tuber Electrical Resistance Tomography is proposed. On the basis of the research on the forward and the inverse problems of the TERT system, a hybrid algorithm based on genetic algorithm is proposed, which can be used in TERT system to monitor the growth status of the plant tubers. The image reconstruction of TERT system is different from the conventional ERT system for two phase-flow measurement. Imaging of TERT needs more precision measurement and the conventional ERT cares more about the image reconstruction speed. A variety of algorithms are analyzed and optimized for the purpose of making them suitable for TERT system. For example: linear back projection, modified Newton-Raphson and genetic algorithm. Experimental results showed that the novel hybrid algorithm is superior to other algorithm and it can effectively improve the image reconstruction quality.

  5. Reconstruction of bremsstrahlung spectra from attenuation data using generalized simulated annealing

    International Nuclear Information System (INIS)

    Menin, O.H.; Martinez, A.S.; Costa, A.M.

    2016-01-01

    A generalized simulated annealing algorithm, combined with a suitable smoothing regularization function is used to solve the inverse problem of X-ray spectrum reconstruction from attenuation data. The approach is to set the initial acceptance and visitation temperatures and to standardize the terms of objective function to automate the algorithm to accommodate different spectra ranges. Experiments with both numerical and measured attenuation data are presented. Results show that the algorithm reconstructs spectra shapes accurately. It should be noted that in this algorithm, the regularization function was formulated to guarantee a smooth spectrum, thus, the presented technique does not apply to X-ray spectrum where characteristic radiation are present. - Highlights: • X-ray spectra reconstruction from attenuation data using generalized simulated annealing. • Algorithm employs a smoothing regularization function, and sets the initial acceptance and visitation temperatures. • Algorithmic is automated by standardizing the terms of the objective function. • Algorithm is compared with classical methods.

  6. A new iterative reconstruction technique for attenuation correction in high-resolution positron emission tomography

    International Nuclear Information System (INIS)

    Knesaurek, K.; Machac, J.; Vallabhajosula, S.; Buchsbaum, M.S.

    1996-01-01

    A new interative reconstruction technique (NIRT) for positron emission computed tomography (PET), which uses transmission data for nonuniform attenuation correction, is described. Utilizing the general inverse problem theory, a cost functional which includes a noise term was derived. The cost functional was minimized using a weighted-least-square maximum a posteriori conjugate gradient (CG) method. The procedure involves a change in the Hessian of the cost function by adding an additional term. Two phantoms were used in a real data acquisition. The first was a cylinder phantom filled with uniformly distributed activity of 74 MBq of fluorine-18. Two different inserts were placed in the phantom. The second was a Hoffman brain phantom filled with uniformly distributed activity of 7.4 MBq of 18 F. Resulting reconstructed images were used to test and compare a new interative reconstruction technique with a standard filtered backprojection (FBP) method. The results confirmed that NIRT, based on the conjugate gradient method, converges rapidly and provides good reconstructed images. In comaprison with standard results obtained by the FBP method, the images reconstructed by NIRT showed better noise properties. The noise was measured as rms% noise and was less, by a factor of 1.75, in images reconstructed by NIRT than in the same images reconstructed by FBP. The distance between the Hoffman brain slice created from the MRI image was 0.526, while the same distance for the Hoffman brain slice reconstructed by NIRT was 0.328. The NIRT method suppressed the propagation of the noise without visible loss of resolution in the reconstructed PET images. (orig.)

  7. Breast Reconstruction Following Cancer Treatment.

    Science.gov (United States)

    Gerber, Bernd; Marx, Mario; Untch, Michael; Faridi, Andree

    2015-08-31

    About 8000 breast reconstructions after mastectomy are per - formed in Germany each year. It has become more difficult to advise patients because of the wide variety of heterologous and autologous techniques that are now available and because of changes in the recommendations about radiotherapy. This article is based on a review of pertinent articles (2005-2014) that were retrieved by a selective search employing the search terms "mastectomy" and "breast reconstruction." The goal of reconstruction is to achieve an oncologically safe and aestically satisfactory result for the patient over the long term. Heterologous, i.e., implant-based, breast reconstruction (IBR) and autologous breast reconstruction (ABR) are complementary techniques. Immediate reconstruction preserves the skin of the breast and its natural form and prevents the psychological trauma associated with mastectomy. If post-mastectomy radiotherapy (PMRT) is not indicated, implant-based reconstruction with or without a net/acellular dermal matrix (ADM) is a common option. Complications such as seroma formation, infection, and explantation are significantly more common when an ADM is used (15.3% vs. 5.4% ). If PMRT is performed, then the complication rate of implant-based breast reconstruction is 1 to 48% ; in particular, Baker grade III/IV capsular fibrosis occurs in 7 to 22% of patients, and the prosthesis must be explanted in 9 to 41% . Primary or, preferably, secondary autologous reconstruction is an alternative. The results of ABR are more stable over the long term, but the operation is markedly more complex. Autologous breast reconstruction after PMRT does not increase the risk of serious complications (20.5% vs. 17.9% without radiotherapy). No randomized controlled trials have yet been conducted to compare the reconstructive techniques with each other. If radiotherapy will not be performed, immediate reconstruction with an implant is recommended. On the other hand, if post-mastectomy radiotherapy

  8. TV-based conjugate gradient method and discrete L-curve for few-view CT reconstruction of X-ray in vivo data

    International Nuclear Information System (INIS)

    Yang, Xiaoli; Hofmann, Ralf; Dapp, Robin; Van de Kamp, Thomas; Rolo, Tomy dos Santos; Xiao, Xianghui; Moosmann, Julian; Kashef, Jubin; Stotzka, Rainer

    2015-01-01

    High-resolution, three-dimensional (3D) imaging of soft tissues requires the solution of two inverse problems: phase retrieval and the reconstruction of the 3D image from a tomographic stack of two-dimensional (2D) projections. The number of projections per stack should be small to accommodate fast tomography of rapid processes and to constrain X-ray radiation dose to optimal levels to either increase the duration o fin vivo time-lapse series at a given goal for spatial resolution and/or the conservation of structure under X-ray irradiation. In pursuing the 3D reconstruction problem in the sense of compressive sampling theory, we propose to reduce the number of projections by applying an advanced algebraic technique subject to the minimisation of the total variation (TV) in the reconstructed slice. This problem is formulated in a Lagrangian multiplier fashion with the parameter value determined by appealing to a discrete L-curve in conjunction with a conjugate gradient method. The usefulness of this reconstruction modality is demonstrated for simulated and in vivo data, the latter acquired in parallel-beam imaging experiments using synchrotron radiation

  9. Health-related quality of life after segmental resection of the lateral mandible: Free fibula flap versus plate reconstruction.

    Science.gov (United States)

    van Gemert, Johannes; Holtslag, Irene; van der Bilt, Andries; Merkx, Matthias; Koole, Ron; Van Cann, Ellen

    2015-06-01

    Segmental resection of the mandible causes functional, aesthetic and social problems affecting health-related quality of life (HRQoL). It is often assumed that reconstruction with composite free flaps guarantees better function and aesthetics than bridging the defect with reconstruction plates. Using the European Organization for Research and Treatment of Cancer questionnaires (EORTC QLQ-C30 version 3.0 and EORTC QLQ-H&N35), we compared HRQoL in patients who received free fibula flaps versus reconstruction plates after segmental resection of the lateral mandible. Thirty-seven completed questionnaires (18 fibula reconstructions and 19 patients with reconstruction plates) were available. Reconstruction with a free fibula flap did not provide clear additional benefit to bridging the defect with a reconstruction plate after segmental resection of the lateral mandible. In particular aspects known to have the most impact on HRQoL like swallowing, speech and chewing were not influenced by the type of reconstruction. Reconstruction of segmental defects of the lateral mandible with free fibula flap and reconstruction plate resulted in comparable HRQoL. If dental rehabilitation by means of dental implants is not anticipated in the fibula, then plate reconstruction with adequate soft tissue remains a suitable technique for the reconstruction of segmental defects of the lateral mandible. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  10. Graph reconstruction with a betweenness oracle

    DEFF Research Database (Denmark)

    Abrahamsen, Mikkel; Bodwin, Greg; Rotenberg, Eva

    2016-01-01

    Graph reconstruction algorithms seek to learn a hidden graph by repeatedly querying a blackbox oracle for information about the graph structure. Perhaps the most well studied and applied version of the problem uses a distance oracle, which can report the shortest path distance between any pair...... of nodes. We introduce and study the betweenness oracle, where bet(a, m, z) is true iff m lies on a shortest path between a and z. This oracle is strictly weaker than a distance oracle, in the sense that a betweenness query can be simulated by a constant number of distance queries, but not vice versa...

  11. ECG-gated interventional cardiac reconstruction for non-periodic motion.

    Science.gov (United States)

    Rohkohl, Christopher; Lauritsch, Günter; Biller, Lisa; Hornegger, Joachim

    2010-01-01

    The 3-D reconstruction of cardiac vasculature using C-arm CT is an active and challenging field of research. In interventional environments patients often do have arrhythmic heart signals or cannot hold breath during the complete data acquisition. This important group of patients cannot be reconstructed with current approaches that do strongly depend on a high degree of cardiac motion periodicity for working properly. In a last year's MICCAI contribution a first algorithm was presented that is able to estimate non-periodic 4-D motion patterns. However, to some degree that algorithm still depends on periodicity, as it requires a prior image which is obtained using a simple ECG-gated reconstruction. In this work we aim to provide a solution to this problem by developing a motion compensated ECG-gating algorithm. It is built upon a 4-D time-continuous affine motion model which is capable of compactly describing highly non-periodic motion patterns. A stochastic optimization scheme is derived which minimizes the error between the measured projection data and the forward projection of the motion compensated reconstruction. For evaluation, the algorithm is applied to 5 datasets of the left coronary arteries of patients that have ignored the breath hold command and/or had arrhythmic heart signals during the data acquisition. By applying the developed algorithm the average visibility of the vessel segments could be increased by 27%. The results show that the proposed algorithm provides excellent reconstruction quality in cases where classical approaches fail. The algorithm is highly parallelizable and a clinically feasible runtime of under 4 minutes is achieved using modern graphics card hardware.

  12. Minimal residual cone-beam reconstruction with attenuation correction in SPECT

    International Nuclear Information System (INIS)

    La, Valerie; Grangeat, Pierre

    1998-01-01

    This paper presents an iterative method based on the minimal residual algorithm for tomographic attenuation compensated reconstruction from attenuated cone-beam projections given the attenuation distribution. Unlike conjugate-gradient based reconstruction techniques, the proposed minimal residual based algorithm solves directly a quasisymmetric linear system, which is a preconditioned system. Thus it avoids the use of normal equations, which improves the convergence rate. Two main contributions are introduced. First, a regularization method is derived for quasisymmetric problems, based on a Tikhonov-Phillips regularization applied to the factorization of the symmetric part of the system matrix. This regularization is made spatially adaptive to avoid smoothing the region of interest. Second, our existing reconstruction algorithm for attenuation correction in parallel-beam geometry is extended to cone-beam geometry. A circular orbit is considered. Two preconditioning operators are proposed: the first one is Grangeat's inversion formula and the second one is Feldkamp's inversion formula. Experimental results obtained on simulated data are presented and the shadow zone effect on attenuated data is illustrated. (author)

  13. GPU based Monte Carlo for PET image reconstruction: parameter optimization

    International Nuclear Information System (INIS)

    Cserkaszky, Á; Légrády, D.; Wirth, A.; Bükki, T.; Patay, G.

    2011-01-01

    This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)

  14. Reconstruction and visualization of nanoparticle composites by transmission electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Wang, X.Y. [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Canada T6H 2M9 (Canada); Department of Physics, University of Alberta, Edmonton, Canada T6G 2G7 (Canada); Lockwood, R. [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Canada T6H 2M9 (Canada); Malac, M., E-mail: marek.malac@nrc-cnrc.gc.ca [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Canada T6H 2M9 (Canada); Department of Physics, University of Alberta, Edmonton, Canada T6G 2G7 (Canada); Furukawa, H. [SYSTEM IN FRONTIER INC., 2-8-3, Shinsuzuharu bldg. 4F, Akebono-cho, Tachikawa-shi, Tokyo 190-0012 (Japan); Li, P.; Meldrum, A. [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Canada T6H 2M9 (Canada)

    2012-02-15

    This paper examines the limits of transmission electron tomography reconstruction methods for a nanocomposite object composed of many closely packed nanoparticles. Two commonly used reconstruction methods in TEM tomography were examined and compared, and the sources of various artefacts were explored. Common visualization methods were investigated, and the resulting 'interpretation artefacts' ( i.e., deviations from 'actual' particle sizes and shapes arising from the visualization) were determined. Setting a known or estimated nanoparticle volume fraction as a criterion for thresholding does not in fact give a good visualization. Unexpected effects associated with common built-in image filtering methods were also found. Ultimately, this work set out to establish the common problems and pitfalls associated with electron beam tomographic reconstruction and visualization of samples consisting of closely spaced nanoparticles. -- Highlights: Black-Right-Pointing-Pointer Electron tomography limits were explored by both experiment and simulation. Black-Right-Pointing-Pointer Reliable quantitative volumetry using electron tomography is not presently feasible. Black-Right-Pointing-Pointer Volume rendering appears to be better choice for visualization of composite samples.

  15. Sea level reconstructions from altimetry and tide gauges using independent component analysis

    Science.gov (United States)

    Brunnabend, Sandra-Esther; Kusche, Jürgen; Forootan, Ehsan

    2017-04-01

    Many reconstructions of global and regional sea level rise derived from tide gauges and satellite altimetry used the method of empirical orthogonal functions (EOF) to reduce noise, improving the spatial resolution of the reconstructed outputs and investigate the different signals in climate time series. However, the second order EOF method has some limitations, e.g. in the separation of individual physical signals into different modes of sea level variations and in the capability to physically interpret the different modes as they are assumed to be orthogonal. Therefore, we investigate the use of the more advanced statistical signal decomposition technique called independent component analysis (ICA) to reconstruct global and regional sea level change from satellite altimetry and tide gauge records. Our results indicate that the used method has almost no influence on the reconstruction of global mean sea level change (1.6 mm/yr from 1960-2010 and 2.9 mm/yr from 1993-2013). Only different numbers of modes are needed for the reconstruction. Using the ICA method is advantageous for separating independent climate variability signals from regional sea level variations as the mixing problem of the EOF method is strongly reduced. As an example, the modes most dominated by the El Niño-Southern Oscillation (ENSO) signal are compared. Regional sea level changes near Tianjin, China, Los Angeles, USA, and Majuro, Marshall Islands are reconstructed and the contributions from ENSO are identified.

  16. A subzone reconstruction algorithm for efficient staggered compatible remapping

    Energy Technology Data Exchange (ETDEWEB)

    Starinshak, D.P., E-mail: starinshak1@llnl.gov; Owen, J.M., E-mail: mikeowen@llnl.gov

    2015-09-01

    Staggered-grid Lagrangian hydrodynamics algorithms frequently make use of subzonal discretization of state variables for the purposes of improved numerical accuracy, generality to unstructured meshes, and exact conservation of mass, momentum, and energy. For Arbitrary Lagrangian–Eulerian (ALE) methods using a geometric overlay, it is difficult to remap subzonal variables in an accurate and efficient manner due to the number of subzone–subzone intersections that must be computed. This becomes prohibitive in the case of 3D, unstructured, polyhedral meshes. A new procedure is outlined in this paper to avoid direct subzonal remapping. The new algorithm reconstructs the spatial profile of a subzonal variable using remapped zonal and nodal representations of the data. The reconstruction procedure is cast as an under-constrained optimization problem. Enforcing conservation at each zone and node on the remapped mesh provides the set of equality constraints; the objective function corresponds to a quadratic variation per subzone between the values to be reconstructed and a set of target reference values. Numerical results for various pure-remapping and hydrodynamics tests are provided. Ideas for extending the algorithm to staggered-grid radiation-hydrodynamics are discussed as well as ideas for generalizing the algorithm to include inequality constraints.

  17. Sparse/Low Rank Constrained Reconstruction for Dynamic PET Imaging.

    Directory of Open Access Journals (Sweden)

    Xingjian Yu

    Full Text Available In dynamic Positron Emission Tomography (PET, an estimate of the radio activity concentration is obtained from a series of frames of sinogram data taken at ranging in duration from 10 seconds to minutes under some criteria. So far, all the well-known reconstruction algorithms require known data statistical properties. It limits the speed of data acquisition, besides, it is unable to afford the separated information about the structure and the variation of shape and rate of metabolism which play a major role in improving the visualization of contrast for some requirement of the diagnosing in application. This paper presents a novel low rank-based activity map reconstruction scheme from emission sinograms of dynamic PET, termed as SLCR representing Sparse/Low Rank Constrained Reconstruction for Dynamic PET Imaging. In this method, the stationary background is formulated as a low rank component while variations between successive frames are abstracted to the sparse. The resulting nuclear norm and l1 norm related minimization problem can also be efficiently solved by many recently developed numerical methods. In this paper, the linearized alternating direction method is applied. The effectiveness of the proposed scheme is illustrated on three data sets.

  18. Fast implementations of reconstruction-based scatter compensation in fully 3D SPECT image reconstruction

    International Nuclear Information System (INIS)

    Kadrmas, Dan J.; Karimi, Seemeen S.; Frey, Eric C.; Tsui, Benjamin M.W.

    1998-01-01

    Accurate scatter compensation in SPECT can be performed by modelling the scatter response function during the reconstruction process. This method is called reconstruction-based scatter compensation (RBSC). It has been shown that RBSC has a number of advantages over other methods of compensating for scatter, but using RBSC for fully 3D compensation has resulted in prohibitively long reconstruction times. In this work we propose two new methods that can be used in conjunction with existing methods to achieve marked reductions in RBSC reconstruction times. The first method, coarse-grid scatter modelling, significantly accelerates the scatter model by exploiting the fact that scatter is dominated by low-frequency information. The second method, intermittent RBSC, further accelerates the reconstruction process by limiting the number of iterations during which scatter is modelled. The fast implementations were evaluated using a Monte Carlo simulated experiment of the 3D MCAT phantom with 99m Tc tracer, and also using experimentally acquired data with 201 Tl tracer. Results indicated that these fast methods can reconstruct, with fully 3D compensation, images very similar to those obtained using standard RBSC methods, and in reconstruction times that are an order of magnitude shorter. Using these methods, fully 3D iterative reconstruction with RBSC can be performed well within the realm of clinically realistic times (under 10 minutes for 64x64x24 image reconstruction). (author)

  19. Update on orbital reconstruction.

    Science.gov (United States)

    Chen, Chien-Tzung; Chen, Yu-Ray

    2010-08-01

    Orbital trauma is common and frequently complicated by ocular injuries. The recent literature on orbital fracture is analyzed with emphasis on epidemiological data assessment, surgical timing, method of approach and reconstruction materials. Computed tomographic (CT) scan has become a routine evaluation tool for orbital trauma, and mobile CT can be applied intraoperatively if necessary. Concomitant serious ocular injury should be carefully evaluated preoperatively. Patients presenting with nonresolving oculocardiac reflex, 'white-eyed' blowout fracture, or diplopia with a positive forced duction test and CT evidence of orbital tissue entrapment require early surgical repair. Otherwise, enophthalmos can be corrected by late surgery with a similar outcome to early surgery. The use of an endoscope-assisted approach for orbital reconstruction continues to grow, offering an alternative method. Advances in alloplastic materials have improved surgical outcome and shortened operating time. In this review of modern orbital reconstruction, several controversial issues such as surgical indication, surgical timing, method of approach and choice of reconstruction material are discussed. Preoperative fine-cut CT image and thorough ophthalmologic examination are key elements to determine surgical indications. The choice of surgical approach and reconstruction materials much depends on the surgeon's experience and the reconstruction area. Prefabricated alloplastic implants together with image software and stereolithographic models are significant advances that help to more accurately reconstruct the traumatized orbit. The recent evolution of orbit reconstruction improves functional and aesthetic results and minimizes surgical complications.

  20. A Penalization Approach for Tomographic Reconstruction of Binary Axially Symmetric Objects

    International Nuclear Information System (INIS)

    Abraham, R.; Bergounioux, M.; Trelat, E.

    2008-01-01

    We propose a variational method for tomographic reconstruction of blurred and noised binary images based on a penalization process of a minimization problem settled in the space of bounded variation functions. We prove existence and/or uniqueness results and derive a penalized optimality system. Numerical simulations are provided to demonstrate the relevance of the approach

  1. Solving ill-posed control problems by stabilized finite element methods: an alternative to Tikhonov regularization

    Science.gov (United States)

    Burman, Erik; Hansbo, Peter; Larson, Mats G.

    2018-03-01

    Tikhonov regularization is one of the most commonly used methods for the regularization of ill-posed problems. In the setting of finite element solutions of elliptic partial differential control problems, Tikhonov regularization amounts to adding suitably weighted least squares terms of the control variable, or derivatives thereof, to the Lagrangian determining the optimality system. In this note we show that the stabilization methods for discretely ill-posed problems developed in the setting of convection-dominated convection-diffusion problems, can be highly suitable for stabilizing optimal control problems, and that Tikhonov regularization will lead to less accurate discrete solutions. We consider some inverse problems for Poisson’s equation as an illustration and derive new error estimates both for the reconstruction of the solution from the measured data and reconstruction of the source term from the measured data. These estimates include both the effect of the discretization error and error in the measurements.

  2. Two-dimensional semi-analytic nodal method for multigroup pin power reconstruction

    International Nuclear Information System (INIS)

    Seung Gyou, Baek; Han Gyu, Joo; Un Chul, Lee

    2007-01-01

    A pin power reconstruction method applicable to multigroup problems involving square fuel assemblies is presented. The method is based on a two-dimensional semi-analytic nodal solution which consists of eight exponential terms and 13 polynomial terms. The 13 polynomial terms represent the particular solution obtained under the condition of a 2-dimensional 13 term source expansion. In order to achieve better approximation of the source distribution, the least square fitting method is employed. The 8 exponential terms represent a part of the analytically obtained homogeneous solution and the 8 coefficients are determined by imposing constraints on the 4 surface average currents and 4 corner point fluxes. The surface average currents determined from a transverse-integrated nodal solution are used directly whereas the corner point fluxes are determined during the course of the reconstruction by employing an iterative scheme that would realize the corner point balance condition. The outgoing current based corner point flux determination scheme is newly introduced. The accuracy of the proposed method is demonstrated with the L336C5 benchmark problem. (authors)

  3. Adaptive tight frame based medical image reconstruction: a proof-of-concept study for computed tomography

    International Nuclear Information System (INIS)

    Zhou, Weifeng; Cai, Jian-Feng; Gao, Hao

    2013-01-01

    A popular approach for medical image reconstruction has been through the sparsity regularization, assuming the targeted image can be well approximated by sparse coefficients under some properly designed system. The wavelet tight frame is such a widely used system due to its capability for sparsely approximating piecewise-smooth functions, such as medical images. However, using a fixed system may not always be optimal for reconstructing a variety of diversified images. Recently, the method based on the adaptive over-complete dictionary that is specific to structures of the targeted images has demonstrated its superiority for image processing. This work is to develop the adaptive wavelet tight frame method image reconstruction. The proposed scheme first constructs the adaptive wavelet tight frame that is task specific, and then reconstructs the image of interest by solving an l 1 -regularized minimization problem using the constructed adaptive tight frame system. The proof-of-concept study is performed for computed tomography (CT), and the simulation results suggest that the adaptive tight frame method improves the reconstructed CT image quality from the traditional tight frame method. (paper)

  4. Surface reconstruction and deformation monitoring of stratospheric airship based on laser scanning technology

    Science.gov (United States)

    Guo, Kai; Xie, Yongjie; Ye, Hu; Zhang, Song; Li, Yunfei

    2018-04-01

    Due to the uncertainty of stratospheric airship's shape and the security problem caused by the uncertainty, surface reconstruction and surface deformation monitoring of airship was conducted based on laser scanning technology and a √3-subdivision scheme based on Shepard interpolation was developed. Then, comparison was conducted between our subdivision scheme and the original √3-subdivision scheme. The result shows our subdivision scheme could reduce the shrinkage of surface and the number of narrow triangles. In addition, our subdivision scheme could keep the sharp features. So, surface reconstruction and surface deformation monitoring of airship could be conducted precisely by our subdivision scheme.

  5. A filtering approach to image reconstruction in 3D SPECT

    International Nuclear Information System (INIS)

    Bronnikov, Andrei V.

    2000-01-01

    We present a new approach to three-dimensional (3D) image reconstruction using analytical inversion of the exponential divergent beam transform, which can serve as a mathematical model for cone-beam 3D SPECT imaging. We apply a circular cone-beam scan and assume constant attenuation inside a convex area with a known boundary, which is satisfactory in brain imaging. The reconstruction problem is reduced to an image restoration problem characterized by a shift-variant point spread function which is given analytically. The method requires two computation steps: backprojection and filtering. The modulation transfer function (MTF) of the filter is derived by means of an original methodology using the 2D Laplace transform. The filter is implemented in the frequency domain and requires 2D Fourier transform of transverse slices. In order to obtain a shift-invariant cone-beam projection-backprojection operator we resort to an approximation, assuming that the collimator has a relatively large focal length. Nevertheless, numerical experiments demonstrate surprisingly good results for detectors with relatively short focal lengths. The use of a wavelet-based filtering algorithm greatly improves the stability to Poisson noise. (author)

  6. Industrial dynamic tomographic reconstruction; Reconstrucao tomografica dinamica industrial

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Eric Ferreira de

    2016-07-01

    The state of the art methods applied to industrial processes is currently based on the principles of classical tomographic reconstructions developed for tomographic patterns of static distributions, or is limited to cases of low variability of the density distribution function of the tomographed object. Noise and motion artifacts are the main problems caused by a mismatch in the data from views acquired in different instants. All of these add to the known fact that using a limited amount of data can result in the presence of noise, artifacts and some inconsistencies with the distribution under study. One of the objectives of the present work is to discuss the difficulties that arise from implementing reconstruction algorithms in dynamic tomography that were originally developed for static distributions. Another objective is to propose solutions that aim at reducing a temporal type of information loss caused by employing regular acquisition systems to dynamic processes. With respect to dynamic image reconstruction it was conducted a comparison between different static reconstruction methods, like MART and FBP, when used for dynamic scenarios. This comparison was based on a MCNPx simulation as well as an analytical setup of an aluminum cylinder that moves along the section of a riser during the process of acquisition, and also based on cross section images from CFD techniques. As for the adaptation of current tomographic acquisition systems for dynamic processes, this work established a sequence of tomographic views in a just-in-time fashion for visualization purposes, a form of visually disposing density information as soon as it becomes amenable to image reconstruction. A third contribution was to take advantage of the triple color channel necessary to display colored images in most displays, so that, by appropriately scaling the acquired values of each view in the linear system of the reconstruction, it was possible to imprint a temporal trace into the regularly

  7. The impact of the Ice Model on tau neutrino reconstruction in IceCube

    Energy Technology Data Exchange (ETDEWEB)

    Usner, Marcel; Kowalski, Marek [DESY Zeuthen (Germany); Collaboration: IceCube-Collaboration

    2015-07-01

    The IceCube Neutrino Observatory at the South Pole is a Cherenkov detector with an instrumented volume of about one cubic kilometer of the Antarctic ice. Tau neutrinos can be measured via the double bang signature that links two subsequent cascades from the neutrino interaction and the tau decay. Reconstruction of double bang events is currently limited to PeV energies and above where the decay length of the tau is greater than 50 m. At lower energies it is important to consider small effects that affect the propagation of Cherenkov photons in the ice. The most recent model of the glacial ice below South pole contains a tilt of the ice layers and an anisotropy of the scattering coefficient in the direction of the glacier flow. These effects cannot be incorporated trivially into the existing reconstruction methods and can have a significant impact on single and double cascade reconstruction. Updates on finding a solution to this problem are presented, and the effect on the reconstruction of tau neutrino events is discussed.

  8. [Drainage variants in reconstructive and restorative operations for high strictures and injuries of the biliary tract].

    Science.gov (United States)

    Toskin, K D; Starosek, V N; Grinchesku, A E

    1990-10-01

    The article deals with the author's views on certain aspects of the problem of reconstructive and restorative surgery of the biliary tract. Original methods are suggested for external drainage (through the inferior surface of the right hepatic lobe in the region of the gallbladder seat and through the round ligament of the liver) in formation of ++hepato-hepatico- and hepaticojejunoanastomoses. Problems of operative techniques in formation of the anastomoses are discussed. Thirty-nine operations have been carried out in the clinic in the recent decade in high strictures and traumas of the biliary tract, 25 were reconstructive and 14 restorative. Postoperative mortality was 28.2% (11 patients). Intoxication and hepatargia associated with cholangiolytic abscesses of the liver were the main causes of death.

  9. An electron tomography algorithm for reconstructing 3D morphology using surface tangents of projected scattering interfaces

    Science.gov (United States)

    Petersen, T. C.; Ringer, S. P.

    2010-03-01

    Upon discerning the mere shape of an imaged object, as portrayed by projected perimeters, the full three-dimensional scattering density may not be of particular interest. In this situation considerable simplifications to the reconstruction problem are possible, allowing calculations based upon geometric principles. Here we describe and provide an algorithm which reconstructs the three-dimensional morphology of specimens from tilt series of images for application to electron tomography. Our algorithm uses a differential approach to infer the intersection of projected tangent lines with surfaces which define boundaries between regions of different scattering densities within and around the perimeters of specimens. Details of the algorithm implementation are given and explained using reconstruction calculations from simulations, which are built into the code. An experimental application of the algorithm to a nano-sized Aluminium tip is also presented to demonstrate practical analysis for a real specimen. Program summaryProgram title: STOMO version 1.0 Catalogue identifier: AEFS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2988 No. of bytes in distributed program, including test data, etc.: 191 605 Distribution format: tar.gz Programming language: C/C++ Computer: PC Operating system: Windows XP RAM: Depends upon the size of experimental data as input, ranging from 200 Mb to 1.5 Gb Supplementary material: Sample output files, for the test run provided, are available. Classification: 7.4, 14 External routines: Dev-C++ ( http://www.bloodshed.net/devcpp.html) Nature of problem: Electron tomography of specimens for which conventional back projection may fail and/or data for which there is a limited angular

  10. Incomplete projection reconstruction of computed tomography based on the modified discrete algebraic reconstruction technique

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Gao, Zongzhao; Yang, YaFei

    2018-02-01

    Based on the discrete algebraic reconstruction technique (DART), this study aims to address and test a new improved algorithm applied to incomplete projection data to generate a high quality reconstruction image by reducing the artifacts and noise in computed tomography. For the incomplete projections, an augmented Lagrangian based on compressed sensing is first used in the initial reconstruction for segmentation of the DART to get higher contrast graphics for boundary and non-boundary pixels. Then, the block matching 3D filtering operator was used to suppress the noise and to improve the gray distribution of the reconstructed image. Finally, simulation studies on the polychromatic spectrum were performed to test the performance of the new algorithm. Study results show a significant improvement in the signal-to-noise ratios (SNRs) and average gradients (AGs) of the images reconstructed from incomplete data. The SNRs and AGs of the new images reconstructed by DART-ALBM were on average 30%-40% and 10% higher than the images reconstructed by DART algorithms. Since the improved DART-ALBM algorithm has a better robustness to limited-view reconstruction, which not only makes the edge of the image clear but also makes the gray distribution of non-boundary pixels better, it has the potential to improve image quality from incomplete projections or sparse projections.

  11. Reconstruction from one boundary measurement of a potential homogeneous of degree zero

    DEFF Research Database (Denmark)

    Cornean, Horia Decebal; Knudsen, Kim

    We consider the inverse boundary value problem concerning the determination and reconstruction of an unknown potential in a Schrödinger equation in a bounded domain from measurements on the boundary of the domain. For the special case of a small potential homogeneous of degree zero we show that one...

  12. Reconstruction from one boundary measurement of a potential homogeneous of degree zero

    DEFF Research Database (Denmark)

    Cornean, Horia; Knudsen, Kim

    2006-01-01

    We consider the inverse boundary value problem concerning the determination and reconstruction of an unknown potential in a Schrödinger equation in a bounded domain from measurements on the boundary of the domain. For the special case of a small potential homogeneous of degree zero we show that one...

  13. BES-II fast data reconstruction

    International Nuclear Information System (INIS)

    Rong Gang; Zhang Jiawen; Guo Yiqing; Zhang Shaoqiang; Zhao Dixin

    2002-01-01

    The BES-II fast data reconstruction is reported. Based on PC FARM and/or a 'Distributed Clustered Linux PC System', BES-II fast data reconstruction system is set up. With this system the BES-II data can be fully reconstructed in about 20 minutes after data collection. It takes only 12 minutes to fully reconstruct 30000 events, collected with BES-II detector at BEPC Collider, with a P III-800 PC. The detector performance can be examined based on fully reconstructed data in about 20 minutes after data taking in the BES-II experiment

  14. Low dose CT reconstruction via L1 norm dictionary learning using alternating minimization algorithm and balancing principle.

    Science.gov (United States)

    Wu, Junfeng; Dai, Fang; Hu, Gang; Mou, Xuanqin

    2018-04-18

    Excessive radiation exposure in computed tomography (CT) scans increases the chance of developing cancer and has become a major clinical concern. Recently, statistical iterative reconstruction (SIR) with l0-norm dictionary learning regularization has been developed to reconstruct CT images from the low dose and few-view dataset in order to reduce radiation dose. Nonetheless, the sparse regularization term adopted in this approach is l0-norm, which cannot guarantee the global convergence of the proposed algorithm. To address this problem, in this study we introduced the l1-norm dictionary learning penalty into SIR framework for low dose CT image reconstruction, and developed an alternating minimization algorithm to minimize the associated objective function, which transforms CT image reconstruction problem into a sparse coding subproblem and an image updating subproblem. During the image updating process, an efficient model function approach based on balancing principle is applied to choose the regularization parameters. The proposed alternating minimization algorithm was evaluated first using real projection data of a sheep lung CT perfusion and then using numerical simulation based on sheep lung CT image and chest image. Both visual assessment and quantitative comparison using terms of root mean square error (RMSE) and structural similarity (SSIM) index demonstrated that the new image reconstruction algorithm yielded similar performance with l0-norm dictionary learning penalty and outperformed the conventional filtered backprojection (FBP) and total variation (TV) minimization algorithms.

  15. Free gracilis flap for chest wall reconstruction in male patient with Poland syndrome after implant failure

    OpenAIRE

    Cherubino, Mario; Maggiulli, Francesca; Pellegatta, Igor; Valdatta, Luigi

    2016-01-01

    Poland's syndrome (PS) is a congenital monolateral deformity that may involve breast, chest wall, and upper limb with different degrees of clinical expressions. In some cases, the problem is mainly cosmetic, and the reconstruction should be performed to achieve minimal scarring and donor site morbidity. The authors describe a case report of a male patient with PS who developed a severe capsular contraction after 25 years implant reconstruction, who was treated after explantation using free gr...

  16. Verifying three-dimensional skull model reconstruction using cranial index of symmetry.

    Science.gov (United States)

    Kung, Woon-Man; Chen, Shuo-Tsung; Lin, Chung-Hsiang; Lu, Yu-Mei; Chen, Tzu-Hsuan; Lin, Muh-Shi

    2013-01-01

    Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM) implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D) CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS). From January 2011 to June 2012, decompressive craniectomy (DC) was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47-99.84). CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (ppairs signed rank test). These data evidenced the highly accurate symmetry of these CAD models with regular contours. CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation.

  17. Antireflective Boundary Conditions for Deblurring Problems

    Directory of Open Access Journals (Sweden)

    Marco Donatelli

    2010-01-01

    Full Text Available This survey paper deals with the use of antireflective boundary conditions for deblurring problems where the issues that we consider are the precision of the reconstruction when the noise is not present, the linear algebra related to these boundary conditions, the iterative and noniterative regularization solvers when the noise is considered, both from the viewpoint of the computational cost and from the viewpoint of the quality of the reconstruction. In the latter case, we consider a reblurring approach that replaces the transposition operation with correlation. For many of the considered items, the anti-reflective algebra coming from the given boundary conditions is the optimal choice. Numerical experiments corroborating the previous statement and a conclusion section end the paper.

  18. Reconstruction of sparse-view X-ray computed tomography using adaptive iterative algorithms.

    Science.gov (United States)

    Liu, Li; Lin, Weikai; Jin, Mingwu

    2015-01-01

    In this paper, we propose two reconstruction algorithms for sparse-view X-ray computed tomography (CT). Treating the reconstruction problems as data fidelity constrained total variation (TV) minimization, both algorithms adapt the alternate two-stage strategy: projection onto convex sets (POCS) for data fidelity and non-negativity constraints and steepest descent for TV minimization. The novelty of this work is to determine iterative parameters automatically from data, thus avoiding tedious manual parameter tuning. In TV minimization, the step sizes of steepest descent are adaptively adjusted according to the difference from POCS update in either the projection domain or the image domain, while the step size of algebraic reconstruction technique (ART) in POCS is determined based on the data noise level. In addition, projection errors are used to compare with the error bound to decide whether to perform ART so as to reduce computational costs. The performance of the proposed methods is studied and evaluated using both simulated and physical phantom data. Our methods with automatic parameter tuning achieve similar, if not better, reconstruction performance compared to a representative two-stage algorithm. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Development of Image Reconstruction Algorithms in electrical Capacitance Tomography; Desarrollo de algoritmos de reconstruccion de imagenes en tomografia de capacitancia electrica

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez Marron, J. L.; Alberdi Primicia, J.; Barcala Riveira, J. M.

    2007-12-28

    The Electrical Capacitance Tomography (ECT) has not obtained a good development in order to be used at industrial level. That is due first to difficulties in the measurement of very little capacitances (in the range of femto farads) and second to the problem of reconstruction on- line of the images. This problem is due also to the small numbers of electrodes (maximum 16), that made the usual algorithms of reconstruction has many errors. In this work it is described a new purely geometrical method that could be used for this purpose. (Author) 4 refs.

  20. Lipschitz stability for an inverse hyperbolic problem of determining two coefficients by a finite number of observations

    Science.gov (United States)

    Beilina, L.; Cristofol, M.; Li, S.; Yamamoto, M.

    2018-01-01

    We consider an inverse problem of reconstructing two spatially varying coefficients in an acoustic equation of hyperbolic type using interior data of solutions with suitable choices of initial condition. Using a Carleman estimate, we prove Lipschitz stability estimates which ensure unique reconstruction of both coefficients. Our theoretical results are justified by numerical studies on the reconstruction of two unknown coefficients using noisy backscattered data.

  1. QSdpR: Viral quasispecies reconstruction via correlation clustering.

    Science.gov (United States)

    Barik, Somsubhra; Das, Shreepriya; Vikalo, Haris

    2017-12-19

    RNA viruses are characterized by high mutation rates that give rise to populations of closely related genomes, known as viral quasispecies. Underlying heterogeneity enables the quasispecies to adapt to changing conditions and proliferate over the course of an infection. Determining genetic diversity of a virus (i.e., inferring haplotypes and their proportions in the population) is essential for understanding its mutation patterns, and for effective drug developments. Here, we present QSdpR, a method and software for the reconstruction of quasispecies from short sequencing reads. The reconstruction is achieved by solving a correlation clustering problem on a read-similarity graph and the results of the clustering are used to estimate frequencies of sub-species; the number of sub-species is determined using pseudo F index. Extensive tests on both synthetic datasets and experimental HIV-1 and Zika virus data demonstrate that QSdpR compares favorably to existing methods in terms of various performance metrics. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. The transverse musculocutaneous gracilis flap for breast reconstruction: guidelines for flap and patient selection.

    Science.gov (United States)

    Schoeller, Thomas; Huemer, Georg M; Wechselberger, Gottfried

    2008-07-01

    The transverse musculocutaneous gracilis (TMG) flap has received little attention in the literature as a valuable alternative source of donor tissue in the setting of breast reconstruction. The authors give an in-depth review of their experience with breast reconstruction using the TMG flap. A retrospective review of 111 patients treated with a TMG flap for breast reconstruction in an immediate or a delayed setting between August of 2002 and July of 2007 was undertaken. Of these, 26 patients underwent bilateral reconstruction and 68 underwent unilateral reconstruction, and 17 patients underwent reconstruction unilaterally with a double TMG flap. Patient age ranged between 24 and 65 years (mean, 37 years). Twelve patients had to be taken back to the operating room because of flap-related problems and nine patients underwent successful revision microsurgically, resulting in three complete flap losses in a series of 111 patients with 154 transplanted TMG flaps. Partial flap loss was encountered in two patients, whereas fat tissue necrosis was managed conservatively in six patients. Donor-site morbidity was an advantage of this flap, with a concealed scar and minimal contour irregularities of the thigh, even in unilateral harvest. Complications included delayed wound healing (n = 10), hematoma (n = 5), and transient sensory deficit over the posterior thigh (n = 49). The TMG flap is more than an alternative to the deep inferior epigastric perforator (DIEP) flap in microsurgical breast reconstruction in selected patients. In certain indications, such as bilateral reconstructions, it possibly surpasses the DIEP flap because of a better concealed donor scar and easier harvest.

  3. On proton CT reconstruction using MVCT-converted virtual proton projections

    Energy Technology Data Exchange (ETDEWEB)

    Wang Dongxu; Mackie, T. Rockwell; Tome, Wolfgang A. [Department of Medical Physics, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin 53705 and Department of Radiation Oncology, University of Iowa Hospitals and Clinics, Iowa City, Iowa 52242 (United States); Department of Medical Physics, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin 53705 and Morgridge Institute of Research, University of Wisconsin, Madison, Wisconsin 53715 (United States); Department of Medical Physics, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin 53705 and Oncophysics Institute, Albert Einstein College of Medicine, Yeshiva University, Bronx, New York 10461 (United States)

    2012-06-15

    . If these images are used for treatment planning, the average proton range uncertainty is estimated to be less than 1.5% for an imaging dose in the milligray range. Conclusions: The proposed method can be used to convert x-ray projections into virtual proton projections. The converted proton projections can be blended with existing proton projections or can be used solely for pCT reconstruction, addressing the range limit problem of pCT using current therapeutic proton machines.

  4. On proton CT reconstruction using MVCT-converted virtual proton projections

    International Nuclear Information System (INIS)

    Wang Dongxu; Mackie, T. Rockwell; Tomé, Wolfgang A.

    2012-01-01

    these images are used for treatment planning, the average proton range uncertainty is estimated to be less than 1.5% for an imaging dose in the milligray range. Conclusions: The proposed method can be used to convert x-ray projections into virtual proton projections. The converted proton projections can be blended with existing proton projections or can be used solely for pCT reconstruction, addressing the range limit problem of pCT using current therapeutic proton machines.

  5. Direct EIT reconstructions of complex admittivities on a chest-shaped domain in 2-D.

    Science.gov (United States)

    Hamilton, Sarah J; Mueller, Jennifer L

    2013-04-01

    Electrical impedance tomography (EIT) is a medical imaging technique in which current is applied on electrodes on the surface of the body, the resulting voltage is measured, and an inverse problem is solved to recover the conductivity and/or permittivity in the interior. Images are then formed from the reconstructed conductivity and permittivity distributions. In the 2-D geometry, EIT is clinically useful for chest imaging. In this work, an implementation of a D-bar method for complex admittivities on a general 2-D domain is presented. In particular, reconstructions are computed on a chest-shaped domain for several realistic phantoms including a simulated pneumothorax, hyperinflation, and pleural effusion. The method demonstrates robustness in the presence of noise. Reconstructions from trigonometric and pairwise current injection patterns are included.

  6. Reconstructive options in pelvic tumours

    Directory of Open Access Journals (Sweden)

    Mayilvahanan N

    2005-01-01

    Full Text Available Background: Pelvic tumours present a complex problem. It is difficult to choose between limb salvage and hemipelvectomy. Method: Forty three patients of tumours of pelvis underwent limb salvage resection with reconstruction in 32 patients. The majority were chondrosarcomas (20 cases followed by Ewing sarcoma. Stage II B was the most common stage in malignant lesions and all the seven benign lesions were aggressive (B3. Surgical margins achieved were wide in 31 and marginal in 12 cases. Ilium was involved in 51% of cases and periacetabular involvement was seen in 12 patients. The resections done were mostly of types I &II of Enneking′s classification of pelvic resection. Arthrodesis was attempted in 24 patients. Customized Saddle prosthesis was used in seven patients and no reconstruction in 12 patients. Adjuvant chemotherapy was given to all high-grade malignant tumours, combined with radiotherapy in 7 patients. Results: With a mean follow up of 48.5 months and one patient lost to follow up, the recurrence rate among the evaluated cases was 16.6%. Oncologically, 30 patients were continuously disease free with 7 local recurrences and 4 deaths due to disseminated disease and 2 patients died of other causes. During the initial years, satisfactory functional results were achieved with prosthetic replacement. Long-term functional result of 36 patients who were alive at the time of latest follow up was satisfactory in 75% who underwent arthrodesis and in those where no reconstruction was used. We also describe a method of new classification of pelvic resections that clarifies certain shortcomings of the previous systems of classification. Conclusion: Selection of a procedure depends largely on the patient factors, the tumour grade, the resultant defect and the tissue factors. Resection with proper margins gives better functional and oncological results

  7. Electrical Impedance Tomography Reconstruction Through Simulated Annealing using a New Outside-in Heuristic and GPU Parallelization

    International Nuclear Information System (INIS)

    Tavares, R S; Tsuzuki, M S G; Martins, T C

    2012-01-01

    Electrical Impedance Tomography (EIT) is an imaging technique that attempts to reconstruct the conductivity distribution inside an object from electrical currents and potentials applied and measured at its surface. The EIT reconstruction problem is approached as an optimization problem, where the difference between the simulated and measured distributions must be minimized. This optimization problem can be solved using Simulated Annealing (SA), but at a high computational cost. To reduce the computational load, it is possible to use an incomplete evaluation of the objective function. This algorithm showed to present an outside-in behavior, determining the impedance of the external elements first, similar to a layer striping algorithm. A new outside-in heuristic to make use of this property is proposed. It also presents the impact of using GPU for parallelizing matrix-vector multiplication and triangular solvers. Results with experimental data are presented. The outside-in heuristic showed to be faster when compared to the conventional SA algorithm.

  8. Three-dimensional reconstruction of a radionuclide distribution within a medium of uniform coefficient of attenuation

    International Nuclear Information System (INIS)

    Diaz, J.E.

    1982-01-01

    The non-invasive, fully three-dimensional reconstruction of a radionuclide distribution is studied. The problem is considered in ideal form. Several solutions, ranging from the completely analytical to the completely graphical, are presented for both the non-attenuated and uniformly attenuated cases. A function is defined which, if enacted as a response to each detected photon, will yield, upon superposition, a faithful reconstruction of the radionuclide density. Two and three-dimensional forms of this functions are defined for both the non-attenuated and uniformly attenuated case

  9. Discussion of Source Reconstruction Models Using 3D MCG Data

    Science.gov (United States)

    Melis, Massimo De; Uchikawa, Yoshinori

    In this study we performed the source reconstruction of magnetocardiographic signals generated by the human heart activity to localize the site of origin of the heart activation. The localizations were performed in a four compartment model of the human volume conductor. The analyses were conducted on normal subjects and on a subject affected by the Wolff-Parkinson-White syndrome. Different models of the source activation were used to evaluate whether a general model of the current source can be applied in the study of the cardiac inverse problem. The data analyses were repeated using normal and vector component data of the MCG. The results show that a distributed source model has the better accuracy in performing the source reconstructions, and that 3D MCG data allow finding smaller differences between the different source models.

  10. Space-Varying Iterative Restoration of Diffuse Optical Tomograms Reconstructed by the Photon Average Trajectories Method

    Directory of Open Access Journals (Sweden)

    Kravtsenyuk Olga V

    2007-01-01

    Full Text Available The possibility of improving the spatial resolution of diffuse optical tomograms reconstructed by the photon average trajectories (PAT method is substantiated. The PAT method recently presented by us is based on a concept of an average statistical trajectory for transfer of light energy, the photon average trajectory (PAT. The inverse problem of diffuse optical tomography is reduced to a solution of an integral equation with integration along a conditional PAT. As a result, the conventional algorithms of projection computed tomography can be used for fast reconstruction of diffuse optical images. The shortcoming of the PAT method is that it reconstructs the images blurred due to averaging over spatial distributions of photons which form the signal measured by the receiver. To improve the resolution, we apply a spatially variant blur model based on an interpolation of the spatially invariant point spread functions simulated for the different small subregions of the image domain. Two iterative algorithms for solving a system of linear algebraic equations, the conjugate gradient algorithm for least squares problem and the modified residual norm steepest descent algorithm, are used for deblurring. It is shown that a gain in spatial resolution can be obtained.

  11. Space-Varying Iterative Restoration of Diffuse Optical Tomograms Reconstructed by the Photon Average Trajectories Method

    Directory of Open Access Journals (Sweden)

    Vladimir V. Lyubimov

    2007-01-01

    Full Text Available The possibility of improving the spatial resolution of diffuse optical tomograms reconstructed by the photon average trajectories (PAT method is substantiated. The PAT method recently presented by us is based on a concept of an average statistical trajectory for transfer of light energy, the photon average trajectory (PAT. The inverse problem of diffuse optical tomography is reduced to a solution of an integral equation with integration along a conditional PAT. As a result, the conventional algorithms of projection computed tomography can be used for fast reconstruction of diffuse optical images. The shortcoming of the PAT method is that it reconstructs the images blurred due to averaging over spatial distributions of photons which form the signal measured by the receiver. To improve the resolution, we apply a spatially variant blur model based on an interpolation of the spatially invariant point spread functions simulated for the different small subregions of the image domain. Two iterative algorithms for solving a system of linear algebraic equations, the conjugate gradient algorithm for least squares problem and the modified residual norm steepest descent algorithm, are used for deblurring. It is shown that a 27% gain in spatial resolution can be obtained.

  12. Robust Adaptive Beamforming with Sensor Position Errors Using Weighted Subspace Fitting-Based Covariance Matrix Reconstruction.

    Science.gov (United States)

    Chen, Peng; Yang, Yixin; Wang, Yong; Ma, Yuanliang

    2018-05-08

    When sensor position errors exist, the performance of recently proposed interference-plus-noise covariance matrix (INCM)-based adaptive beamformers may be severely degraded. In this paper, we propose a weighted subspace fitting-based INCM reconstruction algorithm to overcome sensor displacement for linear arrays. By estimating the rough signal directions, we construct a novel possible mismatched steering vector (SV) set. We analyze the proximity of the signal subspace from the sample covariance matrix (SCM) and the space spanned by the possible mismatched SV set. After solving an iterative optimization problem, we reconstruct the INCM using the estimated sensor position errors. Then we estimate the SV of the desired signal by solving an optimization problem with the reconstructed INCM. The main advantage of the proposed algorithm is its robustness against SV mismatches dominated by unknown sensor position errors. Numerical examples show that even if the position errors are up to half of the assumed sensor spacing, the output signal-to-interference-plus-noise ratio is only reduced by 4 dB. Beam patterns plotted using experiment data show that the interference suppression capability of the proposed beamformer outperforms other tested beamformers.

  13. Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows.

    Science.gov (United States)

    Bieberle, M; Hampel, U

    2015-06-13

    Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  14. Evaluation of the influence of uncertain forward models on the EEG source reconstruction problem

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2009-01-01

    in the different areas of the brain when noise is present. Results Due to mismatch between the true and experimental forward model, the reconstruction of the sources is determined by the angles between the i'th forward field associated with the true source and the j'th forward field in the experimental forward...... representation of the signal. Conclusions This analysis demonstrated that caution is needed when evaluating the source estimates in different brain regions. Moreover, we demonstrated the importance of reliable forward models, which may be used as a motivation for including the forward model uncertainty...

  15. Hardware system of parallel processing for fast CT image reconstruction based on circular shifting float memory architecture

    International Nuclear Information System (INIS)

    Wang Shi; Kang Kejun; Wang Jingjin

    1995-01-01

    Computerized Tomography (CT) is expected to become an inevitable diagnostic technique in the future. However, the long time required to reconstruct an image has been one of the major drawbacks associated with this technique. Parallel process is one of the best way to solve this problem. This paper gives the architecture and hardware design of PIRS-4 (4-processor Parallel Image Reconstruction System) which is a parallel processing system for fast 3D-CT image reconstruction by circular shifting float memory architecture. It includes structure and component of the system, the design of cross bar switch and details of control model. The test results are described

  16. Adaptive reconstructions for magnetic resonance imaging of moving organs

    International Nuclear Information System (INIS)

    Lohezic, Maelene

    2011-01-01

    Magnetic resonance imaging (MRI) is a valuable tool for the clinical diagnosis for brain imaging as well as cardiac and abdominal imaging. For instance, MRI is the only modality that enables the visualization and characterization myocardial edema. However, motion remains a challenging problem for cardiac MRI. Breathing as well as cardiac beating have to be carefully handled during patient examination. Moreover they limit the achievable temporal and spatial resolution of the images. In this work an approach that takes these physiological motions into account during image reconstruction process has been proposed. It allows performing cardiac examination while breathing freely. First, an iterative reconstruction algorithm, that compensates motion estimated from a motion model constrained by physiological signals, is applied to morphological cardiac imaging. A semi-automatic method for edema detection has been tested on reconstructed images. It has also been associated with an adaptive acquisition strategy which enables free-breathing end-systolic imaging. This reconstruction has then been extended to the assessment of transverse relaxation times T2, which is used for myocardial edema characterization. The proposed method, ARTEMIS, enables free-breathing T2 mapping without additional acquisition time. The proposed free breathing approaches take advantage of physiological signals to estimate the motion that occurs during MR acquisitions. Several solutions have been tested to measure this information. Among them, accelerometer-based external sensors allow local measurements at several locations. Another approach consists in the use of k-space based measurements, which are 'embedded' inside the MRI pulse sequence (navigator) and prevent from the requirement of additional recording hardware. Hence, several adaptive reconstruction algorithms were developed to obtain diagnostic information from free breathing acquisitions. These works allow performing efficient and accurate

  17. Accurate 3D reconstruction by a new PDS-OSEM algorithm for HRRT

    Science.gov (United States)

    Chen, Tai-Been; Horng-Shing Lu, Henry; Kim, Hang-Keun; Son, Young-Don; Cho, Zang-Hee

    2014-03-01

    State-of-the-art high resolution research tomography (HRRT) provides high resolution PET images with full 3D human brain scanning. But, a short time frame in dynamic study causes many problems related to the low counts in the acquired data. The PDS-OSEM algorithm was proposed to reconstruct the HRRT image with a high signal-to-noise ratio that provides accurate information for dynamic data. The new algorithm was evaluated by simulated image, empirical phantoms, and real human brain data. Meanwhile, the time activity curve was adopted to validate a reconstructed performance of dynamic data between PDS-OSEM and OP-OSEM algorithms. According to simulated and empirical studies, the PDS-OSEM algorithm reconstructs images with higher quality, higher accuracy, less noise, and less average sum of square error than those of OP-OSEM. The presented algorithm is useful to provide quality images under the condition of low count rates in dynamic studies with a short scan time.

  18. Landscape design strategies for post-disaster reconstructions based on traditional ethical wisdom

    Science.gov (United States)

    Yi, Shouli; Hu, Di; Gao, Suping; Lei, Ting; Chen, Qibin

    2018-03-01

    In the face of the black swan events which frequently happened globally, I take the earthquake of Ya'an, happened at 4.20 in Sichuan, as an example of my subject. The results indicate that the social responsibility of landscape architects is a post-disaster reconstruction of a material and spiritual homeland for victims and mental care for individuals, which specifically reflected in the process of rebuilding victims' sense of security. The reconstruction of a sense of security must follow certain ethics and moralities which embody the ecological wisdom. We take a typical case of Ya'an Lushan Snow Mountain Village-the earthquake distress area, we found the incidence of disease was up to 68.6% through the PTSD analysis, indicating the overall absence of a sense of security. To solve the problem and reconstruct people's mental and material homeland, the article discussed the strategies and solutions to post-disaster landscape design based on traditional ethical wisdom.

  19. Flaw shape reconstruction – an experimental approach

    Directory of Open Access Journals (Sweden)

    Marilena STANCULESCU

    2009-05-01

    Full Text Available Flaws can be classified as acceptable and unacceptable flaws. As a result of nondestructive testing, one takes de decision Admit/Reject regarding the tested product related to some acceptability criteria. In order to take the right decision, one should know the shape and the dimension of the flaw. On the other hand, the flaws considered to be acceptable, develop in time, such that they can become unacceptable. In this case, the knowledge of the shape and dimension of the flaw allows determining the product time life. For interior flaw shape reconstruction the best procedure is the use of difference static magnetic field. We have a stationary magnetic field problem, but we face the problem given by the nonlinear media. This paper presents the results of the experimental work for control specimen with and without flaw.

  20. Novel automated inversion algorithm for temperature reconstruction using gas isotopes from ice cores

    Directory of Open Access Journals (Sweden)

    M. Döring

    2018-06-01

    Full Text Available Greenland past temperature history can be reconstructed by forcing the output of a firn-densification and heat-diffusion model to fit multiple gas-isotope data (δ15N or δ40Ar or δ15Nexcess extracted from ancient air in Greenland ice cores using published accumulation-rate (Acc datasets. We present here a novel methodology to solve this inverse problem, by designing a fully automated algorithm. To demonstrate the performance of this novel approach, we begin by intentionally constructing synthetic temperature histories and associated δ15N datasets, mimicking real Holocene data that we use as true values (targets to be compared to the output of the algorithm. This allows us to quantify uncertainties originating from the algorithm itself. The presented approach is completely automated and therefore minimizes the subjective impact of manual parameter tuning, leading to reproducible temperature estimates. In contrast to many other ice-core-based temperature reconstruction methods, the presented approach is completely independent from ice-core stable-water isotopes, providing the opportunity to validate water-isotope-based reconstructions or reconstructions where water isotopes are used together with δ15N or δ40Ar. We solve the inverse problem T(δ15N, Acc by using a combination of a Monte Carlo based iterative approach and the analysis of remaining mismatches between modelled and target data, based on cubic-spline filtering of random numbers and the laboratory-determined temperature sensitivity for nitrogen isotopes. Additionally, the presented reconstruction approach was tested by fitting measured δ40Ar and δ15Nexcess data, which led as well to a robust agreement between modelled and measured data. The obtained final mismatches follow a symmetric standard-distribution function. For the study on synthetic data, 95 % of the mismatches compared to the synthetic target data are in an envelope between 3.0 to 6.3 permeg for δ15N and 0.23 to 0