WorldWideScience

Sample records for deconvoluting lung evolution

  1. Deconvoluting lung evolution: from phenotypes to gene regulatory networks

    DEFF Research Database (Denmark)

    Torday, J.S.; Rehan, V.K.; Hicks, J.W.

    2007-01-01

    other. Pathways of lung evolution are similar between crocodiles and birds but a low compliance of mammalian lung may have driven the development of the diaphragm to permit lung inflation during inspiration. To meet the high oxygen demands of flight, bird lungs have evolved separate gas exchange...... independent of ventilation as well as a unique mechanism for adjusting metabolic rate. Some of the most ancient oxygen-sensing molecules, i.e., hypoxia-inducible factor-1alpha and erythropoietin, are up-regulated during mammalian lung development and growth under apparently normoxic conditions, suggesting...

  2. Anatomic and energy variation of scatter compensation for digital chest radiography with Fourier deconvolution

    International Nuclear Information System (INIS)

    Floyd, C.E.; Beatty, P.T.; Ravin, C.E.

    1988-01-01

    The Fourier deconvolution algorithm for scatter compensation in digital chest radiography has been evaluated in four anatomically different regions at three energies. A shift invariant scatter distribution shape, optimized for the lung region at 140 kVp, was applied at 90 kVp and 120 kVp in the lung, retrocardiac, subdiaphragmatic, and thoracic spine regions. Scatter estimates from the deconvolution were compared with measured values. While some regional variation is apparent, the use of a shift invariant scatter distribution shape (optimized for a given energy) produces reasonable scatter compensation in the chest. A different set of deconvolution parameters were required at the different energies

  3. Evolution of lung breathing from a lungless primitive vertebrate.

    Science.gov (United States)

    Hoffman, M; Taylor, B E; Harris, M B

    2016-04-01

    Air breathing was critical to the terrestrial radiation and evolution of tetrapods and arose in fish. The vertebrate lung originated from a progenitor structure present in primitive boney fish. The origin of the neural substrates, which are sensitive to metabolically produced CO2 and which rhythmically activate respiratory muscles to match lung ventilation to metabolic demand, is enigmatic. We have found that a distinct periodic centrally generated rhythm, described as "cough" and occurring in lamprey in vivo and in vitro, is modulated by central sensitivity to CO2. This suggests that elements critical for the evolution of breathing in tetrapods, were present in the most basal vertebrate ancestors prior to the evolution of the lung. We propose that the evolution of breathing in all vertebrates occurred through exaptations derived from these critical basal elements. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Deconvoluting double Doppler spectra

    International Nuclear Information System (INIS)

    Ho, K.F.; Beling, C.D.; Fung, S.; Chan, K.L.; Tang, H.W.

    2001-01-01

    The successful deconvolution of data from double Doppler broadening of annihilation radiation (D-DBAR) spectroscopy is a promising area of endeavour aimed at producing momentum distributions of a quality comparable to those of the angular correlation technique. The deconvolution procedure we test in the present study is the constrained generalized least square method. Trials with computer simulated DDBAR spectra are generated and deconvoluted in order to find the best form of regularizer and the regularization parameter. For these trials the Neumann (reflective) boundary condition is used to give a single matrix operation in Fourier space. Experimental D-DBAR spectra are also subject to the same type of deconvolution after having carried out a background subtraction and using a symmetrize resolution function obtained from an 85 Sr source with wide coincidence windows. (orig.)

  5. Partial Deconvolution with Inaccurate Blur Kernel.

    Science.gov (United States)

    Ren, Dongwei; Zuo, Wangmeng; Zhang, David; Xu, Jun; Zhang, Lei

    2017-10-17

    Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning

  6. Deconvolution algorithms applied in ultrasonics; Methodes de deconvolution en echographie ultrasonore

    Energy Technology Data Exchange (ETDEWEB)

    Perrot, P

    1993-12-01

    In a complete system of acquisition and processing of ultrasonic signals, it is often necessary at one stage to use some processing tools to get rid of the influence of the different elements of that system. By that means, the final quality of the signals in terms of resolution is improved. There are two main characteristics of ultrasonic signals which make this task difficult. Firstly, the signals generated by transducers are very often non-minimum phase. The classical deconvolution algorithms are unable to deal with such characteristics. Secondly, depending on the medium, the shape of the propagating pulse is evolving. The spatial invariance assumption often used in classical deconvolution algorithms is rarely valid. Many classical algorithms, parametric and non-parametric, have been investigated: the Wiener-type, the adaptive predictive techniques, the Oldenburg technique in the frequency domain, the minimum variance deconvolution. All the algorithms have been firstly tested on simulated data. One specific experimental set-up has also been analysed. Simulated and real data has been produced. This set-up demonstrated the interest in applying deconvolution, in terms of the achieved resolution. (author). 32 figs., 29 refs.

  7. Development of a Prognostic Marker for Lung Cancer Using Analysis of Tumor Evolution

    Science.gov (United States)

    2017-08-01

    AWARD NUMBER: W81XWH-15-1-0243 TITLE: Development of a Prognostic Marker for Lung Cancer Using Analysis of Tumor Evolution PRINCIPAL...SUBTITLE 5a. CONTRACT NUMBER Development of a Prognostic Marker for Lung Cancer Using Analysis of Tumor Evolution 5b. GRANT NUMBER 5c. PROGRAM...derive a prognostic classifier. 15. SUBJECT TERMS NSCLC; tumor evolution ; whole exome sequencing 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF

  8. Deconvolution of Positrons' Lifetime spectra

    International Nuclear Information System (INIS)

    Calderin Hidalgo, L.; Ortega Villafuerte, Y.

    1996-01-01

    In this paper, we explain the iterative method previously develop for the deconvolution of Doppler broadening spectra using the mathematical optimization theory. Also, we start the adaptation and application of this method to the deconvolution of positrons' lifetime annihilation spectra

  9. Machine Learning Approaches to Image Deconvolution

    OpenAIRE

    Schuler, Christian

    2017-01-01

    Image blur is a fundamental problem in both photography and scientific imaging. Even the most well-engineered optics are imperfect, and finite exposure times cause motion blur. To reconstruct the original sharp image, the field of image deconvolution tries to recover recorded photographs algorithmically. When the blur is known, this problem is called non-blind deconvolution. When the blur is unknown and has to be inferred from the observed image, it is called blind deconvolution. The key to r...

  10. Convolution-deconvolution in DIGES

    International Nuclear Information System (INIS)

    Philippacopoulos, A.J.; Simos, N.

    1995-01-01

    Convolution and deconvolution operations is by all means a very important aspect of SSI analysis since it influences the input to the seismic analysis. This paper documents some of the convolution/deconvolution procedures which have been implemented into the DIGES code. The 1-D propagation of shear and dilatational waves in typical layered configurations involving a stack of layers overlying a rock is treated by DIGES in a similar fashion to that of available codes, e.g. CARES, SHAKE. For certain configurations, however, there is no need to perform such analyses since the corresponding solutions can be obtained in analytic form. Typical cases involve deposits which can be modeled by a uniform halfspace or simple layered halfspaces. For such cases DIGES uses closed-form solutions. These solutions are given for one as well as two dimensional deconvolution. The type of waves considered include P, SV and SH waves. The non-vertical incidence is given special attention since deconvolution can be defined differently depending on the problem of interest. For all wave cases considered, corresponding transfer functions are presented in closed-form. Transient solutions are obtained in the frequency domain. Finally, a variety of forms are considered for representing the free field motion both in terms of deterministic as well as probabilistic representations. These include (a) acceleration time histories, (b) response spectra (c) Fourier spectra and (d) cross-spectral densities

  11. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  12. Deconvolution algorithms applied in ultrasonics

    International Nuclear Information System (INIS)

    Perrot, P.

    1993-12-01

    In a complete system of acquisition and processing of ultrasonic signals, it is often necessary at one stage to use some processing tools to get rid of the influence of the different elements of that system. By that means, the final quality of the signals in terms of resolution is improved. There are two main characteristics of ultrasonic signals which make this task difficult. Firstly, the signals generated by transducers are very often non-minimum phase. The classical deconvolution algorithms are unable to deal with such characteristics. Secondly, depending on the medium, the shape of the propagating pulse is evolving. The spatial invariance assumption often used in classical deconvolution algorithms is rarely valid. Many classical algorithms, parametric and non-parametric, have been investigated: the Wiener-type, the adaptive predictive techniques, the Oldenburg technique in the frequency domain, the minimum variance deconvolution. All the algorithms have been firstly tested on simulated data. One specific experimental set-up has also been analysed. Simulated and real data has been produced. This set-up demonstrated the interest in applying deconvolution, in terms of the achieved resolution. (author). 32 figs., 29 refs

  13. Blind source deconvolution for deep Earth seismology

    Science.gov (United States)

    Stefan, W.; Renaut, R.; Garnero, E. J.; Lay, T.

    2007-12-01

    We present an approach to automatically estimate an empirical source characterization of deep earthquakes recorded teleseismically and subsequently remove the source from the recordings by applying regularized deconvolution. A principle goal in this work is to effectively deblur the seismograms, resulting in more impulsive and narrower pulses, permitting better constraints in high resolution waveform analyses. Our method consists of two stages: (1) we first estimate the empirical source by automatically registering traces to their 1st principal component with a weighting scheme based on their deviation from this shape, we then use this shape as an estimation of the earthquake source. (2) We compare different deconvolution techniques to remove the source characteristic from the trace. In particular Total Variation (TV) regularized deconvolution is used which utilizes the fact that most natural signals have an underlying spareness in an appropriate basis, in this case, impulsive onsets of seismic arrivals. We show several examples of deep focus Fiji-Tonga region earthquakes for the phases S and ScS, comparing source responses for the separate phases. TV deconvolution is compared to the water level deconvolution, Tikenov deconvolution, and L1 norm deconvolution, for both data and synthetics. This approach significantly improves our ability to study subtle waveform features that are commonly masked by either noise or the earthquake source. Eliminating source complexities improves our ability to resolve deep mantle triplications, waveform complexities associated with possible double crossings of the post-perovskite phase transition, as well as increasing stability in waveform analyses used for deep mantle anisotropy measurements.

  14. Cell–cell signaling drives the evolution of complex traits: introduction—lung evo-devo

    Science.gov (United States)

    Torday, John S.; Rehan, V. K.

    2009-01-01

    Physiology integrates biology with the environment through cell–cell interactions at multiple levels. The evolution of the respiratory system has been “deconvoluted” (Torday and Rehan in Am J Respir Cell Mol Biol 31:8–12, 2004) through Gene Regulatory Networks (GRNs) applied to cell–cell communication for all aspects of lung biology development, homeostasis, regeneration, and aging. Using this approach, we have predicted the phenotypic consequences of failed signaling for lung development, homeostasis, and regeneration based on evolutionary principles. This cell–cell communication model predicts other aspects of vertebrate physiology as adaptational responses. For example, the oxygen-induced differentiation of alveolar myocytes into alveolar adipocytes was critical for the evolution of the lung in land dwelling animals adapting to fluctuating Phanarezoic oxygen levels over the past 500 million years. Adipocytes prevent lung injury due to oxygen radicals and facilitate the rise of endothermy. In addition, they produce the class I cytokine leptin, which augments pulmonary surfactant activity and alveolar surface area, increasing selection pressure for both respiratory oxygenation and metabolic demand initially constrained by high-systemic vascular pressure, but subsequently compensated by the evolution of the adrenomedullary beta-adrenergic receptor mechanism. Conserted positive selection for the lung and adrenals created further selection pressure for the heart, which becomes progressively more complex phylogenetically in tandem with the lung. Developmentally, increasing heart complexity and size impinges precociously on the gut mesoderm to induce the liver. That evolutionary-developmental interaction is significant because the liver provides regulated sources of glucose and glycogen to the evolving physiologic system, which is necessary for the evolution of the neocortex. Evolution of neocortical control furthers integration of physiologic systems. Such

  15. Multi-Channel Deconvolution for Forward-Looking Phase Array Radar Imaging

    Directory of Open Access Journals (Sweden)

    Jie Xia

    2017-07-01

    Full Text Available The cross-range resolution of forward-looking phase array radar (PAR is limited by the effective antenna beamwidth since the azimuth echo is the convolution of antenna pattern and targets’ backscattering coefficients. Therefore, deconvolution algorithms are proposed to improve the imaging resolution under the limited antenna beamwidth. However, as a typical inverse problem, deconvolution is essentially a highly ill-posed problem which is sensitive to noise and cannot ensure a reliable and robust estimation. In this paper, multi-channel deconvolution is proposed for improving the performance of deconvolution, which intends to considerably alleviate the ill-posed problem of single-channel deconvolution. To depict the performance improvement obtained by multi-channel more effectively, evaluation parameters are generalized to characterize the angular spectrum of antenna pattern or singular value distribution of observation matrix, which are conducted to compare different deconvolution systems. Here we present two multi-channel deconvolution algorithms which improve upon the traditional deconvolution algorithms via combining with multi-channel technique. Extensive simulations and experimental results based on real data are presented to verify the effectiveness of the proposed imaging methods.

  16. Tracking Genomic Cancer Evolution for Precision Medicine: The Lung TRACERx Study

    DEFF Research Database (Denmark)

    Jamal-Hanjani, Mariam; Hackshaw, Alan; Ngai, Yenting

    2014-01-01

    . TRACERx (TRAcking non-small cell lung Cancer Evolution through therapy [Rx]), a prospective study of patients with primary non-small cell lung cancer (NSCLC), aims to define the evolutionary trajectories of lung cancer in both space and time through multiregion and longitudinal tumour sampling and genetic...... analysis. By following cancers from diagnosis to relapse, tracking the evolutionary trajectories of tumours in relation to therapeutic interventions, and determining the impact of clonal heterogeneity on clinical outcomes, TRACERx may help to identify novel therapeutic targets for NSCLC and may also serve...

  17. Parsimonious Charge Deconvolution for Native Mass Spectrometry

    Science.gov (United States)

    2018-01-01

    Charge deconvolution infers the mass from mass over charge (m/z) measurements in electrospray ionization mass spectra. When applied over a wide input m/z or broad target mass range, charge-deconvolution algorithms can produce artifacts, such as false masses at one-half or one-third of the correct mass. Indeed, a maximum entropy term in the objective function of MaxEnt, the most commonly used charge deconvolution algorithm, favors a deconvolved spectrum with many peaks over one with fewer peaks. Here we describe a new “parsimonious” charge deconvolution algorithm that produces fewer artifacts. The algorithm is especially well-suited to high-resolution native mass spectrometry of intact glycoproteins and protein complexes. Deconvolution of native mass spectra poses special challenges due to salt and small molecule adducts, multimers, wide mass ranges, and fewer and lower charge states. We demonstrate the performance of the new deconvolution algorithm on a range of samples. On the heavily glycosylated plasma properdin glycoprotein, the new algorithm could deconvolve monomer and dimer simultaneously and, when focused on the m/z range of the monomer, gave accurate and interpretable masses for glycoforms that had previously been analyzed manually using m/z peaks rather than deconvolved masses. On therapeutic antibodies, the new algorithm facilitated the analysis of extensions, truncations, and Fab glycosylation. The algorithm facilitates the use of native mass spectrometry for the qualitative and quantitative analysis of protein and protein assemblies. PMID:29376659

  18. Receiver function estimated by maximum entropy deconvolution

    Institute of Scientific and Technical Information of China (English)

    吴庆举; 田小波; 张乃铃; 李卫平; 曾融生

    2003-01-01

    Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.

  19. Parallelization of a blind deconvolution algorithm

    Science.gov (United States)

    Matson, Charles L.; Borelli, Kathy J.

    2006-09-01

    Often it is of interest to deblur imagery in order to obtain higher-resolution images. Deblurring requires knowledge of the blurring function - information that is often not available separately from the blurred imagery. Blind deconvolution algorithms overcome this problem by jointly estimating both the high-resolution image and the blurring function from the blurred imagery. Because blind deconvolution algorithms are iterative in nature, they can take minutes to days to deblur an image depending how many frames of data are used for the deblurring and the platforms on which the algorithms are executed. Here we present our progress in parallelizing a blind deconvolution algorithm to increase its execution speed. This progress includes sub-frame parallelization and a code structure that is not specialized to a specific computer hardware architecture.

  20. Tracking the Evolution of Non-Small-Cell Lung Cancer

    DEFF Research Database (Denmark)

    Jamal-Hanjani, Mariam; Wilson, Gareth A.; McGranahan, Nicholas

    2017-01-01

    Background Among patients with non-small-cell lung cancer (NSCLC), data on intratumor heterogeneity and cancer genome evolution have been limited to small retrospective cohorts. We wanted to prospectively investigate intratumor heterogeneity in relation to clinical outcome and to determine...... as a prognostic predictor. (Funded by Cancer Research UK and others; TRACERx ClinicalTrials.gov number, NCT01888601 .)....

  1. Data-driven efficient score tests for deconvolution hypotheses

    NARCIS (Netherlands)

    Langovoy, M.

    2008-01-01

    We consider testing statistical hypotheses about densities of signals in deconvolution models. A new approach to this problem is proposed. We constructed score tests for the deconvolution density testing with the known noise density and efficient score tests for the case of unknown density. The

  2. Monte-Carlo error analysis in x-ray spectral deconvolution

    International Nuclear Information System (INIS)

    Shirk, D.G.; Hoffman, N.M.

    1985-01-01

    The deconvolution of spectral information from sparse x-ray data is a widely encountered problem in data analysis. An often-neglected aspect of this problem is the propagation of random error in the deconvolution process. We have developed a Monte-Carlo approach that enables us to attach error bars to unfolded x-ray spectra. Our Monte-Carlo error analysis has been incorporated into two specific deconvolution techniques: the first is an iterative convergent weight method; the second is a singular-value-decomposition (SVD) method. These two methods were applied to an x-ray spectral deconvolution problem having m channels of observations with n points in energy space. When m is less than n, this problem has no unique solution. We discuss the systematics of nonunique solutions and energy-dependent error bars for both methods. The Monte-Carlo approach has a particular benefit in relation to the SVD method: It allows us to apply the constraint of spectral nonnegativity after the SVD deconvolution rather than before. Consequently, we can identify inconsistencies between different detector channels

  3. Z-transform Zeros in Mixed Phase Deconvolution of Speech

    DEFF Research Database (Denmark)

    Pedersen, Christian Fischer

    2013-01-01

    The present thesis addresses mixed phase deconvolution of speech by z-transform zeros. This includes investigations into stability, accuracy, and time complexity of a numerical bijection between time domain and the domain of z-transform zeros. Z-transform factorization is by no means esoteric......, but employing zeros of the z-transform (ZZT) as a signal representation, analysis, and processing domain per se, is only scarcely researched. A notable property of this domain is the translation of time domain convolution into union of sets; thus, the ZZT domain is appropriate for convolving and deconvolving...... discrimination achieves mixed phase deconvolution and equivalates complex cepstrum based deconvolution by causality, which has lower time and space complexities as demonstrated. However, deconvolution by ZZT prevents phase wrapping. Existence and persistence of ZZT domain immiscibility of the opening and closing...

  4. Scalar flux modeling in turbulent flames using iterative deconvolution

    Science.gov (United States)

    Nikolaou, Z. M.; Cant, R. S.; Vervisch, L.

    2018-04-01

    In the context of large eddy simulations, deconvolution is an attractive alternative for modeling the unclosed terms appearing in the filtered governing equations. Such methods have been used in a number of studies for non-reacting and incompressible flows; however, their application in reacting flows is limited in comparison. Deconvolution methods originate from clearly defined operations, and in theory they can be used in order to model any unclosed term in the filtered equations including the scalar flux. In this study, an iterative deconvolution algorithm is used in order to provide a closure for the scalar flux term in a turbulent premixed flame by explicitly filtering the deconvoluted fields. The assessment of the method is conducted a priori using a three-dimensional direct numerical simulation database of a turbulent freely propagating premixed flame in a canonical configuration. In contrast to most classical a priori studies, the assessment is more stringent as it is performed on a much coarser mesh which is constructed using the filtered fields as obtained from the direct simulations. For the conditions tested in this study, deconvolution is found to provide good estimates both of the scalar flux and of its divergence.

  5. Evaluation of deconvolution modelling applied to numerical combustion

    Science.gov (United States)

    Mehl, Cédric; Idier, Jérôme; Fiorina, Benoît

    2018-01-01

    A possible modelling approach in the large eddy simulation (LES) of reactive flows is to deconvolve resolved scalars. Indeed, by inverting the LES filter, scalars such as mass fractions are reconstructed. This information can be used to close budget terms of filtered species balance equations, such as the filtered reaction rate. Being ill-posed in the mathematical sense, the problem is very sensitive to any numerical perturbation. The objective of the present study is to assess the ability of this kind of methodology to capture the chemical structure of premixed flames. For that purpose, three deconvolution methods are tested on a one-dimensional filtered laminar premixed flame configuration: the approximate deconvolution method based on Van Cittert iterative deconvolution, a Taylor decomposition-based method, and the regularised deconvolution method based on the minimisation of a quadratic criterion. These methods are then extended to the reconstruction of subgrid scale profiles. Two methodologies are proposed: the first one relies on subgrid scale interpolation of deconvolved profiles and the second uses parametric functions to describe small scales. Conducted tests analyse the ability of the method to capture the chemical filtered flame structure and front propagation speed. Results show that the deconvolution model should include information about small scales in order to regularise the filter inversion. a priori and a posteriori tests showed that the filtered flame propagation speed and structure cannot be captured if the filter size is too large.

  6. The discrete Kalman filtering approach for seismic signals deconvolution

    International Nuclear Information System (INIS)

    Kurniadi, Rizal; Nurhandoko, Bagus Endar B.

    2012-01-01

    Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

  7. Quantitative fluorescence microscopy and image deconvolution.

    Science.gov (United States)

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used

  8. Improving the efficiency of deconvolution algorithms for sound source localization

    DEFF Research Database (Denmark)

    Lylloff, Oliver Ackermann; Fernandez Grande, Efren; Agerkvist, Finn T.

    2015-01-01

    of the unknown acoustic source distribution and the beamformer's response to a point source, i.e., point-spread function. A significant limitation of deconvolution is, however, an additional computational effort compared to beamforming. In this paper, computationally efficient deconvolution algorithms...

  9. Single breath study for lung scan with krypton-81m: proposition of a mathematical model

    International Nuclear Information System (INIS)

    Pommet, R.; Mathieu, E.

    1981-01-01

    A single breath study with sup(81m)Kr was proceeded in patients, and we studied a theorical model. Based on experimental datas, the model was extrapolated by simple compartimental hypothesis, permitting a study per area of the instant alveolar lung flow by a deconvolution operation. An other approach to present the local ventilation is proposed too. Based on the average flow of ventilation index, calculation is obtained easier than by deconvolution method, and this method fully agree with the proposed model. This index allows the realisation of functionnal views of the local ventilation flow, made possible by the use of a computer for the study of each elementary area of the lung and the realisation of the activity curve recorded during the sup(81m)Kr first breath [fr

  10. Filtering and deconvolution for bioluminescence imaging of small animals; Filtrage et deconvolution en imagerie de bioluminescence chez le petit animal

    Energy Technology Data Exchange (ETDEWEB)

    Akkoul, S.

    2010-06-22

    This thesis is devoted to analysis of bioluminescence images applied to the small animal. This kind of imaging modality is used in cancerology studies. Nevertheless, some problems are related to the diffusion and the absorption of the tissues of the light of internal bioluminescent sources. In addition, system noise and the cosmic rays noise are present. This influences the quality of the images and makes it difficult to analyze. The purpose of this thesis is to overcome these disturbing effects. We first have proposed an image formation model for the bioluminescence images. The processing chain is constituted by a filtering stage followed by a deconvolution stage. We have proposed a new median filter to suppress the random value impulsive noise which corrupts the acquired images; this filter represents the first block of the proposed chain. For the deconvolution stage, we have performed a comparative study of various deconvolution algorithms. It allowed us to choose a blind deconvolution algorithm initialized with the estimated point spread function of the acquisition system. At first, we have validated our global approach by comparing our obtained results with the ground truth. Through various clinical tests, we have shown that the processing chain allows a significant improvement of the spatial resolution and a better distinction of very close tumor sources, what represents considerable contribution for the users of bioluminescence images. (author)

  11. Deconvolution using a neural network

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, S.K.

    1990-11-15

    Viewing one dimensional deconvolution as a matrix inversion problem, we compare a neural network backpropagation matrix inverse with LMS, and pseudo-inverse. This is a largely an exercise in understanding how our neural network code works. 1 ref.

  12. Hybrid sparse blind deconvolution: an implementation of SOOT algorithm to real data

    Science.gov (United States)

    Pakmanesh, Parvaneh; Goudarzi, Alireza; Kourki, Meisam

    2018-06-01

    Getting information of seismic data depends on deconvolution as an important processing step; it provides the reflectivity series by signal compression. This compression can be obtained by removing the wavelet effects on the traces. The recently blind deconvolution has provided reliable performance for sparse signal recovery. In this study, two deconvolution methods have been implemented to the seismic data; the convolution of these methods provides a robust spiking deconvolution approach. This hybrid deconvolution is applied using the sparse deconvolution (MM algorithm) and the Smoothed-One-Over-Two algorithm (SOOT) in a chain. The MM algorithm is based on the minimization of the cost function defined by standards l1 and l2. After applying the two algorithms to the seismic data, the SOOT algorithm provided well-compressed data with a higher resolution than the MM algorithm. The SOOT algorithm requires initial values to be applied for real data, such as the wavelet coefficients and reflectivity series that can be achieved through the MM algorithm. The computational cost of the hybrid method is high, and it is necessary to be implemented on post-stack or pre-stack seismic data of complex structure regions.

  13. Constrained blind deconvolution using Wirtinger flow methods

    KAUST Repository

    Walk, Philipp

    2017-09-04

    In this work we consider one-dimensional blind deconvolution with prior knowledge of signal autocorrelations in the classical framework of polynomial factorization. In particular this univariate case highly suffers from several non-trivial ambiguities and therefore blind deconvolution is known to be ill-posed in general. However, if additional autocorrelation information is available and the corresponding polynomials are co-prime, blind deconvolution is uniquely solvable up to global phase. Using lifting, the outer product of the unknown vectors is the solution to a (convex) semi-definite program (SDP) demonstrating that -theoretically- recovery is computationally tractable. However, for practical applications efficient algorithms are required which should operate in the original signal space. To this end we also discuss a gradient descent algorithm (Wirtinger flow) for the original non-convex problem. We demonstrate numerically that such an approach has performance comparable to the semidefinite program in the noisy case. Our work is motivated by applications in blind communication scenarios and we will discuss a specific signaling scheme where information is encoded into polynomial roots.

  14. Constrained blind deconvolution using Wirtinger flow methods

    KAUST Repository

    Walk, Philipp; Jung, Peter; Hassibi, Babak

    2017-01-01

    In this work we consider one-dimensional blind deconvolution with prior knowledge of signal autocorrelations in the classical framework of polynomial factorization. In particular this univariate case highly suffers from several non-trivial ambiguities and therefore blind deconvolution is known to be ill-posed in general. However, if additional autocorrelation information is available and the corresponding polynomials are co-prime, blind deconvolution is uniquely solvable up to global phase. Using lifting, the outer product of the unknown vectors is the solution to a (convex) semi-definite program (SDP) demonstrating that -theoretically- recovery is computationally tractable. However, for practical applications efficient algorithms are required which should operate in the original signal space. To this end we also discuss a gradient descent algorithm (Wirtinger flow) for the original non-convex problem. We demonstrate numerically that such an approach has performance comparable to the semidefinite program in the noisy case. Our work is motivated by applications in blind communication scenarios and we will discuss a specific signaling scheme where information is encoded into polynomial roots.

  15. MINIMUM ENTROPY DECONVOLUTION OF ONE-AND MULTI-DIMENSIONAL NON-GAUSSIAN LINEAR RANDOM PROCESSES

    Institute of Scientific and Technical Information of China (English)

    程乾生

    1990-01-01

    The minimum entropy deconvolution is considered as one of the methods for decomposing non-Gaussian linear processes. The concept of peakedness of a system response sequence is presented and its properties are studied. With the aid of the peakedness, the convergence theory of the minimum entropy deconvolution is established. The problem of the minimum entropy deconvolution of multi-dimensional non-Gaussian linear random processes is first investigated and the corresponding theory is given. In addition, the relation between the minimum entropy deconvolution and parameter method is discussed.

  16. Deconvolution of ferromagnetic resonance in devitrification process of Co-based amorphous alloys

    International Nuclear Information System (INIS)

    Montiel, H.; Alvarez, G.; Betancourt, I.; Zamorano, R.; Valenzuela, R.

    2006-01-01

    Ferromagnetic resonance (FMR) measurements were carried out on soft magnetic amorphous ribbons of composition Co 66 Fe 4 B 12 Si 13 Nb 4 Cu prepared by melt spinning. In the as-cast sample, a simple FMR spectrum was apparent. For treatment times of 5-20 min a complex resonant absorption at lower fields was detected; deconvolution calculations were carried out on the FMR spectra and it was possible to separate two contributions. These results can be interpreted as the combination of two different magnetic phases, corresponding to the amorphous matrix and nanocrystallites. The parameters of resonant absorptions can be associated with the evolution of nanocrystallization during the annealing

  17. Simultaneous super-resolution and blind deconvolution

    International Nuclear Information System (INIS)

    Sroubek, F; Flusser, J; Cristobal, G

    2008-01-01

    In many real applications, blur in input low-resolution images is a nuisance, which prevents traditional super-resolution methods from working correctly. This paper presents a unifying approach to the blind deconvolution and superresolution problem of multiple degraded low-resolution frames of the original scene. We introduce a method which assumes no prior information about the shape of degradation blurs and which is properly defined for any rational (fractional) resolution factor. The method minimizes a regularized energy function with respect to the high-resolution image and blurs, where regularization is carried out in both the image and blur domains. The blur regularization is based on a generalized multichannel blind deconvolution constraint. Experiments on real data illustrate robustness and utilization of the method

  18. Real Time Deconvolution of In-Vivo Ultrasound Images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2013-01-01

    and two wavelengths. This can be improved by deconvolution, which increase the bandwidth and equalizes the phase to increase resolution under the constraint of the electronic noise in the received signal. A fixed interval Kalman filter based deconvolution routine written in C is employed. It uses a state...... resolution has been determined from the in-vivo liver image using the auto-covariance function. From the envelope of the estimated pulse the axial resolution at Full-Width-Half-Max is 0.581 mm corresponding to 1.13 l at 3 MHz. The algorithm increases the resolution to 0.116 mm or 0.227 l corresponding...... to a factor of 5.1. The basic pulse can be estimated in roughly 0.176 seconds on a single CPU core on an Intel i5 CPU running at 1.8 GHz. An in-vivo image consisting of 100 lines of 1600 samples can be processed in roughly 0.1 seconds making it possible to perform real-time deconvolution on ultrasound data...

  19. UDECON: deconvolution optimization software for restoring high-resolution records from pass-through paleomagnetic measurements

    Science.gov (United States)

    Xuan, Chuang; Oda, Hirokuni

    2015-11-01

    The rapid accumulation of continuous paleomagnetic and rock magnetic records acquired from pass-through measurements on superconducting rock magnetometers (SRM) has greatly contributed to our understanding of the paleomagnetic field and paleo-environment. Pass-through measurements are inevitably smoothed and altered by the convolution effect of SRM sensor response, and deconvolution is needed to restore high-resolution paleomagnetic and environmental signals. Although various deconvolution algorithms have been developed, the lack of easy-to-use software has hindered the practical application of deconvolution. Here, we present standalone graphical software UDECON as a convenient tool to perform optimized deconvolution for pass-through paleomagnetic measurements using the algorithm recently developed by Oda and Xuan (Geochem Geophys Geosyst 15:3907-3924, 2014). With the preparation of a format file, UDECON can directly read pass-through paleomagnetic measurement files collected at different laboratories. After the SRM sensor response is determined and loaded to the software, optimized deconvolution can be conducted using two different approaches (i.e., "Grid search" and "Simplex method") with adjustable initial values or ranges for smoothness, corrections of sample length, and shifts in measurement position. UDECON provides a suite of tools to view conveniently and check various types of original measurement and deconvolution data. Multiple steps of measurement and/or deconvolution data can be compared simultaneously to check the consistency and to guide further deconvolution optimization. Deconvolved data together with the loaded original measurement and SRM sensor response data can be saved and reloaded for further treatment in UDECON. Users can also export the optimized deconvolution data to a text file for analysis in other software.

  20. Method for the deconvolution of incompletely resolved CARS spectra in chemical dynamics experiments

    International Nuclear Information System (INIS)

    Anda, A.A.; Phillips, D.L.; Valentini, J.J.

    1986-01-01

    We describe a method for deconvoluting incompletely resolved CARS spectra to obtain quantum state population distributions. No particular form for the rotational and vibrational state distribution is assumed, the population of each quantum state is treated as an independent quantity. This method of analysis differs from previously developed approaches for the deconvolution of CARS spectra, all of which assume that the population distribution is Boltzmann, and thus are limited to the analysis of CARS spectra taken under conditions of thermal equilibrium. The method of analysis reported here has been developed to deconvolute CARS spectra of photofragments and chemical reaction products obtained in chemical dynamics experiments under nonequilibrium conditions. The deconvolution procedure has been incorporated into a computer code. The application of that code to the deconvolution of CARS spectra obtained for samples at thermal equilibrium and not at thermal equilibrium is reported. The method is accurate and computationally efficient

  1. Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning

    Science.gov (United States)

    Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C.

    2014-01-01

    Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

  2. The deconvolution of sputter-etching surface concentration measurements to determine impurity depth profiles

    International Nuclear Information System (INIS)

    Carter, G.; Katardjiev, I.V.; Nobes, M.J.

    1989-01-01

    The quasi-linear partial differential continuity equations that describe the evolution of the depth profiles and surface concentrations of marker atoms in kinematically equivalent systems undergoing sputtering, ion collection and atomic mixing are solved using the method of characteristics. It is shown how atomic mixing probabilities can be deduced from measurements of ion collection depth profiles with increasing ion fluence, and how this information can be used to predict surface concentration evolution. Even with this information, however, it is shown that it is not possible to deconvolute directly the surface concentration measurements to provide initial depth profiles, except when only ion collection and sputtering from the surface layer alone occur. It is demonstrated further that optimal recovery of initial concentration depth profiles could be ensured if the concentration-measuring analytical probe preferentially sampled depths near and at the maximum depth of bombardment-induced perturbations. (author)

  3. Point spread functions and deconvolution of ultrasonic images.

    Science.gov (United States)

    Dalitz, Christoph; Pohle-Fröhlich, Regina; Michalk, Thorsten

    2015-03-01

    This article investigates the restoration of ultrasonic pulse-echo C-scan images by means of deconvolution with a point spread function (PSF). The deconvolution concept from linear system theory (LST) is linked to the wave equation formulation of the imaging process, and an analytic formula for the PSF of planar transducers is derived. For this analytic expression, different numerical and analytic approximation schemes for evaluating the PSF are presented. By comparing simulated images with measured C-scan images, we demonstrate that the assumptions of LST in combination with our formula for the PSF are a good model for the pulse-echo imaging process. To reconstruct the object from a C-scan image, we compare different deconvolution schemes: the Wiener filter, the ForWaRD algorithm, and the Richardson-Lucy algorithm. The best results are obtained with the Richardson-Lucy algorithm with total variation regularization. For distances greater or equal twice the near field distance, our experiments show that the numerically computed PSF can be replaced with a simple closed analytic term based on a far field approximation.

  4. Is deconvolution applicable to renography?

    NARCIS (Netherlands)

    Kuyvenhoven, JD; Ham, H; Piepsz, A

    The feasibility of deconvolution depends on many factors, but the technique cannot provide accurate results if the maximal transit time (MaxTT) is longer than the duration of the acquisition. This study evaluated whether, on the basis of a 20 min renogram, it is possible to predict in which cases

  5. Time evolution of regional CT density changes in normal lung after IMRT for NSCLC

    International Nuclear Information System (INIS)

    Bernchou, Uffe; Schytte, Tine; Bertelsen, Anders; Bentzen, Søren M.; Hansen, Olfred; Brink, Carsten

    2013-01-01

    Purpose: This study investigates the clinical radiobiology of radiation induced lung disease in terms of regional computed tomography (CT) density changes following intensity modulated radiotherapy (IMRT) for non-small-cell lung cancer (NSCLC). Methods: A total of 387 follow-up CT scans in 131 NSCLC patients receiving IMRT to a prescribed dose of 60 or 66 Gy in 2 Gy fractions were analyzed. The dose-dependent temporal evolution of the density change was analyzed using a two-component model, a superposition of an early, transient component and a late, persistent component. Results: The CT density of healthy lung tissue was observed to increase significantly (p 12 months. Conclusions: The radiobiology of lung injury may be analyzed in terms of CT density change. The initial transient change in density is consistent with radiation pneumonitis, while the subsequent stabilization of the density is consistent with pulmonary fibrosis

  6. Filtering and deconvolution for bioluminescence imaging of small animals

    International Nuclear Information System (INIS)

    Akkoul, S.

    2010-01-01

    This thesis is devoted to analysis of bioluminescence images applied to the small animal. This kind of imaging modality is used in cancerology studies. Nevertheless, some problems are related to the diffusion and the absorption of the tissues of the light of internal bioluminescent sources. In addition, system noise and the cosmic rays noise are present. This influences the quality of the images and makes it difficult to analyze. The purpose of this thesis is to overcome these disturbing effects. We first have proposed an image formation model for the bioluminescence images. The processing chain is constituted by a filtering stage followed by a deconvolution stage. We have proposed a new median filter to suppress the random value impulsive noise which corrupts the acquired images; this filter represents the first block of the proposed chain. For the deconvolution stage, we have performed a comparative study of various deconvolution algorithms. It allowed us to choose a blind deconvolution algorithm initialized with the estimated point spread function of the acquisition system. At first, we have validated our global approach by comparing our obtained results with the ground truth. Through various clinical tests, we have shown that the processing chain allows a significant improvement of the spatial resolution and a better distinction of very close tumor sources, what represents considerable contribution for the users of bioluminescence images. (author)

  7. 4Pi microscopy deconvolution with a variable point-spread function.

    Science.gov (United States)

    Baddeley, David; Carl, Christian; Cremer, Christoph

    2006-09-20

    To remove the axial sidelobes from 4Pi images, deconvolution forms an integral part of 4Pi microscopy. As a result of its high axial resolution, the 4Pi point spread function (PSF) is particularly susceptible to imperfect optical conditions within the sample. This is typically observed as a shift in the position of the maxima under the PSF envelope. A significantly varying phase shift renders deconvolution procedures based on a spatially invariant PSF essentially useless. We present a technique for computing the forward transformation in the case of a varying phase at a computational expense of the same order of magnitude as that of the shift invariant case, a method for the estimation of PSF phase from an acquired image, and a deconvolution procedure built on these techniques.

  8. Deconvolution of neutron scattering data: a new computational approach

    International Nuclear Information System (INIS)

    Weese, J.; Hendricks, J.; Zorn, R.; Honerkamp, J.; Richter, D.

    1996-01-01

    In this paper we address the problem of reconstructing the scattering function S Q (E) from neutron spectroscopy data which represent a convolution of the former function with an instrument dependent resolution function. It is well known that this kind of deconvolution is an ill-posed problem. Therefore, we apply the Tikhonov regularization technique to get an estimate of S Q (E) from the data. Special features of the neutron spectroscopy data require modifications of the basic procedure, the most important one being a transformation to a non-linear problem. The method is tested by deconvolution of actual data from the IN6 time-of-flight spectrometer (resolution: 90 μeV) and simulated data. As a result the deconvolution is shown to be feasible down to an energy transfer of ∼100 μeV for this instrument without recognizable error and down to ∼20 μeV with 10% relative error. (orig.)

  9. Deconvolution of time series in the laboratory

    Science.gov (United States)

    John, Thomas; Pietschmann, Dirk; Becker, Volker; Wagner, Christian

    2016-10-01

    In this study, we present two practical applications of the deconvolution of time series in Fourier space. First, we reconstruct a filtered input signal of sound cards that has been heavily distorted by a built-in high-pass filter using a software approach. Using deconvolution, we can partially bypass the filter and extend the dynamic frequency range by two orders of magnitude. Second, we construct required input signals for a mechanical shaker in order to obtain arbitrary acceleration waveforms, referred to as feedforward control. For both situations, experimental and theoretical approaches are discussed to determine the system-dependent frequency response. Moreover, for the shaker, we propose a simple feedback loop as an extension to the feedforward control in order to handle nonlinearities of the system.

  10. Deconvolution using the complex cepstrum

    Energy Technology Data Exchange (ETDEWEB)

    Riley, H B

    1980-12-01

    The theory, description, and implementation of a generalized linear filtering system for the nonlinear filtering of convolved signals are presented. A detailed look at the problems and requirements associated with the deconvolution of signal components is undertaken. Related properties are also developed. A synthetic example is shown and is followed by an application using real seismic data. 29 figures.

  11. A method of PSF generation for 3D brightfield deconvolution.

    Science.gov (United States)

    Tadrous, P J

    2010-02-01

    This paper addresses the problem of 3D deconvolution of through focus widefield microscope datasets (Z-stacks). One of the most difficult stages in brightfield deconvolution is finding the point spread function. A theoretically calculated point spread function (called a 'synthetic PSF' in this paper) requires foreknowledge of many system parameters and still gives only approximate results. A point spread function measured from a sub-resolution bead suffers from low signal-to-noise ratio, compounded in the brightfield setting (by contrast to fluorescence) by absorptive, refractive and dispersal effects. This paper describes a method of point spread function estimation based on measurements of a Z-stack through a thin sample. This Z-stack is deconvolved by an idealized point spread function derived from the same Z-stack to yield a point spread function of high signal-to-noise ratio that is also inherently tailored to the imaging system. The theory is validated by a practical experiment comparing the non-blind 3D deconvolution of the yeast Saccharomyces cerevisiae with the point spread function generated using the method presented in this paper (called the 'extracted PSF') to a synthetic point spread function. Restoration of both high- and low-contrast brightfield structures is achieved with fewer artefacts using the extracted point spread function obtained with this method. Furthermore the deconvolution progresses further (more iterations are allowed before the error function reaches its nadir) with the extracted point spread function compared to the synthetic point spread function indicating that the extracted point spread function is a better fit to the brightfield deconvolution model than the synthetic point spread function.

  12. SU-F-T-478: Effect of Deconvolution in Analysis of Mega Voltage Photon Beam Profiles

    Energy Technology Data Exchange (ETDEWEB)

    Muthukumaran, M [Apollo Speciality Hospitals, Chennai, Tamil Nadu (India); Manigandan, D [Fortis Cancer Institute, Mohali, Punjab (India); Murali, V; Chitra, S; Ganapathy, K [Apollo Speciality Hospital, Chennai, Tamil Nadu (India); Vikraman, S [JAYPEE HOSPITAL- RADIATION ONCOLOGY, Noida, UTTAR PRADESH (India)

    2016-06-15

    Purpose: To study and compare the penumbra of 6 MV and 15 MV photon beam profiles after deconvoluting different volume ionization chambers. Methods: 0.125cc Semi-Flex chamber, Markus Chamber and PTW Farmer chamber were used to measure the in-plane and cross-plane profiles at 5 cm depth for 6 MV and 15 MV photons. The profiles were measured for various field sizes starting from 2×2 cm till 30×30 cm. PTW TBA scan software was used for the measurements and the “deconvolution” functionality in the software was used to remove the volume averaging effect due to finite volume of the chamber along lateral and longitudinal directions for all the ionization chambers. The predicted true profile was compared and the change in penumbra before and after deconvolution was studied. Results: After deconvoluting the penumbra decreased by 1 mm for field sizes ranging from 2 × 2 cm till 20 x20 cm. This is observed for along both lateral and longitudinal directions. However for field sizes from 20 × 20 till 30 ×30 cm the difference in penumbra was around 1.2 till 1.8 mm. This was observed for both 6 MV and 15 MV photon beams. The penumbra was always lesser in the deconvoluted profiles for all the ionization chambers involved in the study. The variation in difference in penumbral values were in the order of 0.1 till 0.3 mm between the deconvoluted profile along lateral and longitudinal directions for all the chambers under study. Deconvolution of the profiles along longitudinal direction for Farmer chamber was not good and is not comparable with other deconvoluted profiles. Conclusion: The results of the deconvoluted profiles for 0.125cc and Markus chamber was comparable and the deconvolution functionality can be used to overcome the volume averaging effect.

  13. Convex blind image deconvolution with inverse filtering

    Science.gov (United States)

    Lv, Xiao-Guang; Li, Fang; Zeng, Tieyong

    2018-03-01

    Blind image deconvolution is the process of estimating both the original image and the blur kernel from the degraded image with only partial or no information about degradation and the imaging system. It is a bilinear ill-posed inverse problem corresponding to the direct problem of convolution. Regularization methods are used to handle the ill-posedness of blind deconvolution and get meaningful solutions. In this paper, we investigate a convex regularized inverse filtering method for blind deconvolution of images. We assume that the support region of the blur object is known, as has been done in a few existing works. By studying the inverse filters of signal and image restoration problems, we observe the oscillation structure of the inverse filters. Inspired by the oscillation structure of the inverse filters, we propose to use the star norm to regularize the inverse filter. Meanwhile, we use the total variation to regularize the resulting image obtained by convolving the inverse filter with the degraded image. The proposed minimization model is shown to be convex. We employ the first-order primal-dual method for the solution of the proposed minimization model. Numerical examples for blind image restoration are given to show that the proposed method outperforms some existing methods in terms of peak signal-to-noise ratio (PSNR), structural similarity (SSIM), visual quality and time consumption.

  14. New Lagrange Multipliers for the Blind Adaptive Deconvolution Problem Applicable for the Noisy Case

    Directory of Open Access Journals (Sweden)

    Monika Pinchas

    2016-02-01

    Full Text Available Recently, a new blind adaptive deconvolution algorithm was proposed based on a new closed-form approximated expression for the conditional expectation (the expectation of the source input given the equalized or deconvolutional output where the output and input probability density function (pdf of the deconvolutional process were approximated with the maximum entropy density approximation technique. The Lagrange multipliers for the output pdf were set to those used for the input pdf. Although this new blind adaptive deconvolution method has been shown to have improved equalization performance compared to the maximum entropy blind adaptive deconvolution algorithm recently proposed by the same author, it is not applicable for the very noisy case. In this paper, we derive new Lagrange multipliers for the output and input pdfs, where the Lagrange multipliers related to the output pdf are a function of the channel noise power. Simulation results indicate that the newly obtained blind adaptive deconvolution algorithm using these new Lagrange multipliers is robust to signal-to-noise ratios (SNR, unlike the previously proposed method, and is applicable for the whole range of SNR down to 7 dB. In addition, we also obtain new closed-form approximated expressions for the conditional expectation and mean square error (MSE.

  15. Blind deconvolution using the similarity of multiscales regularization for infrared spectrum

    International Nuclear Information System (INIS)

    Huang, Tao; Liu, Hai; Zhang, Zhaoli; Liu, Sanyan; Liu, Tingting; Shen, Xiaoxuan; Zhang, Jianfeng; Zhang, Tianxu

    2015-01-01

    Band overlap and random noise exist widely when the spectra are captured using an infrared spectrometer, especially since the aging of instruments has become a serious problem. In this paper, via introducing the similarity of multiscales, a blind spectral deconvolution method is proposed. Considering that there is a similarity between latent spectra at different scales, it is used as prior knowledge to constrain the estimated latent spectrum similar to pre-scale to reduce artifacts which are produced from deconvolution. The experimental results indicate that the proposed method is able to obtain a better performance than state-of-the-art methods, and to obtain satisfying deconvolution results with fewer artifacts. The recovered infrared spectra can easily extract the spectral features and recognize unknown objects. (paper)

  16. Studing Regional Wave Source Time Functions Using A Massive Automated EGF Deconvolution Procedure

    Science.gov (United States)

    Xie, J. "; Schaff, D. P.

    2010-12-01

    Reliably estimated source time functions (STF) from high-frequency regional waveforms, such as Lg, Pn and Pg, provide important input for seismic source studies, explosion detection, and minimization of parameter trade-off in attenuation studies. The empirical Green’s function (EGF) method can be used for estimating STF, but it requires a strict recording condition. Waveforms from pairs of events that are similar in focal mechanism, but different in magnitude must be on-scale recorded on the same stations for the method to work. Searching for such waveforms can be very time consuming, particularly for regional waves that contain complex path effects and have reduced S/N ratios due to attenuation. We have developed a massive, automated procedure to conduct inter-event waveform deconvolution calculations from many candidate event pairs. The procedure automatically evaluates the “spikiness” of the deconvolutions by calculating their “sdc”, which is defined as the peak divided by the background value. The background value is calculated as the mean absolute value of the deconvolution, excluding 10 s around the source time function. When the sdc values are about 10 or higher, the deconvolutions are found to be sufficiently spiky (pulse-like), indicating similar path Green’s functions and good estimates of the STF. We have applied this automated procedure to Lg waves and full regional wavetrains from 989 M ≥ 5 events in and around China, calculating about a million deconvolutions. Of these we found about 2700 deconvolutions with sdc greater than 9, which, if having a sufficiently broad frequency band, can be used to estimate the STF of the larger events. We are currently refining our procedure, as well as the estimated STFs. We will infer the source scaling using the STFs. We will also explore the possibility that the deconvolution procedure could complement cross-correlation in a real time event-screening process.

  17. Seismic interferometry by multidimensional deconvolution as a means to compensate for anisotropic illumination

    Science.gov (United States)

    Wapenaar, K.; van der Neut, J.; Ruigrok, E.; Draganov, D.; Hunziker, J.; Slob, E.; Thorbecke, J.; Snieder, R.

    2008-12-01

    It is well-known that under specific conditions the crosscorrelation of wavefields observed at two receivers yields the impulse response between these receivers. This principle is known as 'Green's function retrieval' or 'seismic interferometry'. Recently it has been recognized that in many situations it can be advantageous to replace the correlation process by deconvolution. One of the advantages is that deconvolution compensates for the waveform emitted by the source; another advantage is that it is not necessary to assume that the medium is lossless. The approaches that have been developed to date employ a 1D deconvolution process. We propose a method for seismic interferometry by multidimensional deconvolution and show that under specific circumstances the method compensates for irregularities in the source distribution. This is an important difference with crosscorrelation methods, which rely on the condition that waves are equipartitioned. This condition is for example fulfilled when the sources are regularly distributed along a closed surface and the power spectra of the sources are identical. The proposed multidimensional deconvolution method compensates for anisotropic illumination, without requiring knowledge about the positions and the spectra of the sources.

  18. Deconvolution for the localization of sound sources using a circular microphone array

    DEFF Research Database (Denmark)

    Tiana Roig, Elisabet; Jacobsen, Finn

    2013-01-01

    During the last decade, the aeroacoustic community has examined various methods based on deconvolution to improve the visualization of acoustic fields scanned with planar sparse arrays of microphones. These methods assume that the beamforming map in an observation plane can be approximated by a c......-negative least squares, and the Richardson-Lucy. This investigation examines the matter with computer simulations and measurements....... that the beamformer's point-spread function is shift-invariant. This makes it possible to apply computationally efficient deconvolution algorithms that consist of spectral procedures in the entire region of interest, such as the deconvolution approach for the mapping of the acoustic sources 2, the Fourier-based non...

  19. Lineshape estimation for magnetic resonance spectroscopy (MRS) signals: self-deconvolution revisited

    International Nuclear Information System (INIS)

    Sima, D M; Garcia, M I Osorio; Poullet, J; Van Huffel, S; Suvichakorn, A; Antoine, J-P; Van Ormondt, D

    2009-01-01

    Magnetic resonance spectroscopy (MRS) is an effective diagnostic technique for monitoring biochemical changes in an organism. The lineshape of MRS signals can deviate from the theoretical Lorentzian lineshape due to inhomogeneities of the magnetic field applied to patients and to tissue heterogeneity. We call this deviation a distortion and study the self-deconvolution method for automatic estimation of the unknown lineshape distortion. The method is embedded within a time-domain metabolite quantitation algorithm for short-echo-time MRS signals. Monte Carlo simulations are used to analyze whether estimation of the unknown lineshape can improve the overall quantitation result. We use a signal with eight metabolic components inspired by typical MRS signals from healthy human brain and allocate special attention to the step of denoising and spike removal in the self-deconvolution technique. To this end, we compare several modeling techniques, based on complex damped exponentials, splines and wavelets. Our results show that self-deconvolution performs well, provided that some unavoidable hyper-parameters of the denoising methods are well chosen. Comparison of the first and last iterations shows an improvement when considering iterations instead of a single step of self-deconvolution

  20. Deconvolution of astronomical images using SOR with adaptive relaxation.

    Science.gov (United States)

    Vorontsov, S V; Strakhov, V N; Jefferies, S M; Borelli, K J

    2011-07-04

    We address the potential performance of the successive overrelaxation technique (SOR) in image deconvolution, focusing our attention on the restoration of astronomical images distorted by atmospheric turbulence. SOR is the classical Gauss-Seidel iteration, supplemented with relaxation. As indicated by earlier work, the convergence properties of SOR, and its ultimate performance in the deconvolution of blurred and noisy images, can be made competitive to other iterative techniques, including conjugate gradients, by a proper choice of the relaxation parameter. The question of how to choose the relaxation parameter, however, remained open, and in the practical work one had to rely on experimentation. In this paper, using constructive (rather than exact) arguments, we suggest a simple strategy for choosing the relaxation parameter and for updating its value in consecutive iterations to optimize the performance of the SOR algorithm (and its positivity-constrained version, +SOR) at finite iteration counts. We suggest an extension of the algorithm to the notoriously difficult problem of "blind" deconvolution, where both the true object and the point-spread function have to be recovered from the blurred image. We report the results of numerical inversions with artificial and real data, where the algorithm is compared with techniques based on conjugate gradients. In all of our experiments +SOR provides the highest quality results. In addition +SOR is found to be able to detect moderately small changes in the true object between separate data frames: an important quality for multi-frame blind deconvolution where stationarity of the object is a necesessity.

  1. Example-driven manifold priors for image deconvolution.

    Science.gov (United States)

    Ni, Jie; Turaga, Pavan; Patel, Vishal M; Chellappa, Rama

    2011-11-01

    Image restoration methods that exploit prior information about images to be estimated have been extensively studied, typically using the Bayesian framework. In this paper, we consider the role of prior knowledge of the object class in the form of a patch manifold to address the deconvolution problem. Specifically, we incorporate unlabeled image data of the object class, say natural images, in the form of a patch-manifold prior for the object class. The manifold prior is implicitly estimated from the given unlabeled data. We show how the patch-manifold prior effectively exploits the available sample class data for regularizing the deblurring problem. Furthermore, we derive a generalized cross-validation (GCV) function to automatically determine the regularization parameter at each iteration without explicitly knowing the noise variance. Extensive experiments show that this method performs better than many competitive image deconvolution methods.

  2. Simulation Study of Effects of the Blind Deconvolution on Ultrasound Image

    Science.gov (United States)

    He, Xingwu; You, Junchen

    2018-03-01

    Ultrasonic image restoration is an essential subject in Medical Ultrasound Imaging. However, without enough and precise system knowledge, some traditional image restoration methods based on the system prior knowledge often fail to improve the image quality. In this paper, we use the simulated ultrasound image to find the effectiveness of the blind deconvolution method for ultrasound image restoration. Experimental results demonstrate that the blind deconvolution method can be applied to the ultrasound image restoration and achieve the satisfactory restoration results without the precise prior knowledge, compared with the traditional image restoration method. And with the inaccurate small initial PSF, the results shows blind deconvolution could improve the overall image quality of ultrasound images, like much better SNR and image resolution, and also show the time consumption of these methods. it has no significant increasing on GPU platform.

  3. Blind Deconvolution With Model Discrepancies

    Czech Academy of Sciences Publication Activity Database

    Kotera, Jan; Šmídl, Václav; Šroubek, Filip

    2017-01-01

    Roč. 26, č. 5 (2017), s. 2533-2544 ISSN 1057-7149 R&D Projects: GA ČR GA13-29225S; GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : blind deconvolution * variational Bayes * automatic relevance determination Subject RIV: JD - Computer Applications, Robotics OBOR OECD: Computer hardware and architecture Impact factor: 4.828, year: 2016 http://library.utia.cas.cz/separaty/2017/ZOI/kotera-0474858.pdf

  4. Comparison of Deconvolution Filters for Photoacoustic Tomography.

    Directory of Open Access Journals (Sweden)

    Dominique Van de Sompel

    Full Text Available In this work, we compare the merits of three temporal data deconvolution methods for use in the filtered backprojection algorithm for photoacoustic tomography (PAT. We evaluate the standard Fourier division technique, the Wiener deconvolution filter, and a Tikhonov L-2 norm regularized matrix inversion method. Our experiments were carried out on subjects of various appearances, namely a pencil lead, two man-made phantoms, an in vivo subcutaneous mouse tumor model, and a perfused and excised mouse brain. All subjects were scanned using an imaging system with a rotatable hemispherical bowl, into which 128 ultrasound transducer elements were embedded in a spiral pattern. We characterized the frequency response of each deconvolution method, compared the final image quality achieved by each deconvolution technique, and evaluated each method's robustness to noise. The frequency response was quantified by measuring the accuracy with which each filter recovered the ideal flat frequency spectrum of an experimentally measured impulse response. Image quality under the various scenarios was quantified by computing noise versus resolution curves for a point source phantom, as well as the full width at half maximum (FWHM and contrast-to-noise ratio (CNR of selected image features such as dots and linear structures in additional imaging subjects. It was found that the Tikhonov filter yielded the most accurate balance of lower and higher frequency content (as measured by comparing the spectra of deconvolved impulse response signals to the ideal flat frequency spectrum, achieved a competitive image resolution and contrast-to-noise ratio, and yielded the greatest robustness to noise. While the Wiener filter achieved a similar image resolution, it tended to underrepresent the lower frequency content of the deconvolved signals, and hence of the reconstructed images after backprojection. In addition, its robustness to noise was poorer than that of the Tikhonov

  5. Motion correction of PET brain images through deconvolution: II. Practical implementation and algorithm optimization

    Science.gov (United States)

    Raghunath, N.; Faber, T. L.; Suryanarayanan, S.; Votaw, J. R.

    2009-02-01

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. When patient motion is known, deconvolution methods can be used to correct the reconstructed image and reduce motion blur. This paper describes the implementation and optimization of an iterative deconvolution method that uses an ordered subset approach to make it practical and clinically viable. We performed ten separate FDG PET scans using the Hoffman brain phantom and simultaneously measured its motion using the Polaris Vicra tracking system (Northern Digital Inc., Ontario, Canada). The feasibility and effectiveness of the technique was studied by performing scans with different motion and deconvolution parameters. Deconvolution resulted in visually better images and significant improvement as quantified by the Universal Quality Index (UQI) and contrast measures. Finally, the technique was applied to human studies to demonstrate marked improvement. Thus, the deconvolution technique presented here appears promising as a valid alternative to existing motion correction methods for PET. It has the potential for deblurring an image from any modality if the causative motion is known and its effect can be represented in a system matrix.

  6. Motion correction of PET brain images through deconvolution: II. Practical implementation and algorithm optimization

    International Nuclear Information System (INIS)

    Raghunath, N; Faber, T L; Suryanarayanan, S; Votaw, J R

    2009-01-01

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. When patient motion is known, deconvolution methods can be used to correct the reconstructed image and reduce motion blur. This paper describes the implementation and optimization of an iterative deconvolution method that uses an ordered subset approach to make it practical and clinically viable. We performed ten separate FDG PET scans using the Hoffman brain phantom and simultaneously measured its motion using the Polaris Vicra tracking system (Northern Digital Inc., Ontario, Canada). The feasibility and effectiveness of the technique was studied by performing scans with different motion and deconvolution parameters. Deconvolution resulted in visually better images and significant improvement as quantified by the Universal Quality Index (UQI) and contrast measures. Finally, the technique was applied to human studies to demonstrate marked improvement. Thus, the deconvolution technique presented here appears promising as a valid alternative to existing motion correction methods for PET. It has the potential for deblurring an image from any modality if the causative motion is known and its effect can be represented in a system matrix.

  7. Motion correction of PET brain images through deconvolution: II. Practical implementation and algorithm optimization

    Energy Technology Data Exchange (ETDEWEB)

    Raghunath, N; Faber, T L; Suryanarayanan, S; Votaw, J R [Department of Radiology, Emory University Hospital, 1364 Clifton Road, N.E. Atlanta, GA 30322 (United States)], E-mail: John.Votaw@Emory.edu

    2009-02-07

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. When patient motion is known, deconvolution methods can be used to correct the reconstructed image and reduce motion blur. This paper describes the implementation and optimization of an iterative deconvolution method that uses an ordered subset approach to make it practical and clinically viable. We performed ten separate FDG PET scans using the Hoffman brain phantom and simultaneously measured its motion using the Polaris Vicra tracking system (Northern Digital Inc., Ontario, Canada). The feasibility and effectiveness of the technique was studied by performing scans with different motion and deconvolution parameters. Deconvolution resulted in visually better images and significant improvement as quantified by the Universal Quality Index (UQI) and contrast measures. Finally, the technique was applied to human studies to demonstrate marked improvement. Thus, the deconvolution technique presented here appears promising as a valid alternative to existing motion correction methods for PET. It has the potential for deblurring an image from any modality if the causative motion is known and its effect can be represented in a system matrix.

  8. Deconvolution of the vestibular evoked myogenic potential.

    Science.gov (United States)

    Lütkenhöner, Bernd; Basel, Türker

    2012-02-07

    The vestibular evoked myogenic potential (VEMP) and the associated variance modulation can be understood by a convolution model. Two functions of time are incorporated into the model: the motor unit action potential (MUAP) of an average motor unit, and the temporal modulation of the MUAP rate of all contributing motor units, briefly called rate modulation. The latter is the function of interest, whereas the MUAP acts as a filter that distorts the information contained in the measured data. Here, it is shown how to recover the rate modulation by undoing the filtering using a deconvolution approach. The key aspects of our deconvolution algorithm are as follows: (1) the rate modulation is described in terms of just a few parameters; (2) the MUAP is calculated by Wiener deconvolution of the VEMP with the rate modulation; (3) the model parameters are optimized using a figure-of-merit function where the most important term quantifies the difference between measured and model-predicted variance modulation. The effectiveness of the algorithm is demonstrated with simulated data. An analysis of real data confirms the view that there are basically two components, which roughly correspond to the waves p13-n23 and n34-p44 of the VEMP. The rate modulation corresponding to the first, inhibitory component is much stronger than that corresponding to the second, excitatory component. But the latter is more extended so that the two modulations have almost the same equivalent rectangular duration. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Waveform inversion with exponential damping using a deconvolution-based objective function

    KAUST Repository

    Choi, Yun Seok

    2016-09-06

    The lack of low frequency components in seismic data usually leads full waveform inversion into the local minima of its objective function. An exponential damping of the data, on the other hand, generates artificial low frequencies, which can be used to admit long wavelength updates for waveform inversion. Another feature of exponential damping is that the energy of each trace also exponentially decreases with source-receiver offset, where the leastsquare misfit function does not work well. Thus, we propose a deconvolution-based objective function for waveform inversion with an exponential damping. Since the deconvolution filter includes a division process, it can properly address the unbalanced energy levels of the individual traces of the damped wavefield. Numerical examples demonstrate that our proposed FWI based on the deconvolution filter can generate a convergent long wavelength structure from the artificial low frequency components coming from an exponential damping.

  10. Image processing of globular clusters - Simulation for deconvolution tests (GlencoeSim)

    Science.gov (United States)

    Blazek, Martin; Pata, Petr

    2016-10-01

    This paper presents an algorithmic approach for efficiency tests of deconvolution algorithms in astronomic image processing. Due to the existence of noise in astronomical data there is no certainty that a mathematically exact result of stellar deconvolution exists and iterative or other methods such as aperture or PSF fitting photometry are commonly used. Iterative methods are important namely in the case of crowded fields (e.g., globular clusters). For tests of the efficiency of these iterative methods on various stellar fields, information about the real fluxes of the sources is essential. For this purpose a simulator of artificial images with crowded stellar fields provides initial information on source fluxes for a robust statistical comparison of various deconvolution methods. The "GlencoeSim" simulator and the algorithms presented in this paper consider various settings of Point-Spread Functions, noise types and spatial distributions, with the aim of producing as realistic an astronomical optical stellar image as possible.

  11. Deconvolution of In Vivo Ultrasound B-Mode Images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Stage, Bjarne; Mathorne, Jan

    1993-01-01

    An algorithm for deconvolution of medical ultrasound images is presented. The procedure involves estimation of the basic one-dimensional ultrasound pulse, determining the ratio of the covariance of the noise to the covariance of the reflection signal, and finally deconvolution of the rf signal from...... the transducer. Using pulse and covariance estimators makes the approach self-calibrating, as all parameters for the procedure are estimated from the patient under investigation. An example of use on a clinical, in-vivo image is given. A 2 × 2 cm region of the portal vein in a liver is deconvolved. An increase...... in axial resolution by a factor of 2.4 is obtained. The procedure can also be applied to whole images, when it is ensured that the rf signal is properly measured. A method for doing that is outlined....

  12. Designing a stable feedback control system for blind image deconvolution.

    Science.gov (United States)

    Cheng, Shichao; Liu, Risheng; Fan, Xin; Luo, Zhongxuan

    2018-05-01

    Blind image deconvolution is one of the main low-level vision problems with wide applications. Many previous works manually design regularization to simultaneously estimate the latent sharp image and the blur kernel under maximum a posterior framework. However, it has been demonstrated that such joint estimation strategies may lead to the undesired trivial solution. In this paper, we present a novel perspective, using a stable feedback control system, to simulate the latent sharp image propagation. The controller of our system consists of regularization and guidance, which decide the sparsity and sharp features of latent image, respectively. Furthermore, the formational model of blind image is introduced into the feedback process to avoid the image restoration deviating from the stable point. The stability analysis of the system indicates the latent image propagation in blind deconvolution task can be efficiently estimated and controlled by cues and priors. Thus the kernel estimation used for image restoration becomes more precision. Experimental results show that our system is effective on image propagation, and can perform favorably against the state-of-the-art blind image deconvolution methods on different benchmark image sets and special blurred images. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Application of deconvolution interferometry with both Hi-net and KiK-net data

    Science.gov (United States)

    Nakata, N.

    2013-12-01

    Application of deconvolution interferometry to wavefields observed by KiK-net, a strong-motion recording network in Japan, is useful for estimating wave velocities and S-wave splitting in the near surface. Using this technique, for example, Nakata and Snieder (2011, 2012) found changed in velocities caused by Tohoku-Oki earthquake in Japan. At the location of the borehole accelerometer of each KiK-net station, a velocity sensor is also installed as a part of a high-sensitivity seismograph network (Hi-net). I present a technique that uses both Hi-net and KiK-net records for computing deconvolution interferometry. The deconvolved waveform obtained from the combination of Hi-net and KiK-net data is similar to the waveform computed from KiK-net data only, which indicates that one can use Hi-net wavefields for deconvolution interferometry. Because Hi-net records have a high signal-to-noise ratio (S/N) and high dynamic resolution, the S/N and the quality of amplitude and phase of deconvolved waveforms can be improved with Hi-net data. These advantages are especially important for short-time moving-window seismic interferometry and deconvolution interferometry using later coda waves.

  14. Optimising delineation accuracy of tumours in PET for radiotherapy planning using blind deconvolution

    International Nuclear Information System (INIS)

    Guvenis, A.; Koc, A.

    2015-01-01

    Positron emission tomography (PET) imaging has been proven to be useful in radiotherapy planning for the determination of the metabolically active regions of tumours. Delineation of tumours, however, is a difficult task in part due to high noise levels and the partial volume effects originating mainly from the low camera resolution. The goal of this work is to study the effect of blind deconvolution on tumour volume estimation accuracy for different computer-aided contouring methods. The blind deconvolution estimates the point spread function (PSF) of the imaging system in an iterative manner in a way that the likelihood of the given image being the convolution output is maximised. In this way, the PSF of the imaging system does not need to be known. Data were obtained from a NEMA NU-2 IQ-based phantom with a GE DSTE-16 PET/CT scanner. The artificial tumour diameters were 13, 17, 22, 28 and 37 mm with a target/background ratio of 4:1. The tumours were delineated before and after blind deconvolution. Student's two-tailed paired t-test showed a significant decrease in volume estimation error ( p < 0.001) when blind deconvolution was used in conjunction with computer-aided delineation methods. A manual delineation confirmation demonstrated an improvement from 26 to 16 % for the artificial tumour of size 37 mm while an improvement from 57 to 15 % was noted for the small tumour of 13 mm. Therefore, it can be concluded that blind deconvolution of reconstructed PET images may be used to increase tumour delineation accuracy. (authors)

  15. Histogram deconvolution - An aid to automated classifiers

    Science.gov (United States)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  16. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.

    Science.gov (United States)

    Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G

    2012-05-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters.

  17. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program

    International Nuclear Information System (INIS)

    Afouxenidis, D.; Polymeris, G. S.; Tsirliganis, N. C.; Kitis, G.

    2012-01-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the Glow Curve Analysis Intercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters. (authors)

  18. PERT: A Method for Expression Deconvolution of Human Blood Samples from Varied Microenvironmental and Developmental Conditions

    Science.gov (United States)

    Csaszar, Elizabeth; Yu, Mei; Morris, Quaid; Zandstra, Peter W.

    2012-01-01

    The cellular composition of heterogeneous samples can be predicted using an expression deconvolution algorithm to decompose their gene expression profiles based on pre-defined, reference gene expression profiles of the constituent populations in these samples. However, the expression profiles of the actual constituent populations are often perturbed from those of the reference profiles due to gene expression changes in cells associated with microenvironmental or developmental effects. Existing deconvolution algorithms do not account for these changes and give incorrect results when benchmarked against those measured by well-established flow cytometry, even after batch correction was applied. We introduce PERT, a new probabilistic expression deconvolution method that detects and accounts for a shared, multiplicative perturbation in the reference profiles when performing expression deconvolution. We applied PERT and three other state-of-the-art expression deconvolution methods to predict cell frequencies within heterogeneous human blood samples that were collected under several conditions (uncultured mono-nucleated and lineage-depleted cells, and culture-derived lineage-depleted cells). Only PERT's predicted proportions of the constituent populations matched those assigned by flow cytometry. Genes associated with cell cycle processes were highly enriched among those with the largest predicted expression changes between the cultured and uncultured conditions. We anticipate that PERT will be widely applicable to expression deconvolution strategies that use profiles from reference populations that vary from the corresponding constituent populations in cellular state but not cellular phenotypic identity. PMID:23284283

  19. Deconvolution of shift-variant broadening for Compton scatter imaging

    International Nuclear Information System (INIS)

    Evans, Brian L.; Martin, Jeffrey B.; Roggemann, Michael C.

    1999-01-01

    A technique is presented for deconvolving shift-variant Doppler broadening of singly Compton scattered gamma rays from their recorded energy distribution. Doppler broadening is important in Compton scatter imaging techniques employing gamma rays with energies below roughly 100 keV. The deconvolution unfolds an approximation to the angular distribution of scattered photons from their recorded energy distribution in the presence of statistical noise and background counts. Two unfolding methods are presented, one based on a least-squares algorithm and one based on a maximum likelihood algorithm. Angular distributions unfolded from measurements made on small scattering targets show less evidence of Compton broadening. This deconvolution is shown to improve the quality of filtered backprojection images in multiplexed Compton scatter tomography. Improved sharpness and contrast are evident in the images constructed from unfolded signals

  20. Optimisation of digital noise filtering in the deconvolution of ultrafast kinetic data

    International Nuclear Information System (INIS)

    Banyasz, Akos; Dancs, Gabor; Keszei, Erno

    2005-01-01

    Ultrafast kinetic measurements in the sub-picosecond time range are always distorted by a convolution with the instrumental response function. To restore the undistorted signal, deconvolution of the measured data is needed, which can be done via inverse filtering, using Fourier transforms, if experimental noise can be successfully filtered. However, in the case of experimental data when no underlying physical model is available, no quantitative criteria are known to find an optimal noise filter which would remove excessive noise without distorting the signal itself. In this paper, we analyse the Fourier transforms used during deconvolution and describe a graphical method to find such optimal noise filters. Comparison of graphically found optima to those found by quantitative criteria in the case of known synthetic kinetic signals shows the reliability of the proposed method to get fairly good deconvolved kinetic curves. A few examples of deconvolution of real-life experimental curves with the graphical noise filter optimisation are also shown

  1. Combined failure acoustical diagnosis based on improved frequency domain blind deconvolution

    International Nuclear Information System (INIS)

    Pan, Nan; Wu, Xing; Chi, YiLin; Liu, Xiaoqin; Liu, Chang

    2012-01-01

    According to gear box combined failure extraction in complex sound field, an acoustic fault detection method based on improved frequency domain blind deconvolution was proposed. Follow the frequency-domain blind deconvolution flow, the morphological filtering was firstly used to extract modulation features embedded in the observed signals, then the CFPA algorithm was employed to do complex-domain blind separation, finally the J-Divergence of spectrum was employed as distance measure to resolve the permutation. Experiments using real machine sound signals was carried out. The result demonstrate this algorithm can be efficiently applied to gear box combined failure detection in practice.

  2. Study of the Van Cittert and Gold iterative methods of deconvolution and their application in the deconvolution of experimental spectra of positron annihilation

    International Nuclear Information System (INIS)

    Bandzuch, P.; Morhac, M.; Kristiak, J.

    1997-01-01

    The study of deconvolution by Van Cittert and Gold iterative algorithms and their use in the processing of experimental spectra of Doppler broadening of the annihilation line in positron annihilation measurement is described. By comparing results from both algorithms it was observed that the Gold algorithm was able to eliminate linear instability of the measuring equipment if one uses the 1274 keV 22 Na peak, that was measured simultaneously with the annihilation peak, for deconvolution of annihilation peak 511 keV. This permitted the measurement of small changes of the annihilation peak (e.g. S-parameter) with high confidence. The dependence of γ-ray-like peak parameters on the number of iterations and the ability of these algorithms to distinguish a γ-ray doublet with different intensities and positions were also studied. (orig.)

  3. Euler deconvolution and spectral analysis of regional aeromagnetic ...

    African Journals Online (AJOL)

    Existing regional aeromagnetic data from the south-central Zimbabwe craton has been analysed using 3D Euler deconvolution and spectral analysis to obtain quantitative information on the geological units and structures for depth constraints on the geotectonic interpretation of the region. The Euler solution maps confirm ...

  4. New deconvolution method for microscopic images based on the continuous Gaussian radial basis function interpolation model.

    Science.gov (United States)

    Chen, Zhaoxue; Chen, Hao

    2014-01-01

    A deconvolution method based on the Gaussian radial basis function (GRBF) interpolation is proposed. Both the original image and Gaussian point spread function are expressed as the same continuous GRBF model, thus image degradation is simplified as convolution of two continuous Gaussian functions, and image deconvolution is converted to calculate the weighted coefficients of two-dimensional control points. Compared with Wiener filter and Lucy-Richardson algorithm, the GRBF method has an obvious advantage in the quality of restored images. In order to overcome such a defect of long-time computing, the method of graphic processing unit multithreading or increasing space interval of control points is adopted, respectively, to speed up the implementation of GRBF method. The experiments show that based on the continuous GRBF model, the image deconvolution can be efficiently implemented by the method, which also has a considerable reference value for the study of three-dimensional microscopic image deconvolution.

  5. Advanced Source Deconvolution Methods for Compton Telescopes

    Science.gov (United States)

    Zoglauer, Andreas

    The next generation of space telescopes utilizing Compton scattering for astrophysical observations is destined to one day unravel the mysteries behind Galactic nucleosynthesis, to determine the origin of the positron annihilation excess near the Galactic center, and to uncover the hidden emission mechanisms behind gamma-ray bursts. Besides astrophysics, Compton telescopes are establishing themselves in heliophysics, planetary sciences, medical imaging, accelerator physics, and environmental monitoring. Since the COMPTEL days, great advances in the achievable energy and position resolution were possible, creating an extremely vast, but also extremely sparsely sampled data space. Unfortunately, the optimum way to analyze the data from the next generation of Compton telescopes has not yet been found, which can retrieve all source parameters (location, spectrum, polarization, flux) and achieves the best possible resolution and sensitivity at the same time. This is especially important for all sciences objectives looking at the inner Galaxy: the large amount of expected sources, the high background (internal and Galactic diffuse emission), and the limited angular resolution, make it the most taxing case for data analysis. In general, two key challenges exist: First, what are the best data space representations to answer the specific science questions? Second, what is the best way to deconvolve the data to fully retrieve the source parameters? For modern Compton telescopes, the existing data space representations can either correctly reconstruct the absolute flux (binned mode) or achieve the best possible resolution (list-mode), both together were not possible up to now. Here we propose to develop a two-stage hybrid reconstruction method which combines the best aspects of both. Using a proof-of-concept implementation we can for the first time show that it is possible to alternate during each deconvolution step between a binned-mode approach to get the flux right and a

  6. Preliminary study of some problems in deconvolution

    International Nuclear Information System (INIS)

    Gilly, Louis; Garderet, Philippe; Lecomte, Alain; Max, Jacques

    1975-07-01

    After defining convolution operator, its physical meaning and principal properties are given. Several deconvolution methods are analysed: method of Fourier Transform and iterative numerical methods. Positivity of measured magnitude has been object of a new Yvon Biraud's method. Analytic prolongation of Fourier transform applied to unknow fonction, has been studied by M. Jean-Paul Sheidecker. An important bibliography is given [fr

  7. Iterative choice of the optimal regularization parameter in TV image deconvolution

    International Nuclear Information System (INIS)

    Sixou, B; Toma, A; Peyrin, F; Denis, L

    2013-01-01

    We present an iterative method for choosing the optimal regularization parameter for the linear inverse problem of Total Variation image deconvolution. This approach is based on the Morozov discrepancy principle and on an exponential model function for the data term. The Total Variation image deconvolution is performed with the Alternating Direction Method of Multipliers (ADMM). With a smoothed l 2 norm, the differentiability of the value of the Lagrangian at the saddle point can be shown and an approximate model function obtained. The choice of the optimal parameter can be refined with a Newton method. The efficiency of the method is demonstrated on a blurred and noisy bone CT cross section

  8. Gamma-ray spectra deconvolution by maximum-entropy methods

    International Nuclear Information System (INIS)

    Los Arcos, J.M.

    1996-01-01

    A maximum-entropy method which includes the response of detectors and the statistical fluctuations of spectra is described and applied to the deconvolution of γ-ray spectra. Resolution enhancement of 25% can be reached for experimental peaks and up to 50% for simulated ones, while the intensities are conserved within 1-2%. (orig.)

  9. Triggerless Readout with Time and Amplitude Reconstruction of Event Based on Deconvolution Algorithm

    International Nuclear Information System (INIS)

    Kulis, S.; Idzik, M.

    2011-01-01

    In future linear colliders like CLIC, where the period between the bunch crossings is in a sub-nanoseconds range ( 500 ps), an appropriate detection technique with triggerless signal processing is needed. In this work we discuss a technique, based on deconvolution algorithm, suitable for time and amplitude reconstruction of an event. In the implemented method the output of a relatively slow shaper (many bunch crossing periods) is sampled and digitalised in an ADC and then the deconvolution procedure is applied to digital data. The time of an event can be found with a precision of few percent of sampling time. The signal to noise ratio is only slightly decreased after passing through the deconvolution filter. The performed theoretical and Monte Carlo studies are confirmed by the results of preliminary measurements obtained with the dedicated system comprising of radiation source, silicon sensor, front-end electronics, ADC and further digital processing implemented on a PC computer. (author)

  10. ALFITeX. A new code for the deconvolution of complex alpha-particle spectra

    International Nuclear Information System (INIS)

    Caro Marroyo, B.; Martin Sanchez, A.; Jurado Vargas, M.

    2013-01-01

    A new code for the deconvolution of complex alpha-particle spectra has been developed. The ALFITeX code is written in Visual Basic for Microsoft Office Excel 2010 spreadsheets, incorporating several features aimed at making it a fast, robust and useful tool with a user-friendly interface. The deconvolution procedure is based on the Levenberg-Marquardt algorithm, with the curve fitting the experimental data being the mathematical function formed by the convolution of a Gaussian with two left-handed exponentials in the low-energy-tail region. The code also includes the capability of fitting a possible constant background contribution. The application of the singular value decomposition method for matrix inversion permits the fit of any kind of alpha-particle spectra, even those presenting singularities or an ill-conditioned curvature matrix. ALFITeX has been checked with its application to the deconvolution and the calculation of the alpha-particle emission probabilities of 239 Pu, 241 Am and 235 U. (author)

  11. An alternating minimization method for blind deconvolution from Poisson data

    International Nuclear Information System (INIS)

    Prato, Marco; La Camera, Andrea; Bonettini, Silvia

    2014-01-01

    Blind deconvolution is a particularly challenging inverse problem since information on both the desired target and the acquisition system have to be inferred from the measured data. When the collected data are affected by Poisson noise, this problem is typically addressed by the minimization of the Kullback-Leibler divergence, in which the unknowns are sought in particular feasible sets depending on the a priori information provided by the specific application. If these sets are separated, then the resulting constrained minimization problem can be addressed with an inexact alternating strategy. In this paper we apply this optimization tool to the problem of reconstructing astronomical images from adaptive optics systems, and we show that the proposed approach succeeds in providing very good results in the blind deconvolution of nondense stellar clusters

  12. Primary variables influencing generation of earthquake motions by a deconvolution process

    International Nuclear Information System (INIS)

    Idriss, I.M.; Akky, M.R.

    1979-01-01

    In many engineering problems, the analysis of potential earthquake response of a soil deposit, a soil structure or a soil-foundation-structure system requires the knowledge of earthquake ground motions at some depth below the level at which the motions are recorded, specified, or estimated. A process by which such motions are commonly calculated is termed a deconvolution process. This paper presents the results of a parametric study which was conducted to examine the accuracy, convergence, and stability of a frequency used deconvolution process and the significant parameters that may influence the output of this process. Parameters studied in included included: soil profile characteristics, input motion characteristics, level of input motion, and frequency cut-off. (orig.)

  13. Deconvolution of continuous paleomagnetic data from pass-through magnetometer: A new algorithm to restore geomagnetic and environmental information based on realistic optimization

    Science.gov (United States)

    Oda, Hirokuni; Xuan, Chuang

    2014-10-01

    development of pass-through superconducting rock magnetometers (SRM) has greatly promoted collection of paleomagnetic data from continuous long-core samples. The output of pass-through measurement is smoothed and distorted due to convolution of magnetization with the magnetometer sensor response. Although several studies could restore high-resolution paleomagnetic signal through deconvolution of pass-through measurement, difficulties in accurately measuring the magnetometer sensor response have hindered the application of deconvolution. We acquired reliable sensor response of an SRM at the Oregon State University based on repeated measurements of a precisely fabricated magnetic point source. In addition, we present an improved deconvolution algorithm based on Akaike's Bayesian Information Criterion (ABIC) minimization, incorporating new parameters to account for errors in sample measurement position and length. The new algorithm was tested using synthetic data constructed by convolving "true" paleomagnetic signal containing an "excursion" with the sensor response. Realistic noise was added to the synthetic measurement using Monte Carlo method based on measurement noise distribution acquired from 200 repeated measurements of a u-channel sample. Deconvolution of 1000 synthetic measurements with realistic noise closely resembles the "true" magnetization, and successfully restored fine-scale magnetization variations including the "excursion." Our analyses show that inaccuracy in sample measurement position and length significantly affects deconvolution estimation, and can be resolved using the new deconvolution algorithm. Optimized deconvolution of 20 repeated measurements of a u-channel sample yielded highly consistent deconvolution results and estimates of error in sample measurement position and length, demonstrating the reliability of the new deconvolution algorithm for real pass-through measurements.

  14. MetaUniDec: High-Throughput Deconvolution of Native Mass Spectra

    Science.gov (United States)

    Reid, Deseree J.; Diesing, Jessica M.; Miller, Matthew A.; Perry, Scott M.; Wales, Jessica A.; Montfort, William R.; Marty, Michael T.

    2018-04-01

    The expansion of native mass spectrometry (MS) methods for both academic and industrial applications has created a substantial need for analysis of large native MS datasets. Existing software tools are poorly suited for high-throughput deconvolution of native electrospray mass spectra from intact proteins and protein complexes. The UniDec Bayesian deconvolution algorithm is uniquely well suited for high-throughput analysis due to its speed and robustness but was previously tailored towards individual spectra. Here, we optimized UniDec for deconvolution, analysis, and visualization of large data sets. This new module, MetaUniDec, centers around a hierarchical data format 5 (HDF5) format for storing datasets that significantly improves speed, portability, and file size. It also includes code optimizations to improve speed and a new graphical user interface for visualization, interaction, and analysis of data. To demonstrate the utility of MetaUniDec, we applied the software to analyze automated collision voltage ramps with a small bacterial heme protein and large lipoprotein nanodiscs. Upon increasing collisional activation, bacterial heme-nitric oxide/oxygen binding (H-NOX) protein shows a discrete loss of bound heme, and nanodiscs show a continuous loss of lipids and charge. By using MetaUniDec to track changes in peak area or mass as a function of collision voltage, we explore the energetic profile of collisional activation in an ultra-high mass range Orbitrap mass spectrometer. [Figure not available: see fulltext.

  15. XDGMM: eXtreme Deconvolution Gaussian Mixture Modeling

    Science.gov (United States)

    Holoien, Thomas W.-S.; Marshall, Philip J.; Wechsler, Risa H.

    2017-08-01

    XDGMM uses Gaussian mixtures to do density estimation of noisy, heterogenous, and incomplete data using extreme deconvolution (XD) algorithms which is compatible with the scikit-learn machine learning methods. It implements both the astroML and Bovy et al. (2011) algorithms, and extends the BaseEstimator class from scikit-learn so that cross-validation methods work. It allows the user to produce a conditioned model if values of some parameters are known.

  16. Stain Deconvolution Using Statistical Analysis of Multi-Resolution Stain Colour Representation.

    Directory of Open Access Journals (Sweden)

    Najah Alsubaie

    Full Text Available Stain colour estimation is a prominent factor of the analysis pipeline in most of histology image processing algorithms. Providing a reliable and efficient stain colour deconvolution approach is fundamental for robust algorithm. In this paper, we propose a novel method for stain colour deconvolution of histology images. This approach statistically analyses the multi-resolutional representation of the image to separate the independent observations out of the correlated ones. We then estimate the stain mixing matrix using filtered uncorrelated data. We conducted an extensive set of experiments to compare the proposed method to the recent state of the art methods and demonstrate the robustness of this approach using three different datasets of scanned slides, prepared in different labs using different scanners.

  17. Retinal image restoration by means of blind deconvolution

    Czech Academy of Sciences Publication Activity Database

    Marrugo, A.; Šorel, Michal; Šroubek, Filip; Millan, M.

    2011-01-01

    Roč. 16, č. 11 (2011), 116016-1-116016-11 ISSN 1083-3668 R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : blind deconvolution * image restoration * retinal image * deblurring Subject RIV: JD - Computer Applications, Robotics Impact factor: 3.157, year: 2011 http://library.utia.cas.cz/separaty/2011/ZOI/sorel-0366061.pdf

  18. Thermoluminescence glow-curve deconvolution functions for mixed order of kinetics and continuous trap distribution

    International Nuclear Information System (INIS)

    Kitis, G.; Gomez-Ros, J.M.

    2000-01-01

    New glow-curve deconvolution functions are proposed for mixed order of kinetics and for continuous-trap distribution. The only free parameters of the presented glow-curve deconvolution functions are the maximum peak intensity (I m ) and the maximum peak temperature (T m ), which can be estimated experimentally together with the activation energy (E). The other free parameter is the activation energy range (ΔE) for the case of the continuous-trap distribution or a constant α for the case of mixed-order kinetics

  19. Improvement in volume estimation from confocal sections after image deconvolution

    Czech Academy of Sciences Publication Activity Database

    Difato, Francesco; Mazzone, F.; Scaglione, S.; Fato, M.; Beltrame, F.; Kubínová, Lucie; Janáček, Jiří; Ramoino, P.; Vicidomini, G.; Diaspro, A.

    2004-01-01

    Roč. 64, č. 2 (2004), s. 151-155 ISSN 1059-910X Institutional research plan: CEZ:AV0Z5011922 Keywords : confocal microscopy * image deconvolution * point spread function Subject RIV: EA - Cell Biology Impact factor: 2.609, year: 2004

  20. Deconvolution of EPR spectral lines with an approximate method

    International Nuclear Information System (INIS)

    Jimenez D, H.; Cabral P, A.

    1990-10-01

    A recently reported approximation expression to deconvolution Lorentzian-Gaussian spectral lines. with small Gaussian contribution, is applied to study an EPR line shape. The potassium-ammonium solution line reported in the literature by other authors was used and the results are compared with those obtained by employing a precise method. (Author)

  1. Numerical deconvolution to enhance sharpness and contrast of portal images for radiotherapy patient positioning verification

    International Nuclear Information System (INIS)

    Looe, H.K.; Uphoff, Y.; Poppe, B.; Carl von Ossietzky Univ., Oldenburg; Harder, D.; Willborn, K.C.

    2012-01-01

    The quality of megavoltage clinical portal images is impaired by physical and geometrical effects. This image blurring can be corrected by a fast numerical two-dimensional (2D) deconvolution algorithm implemented in the electronic portal image device. We present some clinical examples of deconvolved portal images and evaluate the clinical advantages achieved by the improved sharpness and contrast. The principle of numerical 2D image deconvolution and the enhancement of sharpness and contrast thereby achieved are shortly explained. The key concept is the convolution kernel K(x,y), the mathematical equivalent of the smearing or blurring of a picture, and the computer-based elimination of this influence. Enhancements of sharpness and contrast were observed in all clinical portal images investigated. The images of fine bone structures were restored. The identification of organ boundaries and anatomical landmarks was improved, thereby permitting a more accurate comparison with the x-ray simulator radiographs. The visibility of prostate gold markers is also shown to be enhanced by deconvolution. The blurring effects of clinical portal images were eliminated by a numerical deconvolution algorithm that leads to better image sharpness and contrast. The fast algorithm permits the image blurring correction to be performed in real time, so that patient positioning verification with increased accuracy can be achieved in clinical practice. (orig.)

  2. Numerical deconvolution to enhance sharpness and contrast of portal images for radiotherapy patient positioning verification

    Energy Technology Data Exchange (ETDEWEB)

    Looe, H.K.; Uphoff, Y.; Poppe, B. [Pius Hospital, Oldenburg (Germany). Clinic for Radiation Therapy; Carl von Ossietzky Univ., Oldenburg (Germany). WG Medical Radiation Physics; Harder, D. [Georg August Univ., Goettingen (Germany). Medical Physics and Biophysics; Willborn, K.C. [Pius Hospital, Oldenburg (Germany). Clinic for Radiation Therapy

    2012-02-15

    The quality of megavoltage clinical portal images is impaired by physical and geometrical effects. This image blurring can be corrected by a fast numerical two-dimensional (2D) deconvolution algorithm implemented in the electronic portal image device. We present some clinical examples of deconvolved portal images and evaluate the clinical advantages achieved by the improved sharpness and contrast. The principle of numerical 2D image deconvolution and the enhancement of sharpness and contrast thereby achieved are shortly explained. The key concept is the convolution kernel K(x,y), the mathematical equivalent of the smearing or blurring of a picture, and the computer-based elimination of this influence. Enhancements of sharpness and contrast were observed in all clinical portal images investigated. The images of fine bone structures were restored. The identification of organ boundaries and anatomical landmarks was improved, thereby permitting a more accurate comparison with the x-ray simulator radiographs. The visibility of prostate gold markers is also shown to be enhanced by deconvolution. The blurring effects of clinical portal images were eliminated by a numerical deconvolution algorithm that leads to better image sharpness and contrast. The fast algorithm permits the image blurring correction to be performed in real time, so that patient positioning verification with increased accuracy can be achieved in clinical practice. (orig.)

  3. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.; Biegalski, S.R.; Haas, D.A.

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  4. Deconvolution of Complex 1D NMR Spectra Using Objective Model Selection.

    Directory of Open Access Journals (Sweden)

    Travis S Hughes

    Full Text Available Fluorine (19F NMR has emerged as a useful tool for characterization of slow dynamics in 19F-labeled proteins. One-dimensional (1D 19F NMR spectra of proteins can be broad, irregular and complex, due to exchange of probe nuclei between distinct electrostatic environments; and therefore cannot be deconvoluted and analyzed in an objective way using currently available software. We have developed a Python-based deconvolution program, decon1d, which uses Bayesian information criteria (BIC to objectively determine which model (number of peaks would most likely produce the experimentally obtained data. The method also allows for fitting of intermediate exchange spectra, which is not supported by current software in the absence of a specific kinetic model. In current methods, determination of the deconvolution model best supported by the data is done manually through comparison of residual error values, which can be time consuming and requires model selection by the user. In contrast, the BIC method used by decond1d provides a quantitative method for model comparison that penalizes for model complexity helping to prevent over-fitting of the data and allows identification of the most parsimonious model. The decon1d program is freely available as a downloadable Python script at the project website (https://github.com/hughests/decon1d/.

  5. Retinal image restoration by means of blind deconvolution

    Science.gov (United States)

    Marrugo, Andrés G.; Šorel, Michal; Šroubek, Filip; Millán, María S.

    2011-11-01

    Retinal imaging plays a key role in the diagnosis and management of ophthalmologic disorders, such as diabetic retinopathy, glaucoma, and age-related macular degeneration. Because of the acquisition process, retinal images often suffer from blurring and uneven illumination. This problem may seriously affect disease diagnosis and progression assessment. Here we present a method for color retinal image restoration by means of multichannel blind deconvolution. The method is applied to a pair of retinal images acquired within a lapse of time, ranging from several minutes to months. It consists of a series of preprocessing steps to adjust the images so they comply with the considered degradation model, followed by the estimation of the point-spread function and, ultimately, image deconvolution. The preprocessing is mainly composed of image registration, uneven illumination compensation, and segmentation of areas with structural changes. In addition, we have developed a procedure for the detection and visualization of structural changes. This enables the identification of subtle developments in the retina not caused by variation in illumination or blur. The method was tested on synthetic and real images. Encouraging experimental results show that the method is capable of significant restoration of degraded retinal images.

  6. Bayesian Semiparametric Density Deconvolution in the Presence of Conditionally Heteroscedastic Measurement Errors

    KAUST Repository

    Sarkar, Abhra

    2014-10-02

    We consider the problem of estimating the density of a random variable when precise measurements on the variable are not available, but replicated proxies contaminated with measurement error are available for sufficiently many subjects. Under the assumption of additive measurement errors this reduces to a problem of deconvolution of densities. Deconvolution methods often make restrictive and unrealistic assumptions about the density of interest and the distribution of measurement errors, e.g., normality and homoscedasticity and thus independence from the variable of interest. This article relaxes these assumptions and introduces novel Bayesian semiparametric methodology based on Dirichlet process mixture models for robust deconvolution of densities in the presence of conditionally heteroscedastic measurement errors. In particular, the models can adapt to asymmetry, heavy tails and multimodality. In simulation experiments, we show that our methods vastly outperform a recent Bayesian approach based on estimating the densities via mixtures of splines. We apply our methods to data from nutritional epidemiology. Even in the special case when the measurement errors are homoscedastic, our methodology is novel and dominates other methods that have been proposed previously. Additional simulation results, instructions on getting access to the data set and R programs implementing our methods are included as part of online supplemental materials.

  7. Bayesian Semiparametric Density Deconvolution in the Presence of Conditionally Heteroscedastic Measurement Errors

    KAUST Repository

    Sarkar, Abhra; Mallick, Bani K.; Staudenmayer, John; Pati, Debdeep; Carroll, Raymond J.

    2014-01-01

    We consider the problem of estimating the density of a random variable when precise measurements on the variable are not available, but replicated proxies contaminated with measurement error are available for sufficiently many subjects. Under the assumption of additive measurement errors this reduces to a problem of deconvolution of densities. Deconvolution methods often make restrictive and unrealistic assumptions about the density of interest and the distribution of measurement errors, e.g., normality and homoscedasticity and thus independence from the variable of interest. This article relaxes these assumptions and introduces novel Bayesian semiparametric methodology based on Dirichlet process mixture models for robust deconvolution of densities in the presence of conditionally heteroscedastic measurement errors. In particular, the models can adapt to asymmetry, heavy tails and multimodality. In simulation experiments, we show that our methods vastly outperform a recent Bayesian approach based on estimating the densities via mixtures of splines. We apply our methods to data from nutritional epidemiology. Even in the special case when the measurement errors are homoscedastic, our methodology is novel and dominates other methods that have been proposed previously. Additional simulation results, instructions on getting access to the data set and R programs implementing our methods are included as part of online supplemental materials.

  8. Sparse spectral deconvolution algorithm for noncartesian MR spectroscopic imaging.

    Science.gov (United States)

    Bhave, Sampada; Eslami, Ramin; Jacob, Mathews

    2014-02-01

    To minimize line shape distortions and spectral leakage artifacts in MR spectroscopic imaging (MRSI). A spatially and spectrally regularized non-Cartesian MRSI algorithm that uses the line shape distortion priors, estimated from water reference data, to deconvolve the spectra is introduced. Sparse spectral regularization is used to minimize noise amplification associated with deconvolution. A spiral MRSI sequence that heavily oversamples the central k-space regions is used to acquire the MRSI data. The spatial regularization term uses the spatial supports of brain and extracranial fat regions to recover the metabolite spectra and nuisance signals at two different resolutions. Specifically, the nuisance signals are recovered at the maximum resolution to minimize spectral leakage, while the point spread functions of metabolites are controlled to obtain acceptable signal-to-noise ratio. The comparisons of the algorithm against Tikhonov regularized reconstructions demonstrates considerably reduced line-shape distortions and improved metabolite maps. The proposed sparsity constrained spectral deconvolution scheme is effective in minimizing the line-shape distortions. The dual resolution reconstruction scheme is capable of minimizing spectral leakage artifacts. Copyright © 2013 Wiley Periodicals, Inc.

  9. Blind image deconvolution methods and convergence

    CERN Document Server

    Chaudhuri, Subhasis; Rameshan, Renu

    2014-01-01

    Blind deconvolution is a classical image processing problem which has been investigated by a large number of researchers over the last four decades. The purpose of this monograph is not to propose yet another method for blind image restoration. Rather the basic issue of deconvolvability has been explored from a theoretical view point. Some authors claim very good results while quite a few claim that blind restoration does not work. The authors clearly detail when such methods are expected to work and when they will not. In order to avoid the assumptions needed for convergence analysis in the

  10. Understanding AuNP interaction with low-generation PAMAM dendrimers: a CIELab and deconvolution study

    International Nuclear Information System (INIS)

    Jimenez-Ruiz, A.; Carnerero, J. M.; Castillo, P. M.; Prado-Gotor, R.

    2017-01-01

    Low-generation polyamidoamine (PAMAM) dendrimers are known to adsorb on the surface of gold nanoparticles (AuNPs) causing aggregation and color changes. In this paper, a thorough study of this affinity using absorption spectroscopy, colorimetric, and emission methods has been carried out. Results show that, for citrate-capped gold nanoparticles, interaction with the dendrimer is not only of an electrostatic character but instead occurs, at least in part, through the dendrimer’s uncharged internal amino groups. The possibilities of the CIELab chromaticity system parameters’ evolution have also been explored in order to quantify dendrimer interaction with the red-colored nanoparticles. By measuring and quantifying 17 nm citrate-capped AuNP color changes, which are strongly dependant on their aggregation state, binding free energies are obtained for the first time for these systems. Results are confirmed via an alternate fitting method which makes use of deconvolution parameters from absorbance spectra. Binding free energies obtained through the use of both means are in good agreement with each other.

  11. Understanding AuNP interaction with low-generation PAMAM dendrimers: a CIELab and deconvolution study

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez-Ruiz, A., E-mail: ailjimrui@alum.us.es; Carnerero, J. M.; Castillo, P. M.; Prado-Gotor, R., E-mail: pradogotor@us.es [University of Seville, The Department of Physical Chemistry (Spain)

    2017-01-15

    Low-generation polyamidoamine (PAMAM) dendrimers are known to adsorb on the surface of gold nanoparticles (AuNPs) causing aggregation and color changes. In this paper, a thorough study of this affinity using absorption spectroscopy, colorimetric, and emission methods has been carried out. Results show that, for citrate-capped gold nanoparticles, interaction with the dendrimer is not only of an electrostatic character but instead occurs, at least in part, through the dendrimer’s uncharged internal amino groups. The possibilities of the CIELab chromaticity system parameters’ evolution have also been explored in order to quantify dendrimer interaction with the red-colored nanoparticles. By measuring and quantifying 17 nm citrate-capped AuNP color changes, which are strongly dependant on their aggregation state, binding free energies are obtained for the first time for these systems. Results are confirmed via an alternate fitting method which makes use of deconvolution parameters from absorbance spectra. Binding free energies obtained through the use of both means are in good agreement with each other.

  12. Robust Multichannel Blind Deconvolution via Fast Alternating Minimization

    Czech Academy of Sciences Publication Activity Database

    Šroubek, Filip; Milanfar, P.

    2012-01-01

    Roč. 21, č. 4 (2012), s. 1687-1700 ISSN 1057-7149 R&D Projects: GA MŠk 1M0572; GA ČR GAP103/11/1552; GA MV VG20102013064 Institutional research plan: CEZ:AV0Z10750506 Keywords : blind deconvolution * augmented Lagrangian * sparse representation Subject RIV: JD - Computer Applications, Robotics Impact factor: 3.199, year: 2012 http://library.utia.cas.cz/separaty/2012/ZOI/sroubek-0376080.pdf

  13. Automated processing for proton spectroscopic imaging using water reference deconvolution.

    Science.gov (United States)

    Maudsley, A A; Wu, Z; Meyerhoff, D J; Weiner, M W

    1994-06-01

    Automated formation of MR spectroscopic images (MRSI) is necessary before routine application of these methods is possible for in vivo studies; however, this task is complicated by the presence of spatially dependent instrumental distortions and the complex nature of the MR spectrum. A data processing method is presented for completely automated formation of in vivo proton spectroscopic images, and applied for analysis of human brain metabolites. This procedure uses the water reference deconvolution method (G. A. Morris, J. Magn. Reson. 80, 547(1988)) to correct for line shape distortions caused by instrumental and sample characteristics, followed by parametric spectral analysis. Results for automated image formation were found to compare favorably with operator dependent spectral integration methods. While the water reference deconvolution processing was found to provide good correction of spatially dependent resonance frequency shifts, it was found to be susceptible to errors for correction of line shape distortions. These occur due to differences between the water reference and the metabolite distributions.

  14. Deconvolution of Doppler-broadened positron annihilation lineshapes by fast Fourier transformation using a simple automatic filtering technique

    International Nuclear Information System (INIS)

    Britton, D.T.; Bentvelsen, P.; Vries, J. de; Veen, A. van

    1988-01-01

    A deconvolution scheme for digital lineshapes using fast Fourier transforms and a filter based on background subtraction in Fourier space has been developed. In tests on synthetic data this has been shown to give optimum deconvolution without prior inspection of the Fourier spectrum. Although offering significant improvements on the raw data, deconvolution is shown to be limited. The contribution of the resolution function is substantially reduced but not eliminated completely and unphysical oscillations are introduced into the lineshape. The method is further tested on measurements of the lineshape for positron annihilation in single crystal copper at the relatively poor resolution of 1.7 keV at 512 keV. A two-component fit is possible yielding component widths in agreement with previous measurements. (orig.)

  15. Chemometric deconvolution of gas chromatographic unresolved conjugated linoleic acid isomers triplet in milk samples.

    Science.gov (United States)

    Blasko, Jaroslav; Kubinec, Róbert; Ostrovský, Ivan; Pavlíková, Eva; Krupcík, Ján; Soják, Ladislav

    2009-04-03

    A generally known problem of GC separation of trans-7;cis-9; cis-9,trans-11; and trans-8,cis-10 CLA (conjugated linoleic acid) isomers was studied by GC-MS on 100m capillary column coated with cyanopropyl silicone phase at isothermal column temperatures in a range of 140-170 degrees C. The resolution of these CLA isomers obtained at given conditions was not high enough for direct quantitative analysis, but it was, however, sufficient for the determination of their peak areas by commercial deconvolution software. Resolution factors of overlapped CLA isomers determined by the separation of a model CLA mixture prepared by mixing of a commercial CLA mixture and CLA isomer fraction obtained by the HPLC semi-preparative separation of milk fatty acids methyl esters were used to validate the deconvolution procedure. Developed deconvolution procedure allowed the determination of the content of studied CLA isomers in ewes' and cows' milk samples, where dominant isomer cis-9,trans-11 is eluted between two small isomers trans-7,cis-9 and trans-8,cis-10 (in the ratio up to 1:100).

  16. Dereplication of Natural Products Using GC-TOF Mass Spectrometry: Improved Metabolite Identification By Spectral Deconvolution Ratio Analysis

    Directory of Open Access Journals (Sweden)

    Fausto Carnevale Neto

    2016-09-01

    Full Text Available Dereplication based on hyphenated techniques has been extensively applied in plant metabolomics, avoiding re-isolation of known natural products. However, due to the complex nature of biological samples and their large concentration range, dereplication requires the use of chemometric tools to comprehensively extract information from the acquired data. In this work we developed a reliable GC-MS-based method for the identification of non-targeted plant metabolites by combining the Ratio Analysis of Mass Spectrometry deconvolution tool (RAMSY with Automated Mass Spectral Deconvolution and Identification System software (AMDIS. Plants species from Solanaceae, Chrysobalanaceae and Euphorbiaceae were selected as model systems due to their molecular diversity, ethnopharmacological potential and economical value. The samples were analyzed by GC-MS after methoximation and silylation reactions. Dereplication initiated with the use of a factorial design of experiments to determine the best AMDIS configuration for each sample, considering linear retention indices and mass spectral data. A heuristic factor (CDF, compound detection factor was developed and applied to the AMDIS results in order to decrease the false-positive rates. Despite the enhancement in deconvolution and peak identification, the empirical AMDIS method was not able to fully deconvolute all GC-peaks, leading to low MF values and/or missing metabolites. RAMSY was applied as a complementary deconvolution method to AMDIS to peaks exhibiting substantial overlap, resulting in recovery of low-intensity co-eluted ions. The results from this combination of optimized AMDIS with RAMSY attested to the ability of this approach as an improved dereplication method for complex biological samples such as plant extracts.

  17. Inter-source seismic interferometry by multidimensional deconvolution (MDD) for borehole sources

    NARCIS (Netherlands)

    Liu, Y.; Wapenaar, C.P.A.; Romdhane, A.

    2014-01-01

    Seismic interferometry (SI) is usually implemented by crosscorrelation (CC) to retrieve the impulse response between pairs of receiver positions. An alternative approach by multidimensional deconvolution (MDD) has been developed and shown in various studies the potential to suppress artifacts due to

  18. Blind Deconvolution of Anisoplanatic Images Collected by a Partially Coherent Imaging System

    National Research Council Canada - National Science Library

    MacDonald, Adam

    2004-01-01

    ... have limited emissivity or reflectivity. This research proposes a novel blind deconvolution algorithm that is based on a maximum a posteriori Bayesian estimator constructed upon a physically based statistical model for the intensity...

  19. A new deconvolution approach to robust fluence for intensity modulation under geometrical uncertainty

    Science.gov (United States)

    Zhang, Pengcheng; De Crevoisier, Renaud; Simon, Antoine; Haigron, Pascal; Coatrieux, Jean-Louis; Li, Baosheng; Shu, Huazhong

    2013-09-01

    This work addresses random geometrical uncertainties that are intrinsically observed in radiation therapy by means of a new deconvolution method combining a series expansion and a Butterworth filter. The method efficiently suppresses high-frequency components by discarding the higher order terms of the series expansion and then filtering out deviations on the field edges. An additional approximation is made in order to set the fluence values outside the field to zero in the robust profiles. This method is compared to the deconvolution kernel method for a regular 2D fluence map, a real intensity-modulated radiation therapy field, and a prostate case. The results show that accuracy is improved while fulfilling clinical planning requirements.

  20. A new deconvolution approach to robust fluence for intensity modulation under geometrical uncertainty

    International Nuclear Information System (INIS)

    Zhang Pengcheng; Coatrieux, Jean-Louis; Shu Huazhong; De Crevoisier, Renaud; Simon, Antoine; Haigron, Pascal; Li Baosheng

    2013-01-01

    This work addresses random geometrical uncertainties that are intrinsically observed in radiation therapy by means of a new deconvolution method combining a series expansion and a Butterworth filter. The method efficiently suppresses high-frequency components by discarding the higher order terms of the series expansion and then filtering out deviations on the field edges. An additional approximation is made in order to set the fluence values outside the field to zero in the robust profiles. This method is compared to the deconvolution kernel method for a regular 2D fluence map, a real intensity-modulated radiation therapy field, and a prostate case. The results show that accuracy is improved while fulfilling clinical planning requirements. (paper)

  1. Resolution improvement of ultrasonic echography methods in non destructive testing by adaptative deconvolution

    International Nuclear Information System (INIS)

    Vivet, L.

    1989-01-01

    The ultrasonic echography has a lot of advantages which make it attractive for nondestructive testing. But the important acoustic energy useful to go through very attenuating materials can be got only with resonant translators, that is a limit for the resolution on measured echograms. This resolution can be improved by deconvolution. But this method is a problem for austenitic steel. Here is developed a method of time deconvolution which allows to take in account the characteristics of the wave. A first step of phase correction and a second step of spectral equalization which gives back the spectral contents of ideal reflectivity. The two steps use fast Kalman filters which reduce the cost of the method

  2. Constrained variable projection method for blind deconvolution

    International Nuclear Information System (INIS)

    Cornelio, A; Piccolomini, E Loli; Nagy, J G

    2012-01-01

    This paper is focused on the solution of the blind deconvolution problem, here modeled as a separable nonlinear least squares problem. The well known ill-posedness, both on recovering the blurring operator and the true image, makes the problem really difficult to handle. We show that, by imposing appropriate constraints on the variables and with well chosen regularization parameters, it is possible to obtain an objective function that is fairly well behaved. Hence, the resulting nonlinear minimization problem can be effectively solved by classical methods, such as the Gauss-Newton algorithm.

  3. Direct imaging of phase objects enables conventional deconvolution in bright field light microscopy.

    Directory of Open Access Journals (Sweden)

    Carmen Noemí Hernández Candia

    Full Text Available In transmitted optical microscopy, absorption structure and phase structure of the specimen determine the three-dimensional intensity distribution of the image. The elementary impulse responses of the bright field microscope therefore consist of separate absorptive and phase components, precluding general application of linear, conventional deconvolution processing methods to improve image contrast and resolution. However, conventional deconvolution can be applied in the case of pure phase (or pure absorptive objects if the corresponding phase (or absorptive impulse responses of the microscope are known. In this work, we present direct measurements of the phase point- and line-spread functions of a high-aperture microscope operating in transmitted bright field. Polystyrene nanoparticles and microtubules (biological polymer filaments serve as the pure phase point and line objects, respectively, that are imaged with high contrast and low noise using standard microscopy plus digital image processing. Our experimental results agree with a proposed model for the response functions, and confirm previous theoretical predictions. Finally, we use the measured phase point-spread function to apply conventional deconvolution on the bright field images of living, unstained bacteria, resulting in improved definition of cell boundaries and sub-cellular features. These developments demonstrate practical application of standard restoration methods to improve imaging of phase objects such as cells in transmitted light microscopy.

  4. Data-driven haemodynamic response function extraction using Fourier-wavelet regularised deconvolution

    NARCIS (Netherlands)

    Wink, Alle Meije; Hoogduin, Hans; Roerdink, Jos B.T.M.

    2008-01-01

    Background: We present a simple, data-driven method to extract haemodynamic response functions (HRF) from functional magnetic resonance imaging (fMRI) time series, based on the Fourier-wavelet regularised deconvolution (ForWaRD) technique. HRF data are required for many fMRI applications, such as

  5. Data-driven haemodynamic response function extraction using Fourier-wavelet regularised deconvolution

    NARCIS (Netherlands)

    Wink, Alle Meije; Hoogduin, Hans; Roerdink, Jos B.T.M.

    2010-01-01

    Background: We present a simple, data-driven method to extract haemodynamic response functions (HRF) from functional magnetic resonance imaging (fMRI) time series, based on the Fourier-wavelet regularised deconvolution (ForWaRD) technique. HRF data are required for many fMRI applications, such as

  6. The Small-scale Structure of Photospheric Convection Retrieved by a Deconvolution Technique Applied to Hinode /SP Data

    Energy Technology Data Exchange (ETDEWEB)

    Oba, T. [SOKENDAI (The Graduate University for Advanced Studies), 3-1-1 Yoshinodai, Chuo-ku, Sagamihara, Kanagawa 252–5210 (Japan); Riethmüller, T. L.; Solanki, S. K. [Max-Planck-Institut für Sonnensystemforschung (MPS), Justus-von-Liebig-Weg 3, D-37077 Göttingen (Germany); Iida, Y. [Department of Science and Technology/Kwansei Gakuin University, Gakuen 2-1, Sanda, Hyogo, 669–1337 Japan (Japan); Quintero Noda, C.; Shimizu, T. [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo-ku, Sagamihara, Kanagawa 252–5210 (Japan)

    2017-11-01

    Solar granules are bright patterns surrounded by dark channels, called intergranular lanes, in the solar photosphere and are a manifestation of overshooting convection. Observational studies generally find stronger upflows in granules and weaker downflows in intergranular lanes. This trend is, however, inconsistent with the results of numerical simulations in which downflows are stronger than upflows through the joint action of gravitational acceleration/deceleration and pressure gradients. One cause of this discrepancy is the image degradation caused by optical distortion and light diffraction and scattering that takes place in an imaging instrument. We apply a deconvolution technique to Hinode /SP data in an attempt to recover the original solar scene. Our results show a significant enhancement in both the convective upflows and downflows but particularly for the latter. After deconvolution, the up- and downflows reach maximum amplitudes of −3.0 km s{sup −1} and +3.0 km s{sup −1} at an average geometrical height of roughly 50 km, respectively. We found that the velocity distributions after deconvolution match those derived from numerical simulations. After deconvolution, the net LOS velocity averaged over the whole field of view lies close to zero as expected in a rough sense from mass balance.

  7. A soft double regularization approach to parametric blind image deconvolution.

    Science.gov (United States)

    Chen, Li; Yap, Kim-Hui

    2005-05-01

    This paper proposes a blind image deconvolution scheme based on soft integration of parametric blur structures. Conventional blind image deconvolution methods encounter a difficult dilemma of either imposing stringent and inflexible preconditions on the problem formulation or experiencing poor restoration results due to lack of information. This paper attempts to address this issue by assessing the relevance of parametric blur information, and incorporating the knowledge into the parametric double regularization (PDR) scheme. The PDR method assumes that the actual blur satisfies up to a certain degree of parametric structure, as there are many well-known parametric blurs in practical applications. Further, it can be tailored flexibly to include other blur types if some prior parametric knowledge of the blur is available. A manifold soft parametric modeling technique is proposed to generate the blur manifolds, and estimate the fuzzy blur structure. The PDR scheme involves the development of the meaningful cost function, the estimation of blur support and structure, and the optimization of the cost function. Experimental results show that it is effective in restoring degraded images under different environments.

  8. Isotope pattern deconvolution as a tool to study iron metabolism in plants.

    Science.gov (United States)

    Rodríguez-Castrillón, José Angel; Moldovan, Mariella; García Alonso, J Ignacio; Lucena, Juan José; García-Tomé, Maria Luisa; Hernández-Apaolaza, Lourdes

    2008-01-01

    Isotope pattern deconvolution is a mathematical technique for isolating distinct isotope signatures from mixtures of natural abundance and enriched tracers. In iron metabolism studies measurement of all four isotopes of the element by high-resolution multicollector or collision cell ICP-MS allows the determination of the tracer/tracee ratio with simultaneous internal mass bias correction and lower uncertainties. This technique was applied here for the first time to study iron uptake by cucumber plants using 57Fe-enriched iron chelates of the o,o and o,p isomers of ethylenediaminedi(o-hydroxyphenylacetic) acid (EDDHA) and ethylenediamine tetraacetic acid (EDTA). Samples of root, stem, leaves, and xylem sap, after exposure of the cucumber plants to the mentioned 57Fe chelates, were collected, dried, and digested using nitric acid. The isotopic composition of iron in the samples was measured by ICP-MS using a high-resolution multicollector instrument. Mass bias correction was computed using both a natural abundance iron standard and by internal correction using isotope pattern deconvolution. It was observed that, for plants with low 57Fe enrichment, isotope pattern deconvolution provided lower tracer/tracee ratio uncertainties than the traditional method applying external mass bias correction. The total amount of the element in the plants was determined by isotope dilution analysis, using a collision cell quadrupole ICP-MS instrument, after addition of 57Fe or natural abundance Fe in a known amount which depended on the isotopic composition of the sample.

  9. Isotope pattern deconvolution as a tool to study iron metabolism in plants

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez-Castrillon, Jose A.; Moldovan, Mariella; Garcia Alonso, J.I. [University of Oviedo, Department of Physical and Analytical Chemistry, Oviedo (Spain); Lucena, Juan J.; Garcia-Tome, Maria L.; Hernandez-Apaolaza, Lourdes [Autonoma University of Madrid, Department of Agricultural Chemistry, Madrid (Spain)

    2008-01-15

    Isotope pattern deconvolution is a mathematical technique for isolating distinct isotope signatures from mixtures of natural abundance and enriched tracers. In iron metabolism studies measurement of all four isotopes of the element by high-resolution multicollector or collision cell ICP-MS allows the determination of the tracer/tracee ratio with simultaneous internal mass bias correction and lower uncertainties. This technique was applied here for the first time to study iron uptake by cucumber plants using {sup 57}Fe-enriched iron chelates of the o,o and o,p isomers of ethylenediaminedi(o-hydroxyphenylacetic) acid (EDDHA) and ethylenediamine tetraacetic acid (EDTA). Samples of root, stem, leaves, and xylem sap, after exposure of the cucumber plants to the mentioned {sup 57}Fe chelates, were collected, dried, and digested using nitric acid. The isotopic composition of iron in the samples was measured by ICP-MS using a high-resolution multicollector instrument. Mass bias correction was computed using both a natural abundance iron standard and by internal correction using isotope pattern deconvolution. It was observed that, for plants with low {sup 57}Fe enrichment, isotope pattern deconvolution provided lower tracer/tracee ratio uncertainties than the traditional method applying external mass bias correction. The total amount of the element in the plants was determined by isotope dilution analysis, using a collision cell quadrupole ICP-MS instrument, after addition of {sup 57}Fe or natural abundance Fe in a known amount which depended on the isotopic composition of the sample. (orig.)

  10. Resolution enhancement for ultrasonic echographic technique in non destructive testing with an adaptive deconvolution method

    International Nuclear Information System (INIS)

    Vivet, L.

    1989-01-01

    The ultrasonic echographic technique has specific advantages which makes it essential in a lot of Non Destructive Testing (NDT) investigations. However, the high acoustic power necessary to propagate through highly attenuating media can only be transmitted by resonant transducers, which induces severe limitations of the resolution on the received echograms. This resolution may be improved with deconvolution methods. But one-dimensional deconvolution methods come up against problems in non destructive testing when the investigated medium is highly anisotropic and inhomogeneous (i.e. austenitic steel). Numerous deconvolution techniques are well documented in the NDT literature. But they often come from other application fields (biomedical engineering, geophysics) and we show they do not apply well to specific NDT problems: frequency-dependent attenuation and non-minimum phase of the emitted wavelet. We therefore introduce a new time-domain approach which takes into account the wavelet features. Our method solves the deconvolution problem as an estimation one and is performed in two steps: (i) A phase correction step which takes into account the phase of the wavelet and estimates a phase-corrected echogram. The phase of the wavelet is only due to the transducer and is assumed time-invariant during the propagation. (ii) A band equalization step which restores the spectral content of the ideal reflectivity. The two steps of the method are performed using fast Kalman filters which allow a significant reduction of the computational effort. Synthetic and actual results are given to prove that this is a good approach for resolution improvement in attenuating media [fr

  11. A feasibility study for the application of seismic interferometry by multidimensional deconvolution for lithospheric-scale imaging

    Science.gov (United States)

    Ruigrok, Elmer; van der Neut, Joost; Djikpesse, Hugues; Chen, Chin-Wu; Wapenaar, Kees

    2010-05-01

    Active-source surveys are widely used for the delineation of hydrocarbon accumulations. Most source and receiver configurations are designed to illuminate the first 5 km of the earth. For a deep understanding of the evolution of the crust, much larger depths need to be illuminated. The use of large-scale active surveys is feasible, but rather costly. As an alternative, we use passive acquisition configurations, aiming at detecting responses from distant earthquakes, in combination with seismic interferometry (SI). SI refers to the principle of generating new seismic responses by combining seismic observations at different receiver locations. We apply SI to the earthquake responses to obtain responses as if there was a source at each receiver position in the receiver array. These responses are subsequently migrated to obtain an image of the lithosphere. Conventionally, SI is applied by a crosscorrelation of responses. Recently, an alternative implementation was proposed as SI by multidimensional deconvolution (MDD) (Wapenaar et al. 2008). SI by MDD compensates both for the source-sampling and the source wavelet irregularities. Another advantage is that the MDD relation also holds for media with severe anelastic losses. A severe restriction though for the implementation of MDD was the need to estimate responses without free-surface interaction, from the earthquake responses. To mitigate this restriction, Groenestijn en Verschuur (2009) proposed to introduce the incident wavefield as an additional unknown in the inversion process. As an alternative solution, van der Neut et al. (2010) showed that the required wavefield separation may be implemented after a crosscorrelation step. These last two approaches facilitate the application of MDD for lithospheric-scale imaging. In this work, we study the feasibility for the implementation of MDD when considering teleseismic wavefields. We address specific problems for teleseismic wavefields, such as long and complicated source

  12. Noise Quantification with Beamforming Deconvolution: Effects of Regularization and Boundary Conditions

    DEFF Research Database (Denmark)

    Lylloff, Oliver Ackermann; Fernandez Grande, Efren

    Delay-and-sum (DAS) beamforming can be described as a linear convolution of an unknown sound source distribution and the microphone array response to a point source, i.e., point-spread function. Deconvolution tries to compensate for the influence of the array response and reveal the true source...

  13. A comparison of deconvolution and the Rutland-Patlak plot in parenchymal renal uptake rate.

    Science.gov (United States)

    Al-Shakhrah, Issa A

    2012-07-01

    Deconvolution and the Rutland-Patlak (R-P) plot are two of the most commonly used methods for analyzing dynamic radionuclide renography. Both methods allow estimation of absolute and relative renal uptake of radiopharmaceutical and of its rate of transit through the kidney. Seventeen patients (32 kidneys) were referred for further evaluation by renal scanning. All patients were positioned supine with their backs to the scintillation gamma camera, so that the kidneys and the heart are both in the field of view. Approximately 5-7 mCi of (99m)Tc-DTPA (diethylinetriamine penta-acetic acid) in about 0.5 ml of saline is injected intravenously and sequential 20 s frames were acquired, the study on each patient lasts for approximately 20 min. The time-activity curves of the parenchymal region of interest of each kidney, as well as the heart were obtained for analysis. The data were then analyzed with deconvolution and the R-P plot. A strong positive association (n = 32; r = 0.83; R (2) = 0.68) was found between the values that obtained by applying the two methods. Bland-Altman statistical analysis demonstrated that ninety seven percent of the values in the study (31 cases from 32 cases, 97% of the cases) were within limits of agreement (mean ± 1.96 standard deviation). We believe that R-P analysis method is expected to be more reproducible than iterative deconvolution method, because the deconvolution technique (the iterative method) relies heavily on the accuracy of the first point analyzed, as any errors are carried forward into the calculations of all the subsequent points, whereas R-P technique is based on an initial analysis of the data by means of the R-P plot, and it can be considered as an alternative technique to find and calculate the renal uptake rate.

  14. Streaming Multiframe Deconvolutions on GPUs

    Science.gov (United States)

    Lee, M. A.; Budavári, T.

    2015-09-01

    Atmospheric turbulence distorts all ground-based observations, which is especially detrimental to faint detections. The point spread function (PSF) defining this blur is unknown for each exposure and varies significantly over time, making image analysis difficult. Lucky imaging and traditional co-adding throws away lots of information. We developed blind deconvolution algorithms that can simultaneously obtain robust solutions for the background image and all the PSFs. It is done in a streaming setting, which makes it practical for large number of big images. We implemented a new tool that runs of GPUs and achieves exceptional running times that can scale to the new time-domain surveys. Our code can quickly and effectively recover high-resolution images exceeding the quality of traditional co-adds. We demonstrate the power of the method on the repeated exposures in the Sloan Digital Sky Survey's Stripe 82.

  15. Gabor Deconvolution as Preliminary Method to Reduce Pitfall in Deeper Target Seismic Data

    Science.gov (United States)

    Oktariena, M.; Triyoso, W.

    2018-03-01

    Anelastic attenuation process during seismic wave propagation is the trigger of seismic non-stationary characteristic. An absorption and a scattering of energy are causing the seismic energy loss as the depth increasing. A series of thin reservoir layers found in the study area is located within Talang Akar Fm. Level, showing an indication of interpretation pitfall due to attenuation effect commonly occurred in deeper level seismic data. Attenuation effect greatly influences the seismic images of deeper target level, creating pitfalls in several aspect. Seismic amplitude in deeper target level often could not represent its real subsurface character due to a low amplitude value or a chaotic event nearing the Basement. Frequency wise, the decaying could be seen as the frequency content diminishing in deeper target. Meanwhile, seismic amplitude is the simple tool to point out Direct Hydrocarbon Indicator (DHI) in preliminary Geophysical study before a further advanced interpretation method applied. A quick-look of Post-Stack Seismic Data shows the reservoir associated with a bright spot DHI while another bigger bright spot body detected in the North East area near the field edge. A horizon slice confirms a possibility that the other bright spot zone has smaller delineation; an interpretation pitfall commonly occurs in deeper level of seismic. We evaluates this pitfall by applying Gabor Deconvolution to address the attenuation problem. Gabor Deconvolution forms a Partition of Unity to factorize the trace into smaller convolution window that could be processed as stationary packets. Gabor Deconvolution estimates both the magnitudes of source signature alongside its attenuation function. The enhanced seismic shows a better imaging in the pitfall area that previously detected as a vast bright spot zone. When the enhanced seismic is used for further advanced reprocessing process, the Seismic Impedance and Vp/Vs Ratio slices show a better reservoir delineation, in which the

  16. A fast Fourier transform program for the deconvolution of IN10 data

    International Nuclear Information System (INIS)

    Howells, W.S.

    1981-04-01

    A deconvolution program based on the Fast Fourier Transform technique is described and some examples are presented to help users run the programs and interpret the results. Instructions are given for running the program on the RAL IBM 360/195 computer. (author)

  17. A study of the real-time deconvolution of digitized waveforms with pulse pile up for digital radiation spectroscopy

    International Nuclear Information System (INIS)

    Guo Weijun; Gardner, Robin P.; Mayo, Charles W.

    2005-01-01

    Two new real-time approaches have been developed and compared to the least-squares fit approach for the deconvolution of experimental waveforms with pile-up pulses. The single pulse shape chosen is typical for scintillators such as LSO and NaI(Tl). Simulated waveforms with pulse pile up were also generated and deconvolved to compare these three different approaches under cases where the single pulse component has a constant shape and the digitization error dominates. The effects of temporal separation and amplitude ratio between pile-up component pulses were also investigated and statistical tests were applied to quantify the consistency of deconvolution results for each case. Monte Carlo simulation demonstrated that applications of these pile-up deconvolution techniques to radiation spectroscopy are effective in extending the counting-rate range while preserving energy resolution for scintillation detectors

  18. Time-domain full waveform inversion of exponentially damped wavefield using the deconvolution-based objective function

    KAUST Repository

    Choi, Yun Seok

    2017-11-15

    Full waveform inversion (FWI) suffers from the cycle-skipping problem when the available frequency-band of data is not low enough. We apply an exponential damping to the data to generate artificial low frequencies, which helps FWI avoid cycle skipping. In this case, the least-square misfit function does not properly deal with the exponentially damped wavefield in FWI, because the amplitude of traces decays almost exponentially with increasing offset in a damped wavefield. Thus, we use a deconvolution-based objective function for FWI of the exponentially damped wavefield. The deconvolution filter includes inherently a normalization between the modeled and observed data, thus it can address the unbalanced amplitude of a damped wavefield. We, specifically, normalize the modeled data with the observed data in the frequency-domain to estimate the deconvolution filter and selectively choose a frequency-band for normalization that mainly includes the artificial low frequencies. We calculate the gradient of the objective function using the adjoint-state method. The synthetic and benchmark data examples show that our FWI algorithm generates a convergent long wavelength structure without low frequency information in the recorded data.

  19. Time-domain full waveform inversion of exponentially damped wavefield using the deconvolution-based objective function

    KAUST Repository

    Choi, Yun Seok; Alkhalifah, Tariq Ali

    2017-01-01

    Full waveform inversion (FWI) suffers from the cycle-skipping problem when the available frequency-band of data is not low enough. We apply an exponential damping to the data to generate artificial low frequencies, which helps FWI avoid cycle skipping. In this case, the least-square misfit function does not properly deal with the exponentially damped wavefield in FWI, because the amplitude of traces decays almost exponentially with increasing offset in a damped wavefield. Thus, we use a deconvolution-based objective function for FWI of the exponentially damped wavefield. The deconvolution filter includes inherently a normalization between the modeled and observed data, thus it can address the unbalanced amplitude of a damped wavefield. We, specifically, normalize the modeled data with the observed data in the frequency-domain to estimate the deconvolution filter and selectively choose a frequency-band for normalization that mainly includes the artificial low frequencies. We calculate the gradient of the objective function using the adjoint-state method. The synthetic and benchmark data examples show that our FWI algorithm generates a convergent long wavelength structure without low frequency information in the recorded data.

  20. Quantitative interpretation of nuclear logging data by adopting point-by-point spectrum striping deconvolution technology

    International Nuclear Information System (INIS)

    Tang Bin; Liu Ling; Zhou Shumin; Zhou Rongsheng

    2006-01-01

    The paper discusses the gamma-ray spectrum interpretation technology on nuclear logging. The principles of familiar quantitative interpretation methods, including the average content method and the traditional spectrum striping method, are introduced, and their limitation of determining the contents of radioactive elements on unsaturated ledges (where radioactive elements distribute unevenly) is presented. On the basis of the intensity gamma-logging quantitative interpretation technology by using the deconvolution method, a new quantitative interpretation method of separating radioactive elements is presented for interpreting the gamma spectrum logging. This is a point-by-point spectrum striping deconvolution technology which can give the logging data a quantitative interpretation. (authors)

  1. Nonnegative Matrix Factor 2-D Deconvolution for Blind Single Channel Source Separation

    DEFF Research Database (Denmark)

    Schmidt, Mikkel N.; Mørup, Morten

    2006-01-01

    We present a novel method for blind separation of instruments in polyphonic music based on a non-negative matrix factor 2-D deconvolution algorithm. Using a model which is convolutive in both time and frequency we factorize a spectrogram representation of music into components corresponding...

  2. A MAP blind image deconvolution algorithm with bandwidth over-constrained

    Science.gov (United States)

    Ren, Zhilei; Liu, Jin; Liang, Yonghui; He, Yulong

    2018-03-01

    We demonstrate a maximum a posteriori (MAP) blind image deconvolution algorithm with bandwidth over-constrained and total variation (TV) regularization to recover a clear image from the AO corrected images. The point spread functions (PSFs) are estimated by bandwidth limited less than the cutoff frequency of the optical system. Our algorithm performs well in avoiding noise magnification. The performance is demonstrated on simulated data.

  3. Handling of computational in vitro/in vivo correlation problems by Microsoft Excel: III. Convolution and deconvolution.

    Science.gov (United States)

    Langenbucher, Frieder

    2003-11-01

    Convolution and deconvolution are the classical in-vitro-in-vivo correlation tools to describe the relationship between input and weighting/response in a linear system, where input represents the drug release in vitro, weighting/response any body response in vivo. While functional treatment, e.g. in terms of polyexponential or Weibull distribution, is more appropriate for general survey or prediction, numerical algorithms are useful for treating actual experimental data. Deconvolution is not considered an algorithm by its own, but the inversion of a corresponding convolution. MS Excel is shown to be a useful tool for all these applications.

  4. The measurement of layer thickness by the deconvolution of ultrasonic signals

    International Nuclear Information System (INIS)

    McIntyre, P.J.

    1977-07-01

    An ultrasonic technique for measuring layer thickness, such as oxide on corroded steel, is described. A time domain response function is extracted from an ultrasonic signal reflected from the layered system. This signal is the convolution of the input signal with the response function of the layer. By using a signal reflected from a non-layered surface to represent the input, the response function may be obtained by deconvolution. The advantage of this technique over that described by Haines and Bel (1975) is that the quality of the results obtained using their method depends on the ability of a skilled operator in lining up an arbitrary common feature of the signals received. Using deconvolution no operator manipulations are necessary and so less highly trained personnel may successfully make the measurements. Results are presented for layers of araldite on aluminium and magnetite of steel. The results agreed satisfactorily with predictions but in the case of magnetite, its high velocity of sound meant that thicknesses of less than 250 microns were difficult to measure accurately. (author)

  5. Computerized glow curve deconvolution of thermoluminescent emission from polyminerals of Jamaica Mexican flower

    Science.gov (United States)

    Favalli, A.; Furetta, C.; Zaragoza, E. Cruz; Reyes, A.

    The aim of this work is to study the main thermoluminescence (TL) characteristics of the inorganic polyminerals extracted from dehydrated Jamaica flower or roselle (Hibiscus sabdariffa L.) belonging to Malvaceae family of Mexican origin. TL emission properties of the polymineral fraction in powder were studied using the initial rise (IR) method. The complex structure and kinetic parameters of the glow curves have been analysed accurately using the computerized glow curve deconvolution (CGCD) assuming an exponential distribution of trapping levels. The extension of the IR method to the case of a continuous and exponential distribution of traps is reported, such as the derivation of the TL glow curve deconvolution functions for continuous trap distribution. CGCD is performed both in the case of frequency factor, s, temperature independent, and in the case with the s function of temperature.

  6. Sparse deconvolution for the large-scale ill-posed inverse problem of impact force reconstruction

    Science.gov (United States)

    Qiao, Baijie; Zhang, Xingwu; Gao, Jiawei; Liu, Ruonan; Chen, Xuefeng

    2017-01-01

    Most previous regularization methods for solving the inverse problem of force reconstruction are to minimize the l2-norm of the desired force. However, these traditional regularization methods such as Tikhonov regularization and truncated singular value decomposition, commonly fail to solve the large-scale ill-posed inverse problem in moderate computational cost. In this paper, taking into account the sparse characteristic of impact force, the idea of sparse deconvolution is first introduced to the field of impact force reconstruction and a general sparse deconvolution model of impact force is constructed. Second, a novel impact force reconstruction method based on the primal-dual interior point method (PDIPM) is proposed to solve such a large-scale sparse deconvolution model, where minimizing the l2-norm is replaced by minimizing the l1-norm. Meanwhile, the preconditioned conjugate gradient algorithm is used to compute the search direction of PDIPM with high computational efficiency. Finally, two experiments including the small-scale or medium-scale single impact force reconstruction and the relatively large-scale consecutive impact force reconstruction are conducted on a composite wind turbine blade and a shell structure to illustrate the advantage of PDIPM. Compared with Tikhonov regularization, PDIPM is more efficient, accurate and robust whether in the single impact force reconstruction or in the consecutive impact force reconstruction.

  7. Novel response function resolves by image deconvolution more details of surface nanomorphology

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    2010-01-01

    and to imaging by in situ STM of electrocrystallization of copper on gold in electrolytes containing copper sulfate and sulfuric acid. It is suggested that the observed peaks of the recorded image do not represent atoms, but the atomic structure may be recovered by image deconvolution followed by calibration...

  8. Ex vivo lung perfusion in clinical lung transplantation--state of the art.

    Science.gov (United States)

    Andreasson, Anders S I; Dark, John H; Fisher, Andrew J

    2014-11-01

    Ex vivo lung perfusion (EVLP) has emerged as a new technique for assessing and potentially reconditioning human donor lungs previously unacceptable for clinical transplantation with the potential to dramatically push the limits of organ acceptability. With the recent introduction of portable EVLP, a new era in lung preservation may be upon us with the opportunity to also limit organ ischaemic times and potentially improve the outcome of donor lungs already deemed acceptable for transplantation. It took over half a century for the technique to evolve from basic theory to semi-automated circuits fit for clinical use that are now rapidly being adopted in transplant centres across the globe. With this field in constant evolution and many unanswered questions remaining, our review serves as an update on the state of the art of EVLP in clinical lung transplantation. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  9. Seeing deconvolution of globular clusters in M31

    International Nuclear Information System (INIS)

    Bendinelli, O.; Zavatti, F.; Parmeggiani, G.; Djorgovski, S.

    1990-01-01

    The morphology of six M31 globular clusters is examined using seeing-deconvolved CCD images. The deconvolution techniques developed by Bendinelli (1989) are reviewed and applied to the M31 globular clusters to demonstrate the methodology. It is found that the effective resolution limit of the method is about 0.1-0.3 arcsec for CCD images obtained in FWHM = 1 arcsec seeing, and sampling of 0.3 arcsec/pixel. Also, the robustness of the method is discussed. The implications of the technique for future studies using data from the Hubble Space Telescope are considered. 68 refs

  10. Deconvolution in the presence of noise using the Maximum Entropy Principle

    International Nuclear Information System (INIS)

    Steenstrup, S.

    1984-01-01

    The main problem in deconvolution in the presence of noise is the nonuniqueness. This problem is overcome by the application of the Maximum Entropy Principle. The way the noise enters in the formulation of the problem is examined in some detail and the final equations are derived such that the necessary assumptions becomes explicit. Examples using X-ray diffraction data are shown. (orig.)

  11. A new deconvolution method applied to ultrasonic images; Etude d'une methode de deconvolution adaptee aux images ultrasonores

    Energy Technology Data Exchange (ETDEWEB)

    Sallard, J

    1999-07-01

    This dissertation presents the development of a new method for restoration of ultrasonic signals. Our goal is to remove the perturbations induced by the ultrasonic probe and to help to characterize the defects due to a strong local discontinuity of the acoustic impedance. The point of view adopted consists in taking into account the physical properties in the signal processing to develop an algorithm which gives good results even on experimental data. The received ultrasonic signal is modeled as a convolution between a function that represents the waveform emitted by the transducer and a function that is abusively called the 'defect impulse response'. It is established that, in numerous cases, the ultrasonic signal can be expressed as a sum of weighted, phase-shifted replicas of a reference signal. Deconvolution is an ill-posed problem. A priori information must be taken into account to solve the problem. The a priori information translates the physical properties of the ultrasonic signals. The defect impulse response is modeled as a Double-Bernoulli-Gaussian sequence. Deconvolution becomes the problem of detection of the optimal Bernoulli sequence and estimation of the associated complex amplitudes. Optimal parameters of the sequence are those which maximize a likelihood function. We develop a new estimation procedure based on an optimization process. An adapted initialization procedure and an iterative algorithm enables to quickly process a huge number of data. Many experimental ultrasonic data that reflect usual control configurations have been processed and the results demonstrate the robustness of the method. Our algorithm enables not only to remove the waveform emitted by the transducer but also to estimate the phase. This parameter is useful for defect characterization. At last the algorithm makes easier data interpretation by concentrating information. So automatic characterization should be possible in the future. (author)

  12. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Directory of Open Access Journals (Sweden)

    González Adriana

    2016-01-01

    Full Text Available Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF. Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting. The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  13. Deconvolutions based on singular value decomposition and the pseudoinverse: a guide for beginners.

    Science.gov (United States)

    Hendler, R W; Shrager, R I

    1994-01-01

    Singular value decomposition (SVD) is deeply rooted in the theory of linear algebra, and because of this is not readily understood by a large group of researchers who could profit from its application. In this paper, we discuss the subject on a level that should be understandable to scientists who are not well versed in linear algebra. However, because it is necessary that certain key concepts in linear algebra be appreciated in order to comprehend what is accomplished by SVD, we present the section, 'Bare basics of linear algebra'. This is followed by a discussion of the theory of SVD. Next we present step-by-step examples to illustrate how SVD is applied to deconvolute a titration involving a mixture of three pH indicators. One noiseless case is presented as well as two cases where either a fixed or varying noise level is present. Finally, we discuss additional deconvolutions of mixed spectra based on the use of the pseudoinverse.

  14. The origin and evolution of the surfactant system in fish: insights into the evolution of lungs and swim bladders.

    Science.gov (United States)

    Daniels, Christopher B; Orgeig, Sandra; Sullivan, Lucy C; Ling, Nicholas; Bennett, Michael B; Schürch, Samuel; Val, Adalberto Luis; Brauner, Colin J

    2004-01-01

    Several times throughout their radiation fish have evolved either lungs or swim bladders as gas-holding structures. Lungs and swim bladders have different ontogenetic origins and can be used either for buoyancy or as an accessory respiratory organ. Therefore, the presence of air-filled bladders or lungs in different groups of fishes is an example of convergent evolution. We propose that air breathing could not occur without the presence of a surfactant system and suggest that this system may have originated in epithelial cells lining the pharynx. Here we present new data on the surfactant system in swim bladders of three teleost fish (the air-breathing pirarucu Arapaima gigas and tarpon Megalops cyprinoides and the non-air-breathing New Zealand snapper Pagrus auratus). We determined the presence of surfactant using biochemical, biophysical, and morphological analyses and determined homology using immunohistochemical analysis of the surfactant proteins (SPs). We relate the presence and structure of the surfactant system to those previously described in the swim bladders of another teleost, the goldfish, and those of the air-breathing organs of the other members of the Osteichthyes, the more primitive air-breathing Actinopterygii and the Sarcopterygii. Snapper and tarpon swim bladders are lined with squamous and cuboidal epithelial cells, respectively, containing membrane-bound lamellar bodies. Phosphatidylcholine dominates the phospholipid (PL) profile of lavage material from all fish analyzed to date. The presence of the characteristic surfactant lipids in pirarucu and tarpon, lamellar bodies in tarpon and snapper, SP-B in tarpon and pirarucu lavage, and SPs (A, B, and D) in swim bladder tissue of the tarpon provide strong evidence that the surfactant system of teleosts is homologous with that of other fish and of tetrapods. This study is the first demonstration of the presence of SP-D in the air-breathing organs of nonmammalian species and SP-B in actinopterygian

  15. TLD-100 glow-curve deconvolution for the evaluation of the thermal stress and radiation damage effects

    CERN Document Server

    Sabini, M G; Cuttone, G; Guasti, A; Mazzocchi, S; Raffaele, L

    2002-01-01

    In this work, the dose response of TLD-100 dosimeters has been studied in a 62 MeV clinical proton beams. The signal versus dose curve has been compared with the one measured in a sup 6 sup 0 Co beam. Different experiments have been performed in order to observe the thermal stress and the radiation damage effects on the detector sensitivity. A LET dependence of the TL response has been observed. In order to get a physical interpretation of these effects, a computerised glow-curve deconvolution has been employed. The results of all the performed experiments and deconvolutions are extensively reported, and the TLD-100 possible fields of application in the clinical proton dosimetry are discussed.

  16. Deconvolution of the density of states of tip and sample through constant-current tunneling spectroscopy

    Directory of Open Access Journals (Sweden)

    Holger Pfeifer

    2011-09-01

    Full Text Available We introduce a scheme to obtain the deconvolved density of states (DOS of the tip and sample, from scanning tunneling spectra determined in the constant-current mode (z–V spectroscopy. The scheme is based on the validity of the Wentzel–Kramers–Brillouin (WKB approximation and the trapezoidal approximation of the electron potential within the tunneling barrier. In a numerical treatment of z–V spectroscopy, we first analyze how the position and amplitude of characteristic DOS features change depending on parameters such as the energy position, width, barrier height, and the tip–sample separation. Then it is shown that the deconvolution scheme is capable of recovering the original DOS of tip and sample with an accuracy of better than 97% within the one-dimensional WKB approximation. Application of the deconvolution scheme to experimental data obtained on Nb(110 reveals a convergent behavior, providing separately the DOS of both sample and tip. In detail, however, there are systematic quantitative deviations between the DOS results based on z–V data and those based on I–V data. This points to an inconsistency between the assumed and the actual transmission probability function. Indeed, the experimentally determined differential barrier height still clearly deviates from that derived from the deconvolved DOS. Thus, the present progress in developing a reliable deconvolution scheme shifts the focus towards how to access the actual transmission probability function.

  17. Application of Fourier-wavelet regularized deconvolution for improving image quality of free space propagation x-ray phase contrast imaging.

    Science.gov (United States)

    Zhou, Zhongxing; Gao, Feng; Zhao, Huijuan; Zhang, Lixin

    2012-11-21

    New x-ray phase contrast imaging techniques without using synchrotron radiation confront a common problem from the negative effects of finite source size and limited spatial resolution. These negative effects swamp the fine phase contrast fringes and make them almost undetectable. In order to alleviate this problem, deconvolution procedures should be applied to the blurred x-ray phase contrast images. In this study, three different deconvolution techniques, including Wiener filtering, Tikhonov regularization and Fourier-wavelet regularized deconvolution (ForWaRD), were applied to the simulated and experimental free space propagation x-ray phase contrast images of simple geometric phantoms. These algorithms were evaluated in terms of phase contrast improvement and signal-to-noise ratio. The results demonstrate that the ForWaRD algorithm is most appropriate for phase contrast image restoration among above-mentioned methods; it can effectively restore the lost information of phase contrast fringes while reduce the amplified noise during Fourier regularization.

  18. Assessment of perfusion by dynamic contrast-enhanced imaging using a deconvolution approach based on regression and singular value decomposition.

    Science.gov (United States)

    Koh, T S; Wu, X Y; Cheong, L H; Lim, C C T

    2004-12-01

    The assessment of tissue perfusion by dynamic contrast-enhanced (DCE) imaging involves a deconvolution process. For analysis of DCE imaging data, we implemented a regression approach to select appropriate regularization parameters for deconvolution using the standard and generalized singular value decomposition methods. Monte Carlo simulation experiments were carried out to study the performance and to compare with other existing methods used for deconvolution analysis of DCE imaging data. The present approach is found to be robust and reliable at the levels of noise commonly encountered in DCE imaging, and for different models of the underlying tissue vasculature. The advantages of the present method, as compared with previous methods, include its efficiency of computation, ability to achieve adequate regularization to reproduce less noisy solutions, and that it does not require prior knowledge of the noise condition. The proposed method is applied on actual patient study cases with brain tumors and ischemic stroke, to illustrate its applicability as a clinical tool for diagnosis and assessment of treatment response.

  19. The impact of technology on the changing practice of lung SBRT

    NARCIS (Netherlands)

    M.C. Aznar (Marianne C.); Warren, S. (Samantha); M.S. Hoogeman (Mischa); M. Josipovic (Mirjana)

    2018-01-01

    textabstractStereotactic body radiotherapy (SBRT) for lung tumours has been gaining wide acceptance in lung cancer. Here, we review the technological evolution of SBRT delivery in lung cancer, from the first treatments using the stereotactic body frame in the 1990's to modern developments in image

  20. Partial volume effect correction in PET using regularized iterative deconvolution with variance control based on local topology

    International Nuclear Information System (INIS)

    Kirov, A S; Schmidtlein, C R; Piao, J Z

    2008-01-01

    Correcting positron emission tomography (PET) images for the partial volume effect (PVE) due to the limited resolution of PET has been a long-standing challenge. Various approaches including incorporation of the system response function in the reconstruction have been previously tested. We present a post-reconstruction PVE correction based on iterative deconvolution using a 3D maximum likelihood expectation-maximization (MLEM) algorithm. To achieve convergence we used a one step late (OSL) regularization procedure based on the assumption of local monotonic behavior of the PET signal following Alenius et al. This technique was further modified to selectively control variance depending on the local topology of the PET image. No prior 'anatomic' information is needed in this approach. An estimate of the noise properties of the image is used instead. The procedure was tested for symmetric and isotropic deconvolution functions with Gaussian shape and full width at half-maximum (FWHM) ranging from 6.31 mm to infinity. The method was applied to simulated and experimental scans of the NEMA NU 2 image quality phantom with the GE Discovery LS PET/CT scanner. The phantom contained uniform activity spheres with diameters ranging from 1 cm to 3.7 cm within uniform background. The optimal sphere activity to variance ratio was obtained when the deconvolution function was replaced by a step function few voxels wide. In this case, the deconvolution method converged in ∼3-5 iterations for most points on both the simulated and experimental images. For the 1 cm diameter sphere, the contrast recovery improved from 12% to 36% in the simulated and from 21% to 55% in the experimental data. Recovery coefficients between 80% and 120% were obtained for all larger spheres, except for the 13 mm diameter sphere in the simulated scan (68%). No increase in variance was observed except for a few voxels neighboring strong activity gradients and inside the largest spheres. Testing the method for

  1. Motion correction of PET brain images through deconvolution: I. Theoretical development and analysis in software simulations

    Science.gov (United States)

    Faber, T. L.; Raghunath, N.; Tudorascu, D.; Votaw, J. R.

    2009-02-01

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. Existing correction methods that use known patient motion obtained from tracking devices either require multi-frame acquisitions, detailed knowledge of the scanner, or specialized reconstruction algorithms. A deconvolution algorithm has been developed that alleviates these drawbacks by using the reconstructed image to estimate the original non-blurred image using maximum likelihood estimation maximization (MLEM) techniques. A high-resolution digital phantom was created by shape-based interpolation of the digital Hoffman brain phantom. Three different sets of 20 movements were applied to the phantom. For each frame of the motion, sinograms with attenuation and three levels of noise were simulated and then reconstructed using filtered backprojection. The average of the 20 frames was considered the motion blurred image, which was restored with the deconvolution algorithm. After correction, contrast increased from a mean of 2.0, 1.8 and 1.4 in the motion blurred images, for the three increasing amounts of movement, to a mean of 2.5, 2.4 and 2.2. Mean error was reduced by an average of 55% with motion correction. In conclusion, deconvolution can be used for correction of motion blur when subject motion is known.

  2. Motion correction of PET brain images through deconvolution: I. Theoretical development and analysis in software simulations

    Energy Technology Data Exchange (ETDEWEB)

    Faber, T L; Raghunath, N; Tudorascu, D; Votaw, J R [Department of Radiology, Emory University Hospital, 1364 Clifton Road, N.E. Atlanta, GA 30322 (United States)], E-mail: tfaber@emory.edu

    2009-02-07

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. Existing correction methods that use known patient motion obtained from tracking devices either require multi-frame acquisitions, detailed knowledge of the scanner, or specialized reconstruction algorithms. A deconvolution algorithm has been developed that alleviates these drawbacks by using the reconstructed image to estimate the original non-blurred image using maximum likelihood estimation maximization (MLEM) techniques. A high-resolution digital phantom was created by shape-based interpolation of the digital Hoffman brain phantom. Three different sets of 20 movements were applied to the phantom. For each frame of the motion, sinograms with attenuation and three levels of noise were simulated and then reconstructed using filtered backprojection. The average of the 20 frames was considered the motion blurred image, which was restored with the deconvolution algorithm. After correction, contrast increased from a mean of 2.0, 1.8 and 1.4 in the motion blurred images, for the three increasing amounts of movement, to a mean of 2.5, 2.4 and 2.2. Mean error was reduced by an average of 55% with motion correction. In conclusion, deconvolution can be used for correction of motion blur when subject motion is known.

  3. Deconvolution map-making for cosmic microwave background observations

    International Nuclear Information System (INIS)

    Armitage, Charmaine; Wandelt, Benjamin D.

    2004-01-01

    We describe a new map-making code for cosmic microwave background observations. It implements fast algorithms for convolution and transpose convolution of two functions on the sphere [B. Wandelt and K. Gorski, Phys. Rev. D 63, 123002 (2001)]. Our code can account for arbitrary beam asymmetries and can be applied to any scanning strategy. We demonstrate the method using simulated time-ordered data for three beam models and two scanning patterns, including a coarsened version of the WMAP strategy. We quantitatively compare our results with a standard map-making method and demonstrate that the true sky is recovered with high accuracy using deconvolution map-making

  4. Optimized coincidence Doppler broadening spectroscopy using deconvolution algorithms

    International Nuclear Information System (INIS)

    Ho, K.F.; Ching, H.M.; Cheng, K.W.; Beling, C.D.; Fung, S.; Ng, K.P.

    2004-01-01

    In the last few years a number of excellent deconvolution algorithms have been developed for use in ''de-blurring'' 2D images. Here we report briefly on one such algorithm we have studied which uses the non-negativity constraint to optimize the regularization and which is applied to the 2D image like data produced in Coincidence Doppler Broadening Spectroscopy (CDBS). The system instrumental resolution functions are obtained using the 514 keV line from 85 Sr. The technique when applied to a series of well annealed polycrystalline metals gives two photon momentum data on a quality comparable to that obtainable using 1D Angular Correlation of Annihilation Radiation (ACAR). (orig.)

  5. Deconvolution of X-ray diffraction profiles using series expansion: a line-broadening study of polycrystalline 9-YSZ

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Bajo, F. [Universidad de Extremadura, Badajoz (Spain). Dept. de Electronica e Ingenieria Electromecanica; Ortiz, A.L.; Cumbrera, F.L. [Universidad de Extremadura, Badajoz (Spain). Dept. de Fisica

    2001-07-01

    Deconvolution of X-ray diffraction profiles is a fundamental step in obtaining reliable results in the microstructural characterization (crystallite size, lattice microstrain, etc) of polycrystalline materials. In this work we have analyzed a powder sample of 9-YSZ using a technique based on the Fourier series expansion of the pure profile. This procedure, which can be combined with regularization methods, is specially powerful to minimize the effects of the ill-posed nature of the linear integral equation involved in the kinematical theory of X-ray diffraction. Finally, the deconvoluted profiles have been used to obtain microstructural parameters by means of the integral-breadth method. (orig.)

  6. Cramer-Rao Lower Bound for Support-Constrained and Pixel-Based Multi-Frame Blind Deconvolution (Postprint)

    National Research Council Canada - National Science Library

    Matson, Charles; Haji, Aiim

    2006-01-01

    Multi-frame blind deconvolution (MFBD) algorithms can be used to reconstruct a single high-resolution image of an object from one or more measurement frames of that are blurred and noisy realizations of that object...

  7. Optimal filtering values in renogram deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Puchal, R.; Pavia, J.; Gonzalez, A.; Ros, D.

    1988-07-01

    The evaluation of the isotopic renogram by means of the renal retention function (RRF) is a technique that supplies valuable information about renal function. It is not unusual to perform a smoothing of the data because of the sensitivity of the deconvolution algorithms with respect to noise. The purpose of this work is to confirm the existence of an optimal smoothing which minimises the error between the calculated RRF and the theoretical value for two filters (linear and non-linear). In order to test the effectiveness of these optimal smoothing values, some parameters of the calculated RRF were considered using this optimal smoothing. The comparison of these parameters with the theoretical ones revealed a better result in the case of the linear filter than in the non-linear case. The study was carried out simulating the input and output curves which would be obtained when using hippuran and DTPA as tracers.

  8. A new deconvolution method applied to ultrasonic images

    International Nuclear Information System (INIS)

    Sallard, J.

    1999-01-01

    This dissertation presents the development of a new method for restoration of ultrasonic signals. Our goal is to remove the perturbations induced by the ultrasonic probe and to help to characterize the defects due to a strong local discontinuity of the acoustic impedance. The point of view adopted consists in taking into account the physical properties in the signal processing to develop an algorithm which gives good results even on experimental data. The received ultrasonic signal is modeled as a convolution between a function that represents the waveform emitted by the transducer and a function that is abusively called the 'defect impulse response'. It is established that, in numerous cases, the ultrasonic signal can be expressed as a sum of weighted, phase-shifted replicas of a reference signal. Deconvolution is an ill-posed problem. A priori information must be taken into account to solve the problem. The a priori information translates the physical properties of the ultrasonic signals. The defect impulse response is modeled as a Double-Bernoulli-Gaussian sequence. Deconvolution becomes the problem of detection of the optimal Bernoulli sequence and estimation of the associated complex amplitudes. Optimal parameters of the sequence are those which maximize a likelihood function. We develop a new estimation procedure based on an optimization process. An adapted initialization procedure and an iterative algorithm enables to quickly process a huge number of data. Many experimental ultrasonic data that reflect usual control configurations have been processed and the results demonstrate the robustness of the method. Our algorithm enables not only to remove the waveform emitted by the transducer but also to estimate the phase. This parameter is useful for defect characterization. At last the algorithm makes easier data interpretation by concentrating information. So automatic characterization should be possible in the future. (author)

  9. Fatal defect in computerized glow curve deconvolution of thermoluminescence

    International Nuclear Information System (INIS)

    Sakurai, T.

    2001-01-01

    The method of computerized glow curve deconvolution (CGCD) is a powerful tool in the study of thermoluminescence (TL). In a system where the plural trapping levels have the probability of retrapping, the electrons trapped at one level can transfer from this level to another through retrapping via the conduction band during reading TL. However, at present, the method of CGCD has no affect on the electron transition between the trapping levels; this is a fatal defect. It is shown by computer simulation that CGCD using general-order kinetics thus cannot yield the correct trap parameters. (author)

  10. Stable Blind Deconvolution over the Reals from Additional Autocorrelations

    KAUST Repository

    Walk, Philipp

    2017-10-22

    Recently the one-dimensional time-discrete blind deconvolution problem was shown to be solvable uniquely, up to a global phase, by a semi-definite program for almost any signal, provided its autocorrelation is known. We will show in this work that under a sufficient zero separation of the corresponding signal in the $z-$domain, a stable reconstruction against additive noise is possible. Moreover, the stability constant depends on the signal dimension and on the signals magnitude of the first and last coefficients. We give an analytical expression for this constant by using spectral bounds of Vandermonde matrices.

  11. Ultrasonic inspection of studs (bolts) using dynamic predictive deconvolution and wave shaping.

    Science.gov (United States)

    Suh, D M; Kim, W W; Chung, J G

    1999-01-01

    Bolt degradation has become a major issue in the nuclear industry since the 1980's. If small cracks in stud bolts are not detected early enough, they grow rapidly and cause catastrophic disasters. Their detection, despite its importance, is known to be a very difficult problem due to the complicated structures of the stud bolts. This paper presents a method of detecting and sizing a small crack in the root between two adjacent crests in threads. The key idea is from the fact that the mode-converted Rayleigh wave travels slowly down the face of the crack and turns from the intersection of the crack and the root of thread to the transducer. Thus, when a crack exists, a small delayed pulse due to the Rayleigh wave is detected between large regularly spaced pulses from the thread. The delay time is the same as the propagation delay time of the slow Rayleigh wave and is proportional to the site of the crack. To efficiently detect the slow Rayleigh wave, three methods based on digital signal processing are proposed: wave shaping, dynamic predictive deconvolution, and dynamic predictive deconvolution combined with wave shaping.

  12. Nuclear pulse signal processing techniques based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Qi Zhong; Meng Xiangting; Fu Yanyan; Li Dongcang

    2012-01-01

    This article presents a method of measurement and analysis of nuclear pulse signal, the FPGA to control high-speed ADC measurement of nuclear radiation signals and control the high-speed transmission status of the USB to make it work on the Slave FIFO mode, using the LabVIEW online data processing and display, using the blind deconvolution method to remove the accumulation of signal acquisition, and to restore the nuclear pulse signal with a transmission speed, real-time measurements show that the advantages. (authors)

  13. Comparison of alternative methods for multiplet deconvolution in the analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Blaauw, Menno; Keyser, Ronald M.; Fazekas, Bela

    1999-01-01

    Three methods for multiplet deconvolution were tested using the 1995 IAEA reference spectra: Total area determination, iterative fitting and the library-oriented approach. It is concluded that, if statistical control (i.e. the ability to report results that agree with the known, true values to within the reported uncertainties) is required, the total area determination method performs the best. If high deconvolution power is required and a good, internally consistent library is available, the library oriented method yields the best results. Neither Erdtmann and Soyka's gamma-ray catalogue nor Browne and Firestone's Table of Radioactive Isotopes were found to be internally consistent enough in this respect. In the absence of a good library, iterative fitting with restricted peak width variation performs the best. The ultimate approach as yet to be implemented might be library-oriented fitting with allowed peak position variation according to the peak energy uncertainty specified in the library. (author)

  14. The thermoluminescence glow-curve analysis using GlowFit - the new powerful tool for deconvolution

    International Nuclear Information System (INIS)

    Puchalska, M.; Bilski, P.

    2005-10-01

    A new computer program, GlowFit, for deconvoluting first-order kinetics thermoluminescence (TL) glow-curves has been developed. A non-linear function describing a single glow-peak is fitted to experimental points using the least squares Levenberg-Marquardt method. The main advantage of GlowFit is in its ability to resolve complex TL glow-curves consisting of strongly overlapping peaks, such as those observed in heavily doped LiF:Mg,Ti (MTT) detectors. This resolution is achieved mainly by setting constraints or by fixing selected parameters. The initial values of the fitted parameters are placed in the so-called pattern files. GlowFit is a Microsoft Windows-operated user-friendly program. Its graphic interface enables easy intuitive manipulation of glow-peaks, at the initial stage (parameter initialization) and at the final stage (manual adjustment) of fitting peak parameters to the glow-curves. The program is freely downloadable from the web site www.ifj.edu.pl/NPP/deconvolution.htm (author)

  15. 3D image restoration for confocal microscopy: toward a wavelet deconvolution for the study of complex biological structures

    Science.gov (United States)

    Boutet de Monvel, Jacques; Le Calvez, Sophie; Ulfendahl, Mats

    2000-05-01

    Image restoration algorithms provide efficient tools for recovering part of the information lost in the imaging process of a microscope. We describe recent progress in the application of deconvolution to confocal microscopy. The point spread function of a Biorad-MRC1024 confocal microscope was measured under various imaging conditions, and used to process 3D-confocal images acquired in an intact preparation of the inner ear developed at Karolinska Institutet. Using these experiments we investigate the application of denoising methods based on wavelet analysis as a natural regularization of the deconvolution process. Within the Bayesian approach to image restoration, we compare wavelet denoising with the use of a maximum entropy constraint as another natural regularization method. Numerical experiments performed with test images show a clear advantage of the wavelet denoising approach, allowing to `cool down' the image with respect to the signal, while suppressing much of the fine-scale artifacts appearing during deconvolution due to the presence of noise, incomplete knowledge of the point spread function, or undersampling problems. We further describe a natural development of this approach, which consists of performing the Bayesian inference directly in the wavelet domain.

  16. Comparison of active-set method deconvolution and matched-filtering for derivation of an ultrasound transit time spectrum

    International Nuclear Information System (INIS)

    Wille, M-L; Langton, C M; Zapf, M; Ruiter, N V; Gemmeke, H

    2015-01-01

    The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13 μs versus 0.18 μs standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity. (note)

  17. A deconvolution technique for processing small intestinal transit data

    Energy Technology Data Exchange (ETDEWEB)

    Brinch, K. [Department of Clinical Physiology and Nuclear Medicine, Glostrup Hospital, University Hospital of Copenhagen (Denmark); Larsson, H.B.W. [Danish Research Center of Magnetic Resonance, Hvidovre Hospital, University Hospital of Copenhagen (Denmark); Madsen, J.L. [Department of Clinical Physiology and Nuclear Medicine, Hvidovre Hospital, University Hospital of Copenhagen (Denmark)

    1999-03-01

    The deconvolution technique can be used to compute small intestinal impulse response curves from scintigraphic data. Previously suggested approaches, however, are sensitive to noise from the data. We investigated whether deconvolution based on a new simple iterative convolving technique can be recommended. Eight healthy volunteers ingested a meal that contained indium-111 diethylene triamine penta-acetic acid labelled water and technetium-99m stannous colloid labelled omelette. Imaging was performed at 30-min intervals until all radioactivity was located in the colon. A Fermi function=(1+e{sup -{alpha}{beta}})/(1+e{sup (t-{alpha}){beta}}) was chosen to characterize the small intestinal impulse response function. By changing only two parameters, {alpha} and {beta}, it is possible to obtain configurations from nearly a square function to nearly a monoexponential function. Small intestinal input function was obtained from the gastric emptying curve and convolved with the Fermi function. The sum of least squares was used to find {alpha} and {beta} yielding the best fit of the convolved curve to the oberved small intestinal time-activity curve. Finally, a small intestinal mean transit time was calculated from the Fermi function referred to. In all cases, we found an excellent fit of the convolved curve to the observed small intestinal time-activity curve, that is the Fermi function reflected the small intestinal impulse response curve. Small intestinal mean transit time of liquid marker (median 2.02 h) was significantly shorter than that of solid marker (median 2.99 h; P<0.02). The iterative convolving technique seems to be an attractive alternative to ordinary approaches for the processing of small intestinal transit data. (orig.) With 2 figs., 13 refs.

  18. A Robust Gold Deconvolution Approach for LiDAR Waveform Data Processing to Characterize Vegetation Structure

    Science.gov (United States)

    Zhou, T.; Popescu, S. C.; Krause, K.; Sheridan, R.; Ku, N. W.

    2014-12-01

    Increasing attention has been paid in the remote sensing community to the next generation Light Detection and Ranging (lidar) waveform data systems for extracting information on topography and the vertical structure of vegetation. However, processing waveform lidar data raises some challenges compared to analyzing discrete return data. The overall goal of this study was to present a robust de-convolution algorithm- Gold algorithm used to de-convolve waveforms in a lidar dataset acquired within a 60 x 60m study area located in the Harvard Forest in Massachusetts. The waveform lidar data was collected by the National Ecological Observatory Network (NEON). Specific objectives were to: (1) explore advantages and limitations of various waveform processing techniques to derive topography and canopy height information; (2) develop and implement a novel de-convolution algorithm, the Gold algorithm, to extract elevation and canopy metrics; and (3) compare results and assess accuracy. We modeled lidar waveforms with a mixture of Gaussian functions using the Non-least squares (NLS) algorithm implemented in R and derived a Digital Terrain Model (DTM) and canopy height. We compared our waveform-derived topography and canopy height measurements using the Gold de-convolution algorithm to results using the Richardson-Lucy algorithm. Our findings show that the Gold algorithm performed better than the Richardson-Lucy algorithm in terms of recovering the hidden echoes and detecting false echoes for generating a DTM, which indicates that the Gold algorithm could potentially be applied to processing of waveform lidar data to derive information on terrain elevation and canopy characteristics.

  19. Photoacoustic imaging optimization with raw signal deconvolution and empirical mode decomposition

    Science.gov (United States)

    Guo, Chengwen; Wang, Jing; Qin, Yu; Zhan, Hongchen; Yuan, Jie; Cheng, Qian; Wang, Xueding

    2018-02-01

    Photoacoustic (PA) signal of an ideal optical absorb particle is a single N-shape wave. PA signals of a complicated biological tissue can be considered as the combination of individual N-shape waves. However, the N-shape wave basis not only complicates the subsequent work, but also results in aliasing between adjacent micro-structures, which deteriorates the quality of the final PA images. In this paper, we propose a method to improve PA image quality through signal processing method directly working on raw signals, which including deconvolution and empirical mode decomposition (EMD). During the deconvolution procedure, the raw PA signals are de-convolved with a system dependent point spread function (PSF) which is measured in advance. Then, EMD is adopted to adaptively re-shape the PA signals with two constraints, positive polarity and spectrum consistence. With our proposed method, the built PA images can yield more detail structural information. Micro-structures are clearly separated and revealed. To validate the effectiveness of this method, we present numerical simulations and phantom studies consist of a densely distributed point sources model and a blood vessel model. In the future, our study might hold the potential for clinical PA imaging as it can help to distinguish micro-structures from the optimized images and even measure the size of objects from deconvolved signals.

  20. Determination of ion mobility collision cross sections for unresolved isomeric mixtures using tandem mass spectrometry and chemometric deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Harper, Brett [Institute of Biomedical Studies, Baylor University, Waco, TX 76798 (United States); Neumann, Elizabeth K. [Department of Chemistry and Biochemistry, Baylor University, Waco, TX 76798 (United States); Stow, Sarah M.; May, Jody C.; McLean, John A. [Department of Chemistry, Vanderbilt University, Nashville, TN 37235 (United States); Vanderbilt Institute of Chemical Biology, Nashville, TN 37235 (United States); Vanderbilt Institute for Integrative Biosystems Research and Education, Nashville, TN 37235 (United States); Center for Innovative Technology, Nashville, TN 37235 (United States); Solouki, Touradj, E-mail: Touradj_Solouki@baylor.edu [Department of Chemistry and Biochemistry, Baylor University, Waco, TX 76798 (United States)

    2016-10-05

    Ion mobility (IM) is an important analytical technique for determining ion collision cross section (CCS) values in the gas-phase and gaining insight into molecular structures and conformations. However, limited instrument resolving powers for IM may restrict adequate characterization of conformationally similar ions, such as structural isomers, and reduce the accuracy of IM-based CCS calculations. Recently, we introduced an automated technique for extracting “pure” IM and collision-induced dissociation (CID) mass spectra of IM overlapping species using chemometric deconvolution of post-IM/CID mass spectrometry (MS) data [J. Am. Soc. Mass Spectrom., 2014, 25, 1810–1819]. Here we extend those capabilities to demonstrate how extracted IM profiles can be used to calculate accurate CCS values of peptide isomer ions which are not fully resolved by IM. We show that CCS values obtained from deconvoluted IM spectra match with CCS values measured from the individually analyzed corresponding peptides on uniform field IM instrumentation. We introduce an approach that utilizes experimentally determined IM arrival time (AT) “shift factors” to compensate for ion acceleration variations during post-IM/CID and significantly improve the accuracy of the calculated CCS values. Also, we discuss details of this IM deconvolution approach and compare empirical CCS values from traveling wave (TW)IM-MS and drift tube (DT)IM-MS with theoretically calculated CCS values using the projected superposition approximation (PSA). For example, experimentally measured deconvoluted TWIM-MS mean CCS values for doubly-protonated RYGGFM, RMFGYG, MFRYGG, and FRMYGG peptide isomers were 288.{sub 8} Å{sup 2}, 295.{sub 1} Å{sup 2}, 296.{sub 8} Å{sup 2}, and 300.{sub 1} Å{sup 2}; all four of these CCS values were within 1.5% of independently measured DTIM-MS values.

  1. Determination of ion mobility collision cross sections for unresolved isomeric mixtures using tandem mass spectrometry and chemometric deconvolution

    International Nuclear Information System (INIS)

    Harper, Brett; Neumann, Elizabeth K.; Stow, Sarah M.; May, Jody C.; McLean, John A.; Solouki, Touradj

    2016-01-01

    Ion mobility (IM) is an important analytical technique for determining ion collision cross section (CCS) values in the gas-phase and gaining insight into molecular structures and conformations. However, limited instrument resolving powers for IM may restrict adequate characterization of conformationally similar ions, such as structural isomers, and reduce the accuracy of IM-based CCS calculations. Recently, we introduced an automated technique for extracting “pure” IM and collision-induced dissociation (CID) mass spectra of IM overlapping species using chemometric deconvolution of post-IM/CID mass spectrometry (MS) data [J. Am. Soc. Mass Spectrom., 2014, 25, 1810–1819]. Here we extend those capabilities to demonstrate how extracted IM profiles can be used to calculate accurate CCS values of peptide isomer ions which are not fully resolved by IM. We show that CCS values obtained from deconvoluted IM spectra match with CCS values measured from the individually analyzed corresponding peptides on uniform field IM instrumentation. We introduce an approach that utilizes experimentally determined IM arrival time (AT) “shift factors” to compensate for ion acceleration variations during post-IM/CID and significantly improve the accuracy of the calculated CCS values. Also, we discuss details of this IM deconvolution approach and compare empirical CCS values from traveling wave (TW)IM-MS and drift tube (DT)IM-MS with theoretically calculated CCS values using the projected superposition approximation (PSA). For example, experimentally measured deconvoluted TWIM-MS mean CCS values for doubly-protonated RYGGFM, RMFGYG, MFRYGG, and FRMYGG peptide isomers were 288._8 Å"2, 295._1 Å"2, 296._8 Å"2, and 300._1 Å"2; all four of these CCS values were within 1.5% of independently measured DTIM-MS values.

  2. Double spike with isotope pattern deconvolution for mercury speciation

    International Nuclear Information System (INIS)

    Castillo, A.; Rodriguez-Gonzalez, P.; Centineo, G.; Roig-Navarro, A.F.; Garcia Alonso, J.I.

    2009-01-01

    Full text: A double-spiking approach, based on an isotope pattern deconvolution numerical methodology, has been developed and applied for the accurate and simultaneous determination of inorganic mercury (IHg) and methylmercury (MeHg). Isotopically enriched mercury species ( 199 IHg and 201 MeHg) are added before sample preparation to quantify the extent of methylation and demethylation processes. Focused microwave digestion was evaluated to perform the quantitative extraction of such compounds from solid matrices of environmental interest. Satisfactory results were obtained in different certificated reference materials (dogfish liver DOLT-4 and tuna fish CRM-464) both by using GC-ICPMS and GC-MS, demonstrating the suitability of the proposed analytical method. (author)

  3. Blind Deconvolution for Distributed Parameter Systems with Unbounded Input and Output and Determining Blood Alcohol Concentration from Transdermal Biosensor Data.

    Science.gov (United States)

    Rosen, I G; Luczak, Susan E; Weiss, Jordan

    2014-03-15

    We develop a blind deconvolution scheme for input-output systems described by distributed parameter systems with boundary input and output. An abstract functional analytic theory based on results for the linear quadratic control of infinite dimensional systems with unbounded input and output operators is presented. The blind deconvolution problem is then reformulated as a series of constrained linear and nonlinear optimization problems involving infinite dimensional dynamical systems. A finite dimensional approximation and convergence theory is developed. The theory is applied to the problem of estimating blood or breath alcohol concentration (respectively, BAC or BrAC) from biosensor-measured transdermal alcohol concentration (TAC) in the field. A distributed parameter model with boundary input and output is proposed for the transdermal transport of ethanol from the blood through the skin to the sensor. The problem of estimating BAC or BrAC from the TAC data is formulated as a blind deconvolution problem. A scheme to identify distinct drinking episodes in TAC data based on a Hodrick Prescott filter is discussed. Numerical results involving actual patient data are presented.

  4. Deconvolution effect of near-fault earthquake ground motions on stochastic dynamic response of tunnel-soil deposit interaction systems

    Directory of Open Access Journals (Sweden)

    K. Hacıefendioğlu

    2012-04-01

    Full Text Available The deconvolution effect of the near-fault earthquake ground motions on the stochastic dynamic response of tunnel-soil deposit interaction systems are investigated by using the finite element method. Two different earthquake input mechanisms are used to consider the deconvolution effects in the analyses: the standard rigid-base input and the deconvolved-base-rock input model. The Bolu tunnel in Turkey is chosen as a numerical example. As near-fault ground motions, 1999 Kocaeli earthquake ground motion is selected. The interface finite elements are used between tunnel and soil deposit. The mean of maximum values of quasi-static, dynamic and total responses obtained from the two input models are compared with each other.

  5. Fourier Deconvolution Methods for Resolution Enhancement in Continuous-Wave EPR Spectroscopy.

    Science.gov (United States)

    Reed, George H; Poyner, Russell R

    2015-01-01

    An overview of resolution enhancement of conventional, field-swept, continuous-wave electron paramagnetic resonance spectra using Fourier transform-based deconvolution methods is presented. Basic steps that are involved in resolution enhancement of calculated spectra using an implementation based on complex discrete Fourier transform algorithms are illustrated. Advantages and limitations of the method are discussed. An application to an experimentally obtained spectrum is provided to illustrate the power of the method for resolving overlapped transitions. © 2015 Elsevier Inc. All rights reserved.

  6. Deconvolution of ferredoxin, plastocyanin, and P700 transmittance changes in intact leaves with a new type of kinetic LED array spectrophotometer.

    Science.gov (United States)

    Klughammer, Christof; Schreiber, Ulrich

    2016-05-01

    A newly developed compact measuring system for assessment of transmittance changes in the near-infrared spectral region is described; it allows deconvolution of redox changes due to ferredoxin (Fd), P700, and plastocyanin (PC) in intact leaves. In addition, it can also simultaneously measure chlorophyll fluorescence. The major opto-electronic components as well as the principles of data acquisition and signal deconvolution are outlined. Four original pulse-modulated dual-wavelength difference signals are measured (785-840 nm, 810-870 nm, 870-970 nm, and 795-970 nm). Deconvolution is based on specific spectral information presented graphically in the form of 'Differential Model Plots' (DMP) of Fd, P700, and PC that are derived empirically from selective changes of these three components under appropriately chosen physiological conditions. Whereas information on maximal changes of Fd is obtained upon illumination after dark-acclimation, maximal changes of P700 and PC can be readily induced by saturating light pulses in the presence of far-red light. Using the information of DMP and maximal changes, the new measuring system enables on-line deconvolution of Fd, P700, and PC. The performance of the new device is demonstrated by some examples of practical applications, including fast measurements of flash relaxation kinetics and of the Fd, P700, and PC changes paralleling the polyphasic fluorescence rise upon application of a 300-ms pulse of saturating light.

  7. Nuclear pulse signal processing technique based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Fu Tingyan; Qi Zhong; Li Dongcang; Ren Zhongguo

    2012-01-01

    In this paper, we present a method for measurement and analysis of nuclear pulse signal, with which pile-up signal is removed, the signal baseline is restored, and the original signal is obtained. The data acquisition system includes FPGA, ADC and USB. The FPGA controls the high-speed ADC to sample the signal of nuclear radiation, and the USB makes the ADC work on the Slave FIFO mode to implement high-speed transmission status. Using the LabVIEW, it accomplishes online data processing of the blind deconvolution algorithm and data display. The simulation and experimental results demonstrate advantages of the method. (authors)

  8. Approximate deconvolution models of turbulence analysis, phenomenology and numerical analysis

    CERN Document Server

    Layton, William J

    2012-01-01

    This volume presents a mathematical development of a recent approach to the modeling and simulation of turbulent flows based on methods for the approximate solution of inverse problems. The resulting Approximate Deconvolution Models or ADMs have some advantages over more commonly used turbulence models – as well as some disadvantages. Our goal in this book is to provide a clear and complete mathematical development of ADMs, while pointing out the difficulties that remain. In order to do so, we present the analytical theory of ADMs, along with its connections, motivations and complements in the phenomenology of and algorithms for ADMs.

  9. X-ray scatter removal by deconvolution

    International Nuclear Information System (INIS)

    Seibert, J.A.; Boone, J.M.

    1988-01-01

    The distribution of scattered x rays detected in a two-dimensional projection radiograph at diagnostic x-ray energies is measured as a function of field size and object thickness at a fixed x-ray potential and air gap. An image intensifier-TV based imaging system is used for image acquisition, manipulation, and analysis. A scatter point spread function (PSF) with an assumed linear, spatially invariant response is modeled as a modified Gaussian distribution, and is characterized by two parameters describing the width of the distribution and the fraction of scattered events detected. The PSF parameters are determined from analysis of images obtained with radio-opaque lead disks centrally placed on the source side of a homogeneous phantom. Analytical methods are used to convert the PSF into the frequency domain. Numerical inversion provides an inverse filter that operates on frequency transformed, scatter degraded images. Resultant inverse transformed images demonstrate the nonarbitrary removal of scatter, increased radiographic contrast, and improved quantitative accuracy. The use of the deconvolution method appears to be clinically applicable to a variety of digital projection images

  10. Sparse Non-negative Matrix Factor 2-D Deconvolution for Automatic Transcription of Polyphonic Music

    DEFF Research Database (Denmark)

    Schmidt, Mikkel N.; Mørup, Morten

    2006-01-01

    We present a novel method for automatic transcription of polyphonic music based on a recently published algorithm for non-negative matrix factor 2-D deconvolution. The method works by simultaneously estimating a time-frequency model for an instrument and a pattern corresponding to the notes which...... are played based on a log-frequency spectrogram of the music....

  11. Evolution and diversification of Pseudomonas aeruginosa in the paranasal sinuses of cystic fibrosis children have implications for chronic lung infection

    DEFF Research Database (Denmark)

    Hansen, Susse Kirkelund; Rau, Martin Holm; Johansen, Helle Krogh

    2012-01-01

    that the paranasal sinuses constitute an important niche for the colonizing bacteria in many patients. The paranasal sinuses often harbor distinct bacterial subpopulations, and in the early colonization phases there seems to be a migration from the sinuses to the lower airways, suggesting that independent adaptation...... and evolution take place in the sinuses. Importantly, before the onset of chronic lung infection, lineages with mutations conferring a large fitness benefit in CF airways such as mucA and lasR as well as small colony variants and antibiotic-resistant clones are part of the sinus populations. Thus, the paranasal...

  12. Interpretation of high resolution airborne magnetic data (HRAMD of Ilesha and its environs, Southwest Nigeria, using Euler deconvolution method

    Directory of Open Access Journals (Sweden)

    Olurin Oluwaseun Tolutope

    2017-12-01

    Full Text Available Interpretation of high resolution aeromagnetic data of Ilesha and its environs within the basement complex of the geological setting of Southwestern Nigeria was carried out in the study. The study area is delimited by geographic latitudes 7°30′–8°00′N and longitudes 4°30′–5°00′E. This investigation was carried out using Euler deconvolution on filtered digitised total magnetic data (Sheet Number 243 to delineate geological structures within the area under consideration. The digitised airborne magnetic data acquired in 2009 were obtained from the archives of the Nigeria Geological Survey Agency (NGSA. The airborne magnetic data were filtered, processed and enhanced; the resultant data were subjected to qualitative and quantitative magnetic interpretation, geometry and depth weighting analyses across the study area using Euler deconvolution filter control file in Oasis Montag software. Total magnetic intensity distribution in the field ranged from –77.7 to 139.7 nT. Total magnetic field intensities reveal high-magnitude magnetic intensity values (high-amplitude anomaly and magnetic low intensities (low-amplitude magnetic anomaly in the area under consideration. The study area is characterised with high intensity correlated with lithological variation in the basement. The sharp contrast is enhanced due to the sharp contrast in magnetic intensity between the magnetic susceptibilities of the crystalline and sedimentary rocks. The reduced-to-equator (RTE map is characterised by high frequencies, short wavelengths, small size, weak intensity, sharp low amplitude and nearly irregular shaped anomalies, which may due to near-surface sources, such as shallow geologic units and cultural features. Euler deconvolution solution indicates a generally undulating basement, with a depth ranging from −500 to 1000 m. The Euler deconvolution results show that the basement relief is generally gentle and flat, lying within the basement terrain.

  13. Model-based deconvolution of cell cycle time-series data reveals gene expression details at high resolution.

    Directory of Open Access Journals (Sweden)

    Dan Siegal-Gaskins

    2009-08-01

    Full Text Available In both prokaryotic and eukaryotic cells, gene expression is regulated across the cell cycle to ensure "just-in-time" assembly of select cellular structures and molecular machines. However, present in all time-series gene expression measurements is variability that arises from both systematic error in the cell synchrony process and variance in the timing of cell division at the level of the single cell. Thus, gene or protein expression data collected from a population of synchronized cells is an inaccurate measure of what occurs in the average single-cell across a cell cycle. Here, we present a general computational method to extract "single-cell"-like information from population-level time-series expression data. This method removes the effects of 1 variance in growth rate and 2 variance in the physiological and developmental state of the cell. Moreover, this method represents an advance in the deconvolution of molecular expression data in its flexibility, minimal assumptions, and the use of a cross-validation analysis to determine the appropriate level of regularization. Applying our deconvolution algorithm to cell cycle gene expression data from the dimorphic bacterium Caulobacter crescentus, we recovered critical features of cell cycle regulation in essential genes, including ctrA and ftsZ, that were obscured in population-based measurements. In doing so, we highlight the problem with using population data alone to decipher cellular regulatory mechanisms and demonstrate how our deconvolution algorithm can be applied to produce a more realistic picture of temporal regulation in a cell.

  14. An l1-TV Algorithm for Deconvolution with Salt and Pepper Noise

    Science.gov (United States)

    2009-04-01

    deblurring in the presence of impulsive noise ,” Int. J. Comput. Vision, vol. 70, no. 3, pp. 279–298, Dec. 2006. [13] A. E. Beaton and J. W. Tukey, “The...AN 1-TV ALGORITHM FOR DECONVOLUTIONWITH SALT AND PEPPER NOISE Brendt Wohlberg∗ T-7 Mathematical Modeling and Analysis Los Alamos National Laboratory...and pepper noise , but the extension of this formulation to more general prob- lems, such as deconvolution, has received little attention. We consider

  15. Methods for deconvoluting and interpreting complex gamma- and x-ray spectral regions

    International Nuclear Information System (INIS)

    Gunnink, R.

    1983-06-01

    Germanium and silicon detectors are now widely used for the detection and measurement of x and gamma radiation. However, some analysis situations and spectral regions have heretofore been too complex to deconvolute and interpret by techniques in general use. One example is the L x-ray spectrum of an element taken with a Ge or Si detector. This paper describes some new tools and methods that were developed to analyze complex spectral regions; they are illustrated with examples

  16. A Design Methodology for Efficient Implementation of Deconvolutional Neural Networks on an FPGA

    OpenAIRE

    Zhang, Xinyu; Das, Srinjoy; Neopane, Ojash; Kreutz-Delgado, Ken

    2017-01-01

    In recent years deep learning algorithms have shown extremely high performance on machine learning tasks such as image classification and speech recognition. In support of such applications, various FPGA accelerator architectures have been proposed for convolutional neural networks (CNNs) that enable high performance for classification tasks at lower power than CPU and GPU processors. However, to date, there has been little research on the use of FPGA implementations of deconvolutional neural...

  17. Solving a Deconvolution Problem in Photon Spectrometry

    CERN Document Server

    Aleksandrov, D; Hille, P T; Polichtchouk, B; Kharlov, Y; Sukhorukov, M; Wang, D; Shabratova, G; Demanov, V; Wang, Y; Tveter, T; Faltys, M; Mao, Y; Larsen, D T; Zaporozhets, S; Sibiryak, I; Lovhoiden, G; Potcheptsov, T; Kucheryaev, Y; Basmanov, V; Mares, J; Yanovsky, V; Qvigstad, H; Zenin, A; Nikolaev, S; Siemiarczuk, T; Yuan, X; Cai, X; Redlich, K; Pavlinov, A; Roehrich, D; Manko, V; Deloff, A; Ma, K; Maruyama, Y; Dobrowolski, T; Shigaki, K; Nikulin, S; Wan, R; Mizoguchi, K; Petrov, V; Mueller, H; Ippolitov, M; Liu, L; Sadovsky, S; Stolpovsky, P; Kurashvili, P; Nomokonov, P; Xu, C; Torii, H; Il'kaev, R; Zhang, X; Peresunko, D; Soloviev, A; Vodopyanov, A; Sugitate, T; Ullaland, K; Huang, M; Zhou, D; Nystrand, J; Punin, V; Yin, Z; Batyunya, B; Karadzhev, K; Nazarov, G; Fil'chagin, S; Nazarenko, S; Buskenes, J I; Horaguchi, T; Djuvsland, O; Chuman, F; Senko, V; Alme, J; Wilk, G; Fehlker, D; Vinogradov, Y; Budilov, V; Iwasaki, T; Ilkiv, I; Budnikov, D; Vinogradov, A; Kazantsev, A; Bogolyubsky, M; Lindal, S; Polak, K; Skaali, B; Mamonov, A; Kuryakin, A; Wikne, J; Skjerdal, K

    2010-01-01

    We solve numerically a deconvolution problem to extract the undisturbed spectrum from the measured distribution contaminated by the finite resolution of the measuring device. A problem of this kind emerges when one wants to infer the momentum distribution of the neutral pions by detecting the it decay photons using the photon spectrometer of the ALICE LHC experiment at CERN {[}1]. The underlying integral equation connecting the sought for pion spectrum and the measured gamma spectrum has been discretized and subsequently reduced to a system of linear algebraic equations. The latter system, however, is known to be ill-posed and must be regularized to obtain a stable solution. This task has been accomplished here by means of the Tikhonov regularization scheme combined with the L-curve method. The resulting pion spectrum is in an excellent quantitative agreement with the pion spectrum obtained from a Monte Carlo simulation. (C) 2010 Elsevier B.V. All rights reserved.

  18. Transcriptome profile and unique genetic evolution of positively selected genes in yak lungs.

    Science.gov (United States)

    Lan, DaoLiang; Xiong, XianRong; Ji, WenHui; Li, Jian; Mipam, Tserang-Donko; Ai, Yi; Chai, ZhiXin

    2018-04-01

    The yak (Bos grunniens), which is a unique bovine breed that is distributed mainly in the Qinghai-Tibetan Plateau, is considered a good model for studying plateau adaptability in mammals. The lungs are important functional organs that enable animals to adapt to their external environment. However, the genetic mechanism underlying the adaptability of yak lungs to harsh plateau environments remains unknown. To explore the unique evolutionary process and genetic mechanism of yak adaptation to plateau environments, we performed transcriptome sequencing of yak and cattle (Bos taurus) lungs using RNA-Seq technology and a subsequent comparison analysis to identify the positively selected genes in the yak. After deep sequencing, a normal transcriptome profile of yak lung that containing a total of 16,815 expressed genes was obtained, and the characteristics of yak lungs transcriptome was described by functional analysis. Furthermore, Ka/Ks comparison statistics result showed that 39 strong positively selected genes are identified from yak lungs. Further GO and KEGG analysis was conducted for the functional annotation of these genes. The results of this study provide valuable data for further explorations of the unique evolutionary process of high-altitude hypoxia adaptation in yaks in the Tibetan Plateau and the genetic mechanism at the molecular level.

  19. Decline and infiltrated lung

    International Nuclear Information System (INIS)

    Giraldo Estrada, Horacio; Arboleda Casas, Felipe; Duarte, Monica; Triana Harker, Ricardo

    2001-01-01

    The paper describes the decline and infiltrated lung in a patient of 45 years, with diagnosis of arthritis rheumatoid from the 43 years, asymptomatic, without treatment, married, of the 15 to the 35 years of 3 to 10 cigarettes daily, she refers of 7 months of evolution episodes of moderate dyspnoea with exercises and dry cough with occasional mucous expectoration between others

  20. Analysis of low-pass filters for approximate deconvolution closure modelling in one-dimensional decaying Burgers turbulence

    Science.gov (United States)

    San, O.

    2016-01-01

    The idea of spatial filtering is central in approximate deconvolution large-eddy simulation (AD-LES) of turbulent flows. The need for low-pass filters naturally arises in the approximate deconvolution approach which is based solely on mathematical approximations by employing repeated filtering operators. Two families of low-pass spatial filters are studied in this paper: the Butterworth filters and the Padé filters. With a selection of various filtering parameters, variants of the AD-LES are systematically applied to the decaying Burgers turbulence problem, which is a standard prototype for more complex turbulent flows. Comparing with the direct numerical simulations, it is shown that all forms of the AD-LES approaches predict significantly better results than the under-resolved simulations at the same grid resolution. However, the results highly depend on the selection of the filtering procedure and the filter design. It is concluded that a complete attenuation for the smallest scales is crucial to prevent energy accumulation at the grid cut-off.

  1. Lung perfusion scintigraphy by SPECT

    International Nuclear Information System (INIS)

    Hirayama, Takanobu

    1990-01-01

    The initial study reports the characteristic performance using lung segmental phantom filled in Tc-99m pertechnetate. To evaluate the segmental defect in lung perfusion scintigraphy, we applied Bull's-eye analysis in addition to planar image set. Bull's-eye analysis especially facilitated the interpretation in both middle and lower lobes. Subsequently, to evolute the clinical application of Bull's-eye analysis, pulmonary scintigraphy was performed on 10 normal subjects and 60 patients with several pulmonary diseases. Of interest, Bull's-eye analysis, however, encouraged the interpretation in both lower lobes. To calculate the extention and severity of perfusion defect, the present study describes Bull's-eye analysis. Quantitative scoring showed higher in patients with lung cancer than those with pulmonary tuberculosis. The present study focus that Bull's-eye analysis can be useful for evaluating perfusion in patients with a couple of pulmonary diseases. (author)

  2. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  3. Obtaining Crustal Properties From the P Coda Without Deconvolution: an Example From the Dakotas

    Science.gov (United States)

    Frederiksen, A. W.; Delaney, C.

    2013-12-01

    Receiver functions are a popular technique for mapping variations in crustal thickness and bulk properties, as the travel times of Ps conversions and multiples from the Moho constrain both Moho depth (h) and the Vp/Vs ratio (k) of the crust. The established approach is to generate a suite of receiver functions, which are then stacked along arrival-time curves for a set of (h,k) values (the h-k stacking approach of Zhu and Kanamori, 2000). However, this approach is sensitive to noise issues with the receiver functions, deconvolution artifacts, and the effects of strong crustal layering (such as in sedimentary basins). In principle, however, the deconvolution is unnecessary; for any given crustal model, we can derive a transfer function allowing us to predict the radial component of the P coda from the vertical, and so determine a misfit value for a particular crustal model. We apply this idea to an Earthscope Transportable Array data set from North and South Dakota and western Minnesota, for which we already have measurements obtained using conventional h-k stacking, and so examine the possibility of crustal thinning and modification by a possible failed branch of the Mid-Continent Rift.

  4. Generative adversarial networks recover features in astrophysical images of galaxies beyond the deconvolution limit

    Science.gov (United States)

    Schawinski, Kevin; Zhang, Ce; Zhang, Hantian; Fowler, Lucas; Santhanam, Gokula Krishnan

    2017-05-01

    Observations of astrophysical objects such as galaxies are limited by various sources of random and systematic noise from the sky background, the optical system of the telescope and the detector used to record the data. Conventional deconvolution techniques are limited in their ability to recover features in imaging data by the Shannon-Nyquist sampling theorem. Here, we train a generative adversarial network (GAN) on a sample of 4550 images of nearby galaxies at 0.01 < z < 0.02 from the Sloan Digital Sky Survey and conduct 10× cross-validation to evaluate the results. We present a method using a GAN trained on galaxy images that can recover features from artificially degraded images with worse seeing and higher noise than the original with a performance that far exceeds simple deconvolution. The ability to better recover detailed features such as galaxy morphology from low signal to noise and low angular resolution imaging data significantly increases our ability to study existing data sets of astrophysical objects as well as future observations with observatories such as the Large Synoptic Sky Telescope (LSST) and the Hubble and James Webb space telescopes.

  5. VATS anatomic lung resections-the European experience

    DEFF Research Database (Denmark)

    Begum, Sofina; Hansen, Henrik Jessen; Papagiannopoulos, Kostas

    2014-01-01

    Video-assisted thoracoscopic surgery (VATS) has undergone significant evolution over several decades. Although endoscopic instruments continued to improve, it was not until 1992 that the first VATS lobectomy for lung cancer was performed. Despite significant seeding of such procedure in several...

  6. The Study of Lung Cancer Personalized Medicine Through Circulating Cell Free DNA Test

    DEFF Research Database (Denmark)

    Ye, Mingzhi

    According to the serious situation of lung cancer in Chinese cancer incidence and mortality, better prognosis and early diagnosis are the key problems. These works are around of lung cancer genetic profiling, pathway signaling and tumor evolution, targeted therapy and transplant monitoring, and f...

  7. Deconvolution of 238,239,240Pu conversion electron spectra measured with a silicon drift detector

    DEFF Research Database (Denmark)

    Pommé, S.; Marouli, M.; Paepen, J.

    2018-01-01

    Internal conversion electron (ICE) spectra of thin 238,239,240Pu sources, measured with a windowless Peltier-cooled silicon drift detector (SDD), were deconvoluted and relative ICE intensities were derived from the fitted peak areas. Corrections were made for energy dependence of the full...

  8. Contribution of radioisotopes in the diagnosis of lung tumours

    International Nuclear Information System (INIS)

    Raynaud, C.; Crouzel, M.

    1976-01-01

    197 Hg, a radioisotope of half life 65 h, injected as acetate builds up in malignant tumours and evolutive inflammatory lesions. In spite of a high proportion of false positives it can lead to a lung cancer diagnosis under well-defined circumstances such as: round lung images, X-ray tuberculosis images persisting after prolonged therapy, treated cancers where it can detect relapses before X-rays and in the difficult case of benign tumours. In man the uptake kinetics of carrier-free 67 Cu (half-life 58 h), injected as citrate, are apparently not the same in cancers as in evolutive inflammatory lesions. The false positive yield, through much lower than that observed with 197 Hg, is still too high in this preliminary study [fr

  9. Contribution of radioisotopes in the diagnosis of lung tumours

    Energy Technology Data Exchange (ETDEWEB)

    Raynaud, C; Crouzel, M [CEA, 91 - Orsay (France). Service Hospitalier Frederic Joliot

    1976-12-01

    Mercury-197, a radioisotope of half life 65 h, injected as acetate builds up in malignant tumours and evolutive inflammatory lesions. In spite of a high proportion of false positives it can lead to a lung cancer diagnosis under well-defined circumstances such as: round lung images, x-ray tuberculosis images persisting after prolonged therapy, treated cancers where it can detect relapses before X-rays and in the difficult case of benign tumours. In man the uptake kinetics of carrier-free /sup 67/Cu (half-life 58 h), injected as citrate, are apparently not the same in cancers as in evolutive inflammatory lesions. The false positive yield, through much lower than that observed with /sup 197/Hg, is still too high in this preliminary study.

  10. Linear MALDI-ToF simultaneous spectrum deconvolution and baseline removal.

    Science.gov (United States)

    Picaud, Vincent; Giovannelli, Jean-Francois; Truntzer, Caroline; Charrier, Jean-Philippe; Giremus, Audrey; Grangeat, Pierre; Mercier, Catherine

    2018-04-05

    Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.

  11. A HOS-based blind deconvolution algorithm for the improvement of time resolution of mixed phase low SNR seismic data

    International Nuclear Information System (INIS)

    Hani, Ahmad Fadzil M; Younis, M Shahzad; Halim, M Firdaus M

    2009-01-01

    A blind deconvolution technique using a modified higher order statistics (HOS)-based eigenvector algorithm (EVA) is presented in this paper. The main purpose of the technique is to enable the processing of low SNR short length seismograms. In our study, the seismogram is assumed to be the output of a mixed phase source wavelet (system) driven by a non-Gaussian input signal (due to earth) with additive Gaussian noise. Techniques based on second-order statistics are shown to fail when processing non-minimum phase seismic signals because they only rely on the autocorrelation function of the observed signal. In contrast, existing HOS-based blind deconvolution techniques are suitable in the processing of a non-minimum (mixed) phase system; however, most of them are unable to converge and show poor performance whenever noise dominates the actual signal, especially in the cases where the observed data are limited (few samples). The developed blind equalization technique is primarily based on the EVA for blind equalization, initially to deal with mixed phase non-Gaussian seismic signals. In order to deal with the dominant noise issue and small number of available samples, certain modifications are incorporated into the EVA. For determining the deconvolution filter, one of the modifications is to use more than one higher order cumulant slice in the EVA. This overcomes the possibility of non-convergence due to a low signal-to-noise ratio (SNR) of the observed signal. The other modification conditions the cumulant slice by increasing the power of eigenvalues of the cumulant slice, related to actual signal, and rejects the eigenvalues below the threshold representing the noise. This modification reduces the effect of the availability of a small number of samples and strong additive noise on the cumulant slices. These modifications are found to improve the overall deconvolution performance, with approximately a five-fold reduction in a mean square error (MSE) and a six

  12. Blind deconvolution of time-of-flight mass spectra from atom probe tomography

    International Nuclear Information System (INIS)

    Johnson, L.J.S.; Thuvander, M.; Stiller, K.; Odén, M.; Hultman, L.

    2013-01-01

    A major source of uncertainty in compositional measurements in atom probe tomography stems from the uncertainties of assigning peaks or parts of peaks in the mass spectrum to their correct identities. In particular, peak overlap is a limiting factor, whereas an ideal mass spectrum would have peaks at their correct positions with zero broadening. Here, we report a method to deconvolute the experimental mass spectrum into such an ideal spectrum and a system function describing the peak broadening introduced by the field evaporation and detection of each ion. By making the assumption of a linear and time-invariant behavior, a system of equations is derived that describes the peak shape and peak intensities. The model is fitted to the observed spectrum by minimizing the squared residuals, regularized by the maximum entropy method. For synthetic data perfectly obeying the assumptions, the method recovered peak intensities to within ±0.33at%. The application of this model to experimental APT data is exemplified with Fe–Cr data. Knowledge of the peak shape opens up several new possibilities, not just for better overall compositional determination, but, e.g., for the estimation of errors of ranging due to peak overlap or peak separation constrained by isotope abundances. - Highlights: • A method for the deconvolution of atom probe mass spectra is proposed. • Applied to synthetic randomly generated spectra the accuracy was ±0.33 at. • Application of the method to an experimental Fe–Cr spectrum is demonstrated

  13. Deconvolution, differentiation and Fourier transformation algorithms for noise-containing data based on splines and global approximation

    NARCIS (Netherlands)

    Wormeester, Herbert; Sasse, A.G.B.M.; van Silfhout, Arend

    1988-01-01

    One of the main problems in the analysis of measured spectra is how to reduce the influence of noise in data processing. We show a deconvolution, a differentiation and a Fourier Transform algorithm that can be run on a small computer (64 K RAM) and suffer less from noise than commonly used routines.

  14. A novel lung slice system with compromised antioxidant defenses

    Energy Technology Data Exchange (ETDEWEB)

    Hardwick, S.J.; Adam, A.; Cohen, G.M. (Univ. of London (England)); Smith, L.L. (Imperial Chemical Industries PLC, Cheshire (England))

    1990-04-01

    In order to facilitate the study of oxidative stress in lung tissue, rat lung slices with impaired antioxidant defenses were prepared and used. Incubation of lung slices with the antineoplastic agent 1,3-bis(2-chloroethyl)-1-nitrosourea (BCNU) (100 {mu}M) in an amino acid-rich medium for 45 min produced a near-maximal (approximately 85%), irreversible inhibition of glutathione reductase, accompanied by only a modest (approximately 15%) decrease in pulmonary nonprotein sulfhydryls (NPSH) and no alteration in intracellular ATP, NADP{sup +}, and NADPH levels. The amounts of NADP(H), ATP, and NPSH were stable over a 4-hr incubation period following the removal from BCNU. The viability of the system was further evaluated by measuring the rate of evolution of {sup 14}CO{sub 2} from D-({sup 14}C(U))-glucose. The rates of evolution were almost identical in the compromised system when compared with control slices over a 4-hr time period. By using slices with compromised oxidative defenses, preliminary results have been obtained with paraquat, nitrofurantoin, and 2,3-dimethoxy-1,4-naphthoquinone.

  15. A novel lung slice system with compromised antioxidant defenses

    International Nuclear Information System (INIS)

    Hardwick, S.J.; Adam, A.; Cohen, G.M.; Smith, L.L.

    1990-01-01

    In order to facilitate the study of oxidative stress in lung tissue, rat lung slices with impaired antioxidant defenses were prepared and used. Incubation of lung slices with the antineoplastic agent 1,3-bis(2-chloroethyl)-1-nitrosourea (BCNU) (100 μM) in an amino acid-rich medium for 45 min produced a near-maximal (approximately 85%), irreversible inhibition of glutathione reductase, accompanied by only a modest (approximately 15%) decrease in pulmonary nonprotein sulfhydryls (NPSH) and no alteration in intracellular ATP, NADP + , and NADPH levels. The amounts of NADP(H), ATP, and NPSH were stable over a 4-hr incubation period following the removal from BCNU. The viability of the system was further evaluated by measuring the rate of evolution of 14 CO 2 from D-[ 14 C(U)]-glucose. The rates of evolution were almost identical in the compromised system when compared with control slices over a 4-hr time period. By using slices with compromised oxidative defenses, preliminary results have been obtained with paraquat, nitrofurantoin, and 2,3-dimethoxy-1,4-naphthoquinone

  16. [Lung transplantation in pulmonary fibrosis and other interstitial lung diseases].

    Science.gov (United States)

    Berastegui, Cristina; Monforte, Victor; Bravo, Carlos; Sole, Joan; Gavalda, Joan; Tenório, Luis; Villar, Ana; Rochera, M Isabel; Canela, Mercè; Morell, Ferran; Roman, Antonio

    2014-09-15

    Interstitial lung disease (ILD) is the second indication for lung transplantation (LT) after emphysema. The aim of this study is to review the results of LT for ILD in Hospital Vall d'Hebron (Barcelona, Spain). We retrospectively studied 150 patients, 87 (58%) men, mean age 48 (r: 20-67) years between August 1990 and January 2010. One hundred and four (69%) were single lung transplants (SLT) and 46 (31%) bilateral-lung transplants (BLT). The postoperative diagnoses were: 94 (63%) usual interstitial pneumonia, 23 (15%) nonspecific interstitial pneumonia, 11 (7%) unclassifiable interstitial pneumonia and 15% miscellaneous. We describe the functional results, complications and survival. The actuarial survival was 87, 70 and 53% at one, 3 and 5 years respectively. The most frequent causes of death included early graft dysfunction and development of chronic rejection in the form of bronchiolitis obliterans (BOS). The mean postoperative increase in forced vital capacity and forced expiratory volume in the first second (FEV1) was similar in SLT and BLT. The best FEV1 was reached after 10 (r: 1-36) months. Sixteen percent of patients returned to work. At some point during the evolution, proven acute rejection was diagnosed histologically in 53 (35%) patients. The prevalence of BOS among survivors was 20% per year, 45% at 3 years and 63% at 5 years. LT is the best treatment option currently available for ILD, in which medical treatment has failed. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.

  17. A technique for the deconvolution of the pulse shape of acoustic emission signals back to the generating defect source

    International Nuclear Information System (INIS)

    Houghton, J.R.; Packman, P.F.; Townsend, M.A.

    1976-01-01

    Acoustic emission signals recorded after passage through the instrumentation system can be deconvoluted to produce signal traces indicative of those at the generating source, and these traces can be used to identify characteristics of the source

  18. An Algorithm-Independent Analysis of the Quality of Images Produced Using Multi-Frame Blind Deconvolution Algorithms--Conference Proceedings (Postprint)

    National Research Council Canada - National Science Library

    Matson, Charles; Haji, Alim

    2007-01-01

    Multi-frame blind deconvolution (MFBD) algorithms can be used to generate a deblurred image of an object from a sequence of short-exposure and atmospherically-blurred images of the object by jointly estimating the common object...

  19. DECONVOLUTION OF IMAGES FROM BLAST 2005: INSIGHT INTO THE K3-50 AND IC 5146 STAR-FORMING REGIONS

    International Nuclear Information System (INIS)

    Roy, Arabindo; Netterfield, Calvin B.; Ade, Peter A. R.; Griffin, Matthew; Hargrave, Peter C.; Mauskopf, Philip; Bock, James J.; Brunt, Christopher M.; Chapin, Edward L.; Gibb, Andrew G.; Halpern, Mark; Marsden, Gaelen; Devlin, Mark J.; Dicker, Simon R.; Klein, Jeff; France, Kevin; Gundersen, Joshua O.; Hughes, David H.; Martin, Peter G.; Olmi, Luca

    2011-01-01

    We present an implementation of the iterative flux-conserving Lucy-Richardson (L-R) deconvolution method of image restoration for maps produced by the Balloon-borne Large Aperture Submillimeter Telescope (BLAST). Compared to the direct Fourier transform method of deconvolution, the L-R operation restores images with better-controlled background noise and increases source detectability. Intermediate iterated images are useful for studying extended diffuse structures, while the later iterations truly enhance point sources to near the designed diffraction limit of the telescope. The L-R method of deconvolution is efficient in resolving compact sources in crowded regions while simultaneously conserving their respective flux densities. We have analyzed its performance and convergence extensively through simulations and cross-correlations of the deconvolved images with available high-resolution maps. We present new science results from two BLAST surveys, in the Galactic regions K3-50 and IC 5146, further demonstrating the benefits of performing this deconvolution. We have resolved three clumps within a radius of 4.'5 inside the star-forming molecular cloud containing K3-50. Combining the well-resolved dust emission map with available multi-wavelength data, we have constrained the spectral energy distributions (SEDs) of five clumps to obtain masses (M), bolometric luminosities (L), and dust temperatures (T). The L-M diagram has been used as a diagnostic tool to estimate the evolutionary stages of the clumps. There are close relationships between dust continuum emission and both 21 cm radio continuum and 12 CO molecular line emission. The restored extended large-scale structures in the Northern Streamer of IC 5146 have a strong spatial correlation with both SCUBA and high-resolution extinction images. A dust temperature of 12 K has been obtained for the central filament. We report physical properties of ten compact sources, including six associated protostars, by fitting

  20. Optimization of deconvolution software used in the study of spectra of soil samples from Madagascar

    International Nuclear Information System (INIS)

    ANDRIAMADY NARIMANANA, S.F.

    2005-01-01

    The aim of this work is to perform the deconvolution of gamma spectra by using the deconvolution peak program. Synthetic spectra, reference materials and ten soil samples with various U-238 activities from three regions of Madagascar were used. This work concerns : soil sample spectra with low activities of about (47±2) Bq.kg -1 from Ankatso, soil sample spectra with average activities of about (125±2)Bq.kg -1 from Antsirabe and soil sample spectra with high activities of about (21100± 120) Bq.kg -1 from Vinaninkarena. Singlet and multiplet peaks with various intensities were found in each soil spectrum. Interactive Peak Fit (IPF) program in Genie-PC from Canberra Industries allows to deconvoluate many multiplet regions : quartet within 235 keV-242 keV, Pb-214 and Pb-212 within 294 keV -301 keV; Th-232 daughters within 582 keV - 584 keV; Ac-228 within 904 keV -911 keV and within 964 keV-970 keV and Bi-214 within 1401 keV - 1408 keV. Those peaks were used to quantify considered radionuclides. However, IPF cannot resolve Ra-226 peak at 186,1 keV. [fr

  1. Lunge feeding in early marine reptiles and fast evolution of marine tetrapod feeding guilds.

    Science.gov (United States)

    Motani, Ryosuke; Chen, Xiao-hong; Jiang, Da-yong; Cheng, Long; Tintori, Andrea; Rieppel, Olivier

    2015-03-10

    Traditional wisdom holds that biotic recovery from the end-Permian extinction was slow and gradual, and was not complete until the Middle Triassic. Here, we report that the evolution of marine predator feeding guilds, and their trophic structure, proceeded faster. Marine reptile lineages with unique feeding adaptations emerged during the Early Triassic (about 248 million years ago), including the enigmatic Hupehsuchus that possessed an unusually slender mandible. A new specimen of this genus reveals a well-preserved palate and mandible, which suggest that it was a rare lunge feeder as also occurs in rorqual whales and pelicans. The diversity of feeding strategies among Triassic marine tetrapods reached their peak in the Early Triassic, soon after their first appearance in the fossil record. The diet of these early marine tetrapods most likely included soft-bodied animals that are not preserved as fossils. Early marine tetrapods most likely introduced a new trophic mechanism to redistribute nutrients to the top 10 m of the sea, where the primary productivity is highest. Therefore, a simple recovery to a Permian-like trophic structure does not explain the biotic changes seen after the Early Triassic.

  2. A Convolution Tree with Deconvolution Branches: Exploiting Geometric Relationships for Single Shot Keypoint Detection

    OpenAIRE

    Kumar, Amit; Chellappa, Rama

    2017-01-01

    Recently, Deep Convolution Networks (DCNNs) have been applied to the task of face alignment and have shown potential for learning improved feature representations. Although deeper layers can capture abstract concepts like pose, it is difficult to capture the geometric relationships among the keypoints in DCNNs. In this paper, we propose a novel convolution-deconvolution network for facial keypoint detection. Our model predicts the 2D locations of the keypoints and their individual visibility ...

  3. Phenotypes selected during chronic lung infection in cystic fibrosis patients

    DEFF Research Database (Denmark)

    Ciofu, Oana; Mandsberg, Lotte F; Wang, Hengzhuang

    2012-01-01

    During chronic lung infection of patients with cystic fibrosis, Pseudomonas aeruginosa can survive for long periods of time under the challenging selective pressure imposed by the immune system and antibiotic treatment as a result of its biofilm mode of growth and adaptive evolution mediated by g...... the importance of biofilm prevention strategies by early aggressive antibiotic prophylaxis or therapy before phenotypic diversification during chronic lung infection of patients with cystic fibrosis....

  4. Data matching for free-surface multiple attenuation by multidimensional deconvolution

    Science.gov (United States)

    van der Neut, Joost; Frijlink, Martijn; van Borselen, Roald

    2012-09-01

    A common strategy for surface-related multiple elimination of seismic data is to predict multiples by a convolutional model and subtract these adaptively from the input gathers. Problems can be posed by interfering multiples and primaries. Removing multiples by multidimensional deconvolution (MDD) (inversion) does not suffer from these problems. However, this approach requires data to be consistent, which is often not the case, especially not at interpolated near-offsets. A novel method is proposed to improve data consistency prior to inversion. This is done by backpropagating first-order multiples with a time-gated reference primary event and matching these with early primaries in the input gather. After data matching, multiple elimination by MDD can be applied with a deterministic inversion scheme.

  5. Deconvolution under Poisson noise using exact data fidelity and synthesis or analysis sparsity priors

    OpenAIRE

    Dupé , François-Xavier; Fadili , Jalal M.; Starck , Jean-Luc

    2012-01-01

    International audience; In this paper, we propose a Bayesian MAP estimator for solving the deconvolution problems when the observations are corrupted by Poisson noise. Towards this goal, a proper data fidelity term (log-likelihood) is introduced to reflect the Poisson statistics of the noise. On the other hand, as a prior, the images to restore are assumed to be positive and sparsely represented in a dictionary of waveforms such as wavelets or curvelets. Both analysis and synthesis-type spars...

  6. Epithelioid lung haenangioendiothelioma

    International Nuclear Information System (INIS)

    Finozzi, V.; Andrade, E.; Campos, N.; Pizarrosa, C.

    2000-01-01

    The first national case of epithelioid lung haenangioendiothelioma concerned a 60 years old woman. Clinical picture and TC diagnosis showed a vascularized tumor producing persistent haenoptoic expectoration.It is necessary to be acquainted with it for the purpose of a differentiated diagnosis. It is a tumor with specific immunohistochemical markers against factor V III.Treatment consisted of surgical resection. Concerning its classification it is situated at the limit between benign and malignant, prognosis is usually good, but evolution is slow, extending for over 20 years

  7. Phylogenetic ctDNA analysis depicts early stage lung cancer evolution

    DEFF Research Database (Denmark)

    Abbosh, Christopher; Birkbak, Nicolai Juul; Wilson, Gareth A.

    2017-01-01

    The early detection of relapse following primary surgery for non-small cell lung cancer and the characterization of emerging subclones seeding metastatic sites might offer new therapeutic approaches to limit tumor recurrence. The potential to non-invasively track tumor evolutionary dynamics in ct...

  8. Application of Glow Curve Deconvolution Method to Evaluate Low Dose TLD LiF

    International Nuclear Information System (INIS)

    Kurnia, E; Oetami, H R; Mutiah

    1996-01-01

    Thermoluminescence Dosimeter (TLD), especially LiF:Mg, Ti material, is one of the most practical personal dosimeter in known to date. Dose measurement under 100 uGy using TLD reader is very difficult in high precision level. The software application is used to improve the precision of the TLD reader. The objectives of the research is to compare three Tl-glow curve analysis method irradiated in the range between 5 up to 250 uGy. The first method is manual analysis, dose information is obtained from the area under the glow curve between pre selected temperature limits, and background signal is estimated by a second readout following the first readout. The second method is deconvolution method, separating glow curve into four peaks mathematically and dose information is obtained from area of peak 5, and background signal is eliminated computationally. The third method is deconvolution method but the dose is represented by the sum of area of peak 3,4 and 5. The result shown that the sum of peak 3,4 and 5 method can improve reproducibility six times better than manual analysis for dose 20 uGy, the ability to reduce MMD until 10 uGy rather than 60 uGy with manual analysis or 20 uGy with peak 5 area method. In linearity, the sum of peak 3,4 and 5 method yields exactly linear dose response curve over the entire dose range

  9. Increasing the darkfield contrast-to-noise ratio using a deconvolution-based information retrieval algorithm in X-ray grating-based phase-contrast imaging.

    Science.gov (United States)

    Weber, Thomas; Pelzer, Georg; Bayer, Florian; Horn, Florian; Rieger, Jens; Ritter, André; Zang, Andrea; Durst, Jürgen; Anton, Gisela; Michel, Thilo

    2013-07-29

    A novel information retrieval algorithm for X-ray grating-based phase-contrast imaging based on the deconvolution of the object and the reference phase stepping curve (PSC) as proposed by Modregger et al. was investigated in this paper. We applied the method for the first time on data obtained with a polychromatic spectrum and compared the results to those, received by applying the commonly used method, based on a Fourier analysis. We confirmed the expectation, that both methods deliver the same results for the absorption and the differential phase image. For the darkfield image, a mean contrast-to-noise ratio (CNR) increase by a factor of 1.17 using the new method was found. Furthermore, the dose saving potential was estimated for the deconvolution method experimentally. It is found, that for the conventional method a dose which is higher by a factor of 1.66 is needed to obtain a similar CNR value compared to the novel method. A further analysis of the data revealed, that the improvement in CNR and dose efficiency is due to the superior background noise properties of the deconvolution method, but at the cost of comparability between measurements at different applied dose values, as the mean value becomes dependent on the photon statistics used.

  10. Ganoderma lucidum targeting lung cancer signaling: A review.

    Science.gov (United States)

    Gill, Balraj Singh; Navgeet; Kumar, Sanjeev

    2017-06-01

    Lung cancer causes huge mortality to population, and pharmaceutical companies require new drugs as an alternative either synthetic or natural targeting lung cancer. The conventional therapies cause side effects, and therefore, natural products are used as a therapeutic candidate in lung cancer. Chemical diversity among natural products highlights the impact of evolution and survival of fittest. One such neglected natural product is Ganoderma lucidum used for promoting health and longevity for a longer time. The major bioconstituents of G. lucidum are mainly terpenes, polysaccharides, and proteins, which were explored for various activities ranging from apoptosis to autophagy. The bioconstituents of G. lucidum activate plasma membrane receptors and initiate various downstream signaling leading to nuclear factor-κB, phosphoinositide 3-kinase, Akt, and mammalian target of rapamycin in cancer. The bioconstituents regulate the expression of various genes involved in cell cycle, immune response, apoptosis, and autophagy in lung cancer. This review highlights the inextricable role of G. lucidum and its bioconstituents in lung cancer signaling for the first time.

  11. Maximum entropy deconvolution of low count nuclear medicine images

    International Nuclear Information System (INIS)

    McGrath, D.M.

    1998-12-01

    Maximum entropy is applied to the problem of deconvolving nuclear medicine images, with special consideration for very low count data. The physics of the formation of scintigraphic images is described, illustrating the phenomena which degrade planar estimates of the tracer distribution. Various techniques which are used to restore these images are reviewed, outlining the relative merits of each. The development and theoretical justification of maximum entropy as an image processing technique is discussed. Maximum entropy is then applied to the problem of planar deconvolution, highlighting the question of the choice of error parameters for low count data. A novel iterative version of the algorithm is suggested which allows the errors to be estimated from the predicted Poisson mean values. This method is shown to produce the exact results predicted by combining Poisson statistics and a Bayesian interpretation of the maximum entropy approach. A facility for total count preservation has also been incorporated, leading to improved quantification. In order to evaluate this iterative maximum entropy technique, two comparable methods, Wiener filtering and a novel Bayesian maximum likelihood expectation maximisation technique, were implemented. The comparison of results obtained indicated that this maximum entropy approach may produce equivalent or better measures of image quality than the compared methods, depending upon the accuracy of the system model used. The novel Bayesian maximum likelihood expectation maximisation technique was shown to be preferable over many existing maximum a posteriori methods due to its simplicity of implementation. A single parameter is required to define the Bayesian prior, which suppresses noise in the solution and may reduce the processing time substantially. Finally, maximum entropy deconvolution was applied as a pre-processing step in single photon emission computed tomography reconstruction of low count data. Higher contrast results were

  12. Learning High-Order Filters for Efficient Blind Deconvolution of Document Photographs

    KAUST Repository

    Xiao, Lei

    2016-09-16

    Photographs of text documents taken by hand-held cameras can be easily degraded by camera motion during exposure. In this paper, we propose a new method for blind deconvolution of document images. Observing that document images are usually dominated by small-scale high-order structures, we propose to learn a multi-scale, interleaved cascade of shrinkage fields model, which contains a series of high-order filters to facilitate joint recovery of blur kernel and latent image. With extensive experiments, we show that our method produces high quality results and is highly efficient at the same time, making it a practical choice for deblurring high resolution text images captured by modern mobile devices. © Springer International Publishing AG 2016.

  13. Approximate deconvolution model for the simulation of turbulent gas-solid flows: An a priori analysis

    Science.gov (United States)

    Schneiderbauer, Simon; Saeedipour, Mahdi

    2018-02-01

    Highly resolved two-fluid model (TFM) simulations of gas-solid flows in vertical periodic channels have been performed to study closures for the filtered drag force and the Reynolds-stress-like contribution stemming from the convective terms. An approximate deconvolution model (ADM) for the large-eddy simulation of turbulent gas-solid suspensions is detailed and subsequently used to reconstruct those unresolved contributions in an a priori manner. With such an approach, an approximation of the unfiltered solution is obtained by repeated filtering allowing the determination of the unclosed terms of the filtered equations directly. A priori filtering shows that predictions of the ADM model yield fairly good agreement with the fine grid TFM simulations for various filter sizes and different particle sizes. In particular, strong positive correlation (ρ > 0.98) is observed at intermediate filter sizes for all sub-grid terms. Additionally, our study reveals that the ADM results moderately depend on the choice of the filters, such as box and Gaussian filter, as well as the deconvolution order. The a priori test finally reveals that ADM is superior compared to isotropic functional closures proposed recently [S. Schneiderbauer, "A spatially-averaged two-fluid model for dense large-scale gas-solid flows," AIChE J. 63, 3544-3562 (2017)].

  14. Genomics Assisted Ancestry Deconvolution in Grape

    Science.gov (United States)

    Sawler, Jason; Reisch, Bruce; Aradhya, Mallikarjuna K.; Prins, Bernard; Zhong, Gan-Yuan; Schwaninger, Heidi; Simon, Charles; Buckler, Edward; Myles, Sean

    2013-01-01

    The genus Vitis (the grapevine) is a group of highly diverse, diploid woody perennial vines consisting of approximately 60 species from across the northern hemisphere. It is the world’s most valuable horticultural crop with ~8 million hectares planted, most of which is processed into wine. To gain insights into the use of wild Vitis species during the past century of interspecific grape breeding and to provide a foundation for marker-assisted breeding programmes, we present a principal components analysis (PCA) based ancestry estimation method to calculate admixture proportions of hybrid grapes in the United States Department of Agriculture grape germplasm collection using genome-wide polymorphism data. We find that grape breeders have backcrossed to both the domesticated V. vinifera and wild Vitis species and that reasonably accurate genome-wide ancestry estimation can be performed on interspecific Vitis hybrids using a panel of fewer than 50 ancestry informative markers (AIMs). We compare measures of ancestry informativeness used in selecting SNP panels for two-way admixture estimation, and verify the accuracy of our method on simulated populations of admixed offspring. Our method of ancestry deconvolution provides a first step towards selection at the seed or seedling stage for desirable admixture profiles, which will facilitate marker-assisted breeding that aims to introgress traits from wild Vitis species while retaining the desirable characteristics of elite V. vinifera cultivars. PMID:24244717

  15. Genomics assisted ancestry deconvolution in grape.

    Directory of Open Access Journals (Sweden)

    Jason Sawler

    Full Text Available The genus Vitis (the grapevine is a group of highly diverse, diploid woody perennial vines consisting of approximately 60 species from across the northern hemisphere. It is the world's most valuable horticultural crop with ~8 million hectares planted, most of which is processed into wine. To gain insights into the use of wild Vitis species during the past century of interspecific grape breeding and to provide a foundation for marker-assisted breeding programmes, we present a principal components analysis (PCA based ancestry estimation method to calculate admixture proportions of hybrid grapes in the United States Department of Agriculture grape germplasm collection using genome-wide polymorphism data. We find that grape breeders have backcrossed to both the domesticated V. vinifera and wild Vitis species and that reasonably accurate genome-wide ancestry estimation can be performed on interspecific Vitis hybrids using a panel of fewer than 50 ancestry informative markers (AIMs. We compare measures of ancestry informativeness used in selecting SNP panels for two-way admixture estimation, and verify the accuracy of our method on simulated populations of admixed offspring. Our method of ancestry deconvolution provides a first step towards selection at the seed or seedling stage for desirable admixture profiles, which will facilitate marker-assisted breeding that aims to introgress traits from wild Vitis species while retaining the desirable characteristics of elite V. vinifera cultivars.

  16. Full cycle rapid scan EPR deconvolution algorithm.

    Science.gov (United States)

    Tseytlin, Mark

    2017-08-01

    Rapid scan electron paramagnetic resonance (RS EPR) is a continuous-wave (CW) method that combines narrowband excitation and broadband detection. Sinusoidal magnetic field scans that span the entire EPR spectrum cause electron spin excitations twice during the scan period. Periodic transient RS signals are digitized and time-averaged. Deconvolution of absorption spectrum from the measured full-cycle signal is an ill-posed problem that does not have a stable solution because the magnetic field passes the same EPR line twice per sinusoidal scan during up- and down-field passages. As a result, RS signals consist of two contributions that need to be separated and postprocessed individually. Deconvolution of either of the contributions is a well-posed problem that has a stable solution. The current version of the RS EPR algorithm solves the separation problem by cutting the full-scan signal into two half-period pieces. This imposes a constraint on the experiment; the EPR signal must completely decay by the end of each half-scan in order to not be truncated. The constraint limits the maximum scan frequency and, therefore, the RS signal-to-noise gain. Faster scans permit the use of higher excitation powers without saturating the spin system, translating into a higher EPR sensitivity. A stable, full-scan algorithm is described in this paper that does not require truncation of the periodic response. This algorithm utilizes the additive property of linear systems: the response to a sum of two inputs is equal the sum of responses to each of the inputs separately. Based on this property, the mathematical model for CW RS EPR can be replaced by that of a sum of two independent full-cycle pulsed field-modulated experiments. In each of these experiments, the excitation power equals to zero during either up- or down-field scan. The full-cycle algorithm permits approaching the upper theoretical scan frequency limit; the transient spin system response must decay within the scan

  17. Facilitating high resolution mass spectrometry data processing for screening of environmental water samples: An evaluation of two deconvolution tools

    NARCIS (Netherlands)

    Bade, R.; Causanilles, A.; Emke, E.; Bijlsma, L.; Sancho, J.V.; Hernandez, F.; de Voogt, P.

    2016-01-01

    A screening approach was applied to influent and effluent wastewater samples. After injection in a LC-LTQ-Orbitrap, data analysis was performed using two deconvolution tools, MsXelerator (modules MPeaks and MS Compare) and Sieve 2.1. The outputs were searched incorporating an in-house database of >

  18. Application of Deconvolution Algorithm of Point Spread Function in Improving Image Quality: An Observer Preference Study on Chest Radiography.

    Science.gov (United States)

    Chae, Kum Ju; Goo, Jin Mo; Ahn, Su Yeon; Yoo, Jin Young; Yoon, Soon Ho

    2018-01-01

    To evaluate the preference of observers for image quality of chest radiography using the deconvolution algorithm of point spread function (PSF) (TRUVIEW ART algorithm, DRTECH Corp.) compared with that of original chest radiography for visualization of anatomic regions of the chest. Prospectively enrolled 50 pairs of posteroanterior chest radiographs collected with standard protocol and with additional TRUVIEW ART algorithm were compared by four chest radiologists. This algorithm corrects scattered signals generated by a scintillator. Readers independently evaluated the visibility of 10 anatomical regions and overall image quality with a 5-point scale of preference. The significance of the differences in reader's preference was tested with a Wilcoxon's signed rank test. All four readers preferred the images applied with the algorithm to those without algorithm for all 10 anatomical regions (mean, 3.6; range, 3.2-4.0; p chest anatomical structures applied with the deconvolution algorithm of PSF was superior to the original chest radiography.

  19. Analysis of soda-lime glasses using non-negative matrix factor deconvolution of Raman spectra

    OpenAIRE

    Woelffel , William; Claireaux , Corinne; Toplis , Michael J.; Burov , Ekaterina; Barthel , Etienne; Shukla , Abhay; Biscaras , Johan; Chopinet , Marie-Hélène; Gouillart , Emmanuelle

    2015-01-01

    International audience; Novel statistical analysis and machine learning algorithms are proposed for the deconvolution and interpretation of Raman spectra of silicate glasses in the Na 2 O-CaO-SiO 2 system. Raman spectra are acquired along diffusion profiles of three pairs of glasses centered around an average composition of 69. 9 wt. % SiO 2 , 12. 7 wt. % CaO , 16. 8 wt. % Na 2 O. The shape changes of the Raman spectra across the compositional domain are analyzed using a combination of princi...

  20. Improved Transient Response Estimations in Predicting 40 Hz Auditory Steady-State Response Using Deconvolution Methods

    Directory of Open Access Journals (Sweden)

    Xiaodan Tan

    2017-12-01

    Full Text Available The auditory steady-state response (ASSR is one of the main approaches in clinic for health screening and frequency-specific hearing assessment. However, its generation mechanism is still of much controversy. In the present study, the linear superposition hypothesis for the generation of ASSRs was investigated by comparing the relationships between the classical 40 Hz ASSR and three synthetic ASSRs obtained from three different templates for transient auditory evoked potential (AEP. These three AEPs are the traditional AEP at 5 Hz and two 40 Hz AEPs derived from two deconvolution algorithms using stimulus sequences, i.e., continuous loop averaging deconvolution (CLAD and multi-rate steady-state average deconvolution (MSAD. CLAD requires irregular inter-stimulus intervals (ISIs in the sequence while MSAD uses the same ISIs but evenly-spaced stimulus sequences which mimics the classical 40 Hz ASSR. It has been reported that these reconstructed templates show similar patterns but significant difference in morphology and distinct frequency characteristics in synthetic ASSRs. The prediction accuracies of ASSR using these templates show significant differences (p < 0.05 in 45.95, 36.28, and 10.84% of total time points within four cycles of ASSR for the traditional, CLAD, and MSAD templates, respectively, as compared with the classical 40 Hz ASSR, and the ASSR synthesized from the MSAD transient AEP suggests the best similarity. And such a similarity is also demonstrated at individuals only in MSAD showing no statistically significant difference (Hotelling's T2 test, T2 = 6.96, F = 0.80, p = 0.592 as compared with the classical 40 Hz ASSR. The present results indicate that both stimulation rate and sequencing factor (ISI variation affect transient AEP reconstructions from steady-state stimulation protocols. Furthermore, both auditory brainstem response (ABR and middle latency response (MLR are observed in contributing to the composition of ASSR but

  1. Visualizing Escherichia coli sub-cellular structure using sparse deconvolution Spatial Light Interference Tomography.

    Directory of Open Access Journals (Sweden)

    Mustafa Mir

    Full Text Available Studying the 3D sub-cellular structure of living cells is essential to our understanding of biological function. However, tomographic imaging of live cells is challenging mainly because they are transparent, i.e., weakly scattering structures. Therefore, this type of imaging has been implemented largely using fluorescence techniques. While confocal fluorescence imaging is a common approach to achieve sectioning, it requires fluorescence probes that are often harmful to the living specimen. On the other hand, by using the intrinsic contrast of the structures it is possible to study living cells in a non-invasive manner. One method that provides high-resolution quantitative information about nanoscale structures is a broadband interferometric technique known as Spatial Light Interference Microscopy (SLIM. In addition to rendering quantitative phase information, when combined with a high numerical aperture objective, SLIM also provides excellent depth sectioning capabilities. However, like in all linear optical systems, SLIM's resolution is limited by diffraction. Here we present a novel 3D field deconvolution algorithm that exploits the sparsity of phase images and renders images with resolution beyond the diffraction limit. We employ this label-free method, called deconvolution Spatial Light Interference Tomography (dSLIT, to visualize coiled sub-cellular structures in E. coli cells which are most likely the cytoskeletal MreB protein and the division site regulating MinCDE proteins. Previously these structures have only been observed using specialized strains and plasmids and fluorescence techniques. Our results indicate that dSLIT can be employed to study such structures in a practical and non-invasive manner.

  2. Deconvolution of gamma energy spectra from NaI (Tl) detector using the Nelder-Mead zero order optimisation method

    International Nuclear Information System (INIS)

    RAVELONJATO, R.H.M.

    2010-01-01

    The aim of this work is to develop a method for gamma ray spectrum deconvolution from NaI(Tl) detector. Deconvolution programs edited with Matlab 7.6 using Nelder-Mead method were developed to determine multiplet shape parameters. The simulation parameters were: centroid distance/FWHM ratio, Signal/Continuum ratio and counting rate. The test using synthetic spectrum was built with 3σ uncertainty. The tests gave suitable results for centroid distance/FWHM ratio≥2, Signal/Continuum ratio ≥2 and counting level 100 counts. The technique was applied to measure the activity of soils and rocks samples from the Anosy region. The rock activity varies from (140±8) Bq.kg -1 to (190±17)Bq.kg -1 for potassium-40; from (343±7)Bq.Kg -1 to (881±6)Bq.kg -1 for thorium-213 and from (100±3)Bq.kg -1 to (164 ±4) Bq.kg -1 for uranium-238. The soil activity varies from (148±1) Bq.kg -1 to (652±31)Bq.kg -1 for potassium-40; from (1100±11)Bq.kg -1 to (5700 ± 40)Bq.kg -1 for thorium-232 and from (190 ±2) Bq.kg -1 to (779 ±15) Bq -1 for uranium -238. Among 11 samples, the activity value discrepancies compared to high resolution HPGe detector varies from 0.62% to 42.86%. The fitting residuals are between -20% and +20%. The Figure of Merit values are around 5%. These results show that the method developed is reliable for such activity range and the convergence is good. So, NaI(Tl) detector combined with deconvolution method developed may replace HPGe detector within an acceptable limit, if the identification of each nuclides in the radioactive series is not required [fr

  3. Deconvolution of 2D coincident Doppler broadening spectroscopy using the Richardson-Lucy algorithm

    International Nuclear Information System (INIS)

    Zhang, J.D.; Zhou, T.J.; Cheung, C.K.; Beling, C.D.; Fung, S.; Ng, M.K.

    2006-01-01

    Coincident Doppler Broadening Spectroscopy (CDBS) measurements are popular in positron solid-state studies of materials. By utilizing the instrumental resolution function obtained from a gamma line close in energy to the 511 keV annihilation line, it is possible to significantly enhance the quality of the CDBS spectra using deconvolution algorithms. In this paper, we compare two algorithms, namely the Non-Negativity Least Squares (NNLS) regularized method and the Richardson-Lucy (RL) algorithm. The latter, which is based on the method of maximum likelihood, is found to give superior results to the regularized least-squares algorithm and with significantly less computer processing time

  4. The airway microbiota in early cystic fibrosis lung disease.

    Science.gov (United States)

    Frayman, Katherine B; Armstrong, David S; Grimwood, Keith; Ranganathan, Sarath C

    2017-11-01

    Infection plays a critical role in the pathogenesis of cystic fibrosis (CF) lung disease. Over the past two decades, the application of molecular and extended culture-based techniques to microbial analysis has changed our understanding of the lungs in both health and disease. CF lung disease is a polymicrobial disorder, with obligate and facultative anaerobes recovered alongside traditional pathogens in varying proportions, with some differences observed to correlate with disease stage. While healthy lungs are not sterile, differences between the lower airway microbiota of individuals with CF and disease-controls are already apparent in childhood. Understanding the evolution of the CF airway microbiota, and its relationship with clinical treatments and outcome at each disease stage, will improve our understanding of the pathogenesis of CF lung disease and potentially inform clinical management. This review summarizes current knowledge of the early development of the respiratory microbiota in healthy children and then discusses what is known about the airway microbiota in individuals with CF, including how it evolves over time and where future research priorities lie. © 2017 Wiley Periodicals, Inc.

  5. Deconvolution analysis of sup(99m)Tc-methylene diphosphonate kinetics in metabolic bone disease

    Energy Technology Data Exchange (ETDEWEB)

    Knop, J.; Kroeger, E.; Stritzke, P.; Schneider, C.; Kruse, H.P.

    1981-02-01

    The kinetics of sup(99m)Tc-methylene diphosphonate (MDP) and /sup 47/Ca were studied in three patients with osteoporosis, three patients with hyperparathyroidism, and two patients with osteomalacia. The activities of sup(99m)Tc-MDP were recorded in the lumbar spine, paravertebral soft tissues, and in venous blood samples for 1 h after injection. The results were submitted to deconvolution analysis to determine regional bone accumulation rates. /sup 47/Ca kinetics were analysed by a linear two-compartment model quantitating short-term mineral exchange, exchangeable bone calcium, and calcium accretion. The sup(99m)Tc-MDP accumulation rates were small in osteoporosis, greater in hyperparathyroidism, and greatest in osteomalacia. No correlations were obtained between sup(99m)Tc-MDP bone accumulation rates and the results of /sup 47/Ca kinetics. However, there was a significant relationship between the level of serum alkaline phosphatase and bone accumulation rates (R = 0.71, P < 0.025). As a result deconvolution analysis of regional sup(99m)Tc-MDP kinetics in dynamic bone scans might be useful to quantitate osseous tracer accumulation in metabolic bone disease. The lack of correlation between the results of sup(99m)Tc-MDP kinetics and /sup 47/Ca kinetics might suggest a preferential binding of sup(99m)Tc-MDP to the organic matrix of the bone, as has been suggested by other authors on the basis of experimental and clinical investigations.

  6. Mammographic image restoration using maximum entropy deconvolution

    International Nuclear Information System (INIS)

    Jannetta, A; Jackson, J C; Kotre, C J; Birch, I P; Robson, K J; Padgett, R

    2004-01-01

    An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise ratio perceived by the observer. Images of the TORMAM mammographic image quality phantom were recorded using the standard magnification settings of 1.8 magnification/fine focus and also at 1.8 magnification/broad focus and 3.0 magnification/fine focus; the latter two arrangements would normally give rise to unacceptable geometric blurring. Measured point-spread functions were used in conjunction with the MEM image processing to de-blur these images. The results are presented as comparative images of phantom test features and as observer scores for the raw and processed images. Visualization of high resolution features and the total image scores for the test phantom were improved by the application of the MEM processing. It is argued that this successful demonstration of image de-blurring in noisy radiological images offers the possibility of weakening the link between focal spot size and geometric blurring in radiology, thus opening up new approaches to system optimization

  7. Deconvolution-based resolution enhancement of chemical ice core records obtained by continuous flow analysis

    DEFF Research Database (Denmark)

    Rasmussen, Sune Olander; Andersen, Katrine K.; Johnsen, Sigfus Johann

    2005-01-01

    Continuous flow analysis (CFA) has become a popular measuring technique for obtaining high-resolution chemical ice core records due to an attractive combination of measuring speed and resolution. However, when analyzing the deeper sections of ice cores or cores from low-accumulation areas...... of the data for high-resolution studies such as annual layer counting. The presented method uses deconvolution techniques and is robust to the presence of noise in the measurements. If integrated into the data processing, it requires no additional data collection. The method is applied to selected ice core...

  8. Seismic Input Motion Determined from a Surface-Downhole Pair of Sensors: A Constrained Deconvolution Approach

    OpenAIRE

    Dino Bindi; Stefano Parolai; M. Picozzi; A. Ansal

    2010-01-01

    We apply a deconvolution approach to the problem of determining the input motion at the base of an instrumented borehole using only a pair of recordings, one at the borehole surface and the other at its bottom. To stabilize the bottom-tosurface spectral ratio, we apply an iterative regularization algorithm that allows us to constrain the solution to be positively defined and to have a finite time duration. Through the analysis of synthetic data, we show that the method is capab...

  9. Memory-effect based deconvolution microscopy for super-resolution imaging through scattering media

    Science.gov (United States)

    Edrei, Eitan; Scarcelli, Giuliano

    2016-09-01

    High-resolution imaging through turbid media is a fundamental challenge of optical sciences that has attracted a lot of attention in recent years for its wide range of potential applications. Here, we demonstrate that the resolution of imaging systems looking behind a highly scattering medium can be improved below the diffraction-limit. To achieve this, we demonstrate a novel microscopy technique enabled by the optical memory effect that uses a deconvolution image processing and thus it does not require iterative focusing, scanning or phase retrieval procedures. We show that this newly established ability of direct imaging through turbid media provides fundamental and practical advantages such as three-dimensional refocusing and unambiguous object reconstruction.

  10. Deconvolution based attenuation correction for time-of-flight positron emission tomography

    Science.gov (United States)

    Lee, Nam-Yong

    2017-10-01

    For an accurate quantitative reconstruction of the radioactive tracer distribution in positron emission tomography (PET), we need to take into account the attenuation of the photons by the tissues. For this purpose, we propose an attenuation correction method for the case when a direct measurement of the attenuation distribution in the tissues is not available. The proposed method can determine the attenuation factor up to a constant multiple by exploiting the consistency condition that the exact deconvolution of noise-free time-of-flight (TOF) sinogram must satisfy. Simulation studies shows that the proposed method corrects attenuation artifacts quite accurately for TOF sinograms of a wide range of temporal resolutions and noise levels, and improves the image reconstruction for TOF sinograms of higher temporal resolutions by providing more accurate attenuation correction.

  11. Pixel-by-pixel mean transit time without deconvolution.

    Science.gov (United States)

    Dobbeleir, Andre A; Piepsz, Amy; Ham, Hamphrey R

    2008-04-01

    Mean transit time (MTT) within a kidney is given by the integral of the renal activity on a well-corrected renogram between time zero and time t divided by the integral of the plasma activity between zero and t, providing that t is close to infinity. However, as the data acquisition of a renogram is finite, the MTT calculated using this approach might result in the underestimation of the true MTT. To evaluate the degree of this underestimation we conducted a simulation study. One thousand renograms were created by convoluting various plasma curves obtained from patients with different renal clearance levels with simulated retentions curves having different shapes and mean transit times. For a 20 min renogram, the calculated MTT started to underestimate the MTT when the MTT was higher than 6 min. The longer the MTT, the greater was the underestimation. Up to a MTT value of 6 min, the error on the MTT estimation is negligible. As normal cortical transit is less than 2 min, this approach is used for patients to calculate pixel-to-pixel cortical mean transit time and to create a MTT parametric image without deconvolution.

  12. Evolution of the respiratory function after irradiation of the two lungs (about 50 cases)

    International Nuclear Information System (INIS)

    Boulier, Alain.

    1976-01-01

    Whole chest irradiation to a dose of 870 rets, i.e. 1500 rads in 4 sessions over 7 days, causes only minimal functional reduction in patients whose pulmonary function was normal prior to radiation therapy. This reduction is the result of combination of a mild restrictive syndrome (decreased vital capacity and residual volume) and a slight impairment in gas transfer. The changes in the gas transfer do not seem to be related to the restrictive syndrome. They would appear to be due to changes in the gas exchange zone other than a restriction of the gas exchange surface. A comparison of the results with those in the literature shows that there is a distinct relationship between the dose delivered to the lung and the functional reduction in the gas transfer zone. The reduction (DT) increases exponentially with the biologically active dose (in rets). The lungs tolerance dose calculated from the results of lung function studies corresponds to that evaluated by Abbatucci et al. on the basis of clinical and radiological criteria; it is very close to 900 rets. An increase in this dose would inevitably result in a deterioration in function that would rapidly become too severe: a 20% reduction in alveolararterial 'ductance' would be too great even for patients whose lung function was normal prior to radiation therapy. The recommended total dose of 870 rets already exposes the patient to a risk of a 5% (+-10%) reduction in gas exchange. Function studies prior to radiation therapy are indispensible: the radiotherapist remains, of course, the sole judge of the advisability of lung irradiation, but it is imperative that the physiologist participate in the post-irradiation follow-up [fr

  13. The deconvolution of Doppler-broadened positron annihilation measurements using fast Fourier transforms and power spectral analysis

    International Nuclear Information System (INIS)

    Schaffer, J.P.; Shaughnessy, E.J.; Jones, P.L.

    1984-01-01

    A deconvolution procedure which corrects Doppler-broadened positron annihilation spectra for instrument resolution is described. The method employs fast Fourier transforms, is model independent, and does not require iteration. The mathematical difficulties associated with the incorrectly posed first order Fredholm integral equation are overcome by using power spectral analysis to select a limited number of low frequency Fourier coefficients. The FFT/power spectrum method is then demonstrated for an irradiated high purity single crystal sapphire sample. (orig.)

  14. Intersections of lung progenitor cells, lung disease and lung cancer.

    Science.gov (United States)

    Kim, Carla F

    2017-06-30

    The use of stem cell biology approaches to study adult lung progenitor cells and lung cancer has brought a variety of new techniques to the field of lung biology and has elucidated new pathways that may be therapeutic targets in lung cancer. Recent results have begun to identify the ways in which different cell populations interact to regulate progenitor activity, and this has implications for the interventions that are possible in cancer and in a variety of lung diseases. Today's better understanding of the mechanisms that regulate lung progenitor cell self-renewal and differentiation, including understanding how multiple epigenetic factors affect lung injury repair, holds the promise for future better treatments for lung cancer and for optimising the response to therapy in lung cancer. Working between platforms in sophisticated organoid culture techniques, genetically engineered mouse models of injury and cancer, and human cell lines and specimens, lung progenitor cell studies can begin with basic biology, progress to translational research and finally lead to the beginnings of clinical trials. Copyright ©ERS 2017.

  15. Intersections of lung progenitor cells, lung disease and lung cancer

    Directory of Open Access Journals (Sweden)

    Carla F. Kim

    2017-06-01

    Full Text Available The use of stem cell biology approaches to study adult lung progenitor cells and lung cancer has brought a variety of new techniques to the field of lung biology and has elucidated new pathways that may be therapeutic targets in lung cancer. Recent results have begun to identify the ways in which different cell populations interact to regulate progenitor activity, and this has implications for the interventions that are possible in cancer and in a variety of lung diseases. Today's better understanding of the mechanisms that regulate lung progenitor cell self-renewal and differentiation, including understanding how multiple epigenetic factors affect lung injury repair, holds the promise for future better treatments for lung cancer and for optimising the response to therapy in lung cancer. Working between platforms in sophisticated organoid culture techniques, genetically engineered mouse models of injury and cancer, and human cell lines and specimens, lung progenitor cell studies can begin with basic biology, progress to translational research and finally lead to the beginnings of clinical trials.

  16. Determining mineralogical variations of aeolian deposits using thermal infrared emissivity and linear deconvolution methods

    Science.gov (United States)

    Hubbard, Bernard E.; Hooper, Donald M.; Solano, Federico; Mars, John C.

    2018-01-01

    We apply linear deconvolution methods to derive mineral and glass proportions for eight field sample training sites at seven dune fields: (1) Algodones, California; (2) Big Dune, Nevada; (3) Bruneau, Idaho; (4) Great Kobuk Sand Dunes, Alaska; (5) Great Sand Dunes National Park and Preserve, Colorado; (6) Sunset Crater, Arizona; and (7) White Sands National Monument, New Mexico. These dune fields were chosen because they represent a wide range of mineral grain mixtures and allow us to gauge a better understanding of both compositional and sorting effects within terrestrial and extraterrestrial dune systems. We also use actual ASTER TIR emissivity imagery to map the spatial distribution of these minerals throughout the seven dune fields and evaluate the effects of degraded spectral resolution on the accuracy of mineral abundances retrieved. Our results show that hyperspectral data convolutions of our laboratory emissivity spectra outperformed multispectral data convolutions of the same data with respect to the mineral, glass and lithic abundances derived. Both the number and wavelength position of spectral bands greatly impacts the accuracy of linear deconvolution retrieval of feldspar proportions (e.g. K-feldspar vs. plagioclase) especially, as well as the detection of certain mafic and carbonate minerals. In particular, ASTER mapping results show that several of the dune sites display patterns such that less dense minerals typically have higher abundances near the center of the active and most evolved dunes in the field, while more dense minerals and glasses appear to be more abundant along the margins of the active dune fields.

  17. Determining mineralogical variations of aeolian deposits using thermal infrared emissivity and linear deconvolution methods

    Science.gov (United States)

    Hubbard, Bernard E.; Hooper, Donald M.; Solano, Federico; Mars, John C.

    2018-02-01

    We apply linear deconvolution methods to derive mineral and glass proportions for eight field sample training sites at seven dune fields: (1) Algodones, California; (2) Big Dune, Nevada; (3) Bruneau, Idaho; (4) Great Kobuk Sand Dunes, Alaska; (5) Great Sand Dunes National Park and Preserve, Colorado; (6) Sunset Crater, Arizona; and (7) White Sands National Monument, New Mexico. These dune fields were chosen because they represent a wide range of mineral grain mixtures and allow us to gauge a better understanding of both compositional and sorting effects within terrestrial and extraterrestrial dune systems. We also use actual ASTER TIR emissivity imagery to map the spatial distribution of these minerals throughout the seven dune fields and evaluate the effects of degraded spectral resolution on the accuracy of mineral abundances retrieved. Our results show that hyperspectral data convolutions of our laboratory emissivity spectra outperformed multispectral data convolutions of the same data with respect to the mineral, glass and lithic abundances derived. Both the number and wavelength position of spectral bands greatly impacts the accuracy of linear deconvolution retrieval of feldspar proportions (e.g. K-feldspar vs. plagioclase) especially, as well as the detection of certain mafic and carbonate minerals. In particular, ASTER mapping results show that several of the dune sites display patterns such that less dense minerals typically have higher abundances near the center of the active and most evolved dunes in the field, while more dense minerals and glasses appear to be more abundant along the margins of the active dune fields.

  18. Multi-kernel deconvolution for contrast improvement in a full field imaging system with engineered PSFs using conical diffraction

    Science.gov (United States)

    Enguita, Jose M.; Álvarez, Ignacio; González, Rafael C.; Cancelas, Jose A.

    2018-01-01

    The problem of restoration of a high-resolution image from several degraded versions of the same scene (deconvolution) has been receiving attention in the last years in fields such as optics and computer vision. Deconvolution methods are usually based on sets of images taken with small (sub-pixel) displacements or slightly different focus. Techniques based on sets of images obtained with different point-spread-functions (PSFs) engineered by an optical system are less popular and mostly restricted to microscopic systems, where a spot of light is projected onto the sample under investigation, which is then scanned point-by-point. In this paper, we use the effect of conical diffraction to shape the PSFs in a full-field macroscopic imaging system. We describe a series of simulations and real experiments that help to evaluate the possibilities of the system, showing the enhancement in image contrast even at frequencies that are strongly filtered by the lens transfer function or when sampling near the Nyquist frequency. Although results are preliminary and there is room to optimize the prototype, the idea shows promise to overcome the limitations of the image sensor technology in many fields, such as forensics, medical, satellite, or scientific imaging.

  19. Thermogravimetric pyrolysis kinetics of bamboo waste via Asymmetric Double Sigmoidal (Asym2sig) function deconvolution.

    Science.gov (United States)

    Chen, Chuihan; Miao, Wei; Zhou, Cheng; Wu, Hongjuan

    2017-02-01

    Thermogravimetric kinetic of bamboo waste (BW) pyrolysis has been studied using Asymmetric Double Sigmoidal (Asym2sig) function deconvolution. Through deconvolution, BW pyrolytic profiles could be separated into three reactions well, each of which corresponded to pseudo hemicelluloses (P-HC), pseudo cellulose (P-CL), and pseudo lignin (P-LG) decomposition. Based on Friedman method, apparent activation energy of P-HC, P-CL, P-LG was found to be 175.6kJ/mol, 199.7kJ/mol, and 158.4kJ/mol, respectively. Energy compensation effects (lnk 0, z vs. E z ) of pseudo components were in well linearity, from which pre-exponential factors (k 0 ) were determined as 6.22E+11s -1 (P-HC), 4.50E+14s -1 (P-CL) and 1.3E+10s -1 (P-LG). Integral master-plots results showed pyrolytic mechanism of P-HC, P-CL, and P-LG was reaction order of f(α)=(1-α) 2 , f(α)=1-α and f(α)=(1-α) n (n=6-8), respectively. Mechanism of P-HC and P-CL could be further reconstructed to n-th order Avrami-Erofeyev model of f(α)=0.62(1-α)[-ln(1-α)] -0.61 (n=0.62) and f(α)=1.08(1-α)[-ln(1-α)] 0.074 (n=1.08). Two-steps reaction was more suitable for P-LG pyrolysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Understanding Cancer Genome and Its Evolution by Next Generation Sequencing

    DEFF Research Database (Denmark)

    Hou, Yong

    Cancer will cause 13 million deaths by the year of 2030, ranking the second leading cause of death worldwide. Previous studies indicate that most of the cancers originate from cells that acquired somatic mutations and evolved as Darwin Theory. Ten biological insights of cancer have been summarized...... recently. Cutting-age technologies like next generation sequencing (NGS) enable exploring cancer genome and evolution much more efficiently. However, integrated cancer genome sequencing studies showed great inter-/intra-tumoral heterogeneity (ITH) and complex evolution patterns beyond the cancer biological...... knowledge we previously know. There is very limited knowledge of East Asia lung cancer genome except enrichment of EGFR mutations and lack of KRAS mutations. We carried out integrated genomic, transcriptomic and methylomic analysis of 335 primary Chinese lung adenocarcinomas (LUAD) and 35 corresponding...

  1. Multichannel deconvolution and source detection using sparse representations: application to Fermi project

    International Nuclear Information System (INIS)

    Schmitt, Jeremy

    2011-01-01

    This thesis presents new methods for spherical Poisson data analysis for the Fermi mission. Fermi main scientific objectives, the study of diffuse galactic background et the building of the source catalog, are complicated by the weakness of photon flux and the point spread function of the instrument. This thesis proposes a new multi-scale representation for Poisson data on the sphere, the Multi-Scale Variance Stabilizing Transform on the Sphere (MS-VSTS), consisting in the combination of a spherical multi-scale transform (wavelets, curvelets) with a variance stabilizing transform (VST). This method is applied to mono- and multichannel Poisson noise removal, missing data interpolation, background extraction and multichannel deconvolution. Finally, this thesis deals with the problem of component separation using sparse representations (template fitting). (author) [fr

  2. Deconvolution of H-alpha profiles measured by Thompson scattering collecting optics

    International Nuclear Information System (INIS)

    LeBlanc, B.; Grek, B.

    1986-01-01

    This paper discusses that optically fast multichannel Thomson scattering optics that can be used for H-alpha emission profile measurement. A technique based on the fact that a particular volume element of the overall field of view can be seen by many channels, depending on its location, is discussed. It is applied to measurement made on PDX with the vertically viewing TVTS collecting optics (56 channels). The authors found that for this case, about 28 Fourier modes are optimum to represent the spatial behavior of the plasma emissivity. The coefficients for these modes are obtained by doing a least-square-fit to the data subjet to certain constraints. The important constraints are non-negative emissivity, the assumed up and down symmetry and zero emissivity beyond the liners. H-alpha deconvolutions are presented for diverted and circular discharges

  3. Restoring defect structures in 3C-SiC/Si (001) from spherical aberration-corrected high-resolution transmission electron microscope images by means of deconvolution processing.

    Science.gov (United States)

    Wen, C; Wan, W; Li, F H; Tang, D

    2015-04-01

    The [110] cross-sectional samples of 3C-SiC/Si (001) were observed with a spherical aberration-corrected 300 kV high-resolution transmission electron microscope. Two images taken not close to the Scherzer focus condition and not representing the projected structures intuitively were utilized for performing the deconvolution. The principle and procedure of image deconvolution and atomic sort recognition are summarized. The defect structure restoration together with the recognition of Si and C atoms from the experimental images has been illustrated. The structure maps of an intrinsic stacking fault in the area of SiC, and of Lomer and 60° shuffle dislocations at the interface have been obtained at atomic level. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Variation of High-Intensity Therapeutic Ultrasound (HITU) Pressure Field Characterization: Effects of Hydrophone Choice, Nonlinearity, Spatial Averaging and Complex Deconvolution.

    Science.gov (United States)

    Liu, Yunbo; Wear, Keith A; Harris, Gerald R

    2017-10-01

    Reliable acoustic characterization is fundamental for patient safety and clinical efficacy during high-intensity therapeutic ultrasound (HITU) treatment. Technical challenges, such as measurement variation and signal analysis, still exist for HITU exposimetry using ultrasound hydrophones. In this work, four hydrophones were compared for pressure measurement: a robust needle hydrophone, a small polyvinylidene fluoride capsule hydrophone and two fiberoptic hydrophones. The focal waveform and beam distribution of a single-element HITU transducer (1.05 MHz and 3.3 MHz) were evaluated. Complex deconvolution between the hydrophone voltage signal and frequency-dependent complex sensitivity was performed to obtain pressure waveforms. Compressional pressure (p + ), rarefactional pressure (p - ) and focal beam distribution were compared up to 10.6/-6.0 MPa (p + /p - ) (1.05 MHz) and 20.65/-7.20 MPa (3.3 MHz). The effects of spatial averaging, local non-linear distortion, complex deconvolution and hydrophone damage thresholds were investigated. This study showed a variation of no better than 10%-15% among hydrophones during HITU pressure characterization. Published by Elsevier Inc.

  5. [Primitive lung abscess: an unusual situation in children].

    Science.gov (United States)

    Bouyahia, O; Jlidi, S; Sammoud, A

    2014-12-01

    Lung abscess is a localized area of non tuberculosis suppurative necrosis of the parenchyma lung, resulting in formation of a cavity containing purulent material. This pathology is uncommon in childhood. A 3-year-6 month-old boy was admitted with prolonged fever and dyspnea. Chest X-ray showed a non systemized, well limited, thick walled, hydric, and excavated opacity containing an air-fluid level. Chest ultrasound examination showed a collection of 6. 8 cm of diameter in the right pulmonary field with an air-fluid level. Hemoculture showed Staphylococcus aureus. The patient received large spectrum antibiotherapy. Three days after, he presented a septic shock and surgical drainage was indicated. Histological examination confirmed the diagnosis of lung abscess. Any underlying condition such as inoculation site, local cause or immune deficiency, was noted and diagnosis of primary abscess was made. The patient demonstrated complete recovery. He is asymptomatic with normal chest X-ray and pulmonary function after 3 years of evolution. Lung abscess represent a rare cause of prolonged fever in childhood. An underlying condition must be excluded to eliminate secondary abscess. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  6. CXCR4/CXCL12 in Non-Small-Cell Lung Cancer Metastasis to the Brain

    Directory of Open Access Journals (Sweden)

    Sebastiano Cavallaro

    2013-01-01

    Full Text Available Lung cancer represents the leading cause of cancer-related mortality throughout the world. Patients die of local progression, disseminated disease, or both. At least one third of the people with lung cancer develop brain metastases at some point during their disease, even often before the diagnosis of lung cancer is made. The high rate of brain metastasis makes lung cancer the most common type of tumor to spread to the brain. It is critical to understand the biologic basis of brain metastases to develop novel diagnostic and therapeutic approaches. This review will focus on the emerging data supporting the involvement of the chemokine CXCL12 and its receptor CXCR4 in the brain metastatic evolution of non-small-cell lung cancer (NSCLC and the pharmacological tools that may be used to interfere with this signaling axis.

  7. Benchmark of the non-parametric Bayesian deconvolution method implemented in the SINBAD code for X/γ rays spectra processing

    Energy Technology Data Exchange (ETDEWEB)

    Rohée, E. [CEA, LIST, Laboratoire Capteurs et Architectures Electroniques, F-91191 Gif-sur-Yvette (France); Coulon, R., E-mail: romain.coulon@cea.fr [CEA, LIST, Laboratoire Capteurs et Architectures Electroniques, F-91191 Gif-sur-Yvette (France); Carrel, F. [CEA, LIST, Laboratoire Capteurs et Architectures Electroniques, F-91191 Gif-sur-Yvette (France); Dautremer, T.; Barat, E.; Montagu, T. [CEA, LIST, Laboratoire de Modélisation et Simulation des Systèmes, F-91191 Gif-sur-Yvette (France); Normand, S. [CEA, DAM, Le Ponant, DPN/STXN, F-75015 Paris (France); Jammes, C. [CEA, DEN, Cadarache, DER/SPEx/LDCI, F-13108 Saint-Paul-lez-Durance (France)

    2016-11-11

    Radionuclide identification and quantification are a serious concern for many applications as for in situ monitoring at nuclear facilities, laboratory analysis, special nuclear materials detection, environmental monitoring, and waste measurements. High resolution gamma-ray spectrometry based on high purity germanium diode detectors is the best solution available for isotopic identification. Over the last decades, methods have been developed to improve gamma spectra analysis. However, some difficulties remain in the analysis when full energy peaks are folded together with high ratio between their amplitudes, and when the Compton background is much larger compared to the signal of a single peak. In this context, this study deals with the comparison between a conventional analysis based on “iterative peak fitting deconvolution” method and a “nonparametric Bayesian deconvolution” approach developed by the CEA LIST and implemented into the SINBAD code. The iterative peak fit deconvolution is used in this study as a reference method largely validated by industrial standards to unfold complex spectra from HPGe detectors. Complex cases of spectra are studied from IAEA benchmark protocol tests and with measured spectra. The SINBAD code shows promising deconvolution capabilities compared to the conventional method without any expert parameter fine tuning.

  8. Nonlinear spatio-temporal filtering of dynamic PET data using a four-dimensional Gaussian filter and expectation-maximization deconvolution

    International Nuclear Information System (INIS)

    Floberg, J M; Holden, J E

    2013-01-01

    We introduce a method for denoising dynamic PET data, spatio-temporal expectation-maximization (STEM) filtering, that combines four-dimensional Gaussian filtering with EM deconvolution. The initial Gaussian filter suppresses noise at a broad range of spatial and temporal frequencies and EM deconvolution quickly restores the frequencies most important to the signal. We aim to demonstrate that STEM filtering can improve variance in both individual time frames and in parametric images without introducing significant bias. We evaluate STEM filtering with a dynamic phantom study, and with simulated and human dynamic PET studies of a tracer with reversible binding behaviour, [C-11]raclopride, and a tracer with irreversible binding behaviour, [F-18]FDOPA. STEM filtering is compared to a number of established three and four-dimensional denoising methods. STEM filtering provides substantial improvements in variance in both individual time frames and in parametric images generated with a number of kinetic analysis techniques while introducing little bias. STEM filtering does bias early frames, but this does not affect quantitative parameter estimates. STEM filtering is shown to be superior to the other simple denoising methods studied. STEM filtering is a simple and effective denoising method that could be valuable for a wide range of dynamic PET applications. (paper)

  9. A blind deconvolution method based on L1/L2 regularization prior in the gradient space

    Science.gov (United States)

    Cai, Ying; Shi, Yu; Hua, Xia

    2018-02-01

    In the process of image restoration, the result of image restoration is very different from the real image because of the existence of noise, in order to solve the ill posed problem in image restoration, a blind deconvolution method based on L1/L2 regularization prior to gradient domain is proposed. The method presented in this paper first adds a function to the prior knowledge, which is the ratio of the L1 norm to the L2 norm, and takes the function as the penalty term in the high frequency domain of the image. Then, the function is iteratively updated, and the iterative shrinkage threshold algorithm is applied to solve the high frequency image. In this paper, it is considered that the information in the gradient domain is better for the estimation of blur kernel, so the blur kernel is estimated in the gradient domain. This problem can be quickly implemented in the frequency domain by fast Fast Fourier Transform. In addition, in order to improve the effectiveness of the algorithm, we have added a multi-scale iterative optimization method. This paper proposes the blind deconvolution method based on L1/L2 regularization priors in the gradient space can obtain the unique and stable solution in the process of image restoration, which not only keeps the edges and details of the image, but also ensures the accuracy of the results.

  10. Evaluation of obstructive uropathy by deconvolution analysis of {sup 99m}Tc-mercaptoacetyltriglycine ({sup 99m}Tc-MAG3) renal scintigraphic data. A comparison with diuresis renography

    Energy Technology Data Exchange (ETDEWEB)

    Hada, Yoshiyuki [Mie Univ., Tsu (Japan). School of Medicine

    1997-06-01

    Clinical significance of ERPF (effective renal plasma flow) and MTT (mean transit time) calculated by deconvolution analysis was studied in patients with obstructive uropathy. Subjects were 84 kidneys of 38 patients and 4 people without renal abnormality (22 males and 20 females) whose age was 53.8 y in a mean. Scintigraphy was done with a Toshiba {gamma}-camera GCA-7200A equipped with a low energy-high resolution collimator with the energy width of 149 keV{+-}20% at 20 min after loading of 500 ml of water and rapidly after intravenous administration of {sup 99m}Tc-MAG3 (200 MBq). At 5 min later, blood was collected and at 10 min, furosemide was intravenously given. Plasma radioactivity was measured in a well-type scintillation counter and was used for correction of blood concentration-time curve obtained from heart area data. Split MTT, regional MTT and ERPF were calculated by deconvolution analysis. Impaired transit was judged from renogram after furosemide loading and was classified into 6 types. ERPF was found lowered in cases of obstruction and in low renal function. Regional MTT was prolonged only in the former cases. The examination with the deconvolution analysis was concluded to be widely used since it gave useful information for the treatment. (K.H.)

  11. Chromatic aberration correction and deconvolution for UV sensitive imaging of fluorescent sterols in cytoplasmic lipid droplets

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Faergeman, Nils J

    2008-01-01

    adipocyte differentiation. DHE is targeted to transferrin-positive recycling endosomes in preadipocytes but associates with droplets in mature adipocytes. Only in adipocytes but not in foam cells fluorescent sterol was confined to the droplet-limiting membrane. We developed an approach to visualize...... macrophage foam cells and in adipocytes. We used deconvolution microscopy and developed image segmentation techniques to assess the DHE content of lipid droplets in both cell types in an automated manner. Pulse-chase studies and colocalization analysis were performed to monitor the redistribution of DHE upon...

  12. MAXED, a computer code for the deconvolution of multisphere neutron spectrometer data using the maximum entropy method

    International Nuclear Information System (INIS)

    Reginatto, M.; Goldhagen, P.

    1998-06-01

    The problem of analyzing data from a multisphere neutron spectrometer to infer the energy spectrum of the incident neutrons is discussed. The main features of the code MAXED, a computer program developed to apply the maximum entropy principle to the deconvolution (unfolding) of multisphere neutron spectrometer data, are described, and the use of the code is illustrated with an example. A user's guide for the code MAXED is included in an appendix. The code is available from the authors upon request

  13. Toward fully automated genotyping: Genotyping microsatellite markers by deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Lancia, G.; See-Kiong, Ng [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-11-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA){sub n} repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. 32 refs., 5 figs., 3 tabs.

  14. New insight into the evolution of the vertebrate respiratory system and the discovery of unidirectional airflow in iguana lungs.

    Science.gov (United States)

    Cieri, Robert L; Craven, Brent A; Schachner, Emma R; Farmer, C G

    2014-12-02

    The generally accepted framework for the evolution of a key feature of the avian respiratory system, unidirectional airflow, is that it is an adaptation for efficiency of gas exchange and expanded aerobic capacities, and therefore it has historically been viewed as important to the ability of birds to fly and to maintain an endothermic metabolism. This pattern of flow has been presumed to arise from specific features of the respiratory system, such as an enclosed intrapulmonary bronchus and parabronchi. Here we show unidirectional airflow in the green iguana, a lizard with a strikingly different natural history from that of birds and lacking these anatomical features. This discovery indicates a paradigm shift is needed. The selective drivers of the trait, its date of origin, and the fundamental aerodynamic mechanisms by which unidirectional flow arises must be reassessed to be congruent with the natural history of this lineage. Unidirectional flow may serve functions other than expanded aerobic capacity; it may have been present in the ancestral diapsid; and it can occur in structurally simple lungs.

  15. On the Evolution of the Mammalian Brain.

    Science.gov (United States)

    Torday, John S; Miller, William B

    2016-01-01

    Hobson and Friston have hypothesized that the brain must actively dissipate heat in order to process information (Hobson et al., 2014). This physiologic trait is functionally homologous with the first instantation of life formed by lipids suspended in water forming micelles- allowing the reduction in entropy (heat dissipation). This circumvents the Second Law of Thermodynamics permitting the transfer of information between living entities, enabling them to perpetually glean information from the environment, that is felt by many to correspond to evolution per se. The next evolutionary milestone was the advent of cholesterol, embedded in the cell membranes of primordial eukaryotes, facilitating metabolism, oxygenation and locomotion, the triadic basis for vertebrate evolution. Lipids were key to homeostatic regulation of calcium, forming calcium channels. Cell membrane cholesterol also fostered metazoan evolution by forming lipid rafts for receptor-mediated cell-cell signaling, the origin of the endocrine system. The eukaryotic cell membrane exapted to all complex physiologic traits, including the lung and brain, which are molecularly homologous through the function of neuregulin, mediating both lung development and myelinization of neurons. That cooption later exapted as endothermy during the water-land transition (Torday, 2015a), perhaps being the functional homolog for brain heat dissipation and conscious/mindful information processing. The skin and brain similarly share molecular homologies through the "skin-brain" hypothesis, giving insight to the cellular-molecular "arc" of consciousness from its unicellular origins to integrated physiology. This perspective on the evolution of the central nervous system clarifies self-organization, reconciling thermodynamic and informational definitions of the underlying biophysical mechanisms, thereby elucidating relations between the predictive capabilities of the brain and self-organizational processes.

  16. The evolution of blood pressure and the rise of mankind.

    Science.gov (United States)

    Schulte, Kevin; Kunter, Uta; Moeller, Marcus J

    2015-05-01

    Why is it that only human beings continuously perform acts of heroism? Looking back at our evolutionary history can offer us some potentially useful insight. This review highlights some of the major steps in our evolution-more specifically, the evolution of high blood pressure. When we were fish, the first kidney was developed to create a standardized internal 'milieu' preserving the primordial sea within us. When we conquered land as amphibians, the evolution of the lung required a low systemic blood pressure, which explains why early land vertebrates (amphibians, reptiles) are such low performers. Gaining independence from water required the evolution of an impermeable skin and a water-retaining kidney. The latter was accomplished twice with two different solutions in the two major branches of vertebrate evolution: mammals excrete nitrogenous waste products as urea, which can be utilized by the kidney as an osmotic agent to produce more concentrated urine. Dinosaurs and birds have a distinct nitrogen metabolism and excrete nitrogen as water-insoluble uric acid-therefore, their kidneys cannot use urea to concentrate as well. Instead, some birds have developed the capability to reabsorb water from their cloacae. The convergent development of a separate small circulation of the lung in mammals and birds allowed for the evolution of 'high blood-pressure animals' with better capillarization of the peripheral tissues allowing high endurance performance. Finally, we investigate why mankind outperforms any other mammal on earth and why, to this day, we continue to perform acts of heroism on our eternal quest for personal bliss. © The Author 2014. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  17. Water Residence Time estimation by 1D deconvolution in the form of a l2 -regularized inverse problem with smoothness, positivity and causality constraints

    Science.gov (United States)

    Meresescu, Alina G.; Kowalski, Matthieu; Schmidt, Frédéric; Landais, François

    2018-06-01

    The Water Residence Time distribution is the equivalent of the impulse response of a linear system allowing the propagation of water through a medium, e.g. the propagation of rain water from the top of the mountain towards the aquifers. We consider the output aquifer levels as the convolution between the input rain levels and the Water Residence Time, starting with an initial aquifer base level. The estimation of Water Residence Time is important for a better understanding of hydro-bio-geochemical processes and mixing properties of wetlands used as filters in ecological applications, as well as protecting fresh water sources for wells from pollutants. Common methods of estimating the Water Residence Time focus on cross-correlation, parameter fitting and non-parametric deconvolution methods. Here we propose a 1D full-deconvolution, regularized, non-parametric inverse problem algorithm that enforces smoothness and uses constraints of causality and positivity to estimate the Water Residence Time curve. Compared to Bayesian non-parametric deconvolution approaches, it has a fast runtime per test case; compared to the popular and fast cross-correlation method, it produces a more precise Water Residence Time curve even in the case of noisy measurements. The algorithm needs only one regularization parameter to balance between smoothness of the Water Residence Time and accuracy of the reconstruction. We propose an approach on how to automatically find a suitable value of the regularization parameter from the input data only. Tests on real data illustrate the potential of this method to analyze hydrological datasets.

  18. Investigation of the lithosphere of the Texas Gulf Coast using phase-specific Ps receiver functions produced by wavefield iterative deconvolution

    Science.gov (United States)

    Gurrola, H.; Berdine, A.; Pulliam, J.

    2017-12-01

    Interference between Ps phases and reverberations (PPs, PSs phases and reverberations thereof) make it difficult to use Ps receiver functions (RF) in regions with thick sediments. Crustal reverberations typically interfere with Ps phases from the lithosphere-asthenosphere boundary (LAB). We have developed a method to separate Ps phases from reverberations by deconvolution of all the data recorded at a seismic station by removing phases from a single wavefront at each iteration of the deconvolution (wavefield iterative deconvolution or WID). We applied WID to data collected in the Gulf Coast and Llano Front regions of Texas by the EarthScope Transportable array and by a temporary deployment of 23 broadband seismometers (deployed by Texas Tech and Baylor Universities). The 23 station temporary deployment was 300 km long; crossing from Matagorda Island onto the Llano uplift. 3-D imaging using these data shows that the deepest part of the sedimentary basin may be inboard of the coastline. The Moho beneath the Gulf Coast plain does not appear in many of the images. This could be due to interference from reverberations from shallower layers or it may indicate the lack of a strong velocity contrast at the Moho perhaps due to serpentinization of the uppermost mantle. The Moho appears to be flat, at 40 km) beneath most of the Llano uplift but may thicken to the south and thin beneath the Coastal plain. After application of WID, we were able to identify a negatively polarized Ps phase consistent with LAB depths identified in Sp RF images. The LAB appears to be 80-100 km deep beneath most of the coast but is 100 to 120 km deep beneath the Llano uplift. There are other negatively polarized phases between 160 and 200 km depths beneath the Gulf Coast and the Llano Uplift. These deeper phases may indicate that, in this region, the LAB is transitional in nature and rather than a discrete boundary.

  19. Electrospray Ionization with High-Resolution Mass Spectrometry as a Tool for Lignomics: Lignin Mass Spectrum Deconvolution

    Science.gov (United States)

    Andrianova, Anastasia A.; DiProspero, Thomas; Geib, Clayton; Smoliakova, Irina P.; Kozliak, Evguenii I.; Kubátová, Alena

    2018-05-01

    The capability to characterize lignin, lignocellulose, and their degradation products is essential for the development of new renewable feedstocks. Electrospray ionization high-resolution time-of-flight mass spectrometry (ESI-HR TOF-MS) method was developed expanding the lignomics toolkit while targeting the simultaneous detection of low and high molecular weight (MW) lignin species. The effect of a broad range of electrolytes and various ionization conditions on ion formation and ionization effectiveness was studied using a suite of mono-, di-, and triarene lignin model compounds as well as kraft alkali lignin. Contrary to the previous studies, the positive ionization mode was found to be more effective for methoxy-substituted arenes and polyphenols, i.e., species of a broadly varied MW structurally similar to the native lignin. For the first time, we report an effective formation of multiply charged species of lignin with the subsequent mass spectrum deconvolution in the presence of 100 mmol L-1 formic acid in the positive ESI mode. The developed method enabled the detection of lignin species with an MW between 150 and 9000 Da or higher, depending on the mass analyzer. The obtained M n and M w values of 1500 and 2500 Da, respectively, were in good agreement with those determined by gel permeation chromatography. Furthermore, the deconvoluted ESI mass spectrum was similar to that obtained with matrix-assisted laser desorption/ionization (MALDI)-HR TOF-MS, yet featuring a higher signal-to-noise ratio. The formation of multiply charged species was confirmed with ion mobility ESI-HR Q-TOF-MS. [Figure not available: see fulltext.

  20. Extravascular Lung Water and Acute Lung Injury

    Directory of Open Access Journals (Sweden)

    Ritesh Maharaj

    2012-01-01

    Full Text Available Acute lung injury carries a high burden of morbidity and mortality and is characterised by nonhydrostatic pulmonary oedema. The aim of this paper is to highlight the role of accurate quantification of extravascular lung water in diagnosis, management, and prognosis in “acute lung injury” and “acute respiratory distress syndrome”. Several studies have verified the accuracy of both the single and the double transpulmonary thermal indicator techniques. Both experimental and clinical studies were searched in PUBMED using the term “extravascular lung water” and “acute lung injury”. Extravascular lung water measurement offers information not otherwise available by other methods such as chest radiography, arterial blood gas, and chest auscultation at the bedside. Recent data have highlighted the role of extravascular lung water in response to treatment to guide fluid therapy and ventilator strategies. The quantification of extravascular lung water may predict mortality and multiorgan dysfunction. The limitations of the dilution method are also discussed.

  1. Lung cancer

    International Nuclear Information System (INIS)

    Aisner, J.

    1985-01-01

    This book contains 13 chapters. Some of the chapter titles are: The Pathology of Lung Cancer; Radiotherapy for Non-Small-Cell Cancer of the Lung; Chemotherapy for Non-Small-Cell Lung Cancer; Immunotherapy in the Management of Lung Cancer; Preoperative Staging and Surgery for Non-Small-Cell Lung Cancer; and Prognostic Factors in Lung Cancer

  2. Imaging by Electrochemical Scanning Tunneling Microscopy and Deconvolution Resolving More Details of Surfaces Nanomorphology

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    observed in high-resolution images of metallic nanocrystallites may be effectively deconvoluted, as to resolve more details of the crystalline morphology (see figure). Images of surface-crystalline metals indicate that more than a single atomic layer is involved in mediating the tunneling current......Upon imaging, electrochemical scanning tunneling microscopy (ESTM), scanning electrochemical micro-scopy (SECM) and in situ STM resolve information on electronic structures and on surface topography. At very high resolution, imaging processing is required, as to obtain information that relates...... to crystallographic-surface structures. Within the wide range of new technologies, those images surface features, the electrochemical scanning tunneling microscope (ESTM) provides means of atomic resolution where the tip participates actively in the process of imaging. Two metallic surfaces influence ions trapped...

  3. On the evolution of the mammalian brain

    Directory of Open Access Journals (Sweden)

    John Steven Torday

    2016-04-01

    Full Text Available Hobson and Friston have hypothesized that the brain must actively dissipate heat in order to process information (Virtual reality and consciousness inference in dreaming. Front Psychol. 2014 Oct 9;5:1133.. This physiologic trait is functionally homologous with the first instantation of life formed by lipids suspended in water forming micelles- allowing the reduction in entropy (heat dissipation, circumventing the Second Law of Thermodynamics permitting the transfer of information between living entities, enabling them to perpetually glean information from the environment (= evolution. The next evolutionary milestone was the advent of cholesterol, embedded in the cell membranes of primordial eukaryotes, facilitating metabolism, oxygenation and locomotion, the triadic basis for vertebrate evolution. Lipids were key to homeostatic regulation of calcium, forming calcium channels. Cell membrane cholesterol also fostered metazoan evolution by forming lipid rafts for receptor-mediated cell-cell signaling, the origin of the endocrine system. The eukaryotic cell membrane exapted to all complex physiologic traits, including the lung and brain, which are molecularly homologous through the function of neuregulin, mediating both lung development and myelinization of neurons. That cooption later exapted as endothermy during the water-land transition (Torday JS. A Central Theory of Biology. Med Hypotheses. 2015 Jul;85(1:49-57, perhaps being the functional homolog for brain heat dissipation and consciousness/mind. The skin and brain similarly share molecular homologies through the ‘skin-brain’ hypothesis, giving insight to the cellular-molecular ‘arc’ of consciousness from its unicellular origins to integrated physiology. This perspective on the evolution of the central nervous system clarifies self-organization, reconciling thermodynamic and informational definitions of the underlying biophysical mechanisms, thereby elucidating relations between the

  4. Rapid analysis for 567 pesticides and endocrine disrupters by GC/MS using deconvolution reporting software

    Energy Technology Data Exchange (ETDEWEB)

    Wylie, P.; Szelewski, M.; Meng, Chin-Kai [Agilent Technologies, Wilmington, DE (United States)

    2004-09-15

    More than 700 pesticides are approved for use around the world, many of which are suspected endocrine disrupters. Other pesticides, though no longer used, persist in the environment where they bioaccumulate in the flora and fauna. Analytical methods target only a subset of the possible compounds. The analysis of food and environmental samples for pesticides is usually complicated by the presence of co-extracted natural products. Food or tissue extracts can be exceedingly complex matrices that require several stages of sample cleanup prior to analysis. Even then, it can be difficult to detect trace levels of contaminants in the presence of the remaining matrix. For efficiency, multi-residue methods (MRMs) must be used to analyze for most pesticides. Traditionally, these methods have relied upon gas chromatography (GC) with a constellation of element-selective detectors to locate pesticides in the midst of a variable matrix. GC with mass spectral detection (GC/MS) has been widely used for confirmation of hits. Liquid chromatography (LC) has been used for those compounds that are not amenable to GC. Today, more and more pesticide laboratories are relying upon LC with mass spectral detection (LC/MS) and GC/MS as their primary analytical tools. Still, most MRMs are target compound methods that look for a small subset of the possible pesticides. Any compound not on the target list is likely to be missed by these methods. Using the techniques of retention time locking (RTL) and RTL database searching together with spectral deconvolution, a method has been developed to screen for 567 pesticides and suspected endocrine disrupters in a single GC/MS analysis. Spectral deconvolution helps to identify pesticides even when they co-elute with matrix compounds while RTL helps to eliminate false positives and gives greater confidence in the results.

  5. Interplay between the lung microbiome and lung cancer.

    Science.gov (United States)

    Mao, Qixing; Jiang, Feng; Yin, Rong; Wang, Jie; Xia, Wenjie; Dong, Gaochao; Ma, Weidong; Yang, Yao; Xu, Lin; Hu, Jianzhong

    2018-02-28

    The human microbiome confers benefits or disease susceptibility to the human body through multiple pathways. Disruption of the symbiotic balance of the human microbiome is commonly found in systematic diseases such as diabetes, obesity, and chronic gastric diseases. Emerging evidence has suggested that dysbiosis of the microbiota may also play vital roles in carcinogenesis at multiple levels, e.g., by affecting metabolic, inflammatory, or immune pathways. Although the impact of the gut microbiome on the digestive cancer has been widely explored, few studies have investigated the interplay between the microbiome and lung cancer. Some recent studies have shown that certain microbes and microbiota dysbiosis are correlated with development of lung cancer. In this mini-review, we briefly summarize current research findings describing the relationship between the lung microbiome and lung cancer. We further discuss the potential mechanisms through which the lung microbiome may play a role in lung carcinogenesis and impact lung cancer treatment. A better knowledge of the interplay between the lung microbiome and lung cancer may promote the development of innovative strategies for early prevention and personalized treatment in lung cancer. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Inflammatory Pseudotumors of the Lung in Children: Aggressive Types

    International Nuclear Information System (INIS)

    Delgado, J.; Guzman de Villoria, J. A.; Casonava, A.; Zabalza, M. R.

    2003-01-01

    Inflammatory pseudotumors are tumorations which can present themselves very aggressively in children. Three patients were studied as illustration. Each was diagnosed with inflammatory pseudotumors of the lung after surgical intervention and open lung biopsy, which each case being very different in terms of clinical and radiological manifestations and evolution. Basic Radiological procedures and CT were performed in all three cases and MR in two. While the results were not specific, they did provide some means for arriving at a diagnosis, the common trait among them being the manifestation of primarily aggressive behavior. In conclusion, although inflammatory pseudotumor of the lung general y presents itself as a solitary peripheral mass of benign character which generally evolves favorable after surgery, it sometimes behaves in a primarily aggressive way which could hinder, or render impossible, any forthcoming surgical intervention. Imaging techniques tend to provide findings which are varied and unspecific. In this regard, MR is more advantageous than CT for determining not only the relationship between the mass and its adjacent structures, but also for the detection and follow up of relapses. (Author) 40 refs

  7. Blind deconvolution of seismograms regularized via minimum support

    International Nuclear Information System (INIS)

    Royer, A A; Bostock, M G; Haber, E

    2012-01-01

    The separation of earthquake source signature and propagation effects (the Earth’s ‘Green’s function’) that encode a seismogram is a challenging problem in seismology. The task of separating these two effects is called blind deconvolution. By considering seismograms of multiple earthquakes from similar locations recorded at a given station and that therefore share the same Green’s function, we may write a linear relation in the time domain u i (t)*s j (t) − u j (t)*s i (t) = 0, where u i (t) is the seismogram for the ith source and s j (t) is the jth unknown source. The symbol * represents the convolution operator. From two or more seismograms, we obtain a homogeneous linear system where the unknowns are the sources. This system is subject to a scaling constraint to deliver a non-trivial solution. Since source durations are not known a priori and must be determined, we augment our system by introducing the source durations as unknowns and we solve the combined system (sources and source durations) using separation of variables. Our solution is derived using direct linear inversion to recover the sources and Newton’s method to recover source durations. This method is tested using two sets of synthetic seismograms created by convolution of (i) random Gaussian source-time functions and (ii) band-limited sources with a simplified Green’s function and signal to noise levels up to 10% with encouraging results. (paper)

  8. Evolution and clinical impact of co-occurring genetic alterations in advanced-stage EGFR-mutant lung cancers. | Office of Cancer Genomics

    Science.gov (United States)

    A widespread approach to modern cancer therapy is to identify a single oncogenic driver gene and target its mutant-protein product (for example, EGFR-inhibitor treatment in EGFR-mutant lung cancers). However, genetically driven resistance to targeted therapy limits patient survival. Through genomic analysis of 1,122 EGFR-mutant lung cancer cell-free DNA samples and whole-exome analysis of seven longitudinally collected tumor samples from a patient with EGFR-mutant lung cancer, we identified critical co-occurring oncogenic events present in most advanced-stage EGFR-mutant lung cancers.

  9. Resolving deconvolution ambiguity in gene alternative splicing

    Directory of Open Access Journals (Sweden)

    Hubbell Earl

    2009-08-01

    Full Text Available Abstract Background For many gene structures it is impossible to resolve intensity data uniquely to establish abundances of splice variants. This was empirically noted by Wang et al. in which it was called a "degeneracy problem". The ambiguity results from an ill-posed problem where additional information is needed in order to obtain an unique answer in splice variant deconvolution. Results In this paper, we analyze the situations under which the problem occurs and perform a rigorous mathematical study which gives necessary and sufficient conditions on how many and what type of constraints are needed to resolve all ambiguity. This analysis is generally applicable to matrix models of splice variants. We explore the proposal that probe sequence information may provide sufficient additional constraints to resolve real-world instances. However, probe behavior cannot be predicted with sufficient accuracy by any existing probe sequence model, and so we present a Bayesian framework for estimating variant abundances by incorporating the prediction uncertainty from the micro-model of probe responsiveness into the macro-model of probe intensities. Conclusion The matrix analysis of constraints provides a tool for detecting real-world instances in which additional constraints may be necessary to resolve splice variants. While purely mathematical constraints can be stated without error, real-world constraints may themselves be poorly resolved. Our Bayesian framework provides a generic solution to the problem of uniquely estimating transcript abundances given additional constraints that themselves may be uncertain, such as regression fit to probe sequence models. We demonstrate the efficacy of it by extensive simulations as well as various biological data.

  10. A deconvolution method for deriving the transit time spectrum for ultrasound propagation through cancellous bone replica models.

    Science.gov (United States)

    Langton, Christian M; Wille, Marie-Luise; Flegg, Mark B

    2014-04-01

    The acceptance of broadband ultrasound attenuation for the assessment of osteoporosis suffers from a limited understanding of ultrasound wave propagation through cancellous bone. It has recently been proposed that the ultrasound wave propagation can be described by a concept of parallel sonic rays. This concept approximates the detected transmission signal to be the superposition of all sonic rays that travel directly from transmitting to receiving transducer. The transit time of each ray is defined by the proportion of bone and marrow propagated. An ultrasound transit time spectrum describes the proportion of sonic rays having a particular transit time, effectively describing lateral inhomogeneity of transit times over the surface of the receiving ultrasound transducer. The aim of this study was to provide a proof of concept that a transit time spectrum may be derived from digital deconvolution of input and output ultrasound signals. We have applied the active-set method deconvolution algorithm to determine the ultrasound transit time spectra in the three orthogonal directions of four cancellous bone replica samples and have compared experimental data with the prediction from the computer simulation. The agreement between experimental and predicted ultrasound transit time spectrum analyses derived from Bland-Altman analysis ranged from 92% to 99%, thereby supporting the concept of parallel sonic rays for ultrasound propagation in cancellous bone. In addition to further validation of the parallel sonic ray concept, this technique offers the opportunity to consider quantitative characterisation of the material and structural properties of cancellous bone, not previously available utilising ultrasound.

  11. Evolution of emphysema in relation to smoking

    International Nuclear Information System (INIS)

    Bellomi, Massimo; Rampinelli, Cristiano; Veronesi, Giulia; Harari, Sergio; Lanfranchi, Federica; Raimondi, Sara; Maisonneuve, Patrick

    2010-01-01

    We have little knowledge about the evolution of emphysema, and relatively little is understood about its evolution in relation to smoking habits. This study aims to assess the evolution of emphysema in asymptomatic current and former smokers over 2 years and to investigate the association with subjects' characteristics. The study was approved by our Ethics Committee and all participants provided written informed consent. We measured emphysema by automatic low-dose computed tomography densitometry in 254 current and 282 former smokers enrolled in a lung-cancer screening. The measures were repeated after 2 years. The association between subjects' characteristics, smoking habits and emphysema were assessed by chi-squared and Wilcoxon tests. Univariate and multivariate odds ratios (OR) with 95% confidence intervals (CI) were calculated for the risk of emphysema worsening according to subjects' characteristics. We assessed the trend of increasing risk of emphysema progression by smoking habits using the Mantel-Haenszel chi-squared test. The median percentage increase in emphysema over a 2-year period was significantly higher in current than in former smokers (OR 1.8; 95% CI 1.3-2.6; p < 0.0001). The risk of worsening emphysema (by 30% in 2 years) in current smokers increased with smoking duration (p for trend <0.02). As emphysema is a known risk factor for lung cancer, its evaluation could be used as a potential factor for identification of a high-risk population. The evaluation of emphysema progression can be added to low-dose CT screening programmes to inform and incite participants to stop smoking. (orig.)

  12. Evolution of emphysema in relation to smoking

    Energy Technology Data Exchange (ETDEWEB)

    Bellomi, Massimo [European Institute of Oncology, Department of Radiology, Milan (Italy); University of Milan - IT, School of Medicine, Milan (Italy); Rampinelli, Cristiano [European Institute of Oncology, Department of Radiology, Milan (Italy); Veronesi, Giulia [European Institute of Oncology, Department of Thoracic Surgery, Milan (Italy); Harari, Sergio [Fatebenefratelli-Milanocuore, Pneumology Operative Unit, San Giuseppe Hospital, Milan (Italy); Lanfranchi, Federica [University of Milan - IT, School of Medicine, Milan (Italy); Raimondi, Sara; Maisonneuve, Patrick [European Institute of Oncology, Department of Epidemiology and Biostatistics, Milan (Italy)

    2010-02-15

    We have little knowledge about the evolution of emphysema, and relatively little is understood about its evolution in relation to smoking habits. This study aims to assess the evolution of emphysema in asymptomatic current and former smokers over 2 years and to investigate the association with subjects' characteristics. The study was approved by our Ethics Committee and all participants provided written informed consent. We measured emphysema by automatic low-dose computed tomography densitometry in 254 current and 282 former smokers enrolled in a lung-cancer screening. The measures were repeated after 2 years. The association between subjects' characteristics, smoking habits and emphysema were assessed by chi-squared and Wilcoxon tests. Univariate and multivariate odds ratios (OR) with 95% confidence intervals (CI) were calculated for the risk of emphysema worsening according to subjects' characteristics. We assessed the trend of increasing risk of emphysema progression by smoking habits using the Mantel-Haenszel chi-squared test. The median percentage increase in emphysema over a 2-year period was significantly higher in current than in former smokers (OR 1.8; 95% CI 1.3-2.6; p < 0.0001). The risk of worsening emphysema (by 30% in 2 years) in current smokers increased with smoking duration (p for trend <0.02). As emphysema is a known risk factor for lung cancer, its evaluation could be used as a potential factor for identification of a high-risk population. The evaluation of emphysema progression can be added to low-dose CT screening programmes to inform and incite participants to stop smoking. (orig.)

  13. LungMAP: The Molecular Atlas of Lung Development Program.

    Science.gov (United States)

    Ardini-Poleske, Maryanne E; Clark, Robert F; Ansong, Charles; Carson, James P; Corley, Richard A; Deutsch, Gail H; Hagood, James S; Kaminski, Naftali; Mariani, Thomas J; Potter, Steven S; Pryhuber, Gloria S; Warburton, David; Whitsett, Jeffrey A; Palmer, Scott M; Ambalavanan, Namasivayam

    2017-11-01

    The National Heart, Lung, and Blood Institute is funding an effort to create a molecular atlas of the developing lung (LungMAP) to serve as a research resource and public education tool. The lung is a complex organ with lengthy development time driven by interactive gene networks and dynamic cross talk among multiple cell types to control and coordinate lineage specification, cell proliferation, differentiation, migration, morphogenesis, and injury repair. A better understanding of the processes that regulate lung development, particularly alveologenesis, will have a significant impact on survival rates for premature infants born with incomplete lung development and will facilitate lung injury repair and regeneration in adults. A consortium of four research centers, a data coordinating center, and a human tissue repository provides high-quality molecular data of developing human and mouse lungs. LungMAP includes mouse and human data for cross correlation of developmental processes across species. LungMAP is generating foundational data and analysis, creating a web portal for presentation of results and public sharing of data sets, establishing a repository of young human lung tissues obtained through organ donor organizations, and developing a comprehensive lung ontology that incorporates the latest findings of the consortium. The LungMAP website (www.lungmap.net) currently contains more than 6,000 high-resolution lung images and transcriptomic, proteomic, and lipidomic human and mouse data and provides scientific information to stimulate interest in research careers for young audiences. This paper presents a brief description of research conducted by the consortium, database, and portal development and upcoming features that will enhance the LungMAP experience for a community of users. Copyright © 2017 the American Physiological Society.

  14. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  15. Lung volumes and emphysema in smokers with interstitial lung abnormalities.

    Science.gov (United States)

    Washko, George R; Hunninghake, Gary M; Fernandez, Isis E; Nishino, Mizuki; Okajima, Yuka; Yamashiro, Tsuneo; Ross, James C; Estépar, Raúl San José; Lynch, David A; Brehm, John M; Andriole, Katherine P; Diaz, Alejandro A; Khorasani, Ramin; D'Aco, Katherine; Sciurba, Frank C; Silverman, Edwin K; Hatabu, Hiroto; Rosas, Ivan O

    2011-03-10

    Cigarette smoking is associated with emphysema and radiographic interstitial lung abnormalities. The degree to which interstitial lung abnormalities are associated with reduced total lung capacity and the extent of emphysema is not known. We looked for interstitial lung abnormalities in 2416 (96%) of 2508 high-resolution computed tomographic (HRCT) scans of the lung obtained from a cohort of smokers. We used linear and logistic regression to evaluate the associations between interstitial lung abnormalities and HRCT measurements of total lung capacity and emphysema. Interstitial lung abnormalities were present in 194 (8%) of the 2416 HRCT scans evaluated. In statistical models adjusting for relevant covariates, interstitial lung abnormalities were associated with reduced total lung capacity (-0.444 liters; 95% confidence interval [CI], -0.596 to -0.292; Ppulmonary disease (COPD) (odds ratio, 0.53; 95% CI, 0.37 to 0.76; P<0.001). The effect of interstitial lung abnormalities on total lung capacity and emphysema was dependent on COPD status (P<0.02 for the interactions). Interstitial lung abnormalities were positively associated with both greater exposure to tobacco smoke and current smoking. In smokers, interstitial lung abnormalities--which were present on about 1 of every 12 HRCT scans--were associated with reduced total lung capacity and a lesser amount of emphysema. (Funded by the National Institutes of Health and the Parker B. Francis Foundation; ClinicalTrials.gov number, NCT00608764.).

  16. Combining a Deconvolution and a Universal Library Search Algorithm for the Nontarget Analysis of Data-Independent Acquisition Mode Liquid Chromatography-High-Resolution Mass Spectrometry Results.

    Science.gov (United States)

    Samanipour, Saer; Reid, Malcolm J; Bæk, Kine; Thomas, Kevin V

    2018-04-17

    Nontarget analysis is considered one of the most comprehensive tools for the identification of unknown compounds in a complex sample analyzed via liquid chromatography coupled to high-resolution mass spectrometry (LC-HRMS). Due to the complexity of the data generated via LC-HRMS, the data-dependent acquisition mode, which produces the MS 2 spectra of a limited number of the precursor ions, has been one of the most common approaches used during nontarget screening. However, data-independent acquisition mode produces highly complex spectra that require proper deconvolution and library search algorithms. We have developed a deconvolution algorithm and a universal library search algorithm (ULSA) for the analysis of complex spectra generated via data-independent acquisition. These algorithms were validated and tested using both semisynthetic and real environmental data. A total of 6000 randomly selected spectra from MassBank were introduced across the total ion chromatograms of 15 sludge extracts at three levels of background complexity for the validation of the algorithms via semisynthetic data. The deconvolution algorithm successfully extracted more than 60% of the added ions in the analytical signal for 95% of processed spectra (i.e., 3 complexity levels multiplied by 6000 spectra). The ULSA ranked the correct spectra among the top three for more than 95% of cases. We further tested the algorithms with 5 wastewater effluent extracts for 59 artificial unknown analytes (i.e., their presence or absence was confirmed via target analysis). These algorithms did not produce any cases of false identifications while correctly identifying ∼70% of the total inquiries. The implications, capabilities, and the limitations of both algorithms are further discussed.

  17. Comparative transcriptome analyses indicate molecular homology of zebrafish swimbladder and mammalian lung.

    Directory of Open Access Journals (Sweden)

    Weiling Zheng

    Full Text Available The fish swimbladder is a unique organ in vertebrate evolution and it functions for regulating buoyancy in most teleost species. It has long been postulated as a homolog of the tetrapod lung, but the molecular evidence is scarce. In order to understand the molecular function of swimbladder as well as its relationship with lungs in tetrapods, transcriptomic analyses of zebrafish swimbladder were carried out by RNA-seq. Gene ontology classification showed that genes in cytoskeleton and endoplasmic reticulum were enriched in the swimbladder. Further analyses depicted gene sets and pathways closely related to cytoskeleton constitution and regulation, cell adhesion, and extracellular matrix. Several prominent transcription factor genes in the swimbladder including hoxc4a, hoxc6a, hoxc8a and foxf1 were identified and their expressions in developing swimbladder during embryogenesis were confirmed. By comparison of enriched transcripts in the swimbladder with those in human and mouse lungs, we established the resemblance of transcriptome of the zebrafish swimbladder and mammalian lungs. Based on the transcriptomic data of zebrafish swimbladder, the predominant functions of swimbladder are in its epithelial and muscular tissues. Our comparative analyses also provide molecular evidence of the relatedness of the fish swimbladder and mammalian lung.

  18. Lung cancer mimicking lung abscess formation on CT images.

    Science.gov (United States)

    Taira, Naohiro; Kawabata, Tsutomu; Gabe, Atsushi; Ichi, Takaharu; Kushi, Kazuaki; Yohena, Tomofumi; Kawasaki, Hidenori; Yamashiro, Toshimitsu; Ishikawa, Kiyoshi

    2014-01-01

    Male, 64 FINAL DIAGNOSIS: Lung pleomorphic carcinoma Symptoms: Cough • fever - Clinical Procedure: - Specialty: Oncology. Unusual clinical course. The diagnosis of lung cancer is often made based on computed tomography (CT) image findings if it cannot be confirmed on pathological examinations, such as bronchoscopy. However, the CT image findings of cancerous lesions are similar to those of abscesses.We herein report a case of lung cancer that resembled a lung abscess on CT. We herein describe the case of 64-year-old male who was diagnosed with lung cancer using surgery. In this case, it was quite difficult to distinguish between the lung cancer and a lung abscess on CT images, and a lung abscess was initially suspected due to symptoms, such as fever and coughing, contrast-enhanced CT image findings showing a ring-enhancing mass in the right upper lobe and the patient's laboratory test results. However, a pathological diagnosis of lung cancer was confirmed according to the results of a rapid frozen section biopsy of the lesion. This case suggests that physicians should not suspect both a lung abscesses and malignancy in cases involving masses presenting as ring-enhancing lesions on contrast-enhanced CT.

  19. Multi-processor system for real-time deconvolution and flow estimation in medical ultrasound

    DEFF Research Database (Denmark)

    Jensen, Jesper Lomborg; Jensen, Jørgen Arendt; Stetson, Paul F.

    1996-01-01

    of the algorithms. Many of the algorithms can only be properly evaluated in a clinical setting with real-time processing, which generally cannot be done with conventional equipment. This paper therefore presents a multi-processor system capable of performing 1.2 billion floating point operations per second on RF...... filter is used with a second time-reversed recursive estimation step. Here it is necessary to perform about 70 arithmetic operations per RF sample or about 1 billion operations per second for real-time deconvolution. Furthermore, these have to be floating point operations due to the adaptive nature...... interfaced to our previously-developed real-time sampling system that can acquire RF data at a rate of 20 MHz and simultaneously transmit the data at 20 MHz to the processing system via several parallel channels. These two systems can, thus, perform real-time processing of ultrasound data. The advantage...

  20. Specter: linear deconvolution for targeted analysis of data-independent acquisition mass spectrometry proteomics.

    Science.gov (United States)

    Peckner, Ryan; Myers, Samuel A; Jacome, Alvaro Sebastian Vaca; Egertson, Jarrett D; Abelin, Jennifer G; MacCoss, Michael J; Carr, Steven A; Jaffe, Jacob D

    2018-05-01

    Mass spectrometry with data-independent acquisition (DIA) is a promising method to improve the comprehensiveness and reproducibility of targeted and discovery proteomics, in theory by systematically measuring all peptide precursors in a biological sample. However, the analytical challenges involved in discriminating between peptides with similar sequences in convoluted spectra have limited its applicability in important cases, such as the detection of single-nucleotide polymorphisms (SNPs) and alternative site localizations in phosphoproteomics data. We report Specter (https://github.com/rpeckner-broad/Specter), an open-source software tool that uses linear algebra to deconvolute DIA mixture spectra directly through comparison to a spectral library, thus circumventing the problems associated with typical fragment-correlation-based approaches. We validate the sensitivity of Specter and its performance relative to that of other methods, and show that Specter is able to successfully analyze cases involving highly similar peptides that are typically challenging for DIA analysis methods.

  1. Making sense of large data sets without annotations: analyzing age-related correlations from lung CT scans

    Science.gov (United States)

    Dicente Cid, Yashin; Mamonov, Artem; Beers, Andrew; Thomas, Armin; Kovalev, Vassili; Kalpathy-Cramer, Jayashree; Müller, Henning

    2017-03-01

    The analysis of large data sets can help to gain knowledge about specific organs or on specific diseases, just as big data analysis does in many non-medical areas. This article aims to gain information from 3D volumes, so the visual content of lung CT scans of a large number of patients. In the case of the described data set, only little annotation is available on the patients that were all part of an ongoing screening program and besides age and gender no information on the patient and the findings was available for this work. This is a scenario that can happen regularly as image data sets are produced and become available in increasingly large quantities but manual annotations are often not available and also clinical data such as text reports are often harder to share. We extracted a set of visual features from 12,414 CT scans of 9,348 patients that had CT scans of the lung taken in the context of a national lung screening program in Belarus. Lung fields were segmented by two segmentation algorithms and only cases where both algorithms were able to find left and right lung and had a Dice coefficient above 0.95 were analyzed. This assures that only segmentations of good quality were used to extract features of the lung. Patients ranged in age from 0 to 106 years. Data analysis shows that age can be predicted with a fairly high accuracy for persons under 15 years. Relatively good results were also obtained between 30 and 65 years where a steady trend is seen. For young adults and older people the results are not as good as variability is very high in these groups. Several visualizations of the data show the evolution patters of the lung texture, size and density with age. The experiments allow learning the evolution of the lung and the gained results show that even with limited metadata we can extract interesting information from large-scale visual data. These age-related changes (for example of the lung volume, the density histogram of the tissue) can also be

  2. Breast image feature learning with adaptive deconvolutional networks

    Science.gov (United States)

    Jamieson, Andrew R.; Drukker, Karen; Giger, Maryellen L.

    2012-03-01

    Feature extraction is a critical component of medical image analysis. Many computer-aided diagnosis approaches employ hand-designed, heuristic lesion extracted features. An alternative approach is to learn features directly from images. In this preliminary study, we explored the use of Adaptive Deconvolutional Networks (ADN) for learning high-level features in diagnostic breast mass lesion images with potential application to computer-aided diagnosis (CADx) and content-based image retrieval (CBIR). ADNs (Zeiler, et. al., 2011), are recently-proposed unsupervised, generative hierarchical models that decompose images via convolution sparse coding and max pooling. We trained the ADNs to learn multiple layers of representation for two breast image data sets on two different modalities (739 full field digital mammography (FFDM) and 2393 ultrasound images). Feature map calculations were accelerated by use of GPUs. Following Zeiler et. al., we applied the Spatial Pyramid Matching (SPM) kernel (Lazebnik, et. al., 2006) on the inferred feature maps and combined this with a linear support vector machine (SVM) classifier for the task of binary classification between cancer and non-cancer breast mass lesions. Non-linear, local structure preserving dimension reduction, Elastic Embedding (Carreira-Perpiñán, 2010), was then used to visualize the SPM kernel output in 2D and qualitatively inspect image relationships learned. Performance was found to be competitive with current CADx schemes that use human-designed features, e.g., achieving a 0.632+ bootstrap AUC (by case) of 0.83 [0.78, 0.89] for an ultrasound image set (1125 cases).

  3. LLL/DOR seismic conservatism of operating plants project. Interm report on Task II.1.3: soil-structure interaction. Deconvolution of the June 7, 1975, Ferndale Earthquake at the Humboldt Bay Power Plant

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Smith, P.D.

    1978-01-01

    The Ferndale Earthquake of June 7, 1975, provided a unique opportunity to study the accuracy of seismic soil-structure interaction methods used in the nuclear industry because, other than this event, there have been no cases of significant earthquakes for which moderate motions of nuclear plants have been recorded. Future studies are planned which will evaluate the soil-structure interaction methodology further, using increasingly complex methods as required. The first step in this task was to perform deconvolution and soil-structure interaction analyses for the effects of the Ferndale earthquake at the Humboldt Bay Power Plant site. The deconvolution analyses of bedrock motions performed are compared as well as additional studies on analytical sensitivity

  4. Application of an improved maximum correlated kurtosis deconvolution method for fault diagnosis of rolling element bearings

    Science.gov (United States)

    Miao, Yonghao; Zhao, Ming; Lin, Jing; Lei, Yaguo

    2017-08-01

    The extraction of periodic impulses, which are the important indicators of rolling bearing faults, from vibration signals is considerably significance for fault diagnosis. Maximum correlated kurtosis deconvolution (MCKD) developed from minimum entropy deconvolution (MED) has been proven as an efficient tool for enhancing the periodic impulses in the diagnosis of rolling element bearings and gearboxes. However, challenges still exist when MCKD is applied to the bearings operating under harsh working conditions. The difficulties mainly come from the rigorous requires for the multi-input parameters and the complicated resampling process. To overcome these limitations, an improved MCKD (IMCKD) is presented in this paper. The new method estimates the iterative period by calculating the autocorrelation of the envelope signal rather than relies on the provided prior period. Moreover, the iterative period will gradually approach to the true fault period through updating the iterative period after every iterative step. Since IMCKD is unaffected by the impulse signals with the high kurtosis value, the new method selects the maximum kurtosis filtered signal as the final choice from all candidates in the assigned iterative counts. Compared with MCKD, IMCKD has three advantages. First, without considering prior period and the choice of the order of shift, IMCKD is more efficient and has higher robustness. Second, the resampling process is not necessary for IMCKD, which is greatly convenient for the subsequent frequency spectrum analysis and envelope spectrum analysis without resetting the sampling rate. Third, IMCKD has a significant performance advantage in diagnosing the bearing compound-fault which expands the application range. Finally, the effectiveness and superiority of IMCKD are validated by a number of simulated bearing fault signals and applying to compound faults and single fault diagnosis of a locomotive bearing.

  5. Airway Basal Cell Heterogeneity and Lung Squamous Cell Carcinoma.

    Science.gov (United States)

    Hynds, Robert E; Janes, Sam M

    2017-09-01

    Basal cells are stem/progenitor cells that maintain airway homeostasis, enact repair following epithelial injury, and are a candidate cell-of-origin for lung squamous cell carcinoma. Heterogeneity of basal cells is recognized in terms of gene expression and differentiation capacity. In this Issue, Pagano and colleagues isolate a subset of immortalized basal cells that are characterized by high motility, suggesting that they might also be heterogeneous in their biophysical properties. Motility-selected cells displayed an increased ability to colonize the lung in vivo The possible implications of these findings are discussed in terms of basal cell heterogeneity, epithelial cell migration, and modeling of metastasis that occurs early in cancer evolution. Cancer Prev Res; 10(9); 491-3. ©2017 AACR See related article by Pagano et al., p. 514 . ©2017 American Association for Cancer Research.

  6. Lung density

    DEFF Research Database (Denmark)

    Garnett, E S; Webber, C E; Coates, G

    1977-01-01

    The density of a defined volume of the human lung can be measured in vivo by a new noninvasive technique. A beam of gamma-rays is directed at the lung and, by measuring the scattered gamma-rays, lung density is calculated. The density in the lower lobe of the right lung in normal man during quiet...... breathing in the sitting position ranged from 0.25 to 0.37 g.cm-3. Subnormal values were found in patients with emphsema. In patients with pulmonary congestion and edema, lung density values ranged from 0.33 to 0.93 g.cm-3. The lung density measurement correlated well with the findings in chest radiographs...... but the lung density values were more sensitive indices. This was particularly evident in serial observations of individual patients....

  7. Facilitating high resolution mass spectrometry data processing for screening of environmental water samples: An evaluation of two deconvolution tools

    International Nuclear Information System (INIS)

    Bade, Richard; Causanilles, Ana; Emke, Erik; Bijlsma, Lubertus; Sancho, Juan V.; Hernandez, Felix; Voogt, Pim de

    2016-01-01

    A screening approach was applied to influent and effluent wastewater samples. After injection in a LC-LTQ-Orbitrap, data analysis was performed using two deconvolution tools, MsXelerator (modules MPeaks and MS Compare) and Sieve 2.1. The outputs were searched incorporating an in-house database of > 200 pharmaceuticals and illicit drugs or ChemSpider. This hidden target screening approach led to the detection of numerous compounds including the illicit drug cocaine and its metabolite benzoylecgonine and the pharmaceuticals carbamazepine, gemfibrozil and losartan. The compounds found using both approaches were combined, and isotopic pattern and retention time prediction were used to filter out false positives. The remaining potential positives were reanalysed in MS/MS mode and their product ions were compared with literature and/or mass spectral libraries. The inclusion of the chemical database ChemSpider led to the tentative identification of several metabolites, including paraxanthine, theobromine, theophylline and carboxylosartan, as well as the pharmaceutical phenazone. The first three of these compounds are isomers and they were subsequently distinguished based on their product ions and predicted retention times. This work has shown that the use deconvolution tools facilitates non-target screening and enables the identification of a higher number of compounds. - Highlights: • A hidden target non-target screening method is utilised using two databases • Two software (MsXelerator and Sieve 2.1) used for both methods • 22 compounds tentatively identified following MS/MS reinjection • More information gleaned from this combined approach than individually

  8. Facilitating high resolution mass spectrometry data processing for screening of environmental water samples: An evaluation of two deconvolution tools

    Energy Technology Data Exchange (ETDEWEB)

    Bade, Richard [Research Institute for Pesticides and Water, University Jaume I, Avda. Sos Baynat s/n, E-12071 Castellón (Spain); Causanilles, Ana; Emke, Erik [KWR Watercycle Research Institute, Chemical Water Quality and Health, P.O. Box 1072, 3430 BB Nieuwegein (Netherlands); Bijlsma, Lubertus; Sancho, Juan V.; Hernandez, Felix [Research Institute for Pesticides and Water, University Jaume I, Avda. Sos Baynat s/n, E-12071 Castellón (Spain); Voogt, Pim de, E-mail: w.p.devoogt@uva.nl [KWR Watercycle Research Institute, Chemical Water Quality and Health, P.O. Box 1072, 3430 BB Nieuwegein (Netherlands); Institute for Biodiversity and Ecosystem Dynamics, University of Amsterdam, P.O. Box 94248, 1090 GE Amsterdam (Netherlands)

    2016-11-01

    A screening approach was applied to influent and effluent wastewater samples. After injection in a LC-LTQ-Orbitrap, data analysis was performed using two deconvolution tools, MsXelerator (modules MPeaks and MS Compare) and Sieve 2.1. The outputs were searched incorporating an in-house database of > 200 pharmaceuticals and illicit drugs or ChemSpider. This hidden target screening approach led to the detection of numerous compounds including the illicit drug cocaine and its metabolite benzoylecgonine and the pharmaceuticals carbamazepine, gemfibrozil and losartan. The compounds found using both approaches were combined, and isotopic pattern and retention time prediction were used to filter out false positives. The remaining potential positives were reanalysed in MS/MS mode and their product ions were compared with literature and/or mass spectral libraries. The inclusion of the chemical database ChemSpider led to the tentative identification of several metabolites, including paraxanthine, theobromine, theophylline and carboxylosartan, as well as the pharmaceutical phenazone. The first three of these compounds are isomers and they were subsequently distinguished based on their product ions and predicted retention times. This work has shown that the use deconvolution tools facilitates non-target screening and enables the identification of a higher number of compounds. - Highlights: • A hidden target non-target screening method is utilised using two databases • Two software (MsXelerator and Sieve 2.1) used for both methods • 22 compounds tentatively identified following MS/MS reinjection • More information gleaned from this combined approach than individually.

  9. A joint Richardson—Lucy deconvolution algorithm for the reconstruction of multifocal structured illumination microscopy data

    International Nuclear Information System (INIS)

    Ströhl, Florian; Kaminski, Clemens F

    2015-01-01

    We demonstrate the reconstruction of images obtained by multifocal structured illumination microscopy, MSIM, using a joint Richardson–Lucy, jRL-MSIM, deconvolution algorithm, which is based on an underlying widefield image-formation model. The method is efficient in the suppression of out-of-focus light and greatly improves image contrast and resolution. Furthermore, it is particularly well suited for the processing of noise corrupted data. The principle is verified on simulated as well as experimental data and a comparison of the jRL-MSIM approach with the standard reconstruction procedure, which is based on image scanning microscopy, ISM, is made. Our algorithm is efficient and freely available in a user friendly software package. (paper)

  10. Lung Cancer Screening

    Science.gov (United States)

    ... factors increase or decrease the risk of lung cancer. Lung cancer is a disease in which malignant (cancer) ... following PDQ summaries for more information about lung cancer: Lung Cancer Prevention Non-Small Cell Lung Cancer Treatment ...

  11. Nutrition for Lung Cancer

    Science.gov (United States)

    ... Become An Advocate Volunteer Ways To Give Lung Cancer www.lung.org > Lung Health and Diseases > Lung Disease Lookup > ... Cancer Learn About Lung Cancer What Is Lung Cancer Lung Cancer Basics Causes & Risk Factors Lung Cancer Staging ...

  12. Using Fractal And Morphological Criteria For Automatic Classification Of Lung Diseases

    Science.gov (United States)

    Vehel, Jacques Levy

    1989-11-01

    Medical Images are difficult to analyze by means of classical image processing tools because they are very complex and irregular. Such shapes are obtained for instance in Nuclear Medecine with the spatial distribution of activity for organs such as lungs, liver, and heart. We have tried to apply two different theories to these signals: - Fractal Geometry deals with the analysis of complex irregular shapes which cannot well be described by the classical Euclidean geometry. - Integral Geometry treats sets globally and allows to introduce robust measures. We have computed three parameters on three kinds of Lung's SPECT images: normal, pulmonary embolism and chronic desease: - The commonly used fractal dimension (FD), that gives a measurement of the irregularity of the 3D shape. - The generalized lacunarity dimension (GLD), defined as the variance of the ratio of the local activity by the mean activity, which is only sensitive to the distribution and the size of gaps in the surface. - The Favard length that gives an approximation of the surface of a 3-D shape. The results show that each slice of the lung, considered as a 3D surface, is fractal and that the fractal dimension is the same for each slice and for the three kind of lungs; as for the lacunarity and Favard length, they are clearly different for normal lungs, pulmonary embolisms and chronic diseases. These results indicate that automatic classification of Lung's SPECT can be achieved, and that a quantitative measurement of the evolution of the disease could be made.

  13. Suspected-target pesticide screening using gas chromatography-quadrupole time-of-flight mass spectrometry with high resolution deconvolution and retention index/mass spectrum library.

    Science.gov (United States)

    Zhang, Fang; Wang, Haoyang; Zhang, Li; Zhang, Jing; Fan, Ruojing; Yu, Chongtian; Wang, Wenwen; Guo, Yinlong

    2014-10-01

    A strategy for suspected-target screening of pesticide residues in complicated matrices was exploited using gas chromatography in combination with hybrid quadrupole time-of-flight mass spectrometry (GC-QTOF MS). The screening workflow followed three key steps of, initial detection, preliminary identification, and final confirmation. The initial detection of components in a matrix was done by a high resolution mass spectrum deconvolution; the preliminary identification of suspected pesticides was based on a special retention index/mass spectrum (RI/MS) library that contained both the first-stage mass spectra (MS(1) spectra) and retention indices; and the final confirmation was accomplished by accurate mass measurements of representative ions with their response ratios from the MS(1) spectra or representative product ions from the second-stage mass spectra (MS(2) spectra). To evaluate the applicability of the workflow in real samples, three matrices of apple, spinach, and scallion, each spiked with 165 test pesticides in a set of concentrations, were selected as the models. The results showed that the use of high-resolution TOF enabled effective extractions of spectra from noisy chromatograms, which was based on a narrow mass window (5 mDa) and suspected-target compounds identified by the similarity match of deconvoluted full mass spectra and filtering of linear RIs. On average, over 74% of pesticides at 50 ng/mL could be identified using deconvolution and the RI/MS library. Over 80% of pesticides at 5 ng/mL or lower concentrations could be confirmed in each matrix using at least two representative ions with their response ratios from the MS(1) spectra. In addition, the application of product ion spectra was capable of confirming suspected pesticides with specificity for some pesticides in complicated matrices. In conclusion, GC-QTOF MS combined with the RI/MS library seems to be one of the most efficient tools for the analysis of suspected-target pesticide residues

  14. Deep Deconvolutional Neural Network for Target Segmentation of Nasopharyngeal Cancer in Planning Computed Tomography Images.

    Science.gov (United States)

    Men, Kuo; Chen, Xinyuan; Zhang, Ye; Zhang, Tao; Dai, Jianrong; Yi, Junlin; Li, Yexiong

    2017-01-01

    Radiotherapy is one of the main treatment methods for nasopharyngeal carcinoma (NPC). It requires exact delineation of the nasopharynx gross tumor volume (GTVnx), the metastatic lymph node gross tumor volume (GTVnd), the clinical target volume (CTV), and organs at risk in the planning computed tomography images. However, this task is time-consuming and operator dependent. In the present study, we developed an end-to-end deep deconvolutional neural network (DDNN) for segmentation of these targets. The proposed DDNN is an end-to-end architecture enabling fast training and testing. It consists of two important components: an encoder network and a decoder network. The encoder network was used to extract the visual features of a medical image and the decoder network was used to recover the original resolution by deploying deconvolution. A total of 230 patients diagnosed with NPC stage I or stage II were included in this study. Data from 184 patients were chosen randomly as a training set to adjust the parameters of DDNN, and the remaining 46 patients were the test set to assess the performance of the model. The Dice similarity coefficient (DSC) was used to quantify the segmentation results of the GTVnx, GTVnd, and CTV. In addition, the performance of DDNN was compared with the VGG-16 model. The proposed DDNN method outperformed the VGG-16 in all the segmentation. The mean DSC values of DDNN were 80.9% for GTVnx, 62.3% for the GTVnd, and 82.6% for CTV, whereas VGG-16 obtained 72.3, 33.7, and 73.7% for the DSC values, respectively. DDNN can be used to segment the GTVnx and CTV accurately. The accuracy for the GTVnd segmentation was relatively low due to the considerable differences in its shape, volume, and location among patients. The accuracy is expected to increase with more training data and combination of MR images. In conclusion, DDNN has the potential to improve the consistency of contouring and streamline radiotherapy workflows, but careful human review and a

  15. Single-Lung Ventilation with Contralateral Lung Deflation

    Science.gov (United States)

    Dallan, Luís Alberto O.; Lisboa, Luiz Augusto F.; Platania, Fernando; Oliveira, Sérgio A.; Stolf, Noedir A.

    2007-01-01

    There are many new alternative methods of minimally invasive myocardial revascularization that can be applied in selected patients who have multivessel coronary artery disease. However, these techniques often require new and expensive equipment. Most multivessel myocardial revascularization is performed via median sternotomy and involves the use of a conventional endotracheal tube. Both lungs are ventilated, and frequently the left pleural cavity is opened. In contrast, single-lung deflation naturally moves the mediastinum within the thorax toward the collapsed lung, without the need to open the pleural cavities. Herein, we describe a simple alternative procedure that facilitates off-pump multivessel coronary artery bypass grafting via complete median sternotomy: single-lung ventilation with contralateral lung deflation. This technique better exposes the more distal right and circumflex coronary artery branches with or without the opening of the pleural cavities. PMID:17622364

  16. Data-driven Green's function retrieval and application to imaging with multidimensional deconvolution

    Science.gov (United States)

    Broggini, Filippo; Wapenaar, Kees; van der Neut, Joost; Snieder, Roel

    2014-01-01

    An iterative method is presented that allows one to retrieve the Green's function originating from a virtual source located inside a medium using reflection data measured only at the acquisition surface. In addition to the reflection response, an estimate of the travel times corresponding to the direct arrivals is required. However, no detailed information about the heterogeneities in the medium is needed. The iterative scheme generalizes the Marchenko equation for inverse scattering to the seismic reflection problem. To give insight in the mechanism of the iterative method, its steps for a simple layered medium are analyzed using physical arguments based on the stationary phase method. The retrieved Green's wavefield is shown to correctly contain the multiples due to the inhomogeneities present in the medium. Additionally, a variant of the iterative scheme enables decomposition of the retrieved wavefield into its downgoing and upgoing components. These wavefields then enable creation of a ghost-free image of the medium with either cross correlation or multidimensional deconvolution, presenting an advantage over standard prestack migration.

  17. Measurement and deconvolution of detector response time for short HPM pulses: Part 1, Microwave diodes

    International Nuclear Information System (INIS)

    Bolton, P.R.

    1987-06-01

    A technique is described for measuring and deconvolving response times of microwave diode detection systems in order to generate corrected input signals typical of an infinite detection rate. The method has been applied to cases of 2.86 GHz ultra-short HPM pulse detection where pulse rise time is comparable to that of the detector; whereas, the duration of a few nanoseconds is significantly longer. Results are specified in terms of the enhancement of equivalent deconvolved input voltages for given observed voltages. The convolution integral imposes the constraint of linear detector response to input power levels. This is physically equivalent to the conservation of integrated pulse energy in the deconvolution process. The applicable dynamic range of a microwave diode is therefore limited to a smaller signal region as determined by its calibration

  18. Two cases with giant lung abscess originating in the irradiated lung field following the concurrent chemo-radiotherapy of lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Ikeda, Takeshi; Inui, Hiroyuki; Yukawa, Susumu; Nomoto, Hiroshi (Wakayama Medical Coll. (Japan)); Minakata, Yoshiaki; Yamagata, Toshiyuki

    1992-05-01

    Two patients with giant lung abscess originating in the irradiated lung field are reported. Lung abscesses occurred during the term of leukopenia following the concurrent chemo-radiotherapy of lung cancer. Both patients were diagnosed as small cell lung cancer, and were treated concurrently with chemotherapy (Cisplatin + Etoposide) and radiotherapy (total 40-50 Gy). Case 1 was a 59 years old male. Seven weeks after the first irradiation, a giant lung abscess was caused by methicillin resistant staphylococcus aureus (MRSA) originated in the lung field with radiation pneumonitis, and giant bronchial fistula was formed, that showed the specific bronchofiberscopic findings. Case 2 was a 67 years old male. Twelve weeks after the first irradiation, a giant lung abscess was caused by pseudomonas aeruginosa originated in the irradiated lung field following the formation of a pneumatocele. MRSA and pseudomonas aeruginosa are important as cause of hospital infection, and both can cause lung abscess. However, in our cases, lung abscess were formed just in the irradiated lung field and rapidly enlarged. These clinical findings suggested that myelosuppression and radiation injury of lung tissue might cause such giant lung abscess. (author).

  19. Two cases with giant lung abscess originating in the irradiated lung field following the concurrent chemo-radiotherapy of lung cancer

    International Nuclear Information System (INIS)

    Ikeda, Takeshi; Inui, Hiroyuki; Yukawa, Susumu; Nomoto, Hiroshi; Minakata, Yoshiaki; Yamagata, Toshiyuki.

    1992-01-01

    Two patients with giant lung abscess originating in the irradiated lung field are reported. Lung abscesses occurred during the term of leukopenia following the concurrent chemo-radiotherapy of lung cancer. Both patients were diagnosed as small cell lung cancer, and were treated concurrently with chemotherapy (Cisplatin + Etoposide) and radiotherapy (total 40-50 Gy). Case 1 was a 59 years old male. Seven weeks after the first irradiation, a giant lung abscess was caused by methicillin resistant staphylococcus aureus (MRSA) originated in the lung field with radiation pneumonitis, and giant bronchial fistula was formed, that showed the specific bronchofiberscopic findings. Case 2 was a 67 years old male. Twelve weeks after the first irradiation, a giant lung abscess was caused by pseudomonas aeruginosa originated in the irradiated lung field following the formation of a pneumatocele. MRSA and pseudomonas aeruginosa are important as cause of hospital infection, and both can cause lung abscess. However, in our cases, lung abscess were formed just in the irradiated lung field and rapidly enlarged. These clinical findings suggested that myelosuppression and radiation injury of lung tissue might cause such giant lung abscess. (author)

  20. Lung Cancers Associated with Cystic Airspaces: Underrecognized Features of Early Disease.

    Science.gov (United States)

    Sheard, Sarah; Moser, Joanna; Sayer, Charlie; Stefanidis, Konstantinos; Devaraj, Anand; Vlahos, Ioannis

    2018-01-01

    Early lung cancers associated with cystic airspaces are increasingly being recognized as a cause of delayed diagnoses-owing to data gathered from screening trials and encounters in routine clinical practice as more patients undergo serial imaging. Several morphologic subtypes of cancers associated with cystic airspaces exist and can exhibit variable patterns of progression as the solid elements of the tumor grow. Current understanding of the pathogenesis of these malignancies is limited, and the numbers of cases reported in the literature are small. However, several tumor cell types are represented in these lesions, with adenocarcinoma predominating. The features of cystic airspaces differ among cases and include emphysematous bullae, congenital or fibrotic cysts, subpleural blebs, bronchiectatic airways, and distended distal airspaces. Once identified, these cystic lesions pose management challenges to radiologists in terms of distinguishing them from benign mimics of cancer that are commonly seen in patients who also are at increased risk of lung cancer. Rendering a definitive tissue-based diagnosis can be difficult when the lesions are small, and affected patients tend to be in groups that are at higher risk of requiring biopsy or resection. In addition, the decision to monitor these cases can add to patient anxiety and cause the additional burden of strained departmental resources. The authors have drawn from their experience, emerging evidence from international lung cancer screening trials, and large databases of lung cancer cases from other groups to analyze the prevalence and evolution of lung cancers associated with cystic airspaces and provide guidance for managing these lesions. Although there are insufficient data to support specific management guidelines similar to those for managing small solid and ground-glass lung nodules, these data and guidelines should be the direction for ongoing research on early detection of lung cancer. © RSNA, 2018.

  1. Lung Emergencies

    Science.gov (United States)

    ... The Marfan Foundation Marfan & Related Disorders What is Marfan Syndrome? What are Related Disorders? What are the Signs? ... Emergencies Lung Emergencies Surgeries Lung Emergencies People with Marfan syndrome can be at increased risk of sudden lung ...

  2. Study of the lifetime of the TL peaks of quartz: comparison of the deconvolution using the first order kinetic with the initial rise method

    International Nuclear Information System (INIS)

    RATOVONJANAHARY, A.J.F.

    2005-01-01

    Quartz is a thermoluminescent material which can be used for dating and/or for dosimetry. This material has been used since 60s for dating samples like pottery, flint, etc., but the method is still subject to some improvement. One of the problem of thermoluminescence dating is the estimation of the lifetime of the ''used peak'' . The application of the glow-curve deconvolution (GCD) technique for the analysis of a composite thermoluminescence glow curve into its individual glow peaks has been applied widely since the 80s. Many functions describing a single glow peak have been proposed. For analysing quartz behaviour, thermoluminescence glow-curve deconvolution (GCD) functions are compared for first order of kinetic. The free parameters of the GCD functions are the maximum peak intensity (I m ) and the maximum peak temperature (T m ), which can be obtained experimentally. The activation energy (E) is the additional free parameter. The lifetime (τ) of each glow peak, which is an important factor for dating, is calculated from these three parameters. For ''used'' ''peak'' lifetime analysis, GCD results are compared to those from initial rise method (IRM). Results vary fairly from method to method. [fr

  3. Deconvolution analysis of 24-h serum cortisol profiles informs the amount and distribution of hydrocortisone replacement therapy.

    Science.gov (United States)

    Peters, Catherine J; Hill, Nathan; Dattani, Mehul T; Charmandari, Evangelia; Matthews, David R; Hindmarsh, Peter C

    2013-03-01

    Hydrocortisone therapy is based on a dosing regimen derived from estimates of cortisol secretion, but little is known of how the dose should be distributed throughout the 24 h. We have used deconvolution analysis of 24-h serum cortisol profiles to determine 24-h cortisol secretion and distribution to inform hydrocortisone dosing schedules in young children and older adults. Twenty four hour serum cortisol profiles from 80 adults (41 men, aged 60-74 years) and 29 children (24 boys, aged 5-9 years) were subject to deconvolution analysis using an 80-min half-life to ascertain total cortisol secretion and distribution throughout the 24-h period. Mean daily cortisol secretion was similar between adults (6.3 mg/m(2) body surface area/day, range 5.1-9.3) and children (8.0 mg/m(2) body surface area/day, range 5.3-12.0). Peak serum cortisol concentration was higher in children compared with adults, whereas nadir serum cortisol concentrations were similar. Timing of the peak serum cortisol concentration was similar (07.05-07.25), whereas that of the nadir concentration occurred later in adults (midnight) compared with children (22.48) (P = 0.003). Children had the highest percentage of cortisol secretion between 06.00 and 12.00 (38.4%), whereas in adults this took place between midnight and 06.00 (45.2%). These observations suggest that the daily hydrocortisone replacement dose should be equivalent on average to 6.3 mg/m(2) body surface area/day in adults and 8.0 mg/m(2) body surface area/day in children. Differences in distribution of the total daily dose between older adults and young children need to be taken into account when using a three or four times per day dosing regimen. © 2012 Blackwell Publishing Ltd.

  4. Lung Cancer

    International Nuclear Information System (INIS)

    Maghfoor, Irfan; Perry, M.C.

    2005-01-01

    Lung cancer is the leading cause of cancer-related mortality. Since tobacco smoking is the cause in vast majority of cases, the incidence of lung cancer is expected to rise in those countries with high or rising incidence of tobacco smoking. Even though population at a risk of developing lung cancer are easily identified, mass screening for lung cancer is not supported by currently available evidence. In case of non-small cell lung cancer, a cure may be possible with surgical resection followed by post-operative chemotherapy in those diagnosed at an early stage. A small minority of patients who present with locally advanced disease may also benefit from preoperative chemotherapy and/or radiation therapy to down stage the tumor to render it potentially operable. In a vast majority of patients, however, lung cancer presents at an advanced stage and a cure is not possible with currently available therapeutic strategies. Similarly small cell lung cancer confined to one hemi-thorax may be curable with a combination of chemotherapy and thoracic irradiation followed by prophylactic cranial irradiation, if complete remission is achieved at the primary site. Small cell lung cancer that is spread beyond the confines of one hemi-thorax is however, considered incurable. In this era of molecular targeted therapies, new agents are constantly undergoing pre-clinical and clinical testing with the aim of targeting the molecular pathways thought to involved in etiology and pathogenesis of lung cancer. (author)

  5. Lung Cancer—Patient Version

    Science.gov (United States)

    The two main types of lung cancer are non-small cell lung cancer and small cell lung cancer. Smoking causes most lung cancers, but nonsmokers can also develop lung cancer. Start here to find information on lung cancer treatment, causes and prevention, screening, research, and statistics on lung cancer.

  6. Analysis of gravity data beneath Endut geothermal prospect using horizontal gradient and Euler deconvolution

    Science.gov (United States)

    Supriyanto, Noor, T.; Suhanto, E.

    2017-07-01

    The Endut geothermal prospect is located in Banten Province, Indonesia. The geological setting of the area is dominated by quaternary volcanic, tertiary sediments and tertiary rock intrusion. This area has been in the preliminary study phase of geology, geochemistry, and geophysics. As one of the geophysical study, the gravity data measurement has been carried out and analyzed in order to understand geological condition especially subsurface fault structure that control the geothermal system in Endut area. After precondition applied to gravity data, the complete Bouguer anomaly have been analyzed using advanced derivatives method such as Horizontal Gradient (HG) and Euler Deconvolution (ED) to clarify the existance of fault structures. These techniques detected boundaries of body anomalies and faults structure that were compared with the lithologies in the geology map. The analysis result will be useful in making a further realistic conceptual model of the Endut geothermal area.

  7. Further optimization of SeDDaRA blind image deconvolution algorithm and its DSP implementation

    Science.gov (United States)

    Wen, Bo; Zhang, Qiheng; Zhang, Jianlin

    2011-11-01

    Efficient algorithm for blind image deconvolution and its high-speed implementation is of great value in practice. Further optimization of SeDDaRA is developed, from algorithm structure to numerical calculation methods. The main optimization covers that, the structure's modularization for good implementation feasibility, reducing the data computation and dependency of 2D-FFT/IFFT, and acceleration of power operation by segmented look-up table. Then the Fast SeDDaRA is proposed and specialized for low complexity. As the final implementation, a hardware system of image restoration is conducted by using the multi-DSP parallel processing. Experimental results show that, the processing time and memory demand of Fast SeDDaRA decreases 50% at least; the data throughput of image restoration system is over 7.8Msps. The optimization is proved efficient and feasible, and the Fast SeDDaRA is able to support the real-time application.

  8. Enhanced tumor growth in the remaining lung after major lung resection.

    Science.gov (United States)

    Sano, Fumiho; Ueda, Kazuhiro; Murakami, Junichi; Hayashi, Masataro; Nishimoto, Arata; Hamano, Kimikazu

    2016-05-01

    Pneumonectomy induces active growth of the remaining lung in order to compensate for lost lung tissue. We hypothesized that tumor progression is enhanced in the activated local environment. We examined the effects of mechanical strain on the activation of lung growth and tumor progression in mice. The mechanical strain imposed on the right lung after left pneumonectomy was neutralized by filling the empty space that remained after pneumonectomy with a polypropylene prosthesis. The neutralization of the strain prevented active lung growth. According to an angiogenesis array, stronger monocyte chemoattractant protein-1 (MCP-1) expression was found in the strain-induced growing lung. The neutralization of the strain attenuated the release of MCP-1 from the lung cells. The intravenous injection of Lewis lung cancer cells resulted in the enhanced development of metastatic foci in the strain-induced growing lung, but the enhanced development was canceled by the neutralization of the strain. An immunohistochemical analysis revealed the prominent accumulation of tumor-associated macrophages in tumors arising in the strain-induced growing lung, and that there was a relationship between the accumulation and the MCP-1 expression status. Our results suggested that mechanical lung strain, induced by pulmonary resection, triggers active lung growth, thereby creating a tumor-friendly environment. The modification of that environment, as well as the minimizing of surgical stress, may be a meaningful strategy to improve the therapeutic outcome after lung cancer surgery. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Lung Cancer Prevention

    Science.gov (United States)

    ... Colorectal Cancer Kidney (Renal Cell) Cancer Leukemia Liver Cancer Lung Cancer Lymphoma Pancreatic Cancer Prostate Cancer Skin Cancer ... following PDQ summaries for more information about lung cancer: Lung Cancer Screening Non-Small Cell Lung Cancer Treatment ...

  10. Abscess in the Lungs

    Science.gov (United States)

    ... Home Lung and Airway Disorders Abscess in the Lungs Abscess in the Lungs Causes Symptoms Diagnosis Treatment Resources ... here for the Professional Version Abscess in the Lungs Abscess in the Lungs A lung abscess is a ...

  11. Lung nodules after whole lung radiation

    International Nuclear Information System (INIS)

    Cohen, M.D.; Mirkin, D.L.; Provisor, A.; Hornback, N.B.; Smith, J.A.; Slabaugh, R.D.

    1983-01-01

    It is essential to recognize radiation pneumonitis after whole lung irradiation, or nodular changes in response to chemotherapy, so that such conditions are not mistaken for tumor metastases, causing grave error in patient management and the possibility of further lung damage

  12. Increased mean lung density: Another independent predictor of lung cancer?

    Energy Technology Data Exchange (ETDEWEB)

    Sverzellati, Nicola, E-mail: nicola.sverzellati@unipr.it [Department of Department of Surgical Sciences, Section of Diagnostic Imaging, University of Parma, Padiglione Barbieri, University Hospital of Parma, V. Gramsci 14, 43100 Parma (Italy); Randi, Giorgia, E-mail: giorgia.randi@marionegri.it [Department of Epidemiology, Mario Negri Institute, Via La Masa 19, 20156 Milan (Italy); Spagnolo, Paolo, E-mail: paolo.spagnolo@unimore.it [Respiratory Disease Unit, Center for Rare Lung Disease, Department of Oncology, Hematology and Respiratory Disease, University of Modena and Reggio Emilia, Via del Pozzo 71, 44124 Modena (Italy); Marchianò, Alfonso, E-mail: alfonso.marchiano@istitutotumori.mi.it [Department of Radiology, Fondazione IRCCS Istituto Nazionale dei Tumori, Via Venezian 1, 20133 Milan (Italy); Silva, Mario, E-mail: mac.mario@hotmail.it [Department of Department of Surgical Sciences, Section of Diagnostic Imaging, University of Parma, Padiglione Barbieri, University Hospital of Parma, V. Gramsci 14, 43100 Parma (Italy); Kuhnigk, Jan-Martin, E-mail: Jan-Martin.Kuhnigk@mevis.fraunhofer.de [Fraunhofer MEVIS, Universitaetsallee 29, 28359 Bremen (Germany); La Vecchia, Carlo, E-mail: carlo.lavecchia@marionegri.it [Department of Occupational Health, University of Milan, Via Venezian 1, 20133 Milan (Italy); Zompatori, Maurizio, E-mail: maurizio.zompatori@unibo.it [Department of Radiology, Cardio-Thoracic Section, S. Orsola-Malpighi Hospital, Via Albertoni 15, 40138 Bologna (Italy); Pastorino, Ugo, E-mail: ugo.pastorino@istitutotumori.mi.it [Department of Surgery, Section of Thoracic Surgery, Fondazione IRCCS Istituto Nazionale dei Tumori, Via Venezian 1, 20133 Milan (Italy)

    2013-08-15

    Objectives: To investigate the relationship between emphysema phenotype, mean lung density (MLD), lung function and lung cancer by using an automated multiple feature analysis tool on thin-section computed tomography (CT) data. Methods: Both emphysema phenotype and MLD evaluated by automated quantitative CT analysis were compared between outpatients and screening participants with lung cancer (n = 119) and controls (n = 989). Emphysema phenotype was defined by assessing features such as extent, distribution on core/peel of the lung and hole size. Adjusted multiple logistic regression models were used to evaluate independent associations of CT densitometric measurements and pulmonary function test (PFT) with lung cancer risk. Results: No emphysema feature was associated with lung cancer. Lung cancer risk increased with decreasing values of forced expiratory volume in 1 s (FEV{sub 1}) independently of MLD (OR 5.37, 95% CI: 2.63–10.97 for FEV{sub 1} < 60% vs. FEV{sub 1} ≥ 90%), and with increasing MLD independently of FEV{sub 1} (OR 3.00, 95% CI: 1.60–5.63 for MLD > −823 vs. MLD < −857 Hounsfield units). Conclusion: Emphysema per se was not associated with lung cancer whereas decreased FEV{sub 1} was confirmed as being a strong and independent risk factor. The cross-sectional association between increased MLD and lung cancer requires future validations.

  13. What Is Lung Cancer?

    Science.gov (United States)

    ... Shareable Graphics Infographics “African-American Men and Lung Cancer” “Lung Cancer Is the Biggest Cancer Killer in Both ... starts in the lungs, it is called lung cancer. Lung cancer begins in the lungs and may spread ...

  14. Lung Cancer: Glossary

    Science.gov (United States)

    ... professional support team today. Learn More . Find more lung cancer resources. Learn More Donate Today! What is Lung ... to Give How Your Support Helps Events Lung Cancer Awareness © Lung Cancer Alliance. The information presented in this website ...

  15. Thermoluminescence of nanocrystalline CaSO{sub 4}: Dy for gamma dosimetry and calculation of trapping parameters using deconvolution method

    Energy Technology Data Exchange (ETDEWEB)

    Mandlik, Nandkumar, E-mail: ntmandlik@gmail.com [Department of Physics, University of Pune, Ganeshkhind, Pune -411007, India and Department of Physics, Fergusson College, Pune- 411004 (India); Patil, B. J.; Bhoraskar, V. N.; Dhole, S. D. [Department of Physics, University of Pune, Ganeshkhind, Pune -411007 (India); Sahare, P. D. [Department of Physics and Astrophysics, University of Delhi, Delhi- 110007 (India)

    2014-04-24

    Nanorods of CaSO{sub 4}: Dy having diameter 20 nm and length 200 nm have been synthesized by the chemical coprecipitation method. These samples were irradiated with gamma radiation for the dose varying from 0.1 Gy to 50 kGy and their TL characteristics have been studied. TL dose response shows a linear behavior up to 5 kGy and further saturates with increase in the dose. A Computerized Glow Curve Deconvolution (CGCD) program was used for the analysis of TL glow curves. Trapping parameters for various peaks have been calculated by using CGCD program.

  16. Open lung biopsy

    Science.gov (United States)

    Biopsy - open lung ... An open lung biopsy is done in the hospital using general anesthesia . This means you will be asleep and ... The open lung biopsy is done to evaluate lung problems seen on x-ray or CT scan .

  17. Restrictive allograft syndrome after lung transplantation: new radiological insights

    Energy Technology Data Exchange (ETDEWEB)

    Dubbeldam, Adriana; Barthels, Caroline; Coolen, Johan; Verschakelen, Johny A.; Wever, Walter de [University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Verleden, Stijn E.; Vos, Robin; Verleden, Geert M. [University Hospitals Leuven, Department of Pneumology, Leuven (Belgium)

    2017-07-15

    To describe the CT changes in patients with restrictive allograft syndrome (RAS) after lung transplantation, before and after clinical diagnosis. This retrospective study included 22 patients with clinical diagnosis of RAS. Diagnosis was based on a combination of forced expiratory volume (FEV1) decline (≥20 %) and total lung capacity (TLC) decline (≥10 %). All available CT scans after transplantation were analyzed for the appearance and evolution of lung abnormalities. In 14 patients, non-regressing nodules and reticulations predominantly affecting the upper lobes developed an average of 13.9 months prior to the diagnosis of RAS. Median graft survival after onset of non-regressing abnormalities was 33.5 months, with most patients in follow-up (9/14). In eight patients, a sudden appearance of diffuse consolidations mainly affecting both upper and lower lobes was seen an average of 2.8 months prior to the diagnosis of RAS. Median graft survival was 6.4 months after first onset of non-regressing abnormalities, with graft loss in most patients (6/8). RAS has been previously described as a homogenous group. However, our study shows two different groups of RAS-patients: one with slow progression and one with fast progression. The two groups show different onset and progression patterns of CT abnormalities. (orig.)

  18. Bacterial lung abscess

    International Nuclear Information System (INIS)

    Groskin, S.A.; Panicek, D.M.; Ewing, D.K.; Rivera, F.; Math, K.; Teixeira, J.; Heitzman, E.R.

    1987-01-01

    A retrospective review of patients with bacterial lung abscess was carried out. Demographic, clinical, and radiographical features of this patient group are compared with similar data from patients with empyema and/or cavitated lung carcinoma; differential diagnostic points are stressed. The entity of radiographically occult lung abscess is discussed. Complications associated with bacterial lung abscess are discussed. Current therapeutic options and treatment philosophy for patients with bacterial lung abscess are noted

  19. The Murine Lung Microbiome Changes During Lung Inflammation and Intranasal Vancomycin Treatment

    Science.gov (United States)

    Barfod, Kenneth Klingenberg; Vrankx, Katleen; Mirsepasi-Lauridsen, Hengameh Chloé; Hansen, Jitka Stilund; Hougaard, Karin Sørig; Larsen, Søren Thor; Ouwenhand, Arthur C.; Krogfelt, Karen Angeliki

    2015-01-01

    Most microbiome research related to airway diseases has focused on the gut microbiome. This is despite advances in culture independent microbial identification techniques revealing that even healthy lungs possess a unique dynamic microbiome. This conceptual change raises the question; if lung diseases could be causally linked to local dysbiosis of the local lung microbiota. Here, we manipulate the murine lung and gut microbiome, in order to show that the lung microbiota can be changed experimentally. We have used four different approaches: lung inflammation by exposure to carbon nano-tube particles, oral probiotics and oral or intranasal exposure to the antibiotic vancomycin. Bacterial DNA was extracted from broncho-alveolar and nasal lavage fluids, caecum samples and compared by DGGE. Our results show that: the lung microbiota is sex dependent and not just a reflection of the gut microbiota, and that induced inflammation can change lung microbiota. This change is not transferred to offspring. Oral probiotics in adult mice do not change lung microbiome detectible by DGGE. Nasal vancomycin can change the lung microbiome preferentially, while oral exposure does not. These observations should be considered in future studies of the causal relationship between lung microbiota and lung diseases. PMID:26668669

  20. Lung needle biopsy

    Science.gov (United States)

    ... if you have certain lung diseases such as emphysema. Usually, a collapsed lung after a biopsy does not need treatment. But ... any type Bullae (enlarged alveoli that occur with emphysema) Cor pulmonale (condition ... of the lung High blood pressure in the lung arteries Severe ...

  1. Segmentasi Paru-Paru pada Citra X-Ray Thorax Menggunakan Distance Regularized Levelset Evolution (DRLSE

    Directory of Open Access Journals (Sweden)

    M Amin Hariyadi

    2017-03-01

    Full Text Available -- Lung is one in control in the circulatory system of air (oxygen in the human body, so the detection of disorders of the human respiratory urgently needed to detect any disturbance in the lungs used X-ray beam, from the results of x-ray image of the thorax contained information used to analyze and determine the shape of an object from the lungs, in order to obtain such information, a process of segmentation. In this study used methods Distance regularized Levelset Evolution (DLRSE, this method region based models which is an improvement of edge-based models. The purpose of this study to implement segmentation methods DRLSE the lungs of the results of x-ray image of the thorax. The trial results with the system DLRSE method performed on the 20 data from X-ray image of the thorax obtained an average result accuracy of 87.90%, a sensitivity of 76.27% and a specificity of 93.98%.

  2. Hypertonic saline reduces inflammation and enhances the resolution of oleic acid induced acute lung injury

    Directory of Open Access Journals (Sweden)

    Costello Joseph F

    2008-07-01

    Full Text Available Abstract Background Hypertonic saline (HTS reduces the severity of lung injury in ischemia-reperfusion, endotoxin-induced and ventilation-induced lung injury. However, the potential for HTS to modulate the resolution of lung injury is not known. We investigated the potential for hypertonic saline to modulate the evolution and resolution of oleic acid induced lung injury. Methods Adult male Sprague Dawley rats were used in all experiments. Series 1 examined the potential for HTS to reduce the severity of evolving oleic acid (OA induced acute lung injury. Following intravenous OA administration, animals were randomized to receive isotonic (Control, n = 12 or hypertonic saline (HTS, n = 12, and the extent of lung injury assessed after 6 hours. Series 2 examined the potential for HTS to enhance the resolution of oleic acid (OA induced acute lung injury. Following intravenous OA administration, animals were randomized to receive isotonic (Control, n = 6 or hypertonic saline (HTS, n = 6, and the extent of lung injury assessed after 6 hours. Results In Series I, HTS significantly reduced bronchoalveolar lavage (BAL neutrophil count compared to Control [61.5 ± 9.08 versus 102.6 ± 11.89 × 103 cells.ml-1]. However, there were no between group differences with regard to: A-a O2 gradient [11.9 ± 0.5 vs. 12.0 ± 0.5 KPa]; arterial PO2; static lung compliance, or histologic injury. In contrast, in Series 2, hypertonic saline significantly reduced histologic injury and reduced BAL neutrophil count [24.5 ± 5.9 versus 46.8 ± 4.4 × 103 cells.ml-1], and interleukin-6 levels [681.9 ± 190.4 versus 1365.7 ± 246.8 pg.ml-1]. Conclusion These findings demonstrate, for the first time, the potential for HTS to reduce pulmonary inflammation and enhance the resolution of oleic acid induced lung injury.

  3. INTRAVAL project phase 2. Analysis of STRIPA 3D data by a deconvolution technique

    International Nuclear Information System (INIS)

    Ilvonen, M.; Hautojaervi, A.; Paatero, P.

    1994-09-01

    The data analysed in this report were obtained in tracer experiments performed from a specially excavated drift in good granite rock at the level of 360 m below the ground in the Stripa mine. Tracer transport paths from the injection points to the collecting sheets at the tunnel walls were tens of meters long. Data for six tracers that arrived in measurable concentrations were elaborated by different means of data analysis to reveal the transport behaviour of solutes in the rock fractures. Techniques like direct inversion of the data, Fourier analysis, Singular Value Decomposition (SVD) and non-negative least squares fitting (NNLS) were employed. A newly developed code based on a general-purpose approach for solving deconvolution-type or integral equation problems, Extreme Value Estimation (EVE), proved to be a very helpful tool in deconvolving impulse responses from the injection flow rates and break-through curves of tracers and assessing the physical confidence of the results. (23 refs., 33 figs.)

  4. Lung scintigraphy

    International Nuclear Information System (INIS)

    Dalenz, Roberto.

    1994-01-01

    A review of lung scintigraphy, perfusion scintigraphy with SPECT, lung ventilation SPECT, blood pool SPECT. The procedure of lung perfusion studies, radiopharmaceutical, administration and clinical applications, imaging processing .Results encountered and evaluation criteria after Biello and Pioped. Recommendations and general considerations have been studied about relation of this radiopharmaceutical with other pathologies

  5. Unevenness on aerosol inhalation lung images and lung function

    International Nuclear Information System (INIS)

    Teshima, Takeo; Isawa, Toyoharu; Hirano, Tomio; Ebina, Akio; Shiraishi, Koichiro; Konno, Kiyoshi

    1985-01-01

    The unevenness or inhomogeneity of aerosol deposition patterns on radioaerosol inhalation lung images has been interpreted rather qualitatively in the clinical practice. We have reported our approach to quantitatively analyze the radioactive count distribution on radioaerosol inhalation lung images in relation to the actual lung function data. We have defined multiple indexes to express the shape and the unevenness of the count distribution of the lung images. To reduce as much as possible the number of indexes to be used in the regression functions, the method of selection of variables was introduced to the multiple regression analysis. Because some variables showed greater coefficients of simple correlation, while others did not, multicollinearity of variables had to be taken into consideration. For this reason, we chose a principal components regression analysis. The multiple regression function for each item of pulmonary function data thus established from analysis of 67 subjects appeared usable as a predictor of the actual lung function: for example, % VC (vital capacity) could be estimated by using four indexes out of the multiple ones with a coefficient of multiple correlation (R) of 0.753, and FEVsub(1.0) % (forced expiratory volume in one second divided by forced expiratory volume), by 7 indexes with R = 0.921. Pulmonary function data regarding lung volumes and lung mechanics were estimated more accurately with greater R's than those for lung diffusion, but even in the latter the prediction was still statistically significant at p less than 0.01. We believe the multiple regression functions thus obtained are useful for estimating not only the overall but also the regional function of the lungs. (author)

  6. Mitigation of motion artifacts in CBCT of lung tumors based on tracked tumor motion during CBCT acquisition

    International Nuclear Information System (INIS)

    Lewis, John H; Li Ruijiang; Jia Xun; Watkins, W Tyler; Song, William Y; Jiang, Steve B; Lou, Yifei

    2011-01-01

    An algorithm capable of mitigating respiratory motion blurring artifacts in cone-beam computed tomography (CBCT) lung tumor images based on the motion of the tumor during the CBCT scan is developed. The tumor motion trajectory and probability density function (PDF) are reconstructed from the acquired CBCT projection images using a recently developed algorithm Lewis et al (2010 Phys. Med. Biol. 55 2505-22). Assuming that the effects of motion blurring can be represented by convolution of the static lung (or tumor) anatomy with the motion PDF, a cost function is defined, consisting of a data fidelity term and a total variation regularization term. Deconvolution is performed through iterative minimization of this cost function. The algorithm was tested on digital respiratory phantom, physical respiratory phantom and patient data. A clear qualitative improvement is evident in the deblurred images as compared to the motion-blurred images for all cases. Line profiles show that the tumor boundaries are more accurately and clearly represented in the deblurred images. The normalized root-mean-squared error between the images used as ground truth and the motion-blurred images are 0.29, 0.12 and 0.30 in the digital phantom, physical phantom and patient data, respectively. Deblurring reduces the corresponding values to 0.13, 0.07 and 0.19. Application of a -700 HU threshold to the digital phantom results in tumor dimension measurements along the superior-inferior axis of 2.8, 1.8 and 1.9 cm in the motion-blurred, ground truth and deblurred images, respectively. Corresponding values for the physical phantom are 3.4, 2.7 and 2.7 cm. A threshold of -500 HU applied to the patient case gives measurements of 3.1, 1.6 and 1.7 cm along the SI axis in the CBCT, 4DCT and deblurred images, respectively. This technique could provide more accurate information about a lung tumor's size and shape on the day of treatment.

  7. Digital high-pass filter deconvolution by means of an infinite impulse response filter

    Energy Technology Data Exchange (ETDEWEB)

    Födisch, P., E-mail: p.foedisch@hzdr.de [Helmholtz-Zentrum Dresden - Rossendorf, Department of Research Technology, Bautzner Landstr. 400, 01328 Dresden (Germany); Wohsmann, J. [Helmholtz-Zentrum Dresden - Rossendorf, Department of Research Technology, Bautzner Landstr. 400, 01328 Dresden (Germany); Dresden University of Applied Sciences, Faculty of Electrical Engineering, Friedrich-List-Platz 1, 01069 Dresden (Germany); Lange, B. [Helmholtz-Zentrum Dresden - Rossendorf, Department of Research Technology, Bautzner Landstr. 400, 01328 Dresden (Germany); Schönherr, J. [Dresden University of Applied Sciences, Faculty of Electrical Engineering, Friedrich-List-Platz 1, 01069 Dresden (Germany); Enghardt, W. [OncoRay - National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Fetscherstr. 74, PF 41, 01307 Dresden (Germany); Helmholtz-Zentrum Dresden - Rossendorf, Institute of Radiooncology, Bautzner Landstr. 400, 01328 Dresden (Germany); German Cancer Consortium (DKTK) and German Cancer Research Center (DKFZ), Im Neuenheimer Feld 280, 69120 Heidelberg (Germany); Kaever, P. [Helmholtz-Zentrum Dresden - Rossendorf, Department of Research Technology, Bautzner Landstr. 400, 01328 Dresden (Germany); Dresden University of Applied Sciences, Faculty of Electrical Engineering, Friedrich-List-Platz 1, 01069 Dresden (Germany)

    2016-09-11

    In the application of semiconductor detectors, the charge-sensitive amplifier is widely used in front-end electronics. The output signal is shaped by a typical exponential decay. Depending on the feedback network, this type of front-end electronics suffers from the ballistic deficit problem, or an increased rate of pulse pile-ups. Moreover, spectroscopy applications require a correction of the pulse-height, while a shortened pulse-width is desirable for high-throughput applications. For both objectives, digital deconvolution of the exponential decay is convenient. With a general method and the signals of our custom charge-sensitive amplifier for cadmium zinc telluride detectors, we show how the transfer function of an amplifier is adapted to an infinite impulse response (IIR) filter. This paper investigates different design methods for an IIR filter in the discrete-time domain and verifies the obtained filter coefficients with respect to the equivalent continuous-time frequency response. Finally, the exponential decay is shaped to a step-like output signal that is exploited by a forward-looking pulse processing.

  8. Acute Lung Injury Results from Innate Sensing of Viruses by an ER Stress Pathway

    Directory of Open Access Journals (Sweden)

    Eike R. Hrincius

    2015-06-01

    Full Text Available Incursions of new pathogenic viruses into humans from animal reservoirs are occurring with alarming frequency. The molecular underpinnings of immune recognition, host responses, and pathogenesis in this setting are poorly understood. We studied pandemic influenza viruses to determine the mechanism by which increasing glycosylation during evolution of surface proteins facilitates diminished pathogenicity in adapted viruses. ER stress during infection with poorly glycosylated pandemic strains activated the unfolded protein response, leading to inflammation, acute lung injury, and mortality. Seasonal strains or viruses engineered to mimic adapted viruses displaying excess glycans on the hemagglutinin did not cause ER stress, allowing preservation of the lungs and survival. We propose that ER stress resulting from recognition of non-adapted viruses is utilized to discriminate “non-self” at the level of protein processing and to activate immune responses, with unintended consequences on pathogenesis. Understanding this mechanism should improve strategies for treating acute lung injury from zoonotic viral infections.

  9. Static lung compliance and body pressures in Tupinambis merianae with and without post-hepatic septum.

    Science.gov (United States)

    Klein, Wilfried; Abe, Augusto S; Perry, Steven F

    2003-04-15

    The surgical removal of the post-hepatic septum (PHS) in the tegu lizard, Tupinambis merianae, significantly reduces resting lung volume (V(Lr)) and maximal lung volume (V(Lm)) when compared with tegus with intact PHS. Standardised for body mass (M(B)), static lung compliance was significantly less in tegus without PHS. Pleural and abdominal pressures followed, like ventilation, a biphasic pattern. In general, pressures increased during expiration and decreased during inspiration. However, during expiration pressure changes showed a marked intra- and interindividual variation. The removal of the PHS resulted in a lower cranio-caudal intracoelomic pressure differential, but had no effect on the general pattern of pressure changes accompanying ventilation. These results show that a perforated PHS that lacks striated muscle has significant influence on static breathing mechanics in Tupinambis and by analogy provides valuable insight into similar processes that led to the evolution of the mammalian diaphragm.

  10. Measurement of canine pancreatic perfusion using dynamic computed tomography: Influence of input-output vessels on deconvolution and maximum slope methods

    Energy Technology Data Exchange (ETDEWEB)

    Kishimoto, Miori, E-mail: miori@mx6.et.tiki.ne.jp [Department of Clinical Veterinary Science, Obihiro University of Agriculture and Veterinary Medicine, Nishi 2-11 Inada-cho, Obihiro 080-8555 (Japan); Tsuji, Yoshihisa, E-mail: y.tsuji@extra.ocn.ne.jp [Department of Gastroenterology and Hepatology, Kyoto University Graduate School of Medicine, Shogoinkawara-cho 54, Sakyo-ku 606-8507 (Japan); Katabami, Nana; Shimizu, Junichiro; Lee, Ki-Ja [Department of Clinical Veterinary Science, Obihiro University of Agriculture and Veterinary Medicine, Nishi 2-11 Inada-cho, Obihiro 080-8555 (Japan); Iwasaki, Toshiroh [Department of Veterinary Internal Medicine, Tokyo University of Agriculture and Technology, Saiwai-cho, 3-5-8, Fuchu 183-8509 (Japan); Miyake, Yoh-Ichi [Department of Clinical Veterinary Science, Obihiro University of Agriculture and Veterinary Medicine, Nishi 2-11 Inada-cho, Obihiro 080-8555 (Japan); Yazumi, Shujiro [Digestive Disease Center, Kitano Hospital, 2-4-20 Ougi-machi, Kita-ku, Osaka 530-8480 (Japan); Chiba, Tsutomu [Department of Gastroenterology and Hepatology, Kyoto University Graduate School of Medicine, Shogoinkawara-cho 54, Sakyo-ku 606-8507 (Japan); Yamada, Kazutaka, E-mail: kyamada@obihiro.ac.jp [Department of Clinical Veterinary Science, Obihiro University of Agriculture and Veterinary Medicine, Nishi 2-11 Inada-cho, Obihiro 080-8555 (Japan)

    2011-01-15

    Objective: We investigated whether the prerequisite of the maximum slope and deconvolution methods are satisfied in pancreatic perfusion CT and whether the measured parameters between these algorithms are correlated. Methods: We examined nine beagles injected with iohexol (200 mgI kg{sup -1}) at 5.0 ml s{sup -1}. The abdominal aorta and splenic and celiac arteries were selected as the input arteries and the splenic vein, the output veins. For the maximum slope method, we determined the arterial contrast volume of each artery by measuring the area under the curve (AUC) and compared the peak enhancement time in the pancreas with the contrast appearance time in the splenic vein. For the deconvolution method, the artery-to-vein collection rate of contrast medium was calculated. We calculated the pancreatic tissue blood flow (TBF), tissue blood volume (TBV), and mean transit time (MTT) using both algorithms and investigated their correlation based on vessel selection. Results: The artery AUC significantly decreased as it neared the pancreas (P < 0.01). In all cases, the peak time of the pancreas (11.5 {+-} 1.6) was shorter than the appearance time (14.1 {+-} 1.6) in the splenic vein. The splenic artery-vein combination exhibited the highest collection rate (91.1%) and was the only combination that was significantly correlated between TBF, TBV, and MTT in both algorithms. Conclusion: Selection of a vessel nearest to the pancreas is considered as a more appropriate prerequisite. Therefore, vessel selection is important in comparison of the semi-quantitative parameters obtained by different algorithms.

  11. Measurement of canine pancreatic perfusion using dynamic computed tomography: Influence of input-output vessels on deconvolution and maximum slope methods

    International Nuclear Information System (INIS)

    Kishimoto, Miori; Tsuji, Yoshihisa; Katabami, Nana; Shimizu, Junichiro; Lee, Ki-Ja; Iwasaki, Toshiroh; Miyake, Yoh-Ichi; Yazumi, Shujiro; Chiba, Tsutomu; Yamada, Kazutaka

    2011-01-01

    Objective: We investigated whether the prerequisite of the maximum slope and deconvolution methods are satisfied in pancreatic perfusion CT and whether the measured parameters between these algorithms are correlated. Methods: We examined nine beagles injected with iohexol (200 mgI kg -1 ) at 5.0 ml s -1 . The abdominal aorta and splenic and celiac arteries were selected as the input arteries and the splenic vein, the output veins. For the maximum slope method, we determined the arterial contrast volume of each artery by measuring the area under the curve (AUC) and compared the peak enhancement time in the pancreas with the contrast appearance time in the splenic vein. For the deconvolution method, the artery-to-vein collection rate of contrast medium was calculated. We calculated the pancreatic tissue blood flow (TBF), tissue blood volume (TBV), and mean transit time (MTT) using both algorithms and investigated their correlation based on vessel selection. Results: The artery AUC significantly decreased as it neared the pancreas (P < 0.01). In all cases, the peak time of the pancreas (11.5 ± 1.6) was shorter than the appearance time (14.1 ± 1.6) in the splenic vein. The splenic artery-vein combination exhibited the highest collection rate (91.1%) and was the only combination that was significantly correlated between TBF, TBV, and MTT in both algorithms. Conclusion: Selection of a vessel nearest to the pancreas is considered as a more appropriate prerequisite. Therefore, vessel selection is important in comparison of the semi-quantitative parameters obtained by different algorithms.

  12. Estimation of 123I-metaiodobenzylguanidine lung uptake in heart and lung diseases. With reference to lung uptake ratio and decrease of lung uptake

    International Nuclear Information System (INIS)

    Fujii, Tadashige; Tanaka, Masao; Yazaki, Yoshikazu; Kitabayashi, Hiroshi; Koizumi, Tomonori; Sekiguchi, Morie; Gomi, Tsutomu; Yano, Kesato; Itoh, Atsuko.

    1997-01-01

    123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy was performed in 64 patients with heart and lung diseases. Distribution of MIBG in the chest was evaluated by planar images, using counts ratios of the heart to the mediastinum (H/M) and the unilateral lung to the mediastinum (Lu/M). Most of patients with heart diseases showed obvious lung uptake of MIBG. The ratios of H/M were 1.75±0.20 in the group without heart failure and 1.55±0.19 in the group with heart failure. The ratios of Lu/M in the right and left lung were 1.56±0.16 and 1.28±0.16 in the group without heart failure. And those were 1.45±0.16 and 1.19±0.15 in the group with heart failure. But 3 patients complicated with chronic pulmonary emphysema and one patient with interstitial pneumonia due to dermatomyositis showed markedly decreased lung uptake. The ratios of Lu/M in the right and left lung of these patients were 1.20, 1.17; 1.17, 1.13; 1.01, 0.97 and 1.27, 0.94, respectively. These results suggest that the lung uptake of MIBG may reflect the state of pulmonary endothelial cell function in clinical situations, considering that it has been demonstrated that MIBG may be useful as a marker of pulmonary endothelial cell function in the isolated rat lung. (author)

  13. Nonrespiratory lung function

    Energy Technology Data Exchange (ETDEWEB)

    Isawa, Toyoharu [Tohoku University Research Institute for Chest Disease and Cancer, Sendai (Japan)

    1994-07-01

    The function of the lungs is primarily the function as a gas exchanger: the venous blood returning to the lungs is arterialized with oxygen in the lungs and the arterialized blood is sent back again to the peripheral tissues of the whole body to be utilized for metabolic oxygenation. Besides the gas exchanging function which we call ''respiratory lung function'' the lungs have functions that have little to do with gas exchange itself. We categorically call the latter function of the lungs as ''nonrespiratory lung function''. The lungs consist of the conductive airways, the gas exchanging units like the alveoli, and the interstitial space that surrounds the former two compartments. The interstitial space contains the blood and lymphatic capillaries, collagen and elastic fibers and cement substances. The conductive airways and the gas exchanging units are directly exposed to the atmosphere that contains various toxic and nontoxic gases, fume and biological or nonbiological particles. Because the conductive airways are equipped with defense mechanisms like mucociliary clearance or coughs to get rid of these toxic gases, particles or locally produced biological debris, we are usually free from being succumbed to ill effects of inhaled materials. By use of nuclear medicine techniques, we can now evaluate mucociliary clearance function, and other nonrespiratory lung functions as well in vivo.

  14. Nonrespiratory lung function

    International Nuclear Information System (INIS)

    Isawa, Toyoharu

    1994-01-01

    The function of the lungs is primarily the function as a gas exchanger: the venous blood returning to the lungs is arterialized with oxygen in the lungs and the arterialized blood is sent back again to the peripheral tissues of the whole body to be utilized for metabolic oxygenation. Besides the gas exchanging function which we call ''respiratory lung function'' the lungs have functions that have little to do with gas exchange itself. We categorically call the latter function of the lungs as ''nonrespiratory lung function''. The lungs consist of the conductive airways, the gas exchanging units like the alveoli, and the interstitial space that surrounds the former two compartments. The interstitial space contains the blood and lymphatic capillaries, collagen and elastic fibers and cement substances. The conductive airways and the gas exchanging units are directly exposed to the atmosphere that contains various toxic and nontoxic gases, fume and biological or nonbiological particles. Because the conductive airways are equipped with defense mechanisms like mucociliary clearance or coughs to get rid of these toxic gases, particles or locally produced biological debris, we are usually free from being succumbed to ill effects of inhaled materials. By use of nuclear medicine techniques, we can now evaluate mucociliary clearance function, and other nonrespiratory lung functions as well in vivo

  15. Deep Deconvolutional Neural Network for Target Segmentation of Nasopharyngeal Cancer in Planning Computed Tomography Images

    Directory of Open Access Journals (Sweden)

    Kuo Men

    2017-12-01

    Full Text Available BackgroundRadiotherapy is one of the main treatment methods for nasopharyngeal carcinoma (NPC. It requires exact delineation of the nasopharynx gross tumor volume (GTVnx, the metastatic lymph node gross tumor volume (GTVnd, the clinical target volume (CTV, and organs at risk in the planning computed tomography images. However, this task is time-consuming and operator dependent. In the present study, we developed an end-to-end deep deconvolutional neural network (DDNN for segmentation of these targets.MethodsThe proposed DDNN is an end-to-end architecture enabling fast training and testing. It consists of two important components: an encoder network and a decoder network. The encoder network was used to extract the visual features of a medical image and the decoder network was used to recover the original resolution by deploying deconvolution. A total of 230 patients diagnosed with NPC stage I or stage II were included in this study. Data from 184 patients were chosen randomly as a training set to adjust the parameters of DDNN, and the remaining 46 patients were the test set to assess the performance of the model. The Dice similarity coefficient (DSC was used to quantify the segmentation results of the GTVnx, GTVnd, and CTV. In addition, the performance of DDNN was compared with the VGG-16 model.ResultsThe proposed DDNN method outperformed the VGG-16 in all the segmentation. The mean DSC values of DDNN were 80.9% for GTVnx, 62.3% for the GTVnd, and 82.6% for CTV, whereas VGG-16 obtained 72.3, 33.7, and 73.7% for the DSC values, respectively.ConclusionDDNN can be used to segment the GTVnx and CTV accurately. The accuracy for the GTVnd segmentation was relatively low due to the considerable differences in its shape, volume, and location among patients. The accuracy is expected to increase with more training data and combination of MR images. In conclusion, DDNN has the potential to improve the consistency of contouring and streamline radiotherapy

  16. Human pericytes adopt myofibroblast properties in the microenvironment of the IPF lung.

    Science.gov (United States)

    Sava, Parid; Ramanathan, Anand; Dobronyi, Amelia; Peng, Xueyan; Sun, Huanxing; Ledesma-Mendoza, Adrian; Herzog, Erica L; Gonzalez, Anjelica L

    2017-12-21

    Idiopathic pulmonary fibrosis (IPF) is a fatal disease of unknown etiology characterized by a compositionally and mechanically altered extracellular matrix. Poor understanding of the origin of α-smooth muscle actin (α-SMA) expressing myofibroblasts has hindered curative therapies. Though proposed as a source of myofibroblasts in mammalian tissues, identification of microvascular pericytes (PC) as contributors to α-SMA-expressing populations in human IPF and the mechanisms driving this accumulation remain unexplored. Here, we demonstrate enhanced detection of α-SMA+ cells coexpressing the PC marker neural/glial antigen 2 in the human IPF lung. Isolated human PC cultured on decellularized IPF lung matrices adopt expression of α-SMA, demonstrating that these cells undergo phenotypic transition in response to direct contact with the extracellular matrix (ECM) of the fibrotic human lung. Using potentially novel human lung-conjugated hydrogels with tunable mechanical properties, we decoupled PC responses to matrix composition and stiffness to show that α-SMA+ PC accumulate in a mechanosensitive manner independent of matrix composition. PC activated with TGF-β1 remodel the normal lung matrix, increasing tissue stiffness to facilitate the emergence of α-SMA+ PC via MKL-1/MTRFA mechanotranduction. Nintedanib, a tyrosine-kinase inhibitor approved for IPF treatment, restores the elastic modulus of fibrotic lung matrices to reverse the α-SMA+ phenotype. This work furthers our understanding of the role that microvascular PC play in the evolution of IPF, describes the creation of an ex vivo platform that advances the study of fibrosis, and presents a potentially novel mode of action for a commonly used antifibrotic therapy that has great relevance for human disease.

  17. Study of the ventilatory lung motion imaging in primary lung cancer

    International Nuclear Information System (INIS)

    Fujii, Tadashige; Tanaka, Masao; Yazaki, Yosikazu; Kitabayashi, Hiroshi; Sekiguchi, Morie.

    1996-01-01

    Using perfusion lung scintigrams with Tc-99m macroaggregated alubumin at maximal inspiration (I) and expiration (E), images of the ventilatory lung motion, which was calculated and delineated by an expression as (E-I)/I, were obtained in 84 cases with primary lung cancer, and its clinical significance in the diagnosis of primary lung cancer was studied. The image of (E-I)/I consisted of positive and negative components. The former visualized the motion of the regional intrapulmonary areas and the latter showed the motion of the lung border. The sum of positive (E-I)/I in the lung with the primary lesion which was lower than that in the contralateral lung, was significantly low in cases with hilar mass, pleural effusion and TNM classification of T3+T4. The sum of positive (E-I)/I in both lungs and vital capacity was relatively low in cases with hilar mass, pleural effusion, TNM classification of T3+T4 and M1. The distribution pattern of pulmonary perfusion and positive (E-I)/I was fairly matched in 48 cases, but mismatch was observed in 36 cases. In the image of negative (E-I)/I, decreased motion of the lung border including the diaphragm was shown in cases with pleural adhesion and thickening, pleural effusion, phrenic nerve palsy and other conditions with hypoventilation. This technique seems to be useful for the estimation of regional pulmonary function of pulmonary perfusion and lung motion, the extent and pathophysiology of primary lung cancer. (author)

  18. Study of the ventilatory lung motion imaging in primary lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, Tadashige [Shinshu Univ., Matsumoto, Nagano (Japan). Shool of Allied Medical Sciences; Tanaka, Masao; Yazaki, Yosikazu; Kitabayashi, Hiroshi; Sekiguchi, Morie

    1996-12-01

    Using perfusion lung scintigrams with Tc-99m macroaggregated alubumin at maximal inspiration (I) and expiration (E), images of the ventilatory lung motion, which was calculated and delineated by an expression as (E-I)/I, were obtained in 84 cases with primary lung cancer, and its clinical significance in the diagnosis of primary lung cancer was studied. The image of (E-I)/I consisted of positive and negative components. The former visualized the motion of the regional intrapulmonary areas and the latter showed the motion of the lung border. The sum of positive (E-I)/I in the lung with the primary lesion which was lower than that in the contralateral lung, was significantly low in cases with hilar mass, pleural effusion and TNM classification of T3+T4. The sum of positive (E-I)/I in both lungs and vital capacity was relatively low in cases with hilar mass, pleural effusion, TNM classification of T3+T4 and M1. The distribution pattern of pulmonary perfusion and positive (E-I)/I was fairly matched in 48 cases, but mismatch was observed in 36 cases. In the image of negative (E-I)/I, decreased motion of the lung border including the diaphragm was shown in cases with pleural adhesion and thickening, pleural effusion, phrenic nerve palsy and other conditions with hypoventilation. This technique seems to be useful for the estimation of regional pulmonary function of pulmonary perfusion and lung motion, the extent and pathophysiology of primary lung cancer. (author)

  19. Gravity and the Evolution of Cardiopulmonary Morphology in Snakes

    Science.gov (United States)

    Lillywhite, Harvey B.; Albert, James S.; Sheehy, Coleman M.; Seymour, Roger S.

    2011-01-01

    Physiological investigations of snakes have established the importance of heart position and pulmonary structure in contexts of gravity effects on blood circulation. Here we investigate morphological correlates of cardiopulmonary physiology in contexts related to ecology, behavior and evolution. We analyze data for heart position and length of vascular lung in 154 species of snakes that exhibit a broad range of characteristic behaviors and habitat associations. We construct a composite phylogeny for these species, and we codify gravitational stress according to species habitat and behavior. We use conventional regression and phylogenetically independent contrasts to evaluate whether trait diversity is correlated with gravitational habitat related to evolutionary transitions within the composite tree topology. We demonstrate that snake species living in arboreal habitats, or which express strongly climbing behaviors, possess relatively short blood columns between the heart and the head, as well as relatively short vascular lungs, compared to terrestrial species. Aquatic species, which experience little or no gravity stress in water, show the reverse – significantly longer heart–head distance and longer vascular lungs. These phylogenetic differences complement the results of physiological studies and are reflected in multiple habitat transitions during the evolutionary histories of these snake lineages, providing strong evidence that heart–to–head distance and length of vascular lung are co–adaptive cardiopulmonary features of snakes. PMID:22079804

  20. Utilization of the statistics techniques for the analysis of the XPS (X-ray photoelectron spectroscopy) and Auger electronic spectra's deconvolutions

    International Nuclear Information System (INIS)

    Puentes, M.B.

    1987-01-01

    For the analysis of the XPS (X-ray photoelectron spectroscopy) and Auger spectra, it is important to performe the peaks' separation and estimate its intensity. For this purpose, a methodology was implemented, including: a spectrum's filter; b) substraction of the base line (or inelastic background); c) deconvolution (separation of the distribution that integrates the spectrum) and d) error of calculation of the mean estimation, comprising adjustment quality tests. A software (FORTRAN IV plus) that permits to use the methodology proposed from the experimental spectra was implemented. The quality of the methodology was tested with simulated spectra. (Author) [es

  1. Application of stable isotopes and isotope pattern deconvolution-ICPMS to speciation of endogenous and exogenous Fe and Se in rats

    International Nuclear Information System (INIS)

    Gonzalez Iglesias, H.; Fernandez-Sanchez, M.L.; Garcia Alonso, J.I.; Lopez Sastre, J.B.; Sanz-Medel, A.

    2009-01-01

    Full text: Enriched stable isotopes are crucial to study essential trace element metabolism (e.g. Se, Fe) in biological systems. Measuring isotope ratios by ICPMS and using appropriate mathematical calculations, based on isotope pattern deconvolution (IPD) may provide quantitative data about endogenous and exogenous essential or toxic elements and their metabolism. In this work, IPD was applied to explore the feasibility of using two Se (or Fe) enriched stable isotopes, one as metabolic tracer and the other as quantitation tracer, to discriminate between the endogenous and supplemented Se (or Fe) species in rat fluids by collision cell ICPMS coupled to HPLC separation. (author)

  2. Comparison of lung preservation solutions in human lungs using an ex vivo lung perfusion experimental model

    Directory of Open Access Journals (Sweden)

    Israel L. Medeiros

    2012-09-01

    Full Text Available OBJECTIVE: Experimental studies on lung preservation have always been performed using animal models. We present ex vivo lung perfusion as a new model for the study of lung preservation. Using human lungs instead of animal models may bring the results of experimental studies closer to what could be expected in clinical practice. METHOD: Brain-dead donors whose lungs had been declined by transplantation teams were used. The cases were randomized into two groups. In Group 1, Perfadex®was used for pulmonary preservation, and in Group 2, LPDnac, a solution manufactured in Brazil, was used. An ex vivo lung perfusion system was used, and the lungs were ventilated and perfused after 10 hours of cold ischemia. The extent of ischemic-reperfusion injury was measured using functional and histological parameters. RESULTS: After reperfusion, the mean oxygenation capacity was 405.3 mmHg in Group 1 and 406.0 mmHg in Group 2 (p = 0.98. The mean pulmonary vascular resistance values were 697.6 and 378.3 dyn·s·cm-5, respectively (p =0.035. The mean pulmonary compliance was 46.8 cm H20 in Group 1 and 49.3 ml/cm H20 in Group 2 (p =0.816. The mean wet/dry weight ratios were 2.06 and 2.02, respectively (p=0.87. The mean Lung Injury Scores for the biopsy performed after reperfusion were 4.37 and 4.37 in Groups 1 and 2, respectively (p = 1.0, and the apoptotic cell counts were 118.75/mm² and 137.50/mm², respectively (p=0.71. CONCLUSION: The locally produced preservation solution proved to be as good as Perfadex®. The clinical use of LPDnac may reduce costs in our centers. Therefore, it is important to develop new models to study lung preservation.

  3. Frequency and number of ultrasound lung rockets (B-lines) using a regionally based lung ultrasound examination named vet BLUE (veterinary bedside lung ultrasound exam) in dogs with radiographically normal lung findings.

    Science.gov (United States)

    Lisciandro, Gregory R; Fosgate, Geoffrey T; Fulton, Robert M

    2014-01-01

    Lung ultrasound is superior to lung auscultation and supine chest radiography for many respiratory conditions in human patients. Ultrasound diagnoses are based on easily learned patterns of sonographic findings and artifacts in standardized images. By applying the wet lung (ultrasound lung rockets or B-lines, representing interstitial edema) versus dry lung (A-lines with a glide sign) concept many respiratory conditions can be diagnosed or excluded. The ultrasound probe can be used as a visual stethoscope for the evaluation of human lungs because dry artifacts (A-lines with a glide sign) predominate over wet artifacts (ultrasound lung rockets or B-lines). However, the frequency and number of wet lung ultrasound artifacts in dogs with radiographically normal lungs is unknown. Thus, the primary objective was to determine the baseline frequency and number of ultrasound lung rockets in dogs without clinical signs of respiratory disease and with radiographically normal lung findings using an 8-view novel regionally based lung ultrasound examination called Vet BLUE. Frequency of ultrasound lung rockets were statistically compared based on signalment, body condition score, investigator, and reasons for radiography. Ten left-sided heart failure dogs were similarly enrolled. Overall frequency of ultrasound lung rockets was 11% (95% confidence interval, 6-19%) in dogs without respiratory disease versus 100% (95% confidence interval, 74-100%) in those with left-sided heart failure. The low frequency and number of ultrasound lung rockets observed in dogs without respiratory disease and with radiographically normal lungs suggests that Vet BLUE will be clinically useful for the identification of canine respiratory conditions. © 2014 American College of Veterinary Radiology.

  4. Epidemiology of Lung Cancer

    Science.gov (United States)

    Brock, Malcolm V.; Ford, Jean G.; Samet, Jonathan M.; Spivack, Simon D.

    2013-01-01

    Background: Ever since a lung cancer epidemic emerged in the mid-1900s, the epidemiology of lung cancer has been intensively investigated to characterize its causes and patterns of occurrence. This report summarizes the key findings of this research. Methods: A detailed literature search provided the basis for a narrative review, identifying and summarizing key reports on population patterns and factors that affect lung cancer risk. Results: Established environmental risk factors for lung cancer include smoking cigarettes and other tobacco products and exposure to secondhand tobacco smoke, occupational lung carcinogens, radiation, and indoor and outdoor air pollution. Cigarette smoking is the predominant cause of lung cancer and the leading worldwide cause of cancer death. Smoking prevalence in developing nations has increased, starting new lung cancer epidemics in these nations. A positive family history and acquired lung disease are examples of host factors that are clinically useful risk indicators. Risk prediction models based on lung cancer risk factors have been developed, but further refinement is needed to provide clinically useful risk stratification. Promising biomarkers of lung cancer risk and early detection have been identified, but none are ready for broad clinical application. Conclusions: Almost all lung cancer deaths are caused by cigarette smoking, underscoring the need for ongoing efforts at tobacco control throughout the world. Further research is needed into the reasons underlying lung cancer disparities, the causes of lung cancer in never smokers, the potential role of HIV in lung carcinogenesis, and the development of biomarkers. PMID:23649439

  5. RANK rewires energy homeostasis in lung cancer cells and drives primary lung cancer.

    Science.gov (United States)

    Rao, Shuan; Sigl, Verena; Wimmer, Reiner Alois; Novatchkova, Maria; Jais, Alexander; Wagner, Gabriel; Handschuh, Stephan; Uribesalgo, Iris; Hagelkruys, Astrid; Kozieradzki, Ivona; Tortola, Luigi; Nitsch, Roberto; Cronin, Shane J; Orthofer, Michael; Branstetter, Daniel; Canon, Jude; Rossi, John; D'Arcangelo, Manolo; Botling, Johan; Micke, Patrick; Fleur, Linnea La; Edlund, Karolina; Bergqvist, Michael; Ekman, Simon; Lendl, Thomas; Popper, Helmut; Takayanagi, Hiroshi; Kenner, Lukas; Hirsch, Fred R; Dougall, William; Penninger, Josef M

    2017-10-15

    Lung cancer is the leading cause of cancer deaths. Besides smoking, epidemiological studies have linked female sex hormones to lung cancer in women; however, the underlying mechanisms remain unclear. Here we report that the receptor activator of nuclear factor-kB (RANK), the key regulator of osteoclastogenesis, is frequently expressed in primary lung tumors, an active RANK pathway correlates with decreased survival, and pharmacologic RANK inhibition reduces tumor growth in patient-derived lung cancer xenografts. Clonal genetic inactivation of KRas G12D in mouse lung epithelial cells markedly impairs the progression of KRas G12D -driven lung cancer, resulting in a significant survival advantage. Mechanistically, RANK rewires energy homeostasis in human and murine lung cancer cells and promotes expansion of lung cancer stem-like cells, which is blocked by inhibiting mitochondrial respiration. Our data also indicate survival differences in KRas G12D -driven lung cancer between male and female mice, and we show that female sex hormones can promote lung cancer progression via the RANK pathway. These data uncover a direct role for RANK in lung cancer and may explain why female sex hormones accelerate lung cancer development. Inhibition of RANK using the approved drug denosumab may be a therapeutic drug candidate for primary lung cancer. © 2017 Rao et al.; Published by Cold Spring Harbor Laboratory Press.

  6. VEGF receptor expression decreases during lung development in congenital diaphragmatic hernia induced by nitrofen

    Energy Technology Data Exchange (ETDEWEB)

    Sbragia, L. [Divisão de Cirurgia Pediátrica, Departamento de Cirurgia e Anatomia, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP, Brasil, Divisão de Cirurgia Pediátrica, Departamento de Cirurgia e Anatomia, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP (Brazil); Nassr, A.C.C. [Departamento de Hidrobiologia do Centro de Ciências Biológicas e da Saúde, Universidade Federal de São Carlos, São Carlos, SP, Brasil, Departamento de Hidrobiologia do Centro de Ciências Biológicas e da Saúde, Universidade Federal de São Carlos, São Carlos, SP (Brazil); Gonçalves, F.L.L. [Divisão de Cirurgia Pediátrica, Departamento de Cirurgia e Anatomia, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP, Brasil, Divisão de Cirurgia Pediátrica, Departamento de Cirurgia e Anatomia, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP (Brazil); Schmidt, A.F. [Pediatrics House Office, Cincinnati Children' s Hospital Medical Center, Cincinnati, OH, USA, Pediatrics House Office, Cincinnati Children' s Hospital Medical Center, Cincinnati, OH (United States); Zuliani, C.C. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP, Brasil, Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil); Garcia, P.V. [Departamento de Histologia e Embriologia, Instituto de Biologia, Universidade Estadual de Campinas, UNICAMP, Campinas, SP, Brasil, Departamento de Histologia e Embriologia, Instituto de Biologia, Universidade Estadual de Campinas, UNICAMP, Campinas, SP (Brazil); Gallindo, R.M. [Divisão de Cirurgia Pediátrica, Departamento de Cirurgia e Anatomia, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP, Brasil, Divisão de Cirurgia Pediátrica, Departamento de Cirurgia e Anatomia, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP (Brazil); Pereira, L.A.V. [Departamento de Histologia e Embriologia, Instituto de Biologia, Universidade Estadual de Campinas, UNICAMP, Campinas, SP, Brasil, Departamento de Histologia e Embriologia, Instituto de Biologia, Universidade Estadual de Campinas, UNICAMP, Campinas, SP (Brazil)

    2014-02-17

    Changes in vascular endothelial growth factor (VEGF) in pulmonary vessels have been described in congenital diaphragmatic hernia (CDH) and may contribute to the development of pulmonary hypoplasia and hypertension; however, how the expression of VEGF receptors changes during fetal lung development in CDH is not understood. The aim of this study was to compare morphological evolution with expression of VEGF receptors, VEGFR1 (Flt-1) and VEGFR2 (Flk-1), in pseudoglandular, canalicular, and saccular stages of lung development in normal rat fetuses and in fetuses with CDH. Pregnant rats were divided into four groups (n=20 fetuses each) of four different gestational days (GD) 18.5, 19.5, 20.5, 21.5: external control (EC), exposed to olive oil (OO), exposed to 100 mg nitrofen, by gavage, without CDH (N-), and exposed to nitrofen with CDH (CDH) on GD 9.5 (term=22 days). The morphological variables studied were: body weight (BW), total lung weight (TLW), left lung weight, TLW/BW ratio, total lung volume, and left lung volume. The histometric variables studied were: left lung parenchymal area density and left lung parenchymal volume. VEGFR1 and VEGFR2 expression were determined by Western blotting. The data were analyzed using analysis of variance with the Tukey-Kramer post hoc test. CDH frequency was 37% (80/216). All the morphological and histometric variables were reduced in the N- and CDH groups compared with the controls, and reductions were more pronounced in the CDH group (P<0.05) and more evident on GD 20.5 and GD 21.5. Similar results were observed for VEGFR1 and VEGFR2 expression. We conclude that N- and CDH fetuses showed primary pulmonary hypoplasia, with a decrease in VEGFR1 and VEGFR2 expression.

  7. VEGF receptor expression decreases during lung development in congenital diaphragmatic hernia induced by nitrofen

    Directory of Open Access Journals (Sweden)

    L. Sbragia

    2014-02-01

    Full Text Available Changes in vascular endothelial growth factor (VEGF in pulmonary vessels have been described in congenital diaphragmatic hernia (CDH and may contribute to the development of pulmonary hypoplasia and hypertension; however, how the expression of VEGF receptors changes during fetal lung development in CDH is not understood. The aim of this study was to compare morphological evolution with expression of VEGF receptors, VEGFR1 (Flt-1 and VEGFR2 (Flk-1, in pseudoglandular, canalicular, and saccular stages of lung development in normal rat fetuses and in fetuses with CDH. Pregnant rats were divided into four groups (n=20 fetuses each of four different gestational days (GD 18.5, 19.5, 20.5, 21.5: external control (EC, exposed to olive oil (OO, exposed to 100 mg nitrofen, by gavage, without CDH (N-, and exposed to nitrofen with CDH (CDH on GD 9.5 (term=22 days. The morphological variables studied were: body weight (BW, total lung weight (TLW, left lung weight, TLW/BW ratio, total lung volume, and left lung volume. The histometric variables studied were: left lung parenchymal area density and left lung parenchymal volume. VEGFR1 and VEGFR2 expression were determined by Western blotting. The data were analyzed using analysis of variance with the Tukey-Kramer post hoc test. CDH frequency was 37% (80/216. All the morphological and histometric variables were reduced in the N- and CDH groups compared with the controls, and reductions were more pronounced in the CDH group (P<0.05 and more evident on GD 20.5 and GD 21.5. Similar results were observed for VEGFR1 and VEGFR2 expression. We conclude that N- and CDH fetuses showed primary pulmonary hypoplasia, with a decrease in VEGFR1 and VEGFR2 expression.

  8. Early and mid-term results of lung transplantation with donors 60 years and older.

    Science.gov (United States)

    López, Iker; Zapata, Ricardo; Solé, Juan; Jaúregui, Alberto; Deu, María; Romero, Laura; Pérez, Javier; Bello, Irene; Wong, Manuel; Ribas, Montse; Masnou, Nuria; Rello, Jordi; Roman, Antonio; Canela, Mercedes

    2015-01-01

    There are doubts about the age limit for lung donors and the ideal donor has traditionally been considered to be one younger than 55 years. The objective of this study was to compare the outcomes in lung transplantation between organs from donors older and younger than 60 years. We performed a retrospective observational study comparing the group of patients receiving organs from donors 60 years or older (Group A) or younger than 60 years (Group B) between January 2007 and December 2011. Postoperative evolution and mortality rates, short-term and mid-term postoperative complications, and global survival rate were evaluated. We analysed a total of 230 lung transplants, of which 53 (23%) involved lungs from donors 60 years of age or older (Group A), and 177 (77%) were from donors younger than 60 years (Group B). Three (5.7%) patients from Group A and 14 patients (7.9%) from Group B died within 30 days (P = 0.58). The percentage of patients free from chronic lung allograft dysfunction at 1-3 years was 95.5, 74.3 and 69.3% for Group A, and 94.5, 84.8 and 73.3% for Group B, respectively (P = 0.47). There were no statistically significant differences between Groups A and B in terms of survival at 3 years, (69.4 vs 68.8%; P = 0.28). Our results support the idea that lungs from donors aged 60-70 years can be used safely for lung transplantation with comparable results to lungs from younger donors in terms of postoperative mortality and mid-term survival. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  9. VEGF receptor expression decreases during lung development in congenital diaphragmatic hernia induced by nitrofen

    International Nuclear Information System (INIS)

    Sbragia, L.; Nassr, A.C.C.; Gonçalves, F.L.L.; Schmidt, A.F.; Zuliani, C.C.; Garcia, P.V.; Gallindo, R.M.; Pereira, L.A.V.

    2014-01-01

    Changes in vascular endothelial growth factor (VEGF) in pulmonary vessels have been described in congenital diaphragmatic hernia (CDH) and may contribute to the development of pulmonary hypoplasia and hypertension; however, how the expression of VEGF receptors changes during fetal lung development in CDH is not understood. The aim of this study was to compare morphological evolution with expression of VEGF receptors, VEGFR1 (Flt-1) and VEGFR2 (Flk-1), in pseudoglandular, canalicular, and saccular stages of lung development in normal rat fetuses and in fetuses with CDH. Pregnant rats were divided into four groups (n=20 fetuses each) of four different gestational days (GD) 18.5, 19.5, 20.5, 21.5: external control (EC), exposed to olive oil (OO), exposed to 100 mg nitrofen, by gavage, without CDH (N-), and exposed to nitrofen with CDH (CDH) on GD 9.5 (term=22 days). The morphological variables studied were: body weight (BW), total lung weight (TLW), left lung weight, TLW/BW ratio, total lung volume, and left lung volume. The histometric variables studied were: left lung parenchymal area density and left lung parenchymal volume. VEGFR1 and VEGFR2 expression were determined by Western blotting. The data were analyzed using analysis of variance with the Tukey-Kramer post hoc test. CDH frequency was 37% (80/216). All the morphological and histometric variables were reduced in the N- and CDH groups compared with the controls, and reductions were more pronounced in the CDH group (P<0.05) and more evident on GD 20.5 and GD 21.5. Similar results were observed for VEGFR1 and VEGFR2 expression. We conclude that N- and CDH fetuses showed primary pulmonary hypoplasia, with a decrease in VEGFR1 and VEGFR2 expression

  10. Lung growth and development.

    Science.gov (United States)

    Joshi, Suchita; Kotecha, Sailesh

    2007-12-01

    Human lung growth starts as a primitive lung bud in early embryonic life and undergoes several morphological stages which continue into postnatal life. Each stage of lung growth is a result of complex and tightly regulated events governed by physical, environmental, hormonal and genetic factors. Fetal lung liquid and fetal breathing movements are by far the most important determinants of lung growth. Although timing of the stages of lung growth in animals do not mimic that of human, numerous animal studies, mainly on sheep and rat, have given us a better understanding of the regulators of lung growth. Insight into the genetic basis of lung growth has helped us understand and improve management of complex life threatening congenital abnormalities such as congenital diaphragmatic hernia and pulmonary hypoplasia. Although advances in perinatal medicine have improved survival of preterm infants, premature birth is perhaps still the most important factor for adverse lung growth.

  11. The Evolution of Therapies in Non-Small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Boolell, Vishal, E-mail: vishal.boolell@monashhealth.org.au; Alamgeer, Muhammad [Department of Medical Oncology, Monash Medical Centre, 823-865 Centre Road, East Bentleigh VIC 3165 (Australia); Hudson Institute of Medical Research, Monash University, 27-31 Wright Street, Clayton VIC 3168 (Australia); Watkins, David N. [Hudson Institute of Medical Research, Monash University, 27-31 Wright Street, Clayton VIC 3168 (Australia); Garvan Institute of Medical Research, 384 Victoria Street, Darlinghurst, Sydney NSW 2010 (Australia); UNSW Faculty of Medicine, St Vincent’s Clinical School, 390 Victoria Street, Darlinghurst, Sydney NSW 2010 (Australia); Department of Thoracic Medicine, St Vincent’s Hospital, 390 Victoria Street, Darlinghurst, Sydney NSW 2010 (Australia); Ganju, Vinod [Department of Medical Oncology, Monash Medical Centre, 823-865 Centre Road, East Bentleigh VIC 3165 (Australia); Hudson Institute of Medical Research, Monash University, 27-31 Wright Street, Clayton VIC 3168 (Australia)

    2015-09-09

    The landscape of advanced non-small lung cancer (NSCLC) therapies has rapidly been evolving beyond chemotherapy over the last few years. The discovery of oncogenic driver mutations has led to new ways in classifying NSCLC as well as offered novel therapeutic targets for anticancer therapy. Targets such as epidermal growth factor receptor (EGFR) mutations and anaplastic lymphoma kinase (ALK) gene rearrangements have successfully been targeted with appropriate tyrosine kinase inhibitors (TKIs). Other driver mutations such as ROS, MET, RET, BRAF have also been investigated with targeted agents with some success in the early phase clinical setting. Novel strategies in the field of immune-oncology have also led to the development of inhibitors of cytotoxic T lymphocyte antigen-4 (CTLA-4) and programmed death-1 receptor (PD-1), which are important pathways in allowing cancer cells to escape detection by the immune system. These inhibitors have been successfully tried in NSCLC and also now bring the exciting possibility of long term responses in advanced NSCLC. In this review recent data on novel targets and therapeutic strategies and their future prospects are discussed.

  12. How Lungs Work

    Science.gov (United States)

    ... Diseases > How Lungs Work How Lungs Work The Respiratory System Your lungs are part of the respiratory system, ... your sense of smell. The Parts of the Respiratory System and How They Work Airways SINUSES are hollow ...

  13. The evolution of thoracic anesthesia.

    Science.gov (United States)

    Brodsky, Jay B

    2005-02-01

    The specialty of thoracic surgery has evolved along with the modem practice of anesthesia. This close relationship began in the 1930s and continues today. Thoracic surgery has grown from a field limited almost exclusively to simple chest wall procedures to the present situation in which complex procedures, such as lung volume reduction or lung transplantation, now can be performed on the most severely compromised patient. The great advances in thoracic surgery have followed discoveries and technical innovations in many medical fields. One of the most important reasons for the rapid escalation in the number and complexity of thoracic surgical procedures now being performed has been the evolution of anesthesia for thoracic surgery. There has been so much progress in this area that numerous books and journals are devoted entirely to this subject. The author has been privileged to work with several surgeons who specialized in noncardiac thoracic surgery. As a colleague of 25 years, the noted pulmonary surgeon James B.D. Mark wrote, "Any operation is a team effort... (but) nowhere is this team effort more important than in thoracic surgery, where near-choreography of moves by all participants is essential. Exchange of information, status and plans are mandatory". This team approach between the thoracic surgeon and the anesthesiologist reflects the history of the two specialties. With new advances in technology, such as continuous blood gas monitoring and the pharmacologic management of pulmonary circulation to maximize oxygenation during one-lung ventilation, in the future even more complex procedures may be able to be performed safely on even higher risk patients.

  14. Lung scintigraphy in differential diagnosis of peripheral lung cancer and community-acquired pneumonia

    Energy Technology Data Exchange (ETDEWEB)

    Krivonogov, Nikolay G., E-mail: kng@cardio-tomsk.ru [Research Institute of Cardiology, Kievskaya Street 111a, Tomsk, 634012 (Russian Federation); Efimova, Nataliya Y., E-mail: efimova@cardio-tomsk.ru; Zavadovsky, Konstantin W.; Lishmanov, Yuri B. [Research Institute of Cardiology, Kievskaya Street 111a, Tomsk, 634012 (Russian Federation); Tomsk Polytechnic University, Lenin Avenue 30, Tomsk, 634050 (Russian Federation)

    2016-08-02

    Ventilation/perfusion lung scintigraphy was performed in 39 patients with verified diagnosis of community-acquired pneumonia (CAP) and in 14 patients with peripheral lung cancer. Ventilation/perfusion ratio, apical-basal gradients of ventilation (U/L(V)) and lung perfusion (U/L(P)), and alveolar capillary permeability of radionuclide aerosol were determined based on scintigraphy data. The study demonstrated that main signs of CAP were increases in ventilation/perfusion ratio, perfusion and ventilation gradient on a side of the diseased lung, and two-side increase in alveolar capillary permeability rate for radionuclide aerosol. Unlike this, scintigraphic signs of peripheral lung cancer comprise an increase in ventilation/perfusion ratio over 1.0 on a side of the diseased lung with its simultaneous decrease on a contralateral side, normal values of perfusion and ventilation gradients of both lungs, and delayed alveolar capillary clearance in the diseased lung compared with the intact lung.

  15. Lung scintigraphy in differential diagnosis of peripheral lung cancer and community-acquired pneumonia

    Science.gov (United States)

    Krivonogov, Nikolay G.; Efimova, Nataliya Y.; Zavadovsky, Konstantin W.; Lishmanov, Yuri B.

    2016-08-01

    Ventilation/perfusion lung scintigraphy was performed in 39 patients with verified diagnosis of community-acquired pneumonia (CAP) and in 14 patients with peripheral lung cancer. Ventilation/perfusion ratio, apical-basal gradients of ventilation (U/L(V)) and lung perfusion (U/L(P)), and alveolar capillary permeability of radionuclide aerosol were determined based on scintigraphy data. The study demonstrated that main signs of CAP were increases in ventilation/perfusion ratio, perfusion and ventilation gradient on a side of the diseased lung, and two-side increase in alveolar capillary permeability rate for radionuclide aerosol. Unlike this, scintigraphic signs of peripheral lung cancer comprise an increase in ventilation/perfusion ratio over 1.0 on a side of the diseased lung with its simultaneous decrease on a contralateral side, normal values of perfusion and ventilation gradients of both lungs, and delayed alveolar capillary clearance in the diseased lung compared with the intact lung.

  16. Lung cancer - small cell

    Science.gov (United States)

    Cancer - lung - small cell; Small cell lung cancer; SCLC ... About 15% of all lung cancer cases are SCLC. Small cell lung cancer is slightly more common in men than women. Almost all cases of SCLC are ...

  17. Surgical and survival outcomes of lung cancer patients with intratumoral lung abscesses.

    Science.gov (United States)

    Yamanashi, Keiji; Okumura, Norihito; Takahashi, Ayuko; Nakashima, Takashi; Matsuoka, Tomoaki

    2017-05-26

    Intratumoral lung abscess is a secondary lung abscess that is considered to be fatal. Therefore, surgical procedures, although high-risk, have sometimes been performed for intratumoral lung abscesses. However, no studies have examined the surgical outcomes of non-small cell lung cancer patients with intratumoral lung abscesses. The aim of this study was to investigate the surgical and survival outcomes of non-small cell lung cancer patients with intratumoral lung abscesses. Eleven consecutive non-small cell lung cancer patients with intratumoral lung abscesses, who had undergone pulmonary resection at our institution between January 2007 and December 2015, were retrospectively analysed. The post-operative prognoses were investigated and prognostic factors were evaluated. Ten of 11 patients were male and one patient was female. The median age was 64 (range, 52-80) years. Histopathologically, 4 patients had Stage IIA, 2 patients had Stage IIB, 2 patients had Stage IIIA, and 3 patients had Stage IV tumors. The median operative time was 346 min and the median amount of bleeding was 1327 mL. The post-operative morbidity and mortality rates were 63.6% and 0.0%, respectively. Recurrence of respiratory infections, including lung abscesses, was not observed in all patients. The median post-operative observation period was 16.1 (range, 1.3-114.5) months. The 5-year overall survival rate was 43.3%. No pre-operative, intra-operative, or post-operative prognostic factors were identified in the univariate analyses. Surgical procedures for advanced-stage non-small cell lung cancer patients with intratumoral lung abscesses, although high-risk, led to satisfactory post-operative mortality rates and acceptable prognoses.

  18. Protecting Your Lungs

    Science.gov (United States)

    ... lung capacity. Specific breathing exercises can also help improve your lung function if you have certain lung diseases, like COPD. Exercise and breathing techniques are also great for improving your mood and helping you relax. Public Health and Your ...

  19. Geriatric Assessment and Functional Decline in Older Patients with Lung Cancer.

    Science.gov (United States)

    Decoster, L; Kenis, C; Schallier, D; Vansteenkiste, J; Nackaerts, K; Vanacker, L; Vandewalle, N; Flamaing, J; Lobelle, J P; Milisen, K; De Grève, J; Wildiers, H

    2017-10-01

    Older patients with lung cancer are a heterogeneous population making treatment decisions complex. This study aims to evaluate the value of geriatric assessment (GA) as well as the evolution of functional status (FS) in older patients with lung cancer, and to identify predictors associated with functional decline and overall survival (OS). At baseline, GA was performed in patients ≥70 years with newly diagnosed lung cancer. FS measured by activities of daily living (ADL) and instrumental activities of daily living (IADL) was reassessed at follow-up to define functional decline and OS was collected. Predictors for functional decline and OS were determined. Two hundred and forty-five patients were included in this study. At baseline, GA deficiencies were present in all domains and ADL and IADL were impaired in 51 and 63% of patients, respectively. At follow-up, functional decline in ADL was observed in 23% and in IADL in 45% of patients. In multivariable analysis, radiotherapy was predictive for ADL decline. No other predictors for ADL or IADL decline were identified. Stage and baseline performance status were predictive for OS. Older patients with lung cancer present with multiple deficiencies covering all geriatric domains. During treatment, functional decline is observed in almost half of the patients. None of the specific domains of the GA were predictive for functional decline or survival, probably because of the high impact of the aggressiveness of this tumor type leading to a poor prognosis.

  20. Lung regeneration by fetal lung tissue implantation in a mouse pulmonary emphysema model.

    Science.gov (United States)

    Uyama, Koh; Sakiyama, Shoji; Yoshida, Mitsuteru; Kenzaki, Koichiro; Toba, Hiroaki; Kawakami, Yukikiyo; Okumura, Kazumasa; Takizawa, Hiromitsu; Kondo, Kazuya; Tangoku, Akira

    2016-01-01

    The mortality and morbidity of chronic obstructive pulmonary disease are high. However, no radical therapy has been developed to date. The purpose of this study was to evaluate whether fetal mouse lung tissue can grow and differentiate in the emphysematous lung. Fetal lung tissue from green fluorescent protein C57BL/6 mice at 16 days' gestation was used as donor material. Twelve-month-old pallid mice were used as recipients. Donor lungs were cut into small pieces and implanted into the recipient left lung by performing thoracotomy under anesthesia. The recipient mice were sacrificed at day 7, 14, and 28 after implantation and used for histological examination. Well-developed spontaneous pulmonary emphysema was seen in 12-month-old pallid mice. Smooth and continuous connection between implanted fetal lung tissue and recipient lung was recognized. Air space expansion and donor tissue differentiation were observed over time. We could clearly distinguish the border zones between injected tissue and native tissue by the green fluorescence of grafts. Fetal mouse lung fragments survived and differentiated in the emphysematous lung of pallid mice. Implantation of fetal lung tissue in pallid mice might lead to further lung regeneration research from the perspective of respiratory and exercise function. J. Med. Invest. 63: 182-186, August, 2016.

  1. Serial perfusion in native lungs in patients with idiopathic pulmonary fibrosis and other interstitial lung diseases after single lung transplantation.

    Science.gov (United States)

    Sokai, Akihiko; Handa, Tomohiro; Chen, Fengshi; Tanizawa, Kiminobu; Aoyama, Akihiro; Kubo, Takeshi; Ikezoe, Kohei; Nakatsuka, Yoshinari; Oguma, Tsuyoshi; Hirai, Toyohiro; Nagai, Sonoko; Chin, Kazuo; Date, Hiroshi; Mishima, Michiaki

    2016-04-01

    Lung perfusions after single lung transplantation (SLT) have not been fully clarified in patients with interstitial lung disease (ILD). The present study aimed to investigate temporal changes in native lung perfusion and their associated clinical factors in patients with ILD who have undergone SLT. Eleven patients were enrolled. Perfusion scintigraphy was serially performed up to 12 months after SLT. Correlations between the post-operative perfusion ratio in the native lung and clinical parameters, including pre-operative perfusion ratio and computed tomography (CT) volumetric parameters, were evaluated. On average, the perfusion ratio of the native lung was maintained at approximately 30% until 12 months after SLT. However, the ratio declined more significantly in idiopathic pulmonary fibrosis (IPF) than in other ILDs (p = 0.014). The perfusion ratio before SLT was significantly correlated with that at three months after SLT (ρ = 0.64, p = 0.048). The temporal change of the perfusion ratio in the native lung did not correlate with those of the CT parameters. The pre-operative perfusion ratio may predict the post-operative perfusion ratio of the native lung shortly after SLT in ILD. Perfusion of the native lung may decline faster in IPF compared with other ILDs. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. A simple method for the deconvolution of 134 Cs/137 Cs peaks in gamma-ray scintillation spectrometry

    International Nuclear Information System (INIS)

    Darko, E.O.; Osae, E.K.; Schandorf, C.

    1998-01-01

    A simple method for the deconvolution of 134 Cs / 137 Cs peaks in a given mixture of 134 Cs and 137 Cs using Nal(TI) gamma-ray scintillation spectrometry is described. In this method the 795 keV energy of 134 Cs is used as a reference peak to calculate the activity of the 137 Cs directly from the measured peaks. Certified reference materials were measured using the method and compared with a high resolution gamma-ray spectrometry measurements. The results showed good agreement with the certified values. The method is very simple and does not need any complicated mathematics and computer programme to de- convolute the overlapping 604.7 keV and 661.6 keV peaks of 134 Cs and 137 Cs respectively. (author). 14 refs.; 1 tab., 2 figs

  3. Tracking juniper berry content in oils and distillates by spectral deconvolution of gas chromatography/mass spectrometry data.

    Science.gov (United States)

    Robbat, Albert; Kowalsick, Amanda; Howell, Jessalin

    2011-08-12

    The complex nature of botanicals and essential oils makes it difficult to identify all of the constituents by gas chromatography/mass spectrometry (GC/MS) alone. In this paper, automated sequential, multidimensional gas chromatography/mass spectrometry (GC-GC/MS) was used to obtain a matrix-specific, retention time/mass spectrometry library of 190 juniper berry oil compounds. GC/MS analysis on stationary phases with different polarities confirmed the identities of each compound when spectral deconvolution software was used to analyze the oil. Also analyzed were distillates of juniper berry and its oil as well as gin from four different manufacturers. Findings showed the chemical content of juniper berry can be traced from starting material to final product and can be used to authenticate and differentiate brands. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Analysis of Photosystem I Donor and Acceptor Sides with a New Type of Online-Deconvoluting Kinetic LED-Array Spectrophotometer.

    Science.gov (United States)

    Schreiber, Ulrich; Klughammer, Christof

    2016-07-01

    The newly developed Dual/KLAS-NIR spectrophotometer, technical details of which were reported very recently, is used in measuring redox changes of P700, plastocyanin (PC) and ferredoxin (Fd) in intact leaves of Hedera helix, Taxus baccata and Brassica napus An overview of various light-/dark-induced changes of deconvoluted P700 + , PC + and Fd - signals is presented demonstrating the wealth of novel information and the consistency of the obtained results. Fd - changes are particularly large after dark adaptation. PC oxidation precedes P700 oxidation during dark-light induction and in steady-state light response curves. Fd reoxidation during induction correlates with the secondary decline of simultaneously measured fluorescence yield, both of which are eliminated by removal of O 2 By determination of 100% redox changes, relative contents of PC/P700 and Fd/P700 can be assessed, which show considerable variations between different leaves, with a trend to higher values in sun leaves. Based on deconvoluted P700 + signals, the complementary quantum yields of PSI, Y(I) (photochemical energy use), Y(ND) (non-photochemical loss due to oxidized primary donor) and Y(NA) (non-photochemical loss due to reduced acceptor) are determined as a function of light intensity and compared with the corresponding complementary quantum yields of PSII, Y(II) (photochemical energy use), Y(NPQ) (regulated non-photochemical loss) and Y(NO) (non-regulated non-photochemical loss). The ratio Y(I)/Y(II) increases with increasing intensities. In the low intensity range, a two-step increase of PC + is indicative of heterogeneous PC pools. © The Author 2016. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  5. Staging of Lung Cancer

    Science.gov (United States)

    ... LUNG CANCER MINI-SERIES #2 Staging of Lung Cancer Once your lung cancer is diagnosed, staging tells you and your health care provider about ... at it under a microscope. The stages of lung cancer are listed as I, II, III, and IV ...

  6. Valor predicts the thyroid hormones in the evolution the transplant bony marrow

    International Nuclear Information System (INIS)

    Alonso, C.A.; Carnot, J.; De Castro, R.; Morera, M.L.; Garcia, I.

    1998-01-01

    In this work you values the valor I predict the thyroid hormones in the bony marrow transplant evolution as factors the metabolisms oxidative and the synthesis albumins. The patients received conditioning treatments to the transplant and in the postoperational. The received radiations were 1000 cGy lateral cube, with blocking lung, to those that were subjected 3 sessions irradiation, they are practiced a transplant marrow allogeneic

  7. MRI and CT lung biomarkers: Towards an in vivo understanding of lung biomechanics.

    Science.gov (United States)

    Young, Heather M; Eddy, Rachel L; Parraga, Grace

    2017-09-29

    The biomechanical properties of the lung are necessarily dependent on its structure and function, both of which are complex and change over time and space. This makes in vivo evaluation of lung biomechanics and a deep understanding of lung biomarkers, very challenging. In patients and animal models of lung disease, in vivo evaluations of lung structure and function are typically made at the mouth and include spirometry, multiple-breath gas washout tests and the forced oscillation technique. These techniques, and the biomarkers they provide, incorporate the properties of the whole organ system including the parenchyma, large and small airways, mouth, diaphragm and intercostal muscles. Unfortunately, these well-established measurements mask regional differences, limiting their ability to probe the lung's gross and micro-biomechanical properties which vary widely throughout the organ and its subcompartments. Pulmonary imaging has the advantage in providing regional, non-invasive measurements of healthy and diseased lung, in vivo. Here we summarize well-established and emerging lung imaging tools and biomarkers and how they may be used to generate lung biomechanical measurements. We review well-established and emerging lung anatomical, microstructural and functional imaging biomarkers generated using synchrotron x-ray tomographic-microscopy (SRXTM), micro-x-ray computed-tomography (micro-CT), clinical CT as well as magnetic resonance imaging (MRI). Pulmonary imaging provides measurements of lung structure, function and biomechanics with high spatial and temporal resolution. Imaging biomarkers that reflect the biomechanical properties of the lung are now being validated to provide a deeper understanding of the lung that cannot be achieved using measurements made at the mouth. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Lung cancer mimicking lung abscess formation on CT images

    OpenAIRE

    Taira, Naohiro; Kawabata, Tsutomu; Gabe, Atsushi; Ichi, Takaharu; Kushi, Kazuaki; Yohena, Tomofumi; Kawasaki, Hidenori; Yamashiro, Toshimitsu; Ishikawa, Kiyoshi

    2014-01-01

    Patient: Male, 64 Final Diagnosis: Lung pleomorphic carcinoma Symptoms: Cough • fever Medication: — Clinical Procedure: — Specialty: Oncology Objective: Unusual clinical course Background: The diagnosis of lung cancer is often made based on computed tomography (CT) image findings if it cannot be confirmed on pathological examinations, such as bronchoscopy. However, the CT image findings of cancerous lesions are similar to those of abscesses.We herein report a case of lung cancer that resemble...

  9. Stochastic rat lung dosimetry for inhaled radon progeny: a surrogate for the human lung for lung cancer risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Winkler-Heil, R.; Hofmann, W. [University of Salzburg, Division of Physics and Biophysics, Department of Materials Research and Physics, Salzburg (Austria); Hussain, M. [University of Salzburg, Division of Physics and Biophysics, Department of Materials Research and Physics, Salzburg (Austria); Higher Education Commission of Pakistan, Islamabad (Pakistan)

    2015-05-15

    Laboratory rats are frequently used in inhalation studies as a surrogate for human exposures. The objective of the present study was therefore to develop a stochastic dosimetry model for inhaled radon progeny in the rat lung, to predict bronchial dose distributions and to compare them with corresponding dose distributions in the human lung. The most significant difference between human and rat lungs is the branching structure of the bronchial tree, which is relatively symmetric in the human lung, but monopodial in the rat lung. Radon progeny aerosol characteristics used in the present study encompass conditions typical for PNNL and COGEMA rat inhalation studies, as well as uranium miners and human indoor exposure conditions. It is shown here that depending on exposure conditions and modeling assumptions, average bronchial doses in the rat lung ranged from 5.4 to 7.3 mGy WLM{sup -1}. If plotted as a function of airway generation, bronchial dose distributions exhibit a significant maximum in large bronchial airways. If, however, plotted as a function of airway diameter, then bronchial doses are much more uniformly distributed throughout the bronchial tree. Comparisons between human and rat exposures indicate that rat bronchial doses are slightly higher than human bronchial doses by about a factor of 1.3, while lung doses, averaged over the bronchial (BB), bronchiolar (bb) and alveolar-interstitial (AI) regions, are higher by about a factor of about 1.6. This supports the current view that the rat lung is indeed an appropriate surrogate for the human lung in case of radon-induced lung cancers. Furthermore, airway diameter seems to be a more appropriate morphometric parameter than airway generations to relate bronchial doses to bronchial carcinomas. (orig.)

  10. Lung-derived growth factors: possible paracrine effectors of fetal lung development

    International Nuclear Information System (INIS)

    Montes, A.M.

    1985-01-01

    A potential role for paracrine secretions in lung organogenesis has been hypothesized (Alescio and Piperno, 1957). These studies present direct support for the paracrine model by demonstrating the presence of locally produced mitogenic/maturational factors in fetal rat lung tissue. Conditioned serum free medium (CSFM) from nineteen-day fetal rat lung cultures was shown to contain several bioactive peptides as detected by 3 H-Thymidine incorporation into chick embryo and rat lung fibroblasts, as well as 14 C-choline incorporation into surfactant in mixed cell cultures. Using ion-exchange chromatography and Sephadex gel filtration, a partially purified mitogen, 11-III, was obtained. The partially purified 11-III stimulates mitosis in chick embryo fibroblasts and post-natal rat lung fibroblasts. Multiplication in fetal rat lung fibroblasts cultures is stimulated only when these are pre-incubated with a competence factor or unprocessed CSFM. This suggests the existence of an endogenously produced competence factor important in the regulation of fetal lung growth. Preparation 11-III does not possess surfactant stimulating activity as assessed by 3 H-choline incorporation into lipids in predominantly type-II cell cultures. These data demonstrate the presence of a maturational/mitogenic factor, influencing type-II mixed cell cultures. In addition, 11-III had been shown to play an autocrine role stimulating the proliferation of fetal lung fibroblasts. Finally, these data suggest the existence of a local produced competence factor

  11. Deconvolution of the tree ring based delta13C record

    International Nuclear Information System (INIS)

    Peng, T.; Broecker, W.S.; Freyer, H.D.; Trumbore, S.

    1983-01-01

    We assumed that the tree-ring based 13 C/ 12 C record constructed by Freyer and Belacy (1983) to be representative of the fossil fuel and forest-soil induced 13 C/ 12 C change for atmospheric CO 2 . Through the use of a modification of the Oeschger et al. ocean model, we have computed the contribution of the combustion of coal, oil, and natural gas to this observed 13 C/ 12 C change. A large residual remains when the tree-ring-based record is corrected for the contribution of fossil fuel CO 2 . A deconvolution was performed on this residual to determine the time history and magnitude of the forest-soil reservoir changes over the past 150 years. Several important conclusions were reached. (1) The magnitude of the integrated CO 2 input from these sources was about 1.6 times that from fossil fuels. (2) The forest-soil contribution reached a broad maximum centered at about 1900. (3) Over the 2 decade period covered by the Mauna Loa atmospheric CO 2 content record, the input from forests and soils was about 30% that from fossil fuels. (4) The 13 C/ 12 C trend over the last 20 years was dominated by the input of fossil fuel CO 2 . (5) The forest-soil release did not contribute significantly to the secular increase in atmospheric CO 2 observed over the last 20 years. (6) The pre-1850 atmospheric p2 values must have been in the range 245 to 270 x 10 -6 atmospheres

  12. The Evolution of Therapies in Non-Small Cell Lung Cancer

    Directory of Open Access Journals (Sweden)

    Vishal Boolell

    2015-09-01

    Full Text Available The landscape of advanced non-small lung cancer (NSCLC therapies has rapidly been evolving beyond chemotherapy over the last few years. The discovery of oncogenic driver mutations has led to new ways in classifying NSCLC as well as offered novel therapeutic targets for anticancer therapy. Targets such as epidermal growth factor receptor (EGFR mutations and anaplastic lymphoma kinase (ALK gene rearrangements have successfully been targeted with appropriate tyrosine kinase inhibitors (TKIs. Other driver mutations such as ROS, MET, RET, BRAF have also been investigated with targeted agents with some success in the early phase clinical setting. Novel strategies in the field of immune-oncology have also led to the development of inhibitors of cytotoxic T lymphocyte antigen-4 (CTLA-4 and programmed death-1 receptor (PD-1, which are important pathways in allowing cancer cells to escape detection by the immune system. These inhibitors have been successfully tried in NSCLC and also now bring the exciting possibility of long term responses in advanced NSCLC. In this review recent data on novel targets and therapeutic strategies and their future prospects are discussed.

  13. Amebic lung abscess with coexisting lung adenocarcinoma: a unusual case of amebiasis.

    Science.gov (United States)

    Zhu, Hailong; Min, Xiangyang; Li, Shuai; Feng, Meng; Zhang, Guofeng; Yi, Xianghua

    2014-01-01

    Amebic lung abscess with concurrent lung cancer, but without either a liver abscess or amebic colitis, is extremely uncommon. Here, we report a 70-year-old man presenting with pulmonary amebiasis and coexisting lung adenocarcinoma. During his first-time hospitalization, the diagnosis of lung amebiasis was confirmed by morphological observation and PCR in formalin-fixed and paraffin-embedded sediments of pleural effusion. Almost four months later, the patient was readmitted to hospital for similar complaints. On readmission, lung adenocarcinoma was diagnosed by liquid-based sputum cytology and thought to be delayed because coexisting amebic lung abscess. This case demonstrated that sediments of pleural effusion may be used for further pathological examination after routine cytology has shown negative results. At the same time, we concluded that lung cancer may easily go undetected in the patients with pulmonary amebiasis and repetitive evaluation by cytology and imaging follow-up are useful to find potential cancer.

  14. Regeneration of the lung: Lung stem cells and the development of lung mimicking devices

    NARCIS (Netherlands)

    Schilders, K.; Eenjes, E.; van Riet, S.; Poot, Andreas A.; Stamatialis, Dimitrios; Truckenmüller, R.K.; Hiemstra, P.; Rottier, R.

    2016-01-01

    Inspired by the increasing burden of lung associated diseases in society and an growing demand to accommodate patients, great efforts by the scientific community produce an increasing stream of data that are focused on delineating the basic principles of lung development and growth, as well as

  15. Application of constrained deconvolution technique for reconstruction of electron bunch profile with strongly non-Gaussian shape

    Science.gov (United States)

    Geloni, G.; Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.

    2004-08-01

    An effective and practical technique based on the detection of the coherent synchrotron radiation (CSR) spectrum can be used to characterize the profile function of ultra-short bunches. The CSR spectrum measurement has an important limitation: no spectral phase information is available, and the complete profile function cannot be obtained in general. In this paper we propose to use constrained deconvolution method for bunch profile reconstruction based on a priori-known information about formation of the electron bunch. Application of the method is illustrated with practically important example of a bunch formed in a single bunch-compressor. Downstream of the bunch compressor the bunch charge distribution is strongly non-Gaussian with a narrow leading peak and a long tail. The longitudinal bunch distribution is derived by measuring the bunch tail constant with a streak camera and by using a priory available information about profile function.

  16. Application of blind deconvolution with crest factor for recovery of original rolling element bearing defect signals

    International Nuclear Information System (INIS)

    Son, J. D.; Yang, B. S.; Tan, A. C. C.; Mathew, J.

    2004-01-01

    Many machine failures are not detected well in advance due to the masking of background noise and attenuation of the source signal through the transmission mediums. Advanced signal processing techniques using adaptive filters and higher order statistics have been attempted to extract the source signal from the measured data at the machine surface. In this paper, blind deconvolution using the Eigenvector Algorithm (EVA) technique is used to recover a damaged bearing signal using only the measured signal at the machine surface. A damaged bearing signal corrupted by noise with varying signal-to-noise (s/n) was used to determine the effectiveness of the technique in detecting an incipient signal and the optimum choice of filter length. The results show that the technique is effective in detecting the source signal with an s/n ratio as low as 0.21, but requires a relatively large filter length

  17. Application of constrained deconvolution technique for reconstruction of electron bunch profile with strongly non-Gaussian shape

    International Nuclear Information System (INIS)

    Geloni, G.; Saldin, E.L.; Schneidmiller, E.A.; Yurkov, M.V.

    2004-01-01

    An effective and practical technique based on the detection of the coherent synchrotron radiation (CSR) spectrum can be used to characterize the profile function of ultra-short bunches. The CSR spectrum measurement has an important limitation: no spectral phase information is available, and the complete profile function cannot be obtained in general. In this paper we propose to use constrained deconvolution method for bunch profile reconstruction based on a priori-known information about formation of the electron bunch. Application of the method is illustrated with practically important example of a bunch formed in a single bunch-compressor. Downstream of the bunch compressor the bunch charge distribution is strongly non-Gaussian with a narrow leading peak and a long tail. The longitudinal bunch distribution is derived by measuring the bunch tail constant with a streak camera and by using a priory available information about profile function

  18. Effects of lung elasticity on the sound propagation in the lung

    International Nuclear Information System (INIS)

    Yoneda, Takahiro; Wada, Shigeo; Nakamura, Masanori; Horii, Noriaki; Mizushima, Koichiro

    2011-01-01

    Sound propagation in the lung was simulated for gaining insight into its acoustic properties. A thorax model consisting of lung parenchyma, thoracic bones, trachea and other tissues was made from human CT images. Acoustic nature of the lung parenchyma and bones was expressed with the Biot model of poroelastic material, whereas trachea and tissues were modeled with gas and an elastic material. A point sound source of white noises was placed in the first bifurcation of trachea. The sound propagation in the thorax model was simulated in a frequency domain. The results demonstrated the significant attenuation of sound especially in frequencies larger than 1,000 Hz. Simulations with a stiffened lung demonstrated suppression of the sound attenuation for higher frequencies observed in the normal lung. These results indicate that the normal lung has the nature of a low-pass filter, and stiffening helps the sound at higher frequencies to propagate without attenuations. (author)

  19. Contemporary review on the inequities in the management of lung cancer among the African-American population.

    Science.gov (United States)

    Kim, Anthony W; Liptay, Michael J; Higgins, Robert S D

    2008-06-01

    Lung cancer is the leading cause of cancer-related mortality. Nonsmall-cell lung cancer (NSCLC) constitutes approximately 80% of all the lung cancers observed. Despite the aggressive nature of this disease, totally adequate and fully comprehensive treatment yielding outstanding outcomes and survival has yet to be discovered. A uniform means by which to manage NSCLC is only in evolution. Without a universally accepted algorithm upon which clinical decisions can be referenced or compared, differences in the treatment of this disease process can and will exist. Racial bias in the management of NSCLC is being realized as a cause of a substandard delivery of adequate care. Whether this is a newly emerging phenomenon or simply one that is being exposed is unclear. Nevertheless, this inequity in management ranges from the early-to-late stages of NSCLC. The purpose of this manuscript is to explore the reasons behind the differences in the receipt of care for NSCLC that exists between the African-American and Caucasian population.

  20. Lung structure and function relation in systemic sclerosis: Application of lung densitometry

    Energy Technology Data Exchange (ETDEWEB)

    Ninaber, Maarten K., E-mail: m.k.ninaber@lumc.nl [Department of Pulmonology, Leiden University Medical Center, Albinusdreef 2, 2333ZA Leiden (Netherlands); Stolk, Jan; Smit, Jasper; Le Roy, Ernest J. [Department of Pulmonology, Leiden University Medical Center, Albinusdreef 2, 2333ZA Leiden (Netherlands); Kroft, Lucia J.M. [Department of Radiology, Leiden University Medical Center, Albinusdreef 2, 2333ZA Leiden (Netherlands); Els Bakker, M. [Division of Image Processing, Radiology, Leiden University Medical Center, Albinusdreef 2, 2333ZA Leiden (Netherlands); Vries Bouwstra, Jeska K. de; Schouffoer, Anne A. [Department of Rheumatology, Leiden University Medical Center, Albinusdreef 2, 2333ZA Leiden (Netherlands); Staring, Marius; Stoel, Berend C. [Division of Image Processing, Radiology, Leiden University Medical Center, Albinusdreef 2, 2333ZA Leiden (Netherlands)

    2015-05-15

    Highlights: • A quantitative CT parameter of lung parenchyma in systemic sclerosis is presented. • We examine the optimal percentage threshold for the percentile density. • The 85th percentile density threshold correlated significantly with lung function. • A lung structure–function relation is confirmed. • We report applicability of Perc85 in progression mapping of interstitial lung disease. - Abstract: Introduction: Interstitial lung disease occurs frequently in patients with systemic sclerosis (SSc). Quantitative computed tomography (CT) densitometry using the percentile density method may provide a sensitive assessment of lung structure for monitoring parenchymal damage. Therefore, we aimed to evaluate the optimal percentile density score in SSc by quantitative CT densitometry, against pulmonary function. Material and methods: We investigated 41 SSc patients by chest CT scan, spirometry and gas transfer tests. Lung volumes and the nth percentile density (between 1 and 99%) of the entire lungs were calculated from CT histograms. The nth percentile density is defined as the threshold value of densities expressed in Hounsfield units. A prerequisite for an optimal percentage was its correlation with baseline DLCO %predicted. Two patients showed distinct changes in lung function 2 years after baseline. We obtained CT scans from these patients and performed progression analysis. Results: Regression analysis for the relation between DLCO %predicted and the nth percentile density was optimal at 85% (Perc85). There was significant agreement between Perc85 and DLCO %predicted (R = −0.49, P = 0.001) and FVC %predicted (R = −0.64, P < 0.001). Two patients showed a marked change in Perc85 over a 2 year period, but the localization of change differed clearly. Conclusions: We identified Perc85 as optimal lung density parameter, which correlated significantly with DLCO and FVC, confirming a lung parenchymal structure–function relation in SSc. This provides

  1. Lung structure and function relation in systemic sclerosis: Application of lung densitometry

    International Nuclear Information System (INIS)

    Ninaber, Maarten K.; Stolk, Jan; Smit, Jasper; Le Roy, Ernest J.; Kroft, Lucia J.M.; Els Bakker, M.; Vries Bouwstra, Jeska K. de; Schouffoer, Anne A.; Staring, Marius; Stoel, Berend C.

    2015-01-01

    Highlights: • A quantitative CT parameter of lung parenchyma in systemic sclerosis is presented. • We examine the optimal percentage threshold for the percentile density. • The 85th percentile density threshold correlated significantly with lung function. • A lung structure–function relation is confirmed. • We report applicability of Perc85 in progression mapping of interstitial lung disease. - Abstract: Introduction: Interstitial lung disease occurs frequently in patients with systemic sclerosis (SSc). Quantitative computed tomography (CT) densitometry using the percentile density method may provide a sensitive assessment of lung structure for monitoring parenchymal damage. Therefore, we aimed to evaluate the optimal percentile density score in SSc by quantitative CT densitometry, against pulmonary function. Material and methods: We investigated 41 SSc patients by chest CT scan, spirometry and gas transfer tests. Lung volumes and the nth percentile density (between 1 and 99%) of the entire lungs were calculated from CT histograms. The nth percentile density is defined as the threshold value of densities expressed in Hounsfield units. A prerequisite for an optimal percentage was its correlation with baseline DLCO %predicted. Two patients showed distinct changes in lung function 2 years after baseline. We obtained CT scans from these patients and performed progression analysis. Results: Regression analysis for the relation between DLCO %predicted and the nth percentile density was optimal at 85% (Perc85). There was significant agreement between Perc85 and DLCO %predicted (R = −0.49, P = 0.001) and FVC %predicted (R = −0.64, P < 0.001). Two patients showed a marked change in Perc85 over a 2 year period, but the localization of change differed clearly. Conclusions: We identified Perc85 as optimal lung density parameter, which correlated significantly with DLCO and FVC, confirming a lung parenchymal structure–function relation in SSc. This provides

  2. Evolution of Functional Groups during Pyrolysis Oil Upgrading

    Energy Technology Data Exchange (ETDEWEB)

    Stankovikj, Filip [Department; Tran, Chi-Cong [Department; Kaliaguine, Serge [Department; Olarte, Mariefel V. [Pacific Northwest National Laboratory, Richland, Washington 99354, United States; Garcia-Perez, Manuel [Department

    2017-07-14

    In this paper, we examine the evolution of functional groups (carbonyl, carboxyl, phenol, and hydroxyl) during stabilization at 100–200 °C of two typical wood derived pyrolysis oils from BTG and Amaron in a batch reactor over Ru/C catalyst for 4h. An aqueous and an oily phase were obtained. The content of functional groups in both phases were analyzed by GC/MS, 31P-NMR, 1H-NMR, elemental analysis, KF titration, carbonyl groups by Faix, Folin – Ciocalteu method and UV-Fluorescence. The consumption of hydrogen was between 0.007 and 0.016 g/g oil, and 0.001-0.020 g of CH4/g of oil, 0.005-0.016 g of CO2/g oil and 0.03-0.10 g H2O/g oil were formed. The content of carbonyl, hydroxyl, and carboxyl groups in the volatile GC-MS detectable fraction decreased (80, 65, and ~70% respectively), while their behavior in the total oil and hence in the non-volatile fraction was more complex. The carbonyl groups initially decreased having minimum at ~125-150°C and then increased, while the hydroxyl groups had reversed trend. This might be explained by initial hydrogenation of the carbonyl groups to form hydroxyls, followed by continued dehydration reactions at higher temperatures that may increase their content. The 31P-NMR was on the limit of its sensitivity for the carboxylic groups to precisely detect changes in the non-volatile fraction, however the more precise titration method showed that the concentration of carboxylic groups in the non-volatile fraction remains constant with increased stabilization temperature. The UV-Fluorescence results show that repolymerization increases with temperature. ATR-FTIR method coupled with deconvolution of the region between 1490 and 1850 cm-1 showed to be a good tool for following the changes in carbonyl groups and phenols of the stabilized pyrolysis oils. The deconvolution of the IR bands around 1050 and 1260 cm-1 correlated very well with the changes in the 31P-NMR silent O groups (likely ethers). Most of the H2O formation could be

  3. 67Ga lung scan

    International Nuclear Information System (INIS)

    Niden, A.H.; Mishkin, F.S.; Khurana, M.M.L.; Pick, R.

    1977-01-01

    Twenty-three patients with clinical signs of pulmonary embolic disease and lung infiltrates were studied to determine the value of gallium citrate 67 Ga lung scan in differentiating embolic from inflammatory lung disease. In 11 patients without angiographically proved embolism, only seven had corresponding ventilation-perfusion defects compatible with inflammatory disease. In seven of these 11 patients, the 67 Ga concentration indicated inflammatory disease. In the 12 patients with angiographically proved embolic disease, six had corresponding ventilation-perfusion defects compatible with inflammatory disease. None had an accumulation of 67 Ga in the area of pulmonary infiltrate. Thus, ventilation-perfusion lung scans are of limited value when lung infiltrates are present. In contrast, the accumulation of 67 Ga in the lung indicates an inflammatory process. Gallium imaging can help select those patients with lung infiltrates who need angiography

  4. Schistosoma mansoni: quantitative aspects of the evolution of gamma-radiation cercariae at the skin, lungs and portal system, in mice

    International Nuclear Information System (INIS)

    Sa Cardoso, G. de; Coelho, P.M.Z.

    1989-01-01

    The migration of Schistosoma mansoni (LE and SJ strains) has been studied in eight groups of outbred Swiss albino mice (Mus musculus), which were previously infected with ca 450 cercariae, trans-cutaneously. The infection of mice was performed with non irradiated cercariae (control groups), or with gamma-irradiated cercariae, at the schedule of 3, 20 and 40 Krad. Regarding the skin, a progressive decrease was detected for the recovery rates, related to the time of infection. As far as the lungs and portal system are concerned, a significant inverse correlation was observed between the total recovery rate and the irradiation dosages. The dose of 20 Krad practically hinders the migration of the parasites (in both strains) from the lungs to the portal system, whereas the dose of 40 Krad prevents the migration of most of the parasites from the skin to the lungs. (author)

  5. History of Lung Transplantation.

    Science.gov (United States)

    Dabak, Gül; Şenbaklavacı, Ömer

    2016-04-01

    History of lung transplantation in the world can be traced back to the early years of the 20 th century when experimental vascular anastomotic techniques were developed by Carrel and Guthrie, followed by transplantation of thoracic organs on animal models by Demikhov and finally it was James Hardy who did the first lung transplantation attempt on human. But it was not until the discovery of cyclosporine and development of better surgical techniques that success could be achieved in that field by the Toronto Lung Transplant Group led by Joel Cooper. Up to the present day, over 51.000 lung transplants were performed in the world at different centers. The start of lung transplantation in Turkey has been delayed for various reasons. From 1998 on, there were several attempts but the first successful lung transplant was performed at Sureyyapasa Hospital in 2009. Today there are four lung transplant centers in Turkey; two in Istanbul, one in Ankara and another one in Izmir. Three lung transplant centers from Istanbul which belong to private sector have newly applied for licence from the Ministry of Health.

  6. MRI of the lung

    Energy Technology Data Exchange (ETDEWEB)

    Kauczor, Hans-Ulrich (ed.) [University Clinic Heidelberg (Germany). Diagnostic and Interventional Radiology

    2009-07-01

    For a long time, only chest X-ray and CT were used to image lung structure, while nuclear medicine was employed to assess lung function. During the past decade significant developments have been achieved in the field of magnetic resonance imaging (MRI), enabling MRI to enter the clinical arena of chest imaging. Standard protocols can now be implemented on up-to-date scanners, allowing MRI to be used as a first-line imaging modality for various lung diseases, including cystic fibrosis, pulmonary hypertension and even lung cancer. The diagnostic benefits stem from the ability of MRI to visualize changes in lung structure while simultaneously imaging different aspects of lung function, such as perfusion, respiratory motion, ventilation and gas exchange. On this basis, novel quantitative surrogates for lung function can be obtained. This book provides a comprehensive overview of how to use MRI for imaging of lung disease. Special emphasis is placed on benign diseases requiring regular monitoring, given that it is patients with these diseases who derive the greatest benefit from the avoidance of ionizing radiation. (orig.)

  7. American Lung Association

    Science.gov (United States)

    ... see if you should get screened. Learn more EDUCATION ADVOCACY RESEARCH Our vision is a world free of lung disease The American Lung Association is ... by lung disease. Help us continue to deliver education, advocacy and research to those who need it. $250 $100 $50 Your best gift Donate now Learn More ... nonprofit software

  8. Unexpandable lung.

    Science.gov (United States)

    Pereyra, Marco F; Ferreiro, Lucía; Valdés, Luis

    2013-02-01

    Unexpandable lung is a mechanical complication by which the lung does not expand to the chest wall, impeding a normal apposition between the two pleural layers. The main mechanism involved is the restriction of the visceral pleura due to the formation of a fibrous layer along this pleural membrane. This happens because of the presence of an active pleural disease (lung entrapment), which can be resolved if proper therapeutic measures are taken, or a remote disease (trapped lung), in which an irreversible fibrous pleural layer has been formed. The clinical suspicion arises with the presence of post-thoracocentesis hydropneumothorax or a pleural effusion that cannot be drained due to the appearance of thoracic pain. The diagnosis is based on the analysis of the pleural liquid, the determination of pleural pressures as we drain the effusion and on air-contrast chest CT. As both represent the continuity of one same process, the results will depend on the time at which these procedures are done. If, when given a lung that is becoming entrapped, the necessary therapeutic measures are not taken, the final result will be a trapped lung. In this instance, most patients are asymptomatic or have mild exertional dyspnea and therefore they do not require treatment. Nevertheless, in cases of incapacitating dyspnea, it may be necessary to use pleural decortication in order to resolve the symptoms. Copyright © 2012 SEPAR. Published by Elsevier Espana. All rights reserved.

  9. Epidemiology of Lung Cancer.

    Science.gov (United States)

    Schwartz, Ann G; Cote, Michele L

    2016-01-01

    Lung cancer continues to be one of the most common causes of cancer death despite understanding the major cause of the disease: cigarette smoking. Smoking increases lung cancer risk 5- to 10-fold with a clear dose-response relationship. Exposure to environmental tobacco smoke among nonsmokers increases lung cancer risk about 20%. Risks for marijuana and hookah use, and the new e-cigarettes, are yet to be consistently defined and will be important areas for continued research as use of these products increases. Other known environmental risk factors include exposures to radon, asbestos, diesel, and ionizing radiation. Host factors have also been associated with lung cancer risk, including family history of lung cancer, history of chronic obstructive pulmonary disease and infections. Studies to identify genes associated with lung cancer susceptibility have consistently identified chromosomal regions on 15q25, 6p21 and 5p15 associated with lung cancer risk. Risk prediction models for lung cancer typically include age, sex, cigarette smoking intensity and/or duration, medical history, and occupational exposures, however there is not yet a risk prediction model currently recommended for general use. As lung cancer screening becomes more widespread, a validated model will be needed to better define risk groups to inform screening guidelines.

  10. The relationship between ventilatory lung motion and pulmonary perfusion shown by ventilatory lung motion imaging

    International Nuclear Information System (INIS)

    Fujii, Tadashige; Tanaka, Masao; Nakatsuka, Tatsuya; Yoshimura, Kazuhiko; Hirose, Yoshiki; Hirayama, Jiro; Kobayashi, Toshio; Handa, Kenjiro

    1991-01-01

    Using ventilatory lung motion imaging, which was obtained from two perfusion lung scintigrams with 99m Tc-macroaggregated albumin taken in maximal inspiration and maximal expiration, the lung motion (E-I/I) of the each unilateral lung was studied in various cardiopulmonary diseases. The sum of (E-I)/I(+) of the unilateral lung was decreased in the diseased lung for localized pleuropulmonary diseases, including primary lung cancer and pleural thickening, and in both lungs for heart diseases, and diffuse pulmonary diseases including diffuse interstitial pneumonia and diffuse panbronchiolitis. The sum of (E-I)/I(+) of the both lungs, which correlated with vital capacity and PaO 2 , was decreased in diffuse interstitial pneumonia, pulmonary emphysema, diffuse panbronchiolitis, primary lung cancer, pleural diseases and so on. (E-I)/I(+), correlated with pulmonary perfusion (n=49, r=0.51, p 81m Kr or 133 Xe (n=49, r=0.61, p<0.001) than pulmonary perfusion. The ventilatory lung motion imaging, which demonstrates the motion of the intra-pulmonary areas and lung edges, appears useful for estimating pulmonary ventilation of the perfused area as well as pulmonary perfusion. (author)

  11. First Danish experience with ex vivo lung perfusion of donor lungs before transplantation

    DEFF Research Database (Denmark)

    Henriksen, Ian Sune Iversen; Møller-Sørensen, Hasse; Møller, Christian Holdfold

    2014-01-01

    INTRODUCTION: The number of lung transplantations is limited by a general lack of donor organs. Ex vivo lung perfusion (EVLP) is a novel method to optimise and evaluate marginal donor lungs prior to transplantation. We describe our experiences with EVLP in Denmark during the first year after its...... introduction. MATERIAL AND METHODS: The study was conducted by prospective registration of donor offers and lung transplantations in Denmark from 1 May 2012 to 30 April 2013. Donor lungs without any contraindications were transplanted in the traditional manner. Taken for EVLP were donor lungs that were...... otherwise considered transplantable, but failed to meet the usual criteria due to possible contusions or because they were from donors with sepsis or unable to pass the oxygenation test. RESULTS: In the study period, seven of 33 Danish lung transplantations were made possible due to EVLP. One patient died...

  12. Angiogenin and vascular endothelial growth factor expression in lungs of lung cancer patients.

    Science.gov (United States)

    Rozman, Ales; Silar, Mira; Kosnik, Mitja

    2012-12-01

    BACKGROUND.: Lung cancer is the leading cause of cancer deaths. Angiogenesis is crucial process in cancer growth and progression. This prospective study evaluated expression of two central regulatory molecules: angiogenin and vascular endothelial growth factor (VEGF) in patients with lung cancer. PATIENTS AND METHODS.: Clinical data, blood samples and broncho-alveolar lavage (BAL) from 23 patients with primary lung carcinoma were collected. BAL fluid was taken from part of the lung with malignancy, and from corresponding healthy side of the lung. VEGF and angiogenin concentrations were analysed by an enzyme-linked immunosorbent assay. Dilution of bronchial secretions in the BAL fluid was calculated from urea concentration ratio between serum and BAL fluid. RESULTS.: We found no statistical correlation between angiogenin concentrations in serum and in bronchial secretions from both parts of the lung. VEGF concentrations were greater in bronchial secretions in the affected side of the lung than on healthy side. Both concentrations were greater than serum VEGF concentration. VEGF concentration in serum was in positive correlation with tumour size (p = 0,003) and with metastatic stage of disease (p = 0,041). There was correlation between VEGF and angiogenin concentrations in bronchial secretions from healthy side of the lung and between VEGF and angiogenin concentrations in bronchial secretions from part of the lung with malignancy. CONCLUSION.: Angiogenin and VEGF concentrations in systemic, background and local samples of patients with lung cancer are affected by different mechanisms. Pro-angiogenic activity of lung cancer has an important influence on the levels of angiogenin and VEGF.

  13. The reactions of neutral iron clusters with D2O: Deconvolution of equilibrium constants from multiphoton processes

    International Nuclear Information System (INIS)

    Weiller, B.H.; Bechthold, P.S.; Parks, E.K.; Pobo, L.G.; Riley, S.J.

    1989-01-01

    The chemical reactions of neutral iron clusters with D 2 O are studied in a continuous flow tube reactor by molecular beam sampling and time-of-flight mass spectrometry with laser photoionization. Product distributions are invariant to a four-fold change in reaction time demonstrating that equilibrium is attained between free and adsorbed D 2 O. The observed negative temperature dependence is consistent with an exothermic, molecular addition reaction at equilibrium. Under our experimental conditions, there is significant photodesorption of D 2 O (Fe/sub n/(D 2 O)/sub m/ + hν → Fe/sub n/ + m D 2 O) along with ionization due to absorption of multiple photons from the ionizing laser. Using a simple model based on a rate equation analysis, we are able to quantitatively deconvolute this desorption process from the equilibrium constants. 8 refs., 1 fig

  14. Variable tidal volumes improve lung protective ventilation strategies in experimental lung injury.

    Science.gov (United States)

    Spieth, Peter M; Carvalho, Alysson R; Pelosi, Paolo; Hoehn, Catharina; Meissner, Christoph; Kasper, Michael; Hübler, Matthias; von Neindorff, Matthias; Dassow, Constanze; Barrenschee, Martina; Uhlig, Stefan; Koch, Thea; de Abreu, Marcelo Gama

    2009-04-15

    Noisy ventilation with variable Vt may improve respiratory function in acute lung injury. To determine the impact of noisy ventilation on respiratory function and its biological effects on lung parenchyma compared with conventional protective mechanical ventilation strategies. In a porcine surfactant depletion model of lung injury, we randomly combined noisy ventilation with the ARDS Network protocol or the open lung approach (n = 9 per group). Respiratory mechanics, gas exchange, and distribution of pulmonary blood flow were measured at intervals over a 6-hour period. Postmortem, lung tissue was analyzed to determine histological damage, mechanical stress, and inflammation. We found that, at comparable minute ventilation, noisy ventilation (1) improved arterial oxygenation and reduced mean inspiratory peak airway pressure and elastance of the respiratory system compared with the ARDS Network protocol and the open lung approach, (2) redistributed pulmonary blood flow to caudal zones compared with the ARDS Network protocol and to peripheral ones compared with the open lung approach, (3) reduced histological damage in comparison to both protective ventilation strategies, and (4) did not increase lung inflammation or mechanical stress. Noisy ventilation with variable Vt and fixed respiratory frequency improves respiratory function and reduces histological damage compared with standard protective ventilation strategies.

  15. An experimental two-stage rat model of lung carcinoma initiated by radon exposure

    International Nuclear Information System (INIS)

    Poncy, J.L.; Laroque, P.; Fritsch, P.; Monchaux, G.; Masse, R.; Chameaud, J.

    1992-01-01

    We present the results of a two-stage biological model of lung carcinogenesis in rats. The histogenesis of these tumors was examined, and DNA content of lung cells was measured by flow cytometry during the evolving neoplastic stage. Tumors were induced in rat lungs after radon inhalation (1600 WLM) followed by a promoter treatment; six intramuscular injections of 5,6-benzoflavone (25 mg/kg of body weight/injection) every 2 wk. Less than 3 mo after the first injection of benzoflavone, squamous cell carcinoma was observed in the lungs of all rats exposed to radon. The preneoplastic lesions gradually developed as follows: hyperplastic bronchiolar-type cells migrated to the alveoli from cells that proliferated in bronchioles and alveolar ducts; initial lesions were observed in almost all respiratory bronchioles. From some hyperplasias, epidermoid metaplasias arose distally, forming nodular epidermoid lesions in alveoli, which progressed to form squamous papilloma and, finally, epidermoid carcinomas. The histogenesis of these experimentally induced epidermoid carcinomas showed the bronchioloalveolar origin of the tumor. This factor must be considered when comparing these with human lesions; in humans, lung epidermoid carcinomas are thought to arise mainly in the first bronchial generations. The labeling index of pulmonary tissue after incorporation of 3 H-thymidine by the cells was 0.2% in control rats. This index reached a value of 1 to 2% in the hyperplastic area of the bronchioles and 10 to 15% in epidermoid nodules and epidermoid tumors, respectively. DNA cytometric analysis was performed on cell suspensions obtained after enzymatic treatment of paraffin sections of lungs from rats sacrificed during different stags of neoplastic transformations. Data showed the early appearance of a triploid cell population that grew during the evolution of nodular epidermoid lesions to epidermoid carcinomas

  16. Amebic lung abscess with coexisting lung adenocarcinoma: a unusual case of amebiasis

    OpenAIRE

    Zhu, Hailong; Min, Xiangyang; Li, Shuai; Feng, Meng; Zhang, Guofeng; Yi, Xianghua

    2014-01-01

    Amebic lung abscess with concurrent lung cancer, but without either a liver abscess or amebic colitis, is extremely uncommon. Here, we report a 70-year-old man presenting with pulmonary amebiasis and coexisting lung adenocarcinoma. During his first-time hospitalization, the diagnosis of lung amebiasis was confirmed by morphological observation and PCR in formalin-fixed and paraffin-embedded sediments of pleural effusion. Almost four months later, the patient was readmitted to hospital for sim...

  17. Cervical lung hernia

    Science.gov (United States)

    Lightwood, Robin G.; Cleland, W. P.

    1974-01-01

    Lightwood, R. G., and Cleland, W. P. (1974).Thorax, 29, 349-351. Cervical lung hernia. Lung hernias occur in the cervical position in about one third of cases. The remainder appear through the chest wall. Some lung hernias are congenital, but trauma is the most common cause. The indications for surgery depend upon the severity of symptoms. Repair by direct suture can be used for small tears in Sibson's (costovertebral) fascia while larger defects have been closed using prosthetic materials. Four patients with cervical lung hernia are described together with an account of their operations. PMID:4850946

  18. Seismic interferometry by crosscorrelation and by multidimensional deconvolution: a systematic comparison

    Science.gov (United States)

    Wapenaar, Kees; van der Neut, Joost; Ruigrok, Elmer; Draganov, Deyan; Hunziker, Jürg; Slob, Evert; Thorbecke, Jan; Snieder, Roel

    2011-06-01

    Seismic interferometry, also known as Green's function retrieval by crosscorrelation, has a wide range of applications, ranging from surface-wave tomography using ambient noise, to creating virtual sources for improved reflection seismology. Despite its successful applications, the crosscorrelation approach also has its limitations. The main underlying assumptions are that the medium is lossless and that the wavefield is equipartitioned. These assumptions are in practice often violated: the medium of interest is often illuminated from one side only, the sources may be irregularly distributed, and losses may be significant. These limitations may partly be overcome by reformulating seismic interferometry as a multidimensional deconvolution (MDD) process. We present a systematic analysis of seismic interferometry by crosscorrelation and by MDD. We show that for the non-ideal situations mentioned above, the correlation function is proportional to a Green's function with a blurred source. The source blurring is quantified by a so-called interferometric point-spread function which, like the correlation function, can be derived from the observed data (i.e. without the need to know the sources and the medium). The source of the Green's function obtained by the correlation method can be deblurred by deconvolving the correlation function for the point-spread function. This is the essence of seismic interferometry by MDD. We illustrate the crosscorrelation and MDD methods for controlled-source and passive-data applications with numerical examples and discuss the advantages and limitations of both methods.

  19. The Azygous Lobe of the Lung: in the Case of Lung Cancer.

    Science.gov (United States)

    Darlong, L M; Ram, Dharma; Sharma, Ashwani; Sharma, Anil Kumar; Iqbal, Sayed Assif; Nagar, Anand; Hazarika, Dibyamohan

    2017-06-01

    The azygous lobe of the lung is an uncommon developmental anomaly. Its surgical importance is hardly being described in literature. Here, we are presenting a case of lung cancer with incidental azygous lobe, with its surgical relevance during lung cancer surgery.

  20. Evaluation of lung injury induced by pingyangmycin with 99Tcm-HMPAO lung imaging

    International Nuclear Information System (INIS)

    Zhao Changjiu; Yang Zhijie; Fu Peng; Zhang Rui

    2005-01-01

    Objective: To investigate the lung uptake of 99 Tc m -hexamethyl propylene amine oxime (HMPAO) in pingyangmycin-induced lung injury and its mechanism. Methods: 24 white rabbits were randomly divided into 4 groups. Group I: the control with normal diet. In group II, III and IV 0.2, 0.3 and 0.5 mg/kg pingyangmycin were given respectively by marginal vein of ear every other day. 99 Tc m -HMPAO static lung imaging was performed before and 8, 16, 24, 32 d after injection of pingyangmycin. 7 pixel x 5 pixel regions of interest (ROIs) were drawn on the right lung(R) and right upper limb(B), R/B were calculated. Also, 2 ml venous blood was withdrawn for measurement of endothelin by radioimmunoassay. 16 d after pingyangmycin in group IV and 32 d in group I, II and III, all the rabbits were sacrificed. Both lungs were examined immediately under light and electron microscopy. Results: Compared with the control group, there were statistical differences of 99 Tc m -HMPAO lung uptake in group II, III and IV (P 99 Tc m -HMPAO lung imaging can detect early pingyangmycin-induced lung injury. The endothelium of lung microcapillary is presumably the main location site of 99 Tc m -HMPAO abnormal concentration. (authors)

  1. "Open lung ventilation optimizes pulmonary function during lung surgery".

    Science.gov (United States)

    Downs, John B; Robinson, Lary A; Steighner, Michael L; Thrush, David; Reich, Richard R; Räsänen, Jukka O

    2014-12-01

    We evaluated an "open lung" ventilation (OV) strategy using low tidal volumes, low respiratory rate, low FiO2, and high continuous positive airway pressure in patients undergoing major lung resections. In this phase I pilot study, twelve consecutive patients were anesthetized using conventional ventilator settings (CV) and then OV strategy during which oxygenation and lung compliance were noted. Subsequently, a lung resection was performed. Data were collected during both modes of ventilation in each patient, with each patient acting as his own control. The postoperative course was monitored for complications. Twelve patients underwent open thoracotomies for seven lobectomies and five segmentectomies. The OV strategy provided consistent one-lung anesthesia and improved static compliance (40 ± 7 versus 25 ± 4 mL/cm H2O, P = 0.002) with airway pressures similar to CV. Postresection oxygenation (SpO2/FiO2) was better during OV (433 ± 11 versus 386 ± 15, P = 0.008). All postoperative chest x-rays were free of atelectasis or infiltrates. No patient required supplemental oxygen at any time postoperatively or on discharge. The mean hospital stay was 4 ± 1 d. There were no complications or mortality. The OV strategy, previously shown to have benefits during mechanical ventilation of patients with respiratory failure, proved safe and effective in lung resection patients. Because postoperative pulmonary complications may be directly attributable to the anesthetic management, adopting an OV strategy that optimizes lung mechanics and gas exchange may help reduce postoperative problems and improve overall surgical results. A randomized trial is planned to ascertain whether this technique will reduce postoperative pulmonary complications. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Factors affecting the local control of stereotactic body radiotherapy for lung tumors including primary lung cancer and metastatic lung tumors

    International Nuclear Information System (INIS)

    Hamamoto, Yasushi; Kataoka, Masaaki; Yamashita, Motohiro

    2012-01-01

    The purpose of this study was to identify factors affecting local control of stereotactic body radiotherapy (SBRT) for lung tumors including primary lung cancer and metastatic lung tumors. Between June 2006 and June 2009, 159 lung tumors in 144 patients (primary lung cancer, 128; metastatic lung tumor, 31) were treated with SBRT with 48-60 Gy (mean 50.1 Gy) in 4-5 fractions. Higher doses were given to larger tumors and metastatic tumors in principle. Assessed factors were age, gender, tumor origin (primary vs. metastatic), histological subtype, tumor size, tumor appearance (solid vs. ground glass opacity), maximum standardized uptake value of positron emission tomography using 18 F-fluoro-2-deoxy-D-glucose, and SBRT doses. Follow-up time was 1-60 months (median 18 months). The 1-, 2-, and 3-year local failure-free rates of all lesions were 90, 80, and 77%, respectively. On univariate analysis, metastatic tumors (p<0.0001), solid tumors (p=0.0246), and higher SBRT doses (p=0.0334) were the statistically significant unfavorable factors for local control. On multivariate analysis, only tumor origin was statistically significant (p=0.0027). The 2-year local failure-free rates of primary lung cancer and metastatic lung tumors were 87 and 50%, respectively. A metastatic tumor was the only independently significant unfavorable factor for local control after SBRT. (author)

  3. Diet and lung cancer

    DEFF Research Database (Denmark)

    Fabricius, P; Lange, Peter

    2003-01-01

    Lung cancer is the leading cause of cancer-related deaths worldwide. While cigarette smoking is of key importance, factors such as diet also play a role in the development of lung cancer. MedLine and Embase were searched with diet and lung cancer as the key words. Recently published reviews and l...... are only ameliorated to a minor degree by a healthy diet.......Lung cancer is the leading cause of cancer-related deaths worldwide. While cigarette smoking is of key importance, factors such as diet also play a role in the development of lung cancer. MedLine and Embase were searched with diet and lung cancer as the key words. Recently published reviews...... and large well designed original articles were preferred to form the basis for the present article. A diet rich in fruit and vegetables reduces the incidence of lung cancer by approximately 25%. The reduction is of the same magnitude in current smokers, ex-smokers and never smokers. Supplementation...

  4. Lung Focused Resuscitation at a Specialized Donor Care Facility Improves Lung Procurement Rates.

    Science.gov (United States)

    Chang, Stephanie H; Kreisel, Daniel; Marklin, Gary F; Cook, Lindsey; Hachem, Ramsey; Kozower, Benjamin D; Balsara, Keki R; Bell, Jennifer M; Frederiksen, Christine; Meyers, Bryan F; Patterson, G Alexander; Puri, Varun

    2018-05-01

    Lung procurement for transplantation occurs in approximately 20% of brain dead donors and is a major impediment to wider application of lung transplantation. We investigated the effect of lung protective management at a specialized donor care facility on lung procurement rates from brain dead donors. Our local organ procurement organization instituted a protocol of lung protective management at a freestanding specialized donor care facility in 2008. Brain dead donors from 2001 to 2007 (early period) were compared with those from 2009 to 2016 (current period) for lung procurement rates and other solid-organ procurement rates using a prospectively maintained database. An overall increase occurred in the number of brain dead donors during the study period (early group, 791; late group, 1,333; p procurement rate (lung donors/all brain dead donors) improved markedly after the introduction of lung protective management (early group, 157 of 791 [19.8%]; current group, 452 of 1,333 [33.9%]; p procurement rate (total number of organs procured/donor) also increased during the study period (early group, 3.5 organs/donor; current group, 3.8 organs/donor; p = 0.006). Lung protective management in brain dead donors at a specialized donor care facility is associated with higher lung utilization rates compared with conventional management. This strategy does not adversely affect the utilization of other organs in a multiorgan donor. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  5. Lung cancer in women

    Directory of Open Access Journals (Sweden)

    Barrera-Rodriguez R

    2012-12-01

    Full Text Available Raúl Barrera-Rodriguez,1 Jorge Morales-Fuentes2 1Biochemistry and Environmental Medicine Laboratory, National Institute of Respiratory Disease, 2Lung Cancer Medical Service, National Institute of Respiratory Disease, Tlalpan, Mexico City, Distrito Federal, Mexico Both authors contributed equally to this workAbstract: Recent biological advances in tumor research provide clear evidence that lung cancer in females is different from that in males. These differences appear to have a direct impact on the clinical presentation, histology, and outcomes of lung cancer. Women are more likely to present with lung adenocarcinoma, tend to receive a diagnosis at an earlier age, and are more likely to be diagnosed with localized disease. Women may also be more predisposed to molecular aberrations resulting from the carcinogenic effects of tobacco, but do not appear to be more susceptible than men to developing lung cancer. The gender differences found in female lung cancer make it mandatory that gender stratification is used in clinical trials in order to improve the survival rates of patients with lung cancer.Keywords: lung cancer, adenocarcinoma, women, genetic susceptibility, genetic differences, tobacco

  6. AUTOMATIC LUNG NODULE SEGMENTATION USING AUTOSEED REGION GROWING WITH MORPHOLOGICAL MASKING (ARGMM AND FEATURE EX-TRACTION THROUGH COMPLETE LOCAL BINARY PATTERN AND MICROSCOPIC INFORMATION PATTERN

    Directory of Open Access Journals (Sweden)

    Senthil Kumar

    2015-04-01

    Full Text Available An efficient Autoseed Region Growing with Morphological Masking(ARGMM is imple-mented in this paper on the Lung CT Slice to segment the 'Lung Nodules',which may be the potential indicator for the Lung Cancer. The segmentation of lung nodules car-ried out in this paper through Multi-Thresholding, ARGMM and Level Set Evolution. ARGMM takes twice the time compared to Level Set, but still the number of suspected segmented nodules are doubled, which make sure that no potential cancerous nodules go unnoticed at the earlier stages of diagnosis. It is very important not to panic the patient by finding the presence of nodules from Lung CT scan. Only 40 percent of nod-ules can be cancerous. Hence, in this paper an efficient Shape and Texture analysis is computed to quantitatively describe the segmented lung nodules. The Frequency spectrum of the lung nodules is developed and its frequency domain features are com-puted. The Complete Local binary pattern of lung nodules is computed in this paper by constructing the combine histogram of Sign and Magnitude Local Binary Patterns. Lo-cal Configuration Pattern is also determined in this work for lung nodules to numeri-cally model the microscopic information of nodules pattern.

  7. Lung Cancer

    Science.gov (United States)

    Lung cancer is one of the most common cancers in the world. It is a leading cause of cancer death in men and women in the United States. Cigarette smoking causes most lung cancers. The more cigarettes you smoke per day and ...

  8. Lung cancer after internal alpha-exposure of the lung from incorporated plutonium

    International Nuclear Information System (INIS)

    Mikhail, S.

    2004-01-01

    Several epidemiological studies among workers of first Russian nuclear complex Mayak which produced weapon-grade plutonium showed significant increase of lung cancer mortality. The estimated shape of the dose-response was linear with both alpha and gamma dose but risk coefficients for gamma-exposure are on the edge of the significance level. This study was performed in the cohort of male Mayak nuclear workers initially hired in 1948-1958 with known levels of plutonium exposure. Number of observed lung cancer cases available for analyses in this cohort was 217. The relative risk of death from lung cancer among smokers was 10.7 (5.5-25.2) comparatively to non-smokers. This is in good correspondence with results of other studies. The excess relative risk per one Gray was 63. (4.1-9.7) for internal alpha-exposure and 0.18 (0.01-0.5) for external gamma-exposure. According to a model this gives 16:112:60:29 cases of lung cancer attributed to background, smoking, internal alpha-and external gamma-exposure, correspondingly. The relative risks of death from lung cancer were also estimated in a nested case-control study with lung cancer deaths as cases. Controls were selected from the cohort and matched for birth year to account for trend in lung cancer mortality with time. The analyses with nested case-control approach gave relative risks for smoking 14.7 (6.8-38.9). Relative risk of lung cancer among non-smokers after accumulating 0.34 Gy of alpha-exposure to lung was 3.7 (1.7-9.0). It should be emphasized that in fact after accumulation 0.3-0.4 Gy of absorbed dose 3-4 fold increase in lung cancer mortality was observed. This dose is very close to the dose which would be produced after intake of plutonium in quantities which are permissible today. (Author)

  9. The mathematics of a successful deconvolution: a quantitative assessment of mixture-based combinatorial libraries screened against two formylpeptide receptors.

    Science.gov (United States)

    Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia

    2013-05-30

    In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.

  10. Metastatic tumors of lungs

    International Nuclear Information System (INIS)

    Rozenshtraukh, L.C.; Rybakova, N.I.; Vinner, M.G.

    1987-01-01

    Roentgenologic semiotics of lung metastases and their complications, as well as peculiarities of lung metastases of separate localization tumours are presented. Definition table for primary tumour by roentgenologic aspect of lung metastases is given

  11. Lung cancer - non-small cell

    Science.gov (United States)

    Cancer - lung - non-small cell; Non-small cell lung cancer; NSCLC; Adenocarcinoma - lung; Squamous cell carcinoma - lung ... Research shows that smoking marijuana may help cancer cells grow. But there is no direct link between ...

  12. Dosimetric lung models

    International Nuclear Information System (INIS)

    James, A.C.; Roy, M.

    1986-01-01

    The anatomical and physiological factors that vary with age and influence the deposition of airborne radionuclides in the lung are reviewed. The efficiency with which aerosols deposit in the lung for a given exposure at various ages from birth to adulthood is evaluated. Deposition within the lung is considered in relation to the clearance mechanisms acting in different regions or compartments. The procedure for evaluating dose to sensitive tissues in lung and transfer to other organs that is being considered by the Task Group established by ICRP to review the Lung Model is outlined. Examples of the application of this modelling procedure to evaluate lung dose as a function of age are given, for exposure to radon daughters in dwellings, and for exposure to an insoluble 239 Pu aerosol. The former represents exposure to short-lived radionuclides that deliver relatively high doses to bronchial tissue. In this case, dose rates are marginally higher in children than in adults. Plutonium exposure represents the case where dose is predominantly delivered to respiratory tissue and lymph nodes. In this case, the life-time doses tend to be lower for exposure in childhood. Some of the uncertainties in this modelling procedure are noted

  13. Lung abscess

    International Nuclear Information System (INIS)

    Ha, H.K.; Kang, M.W.; Park, J.M.; Yang, W.J.; Shinn, K.S.; Bahk, Y.W.

    1993-01-01

    Lung abscess was successfully treated with percutaneous drainage in 5 of 6 patients. Complete abscess resolution occurred in 4 patients, partial resolution in one, and no response in one. The duration of drainage ranged from 7 to 18 days (mean 15.5 days) in successful cases. The failure of drainage in one neurologicall impaired patient was attributed to persistent aspiration. In 2 patients, concurrent pleural empyema was also cured. CT provided the anatomic details necessary for choosing the puncture site and avoiding puncture of the lung parenchyma. Percutaneous catheter drainage is a safe and effective method for treating lung abscess. (orig.)

  14. Lung function

    International Nuclear Information System (INIS)

    Sorichter, S.

    2009-01-01

    The term lung function is often restricted to the assessment of volume time curves measured at the mouth. Spirometry includes the assessment of lung volumes which can be mobilised with the corresponding flow-volume curves. In addition, lung volumes that can not be mobilised, such as the residual volume, or only partially as FRC and TLC can be measured by body plethysmography combined with the determination of the airway resistance. Body plethysmography allows the correct positioning of forced breathing manoeuvres on the volume-axis, e.g. before and after pharmacotherapy. Adding the CO single breath transfer factor (T LCO ), which includes the measurement of the ventilated lung volume using He, enables a clear diagnosis of different obstructive, restrictive or mixed ventilatory defects with and without trapped air. Tests of reversibility and provocation, as well as the assessment of inspiratory mouth pressures (PI max , P 0.1 ) help to classify the underlying disorder and to clarify treatment strategies. For further information and to complete the diagnostic of disturbances of the ventilation, diffusion and/or perfusion (capillar-)arterial bloodgases at rest and under physical strain sometimes amended by ergospirometry are recommended. Ideally, lung function measurements are amended by radiological and nuclear medicine techniques. (orig.) [de

  15. Comparison of lung protective ventilation strategies in a rabbit model of acute lung injury.

    Science.gov (United States)

    Rotta, A T; Gunnarsson, B; Fuhrman, B P; Hernan, L J; Steinhorn, D M

    2001-11-01

    To determine the impact of different protective and nonprotective mechanical ventilation strategies on the degree of pulmonary inflammation, oxidative damage, and hemodynamic stability in a saline lavage model of acute lung injury. A prospective, randomized, controlled, in vivo animal laboratory study. Animal research facility of a health sciences university. Forty-six New Zealand White rabbits. Mature rabbits were instrumented with a tracheostomy and vascular catheters. Lavage-injured rabbits were randomized to receive conventional ventilation with either a) low peak end-expiratory pressure (PEEP; tidal volume of 10 mL/kg, PEEP of 2 cm H2O); b) high PEEP (tidal volume of 10 mL/kg, PEEP of 10 cm H2O); c) low tidal volume with PEEP above Pflex (open lung strategy, tidal volume of 6 mL/kg, PEEP set 2 cm H2O > Pflex); or d) high-frequency oscillatory ventilation. Animals were ventilated for 4 hrs. Lung lavage fluid and tissue samples were obtained immediately after animals were killed. Lung lavage fluid was assayed for measurements of total protein, elastase activity, tumor necrosis factor-alpha, and malondialdehyde. Lung tissue homogenates were assayed for measurements of myeloperoxidase activity and malondialdehyde. The need for inotropic support was recorded. Animals that received a lung protective strategy (open lung or high-frequency oscillatory ventilation) exhibited more favorable oxygenation and lung mechanics compared with the low PEEP and high PEEP groups. Animals ventilated by a lung protective strategy also showed attenuation of inflammation (reduced tracheal fluid protein, tracheal fluid elastase, tracheal fluid tumor necrosis factor-alpha, and pulmonary leukostasis). Animals treated with high-frequency oscillatory ventilation had attenuated oxidative injury to the lung and greater hemodynamic stability compared with the other experimental groups. Both lung protective strategies were associated with improved oxygenation, attenuated inflammation, and

  16. Association Between RT-Induced Changes in Lung Tissue Density and Global Lung Function

    International Nuclear Information System (INIS)

    Ma Jinli; Zhang Junan; Zhou Sumin; Hubbs, Jessica L.; Foltz, Rodney J.; Hollis, Donna R.; Light, Kim L.; Wong, Terence Z.; Kelsey, Christopher R.; Marks, Lawrence B.

    2009-01-01

    Purpose: To assess the association between radiotherapy (RT)-induced changes in computed tomography (CT)-defined lung tissue density and pulmonary function tests (PFTs). Methods and Materials: Patients undergoing incidental partial lung RT were prospectively assessed for global (PFTs) and regional (CT and single photon emission CT [SPECT]) lung function before and, serially, after RT. The percent reductions in the PFT and the average changes in lung density were compared (Pearson correlations) in the overall group and subgroups stratified according to various clinical factors. Comparisons were also made between the CT- and SPECT-based computations using the Mann-Whitney U test. Results: Between 1991 and 2004, 343 patients were enrolled in this study. Of these, 111 patients had a total of 203 concurrent post-RT evaluations of changes in lung density and PFTs available for the analyses, and 81 patients had a total of 141 concurrent post-RT SPECT images. The average increases in lung density were related to the percent reductions in the PFTs, albeit with modest correlation coefficients (range, 0.20-0.43). The analyses also indicated that the association between lung density and PFT changes is essentially equivalent to the corresponding association with SPECT-defined lung perfusion. Conclusion: We found a weak quantitative association between the degree of increase in lung density as defined by CT and the percent reduction in the PFTs.

  17. Radiation Therapy for Lung Cancer

    Science.gov (United States)

    ... is almost always due to smoking. TREATING LUNG CANCER Lung cancer treatment depends on several factors, including the ... org TARGETING CANCER CARE Radiation Therapy for Lung Cancer Lung cancer is the second most common cancer in ...

  18. Use of new spectral analysis methods in gamma spectra deconvolution

    International Nuclear Information System (INIS)

    Pinault, J.L.

    1991-01-01

    A general deconvolution method applicable to X and gamma ray spectrometry is proposed. Using new spectral analysis methods, it is applied to an actual case: the accurate on-line analysis of three elements (Ca, Si, Fe) in a cement plant using neutron capture gamma rays. Neutrons are provided by a low activity (5 μg) 252 Cf source; the detector is a BGO 3 in.x8 in. scintillator. The principle of the methods rests on the Fourier transform of the spectrum. The search for peaks and determination of peak areas are worked out in the Fourier representation, which enables separation of background and peaks and very efficiently discriminates peaks, or elements represented by several peaks. First the spectrum is transformed so that in the new representation the full width at half maximum (FWHM) is independent of energy. Thus, the spectrum is arranged symmetrically and transformed into the Fourier representation. The latter is multiplied by a function in order to transform original Gaussian into Lorentzian peaks. An autoregressive filter is calculated, leading to a characteristic polynomial whose complex roots represent both the location and the width of each peak, provided that the absolute value is lower than unit. The amplitude of each component (the area of each peak or the sum of areas of peaks characterizing an element) is fitted by the weighted least squares method, taking into account that errors in spectra are independent and follow a Poisson law. Very accurate results are obtained, which would be hard to achieve by other methods. The DECO FORTRAN code has been developed for compatible PC microcomputers. Some features of the code are given. (orig.)

  19. Genetics Home Reference: lung cancer

    Science.gov (United States)

    ... Share: Email Facebook Twitter Home Health Conditions Lung cancer Lung cancer Printable PDF Open All Close All Enable Javascript ... cancer, childhood Additional NIH Resources (3 links) National Cancer Institute: Lung Cancer Overview National Cancer Institute: Lung Cancer Prevention ...

  20. Estimation of Lung Ventilation

    Science.gov (United States)

    Ding, Kai; Cao, Kunlin; Du, Kaifang; Amelon, Ryan; Christensen, Gary E.; Raghavan, Madhavan; Reinhardt, Joseph M.

    Since the primary function of the lung is gas exchange, ventilation can be interpreted as an index of lung function in addition to perfusion. Injury and disease processes can alter lung function on a global and/or a local level. MDCT can be used to acquire multiple static breath-hold CT images of the lung taken at different lung volumes, or with proper respiratory control, 4DCT images of the lung reconstructed at different respiratory phases. Image registration can be applied to this data to estimate a deformation field that transforms the lung from one volume configuration to the other. This deformation field can be analyzed to estimate local lung tissue expansion, calculate voxel-by-voxel intensity change, and make biomechanical measurements. The physiologic significance of the registration-based measures of respiratory function can be established by comparing to more conventional measurements, such as nuclear medicine or contrast wash-in/wash-out studies with CT or MR. An important emerging application of these methods is the detection of pulmonary function change in subjects undergoing radiation therapy (RT) for lung cancer. During RT, treatment is commonly limited to sub-therapeutic doses due to unintended toxicity to normal lung tissue. Measurement of pulmonary function may be useful as a planning tool during RT planning, may be useful for tracking the progression of toxicity to nearby normal tissue during RT, and can be used to evaluate the effectiveness of a treatment post-therapy. This chapter reviews the basic measures to estimate regional ventilation from image registration of CT images, the comparison of them to the existing golden standard and the application in radiation therapy.

  1. Deconvolution of Thermal Emissivity Spectra of Mercury to their Endmember Counterparts measured in Simulated Mercury Surface Conditions

    Science.gov (United States)

    Varatharajan, I.; D'Amore, M.; Maturilli, A.; Helbert, J.; Hiesinger, H.

    2017-12-01

    The Mercury Radiometer and Thermal Imaging Spectrometer (MERTIS) payload of ESA/JAXA Bepicolombo mission to Mercury will map the thermal emissivity at wavelength range of 7-14 μm and spatial resolution of 500 m/pixel [1]. Mercury was also imaged at the same wavelength range using the Boston University's Mid-Infrared Spectrometer and Imager (MIRSI) mounted on the NASA Infrared Telescope Facility (IRTF) on Mauna Kea, Hawaii with the minimum spatial coverage of 400-600km/spectra which blends all rocks, minerals, and soil types [2]. Therefore, the study [2] used quantitative deconvolution algorithm developed by [3] for spectral unmixing of this composite thermal emissivity spectrum from telescope to their respective areal fractions of endmember spectra; however, the thermal emissivity of endmembers used in [2] is the inverted reflectance measurements (Kirchhoff's law) of various samples measured at room temperature and pressure. Over a decade, the Planetary Spectroscopy Laboratory (PSL) at the Institute of Planetary Research (PF) at the German Aerospace Center (DLR) facilitates the thermal emissivity measurements under controlled and simulated surface conditions of Mercury by taking emissivity measurements at varying temperatures from 100-500°C under vacuum conditions supporting MERTIS payload. The measured thermal emissivity endmember spectral library therefore includes major silicates such as bytownite, anorthoclase, synthetic glass, olivine, enstatite, nepheline basanite, rocks like komatiite, tektite, Johnson Space Center lunar simulant (1A), and synthetic powdered sulfides which includes MgS, FeS, CaS, CrS, TiS, NaS, and MnS. Using such specialized endmember spectral library created under Mercury's conditions significantly increases the accuracy of the deconvolution model results. In this study, we revisited the available telescope spectra and redeveloped the algorithm by [3] by only choosing the endmember spectral library created at PSL for unbiased model

  2. Treatment of intractable interstitial lung injury with alemtuzumab after lung transplantation

    DEFF Research Database (Denmark)

    Kohno, M; Perch, M; Andersen, E

    2011-01-01

    A 44-year-old woman underwent left single-lung transplantation for end-stage emphysema due to α1-antitrypsin deficiency in January 2010. Cyclosporine, azathioprine, and prednisolone were administered for immunosuppression and antithymocyte globulin for induction therapy at the time...... of transplantation. Routine examination of a lung biopsy, 4 months after transplantation, showed nonspecific, diffuse interstitial inflammation with alveolar septal fibrosis. The patient's clinical status and imaging studies, consistent with nonspecific interstitial pneumonitis, which was considered as signs......, posttransplant antirejection drug regimen. We have since successfully treated with alemtuzumab three additional patients who developed interstitial lung injury after lung transplantation, who are also summarized in this report....

  3. Gene Expression Deconvolution for Uncovering Molecular Signatures in Response to Therapy in Juvenile Idiopathic Arthritis.

    Directory of Open Access Journals (Sweden)

    Ang Cui

    Full Text Available Gene expression-based signatures help identify pathways relevant to diseases and treatments, but are challenging to construct when there is a diversity of disease mechanisms and treatments in patients with complex diseases. To overcome this challenge, we present a new application of an in silico gene expression deconvolution method, ISOpure-S1, and apply it to identify a common gene expression signature corresponding to response to treatment in 33 juvenile idiopathic arthritis (JIA patients. Using pre- and post-treatment gene expression profiles only, we found a gene expression signature that significantly correlated with a reduction in the number of joints with active arthritis, a measure of clinical outcome (Spearman rho = 0.44, p = 0.040, Bonferroni correction. This signature may be associated with a decrease in T-cells, monocytes, neutrophils and platelets. The products of most differentially expressed genes include known biomarkers for JIA such as major histocompatibility complexes and interleukins, as well as novel biomarkers including α-defensins. This method is readily applicable to expression datasets of other complex diseases to uncover shared mechanistic patterns in heterogeneous samples.

  4. Analysis of the deconvolution of the thermoluminescent curve of the zirconium oxide doped with graphite

    International Nuclear Information System (INIS)

    Salas C, P.; Estrada G, R.; Gonzalez M, P.R.; Mendoza A, D.

    2003-01-01

    In this work, we present a mathematical analysis of the behavior of the thermoluminescent curve (Tl) induced by gamma radiation in samples made of zirconium oxide doped with different amounts of graphite. In accordance with the results gamma radiation induces a Tl curve with two maximum of emission localized in the temperatures at 139 and 250 C, the area under the curve is increasing as a function of the time of exposition to the radiation. The analysis of curve deconvolution, in accordance with the theory which indicates that this behavior must be obey a Boltzmann distribution, we found that each one of them has a different growth velocity as the time of exposition increase. In the same way, we observed that after the irradiation was suspended each one of the maximum decrease with different velocity. The behaviour observed in the samples is very interesting because the zirconium oxide has attracted the interest of many research groups, this material has demonstrated to have many applications in thermoluminescent dosimetry and it can be used in the quantification of radiation. (Author)

  5. 4D PET iterative deconvolution with spatiotemporal regularization for quantitative dynamic PET imaging.

    Science.gov (United States)

    Reilhac, Anthonin; Charil, Arnaud; Wimberley, Catriona; Angelis, Georgios; Hamze, Hasar; Callaghan, Paul; Garcia, Marie-Paule; Boisson, Frederic; Ryder, Will; Meikle, Steven R; Gregoire, Marie-Claude

    2015-09-01

    Quantitative measurements in dynamic PET imaging are usually limited by the poor counting statistics particularly in short dynamic frames and by the low spatial resolution of the detection system, resulting in partial volume effects (PVEs). In this work, we present a fast and easy to implement method for the restoration of dynamic PET images that have suffered from both PVE and noise degradation. It is based on a weighted least squares iterative deconvolution approach of the dynamic PET image with spatial and temporal regularization. Using simulated dynamic [(11)C] Raclopride PET data with controlled biological variations in the striata between scans, we showed that the restoration method provides images which exhibit less noise and better contrast between emitting structures than the original images. In addition, the method is able to recover the true time activity curve in the striata region with an error below 3% while it was underestimated by more than 20% without correction. As a result, the method improves the accuracy and reduces the variability of the kinetic parameter estimates calculated from the corrected images. More importantly it increases the accuracy (from less than 66% to more than 95%) of measured biological variations as well as their statistical detectivity. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  6. Precision cut lung slices as an efficient tool for in vitro lung physio-pharmacotoxicology studies.

    Science.gov (United States)

    Morin, Jean-Paul; Baste, Jean-Marc; Gay, Arnaud; Crochemore, Clément; Corbière, Cécile; Monteil, Christelle

    2013-01-01

    1.We review the specific approaches for lung tissue slices preparation and incubation systems and the research application fields in which lung slices proved to be a very efficient alternative to animal experimentation for biomechanical, physiological, pharmacological and toxicological approaches. 2.Focus is made on air-liquid interface dynamic organ culture systems that allow direct tissue exposure to complex aerosol and that best mimic in vivo lung tissue physiology. 3.A compilation of research applications in the fields of vascular and airway reactivity, mucociliary transport, polyamine transport, xenobiotic biotransformation, chemicals toxicology and complex aerosols supports the concept that precision cut lung slices are a very efficient tool maintaining highly differentiated functions similar to in vivo lung organ when kept under dynamic organ culture. They also have been successfully used for lung gene transfer efficiency assessment, for lung viral infection efficiency assessment, for studies of tissue preservation media and tissue post-conditioning to optimize lung tissue viability before grafting. 4.Taken all together, the reviewed studies point to a great interest for precision cut lung slices as an efficient and valuable alternative to in vivo lung organ experimentation.

  7. Traumatic lung hernia

    International Nuclear Information System (INIS)

    Rabaza, M. J.; Alcazar, P. P.; Touma, C.

    2001-01-01

    Lung hernia is an uncommon entity that is defined as the protrusion of the lung parenchyma through a defect in the thoracic cavity. It is classified on the basis of its location (cervical, intercostal and diaphragmatic) and etiology (congenital and acquired). Acquired lung hernias can be further grouped as spontaneous, traumatic or pathological, depending on the responsible mechanism. Nearly half of them are secondary to chest trauma, whether penetrating or blunt. We present a case of lung hernia in a patient with penetrating chest trauma. The diagnosis was suspected from the radiographic images and was confirmed by computed tomography. We also review the literature concerning its classification and incidence, diagnostic methods used and treatment. (Author) 9 refs

  8. The mean lung dose (MLD). Predictive criterion for lung damage

    Energy Technology Data Exchange (ETDEWEB)

    Geyer, Peter; Appold, Steffen [Dresden University of Technology (TU Dresden), Clinic and Polyclinic for Radiotherapy and Radiation Oncology, Carl Gustav Carus Medical Faculty, Dresden (Germany); Herrmann, Thomas

    2015-07-15

    The purpose of this work was to prove the validity of the mean lung dose (MLD), widely used in clinical practice to estimate the lung toxicity of a treatment plan, by reevaluating experimental data from mini pigs. A total of 43 mini pigs were irradiated in one of four dose groups (25, 29, 33, and 37 Gy). Two regimens were applied: homogeneous irradiation of the right lung or partial irradiation of both lungs - including parts with lower dose - but with similar mean lung doses. The animals were treated with five fractions with a linear accelerator applying a CT-based treatment plan. The clinical lung reaction (breathing frequency) and morphological changes in CT scans were examined frequently during the 48 weeks after irradiation. A clear dose-effect relationship was found for both regimens of the trial. However, a straightforward relationship between the MLD and the relative number of responders with respect to different grades of increased breathing frequency for both regimens was not found. A morphologically based parameter NTCP{sub lung} was found to be more suitable for this purpose. The dependence of this parameter on the MLD is markedly different for the two regimens. In clinical practice, the MLD can be used to predict lung toxicity of a treatment plan, except for dose values that could lead to severe side effects. In the latter mentioned case, limitations to the predictive value of the MLD are possible. Such severe developments of a radiation-induced pneumopathy are better predicted by the NTCP{sub lung} formalism. The predictive advantage of this parameter compared to the MLD seems to remain in the evaluation and comparison of widely differing dose distributions, like in the investigated trial. (orig.) [German] Es soll unter Reevaluation von Tierversuchsdaten am Minischwein geprueft werden, ob die in der klinischen Praxis zur Beurteilung der Lungentoxizitaet eines Bestrahlungsregims regelhaft verwendete mittlere Lungendosis (MLD) eine zuverlaessige

  9. Deconvolution analysis of sup(99m)Tc-methylene diphosphonate kinetics in metabolic bone disease

    International Nuclear Information System (INIS)

    Knop, J.; Kroeger, E.; Stritzke, P.; Schneider, C.; Kruse, H.P.; Hamburg Univ.

    1981-01-01

    The kinetics of sup(99m)Tc-methylene diphosphonate (MDP) and 47 Ca were studied in three patients with osteoporosis, three patients with hyperparathyroidism, and two patients with osteomalacia. The activities of sup(99m)Tc-MDP were recorded in the lumbar spine, paravertebral soft tissues, and in venous blood samples for 1 h after injection. The results were submitted to deconvolution analysis to determine regional bone accumulation rates. 47 Ca kinetics were analysed by a linear two-compartment model quantitating short-term mineral exchange, exchangeable bone calcium, and calcium accretion. The sup(99m)Tc-MDP accumulation rates were small in osteoporosis, greater in hyperparathyroidism, and greatest in osteomalacia. No correlations were obtained between sup(99m)Tc-MDP bone accumulation rates and the results of 47 Ca kinetics. However, there was a significant relationship between the level of serum alkaline phosphatase and bone accumulation rates (R = 0.71, P 47 Ca kinetics might suggest a preferential binding of sup(99m)Tc-MDP to the organic matrix of the bone, as has been suggested by other authors on the basis of experimental and clinical investigations. (orig.)

  10. Estimation of gas and tissue lung volumes by MRI: functional approach of lung imaging.

    Science.gov (United States)

    Qanadli, S D; Orvoen-Frija, E; Lacombe, P; Di Paola, R; Bittoun, J; Frija, G

    1999-01-01

    The purpose of this work was to assess the accuracy of MRI for the determination of lung gas and tissue volumes. Fifteen healthy subjects underwent MRI of the thorax and pulmonary function tests [vital capacity (VC) and total lung capacity (TLC)] in the supine position. MR examinations were performed at inspiration and expiration. Lung volumes were measured by a previously validated technique on phantoms. Both individual and total lung volumes and capacities were calculated. MRI total vital capacity (VC(MRI)) was compared with spirometric vital capacity (VC(SP)). Capacities were correlated to lung volumes. Tissue volume (V(T)) was estimated as the difference between the total lung volume at full inspiration and the TLC. No significant difference was seen between VC(MRI) and VC(SP). Individual capacities were well correlated (r = 0.9) to static volume at full inspiration. The V(T) was estimated to be 836+/-393 ml. This preliminary study demonstrates that MRI can accurately estimate lung gas and tissue volumes. The proposed approach appears well suited for functional imaging of the lung.

  11. Lung PET scan

    Science.gov (United States)

    ... Chest PET scan; Lung positron emission tomography; PET - chest; PET - lung; PET - tumor imaging; ... Grainger & Allison's Diagnostic Radiology: A Textbook of Medical Imaging . 6th ed. Philadelphia, ...

  12. Postoperative complications do not influence the pattern of early lung function recovery after lung resection for lung cancer in patients at risk.

    Science.gov (United States)

    Ercegovac, Maja; Subotic, Dragan; Zugic, Vladimir; Jakovic, Radoslav; Moskovljevic, Dejan; Bascarevic, Slavisa; Mujovic, Natasa

    2014-05-19

    The pattern and factors influencing the lung function recovery in the first postoperative days are still not fully elucidated, especially in patients at increased risk. Prospective study on 60 patients at increased risk, who underwent a lung resection for primary lung cancer. complete resection and one or more known risk factors in form of COPD, cardiovascular disorders, advanced age or other comorbidities. Previous myocardial infarction, myocardial revascularization or stenting, cardiac rhythm disorders, arterial hypertension and myocardiopathy determined the increased cardiac risk. The severity of COPD was graded according to GOLD criteria. The trend of the postoperative lung function recovery was assessed by performing spirometry with a portable spirometer. Cardiac comorbidity existed in 55%, mild and moderate COPD in 20% and 35% of patients respectively. Measured values of FVC% and FEV1% on postoperative days one, three and seven, showed continuous improvement, with significant difference between the days of measurement, especially between days three and seven. There was no difference in the trend of the lung function recovery between patients with and without postoperative complications. Whilst pO2 was decreasing during the first three days in a roughly parallel fashion in patients with respiratory, surgical complications and in patients without complications, a slight hypercapnia registered on the first postoperative day was gradually abolished in all groups except in patients with cardiac complications. Extent of the lung resection and postoperative complications do not significantly influence the trend of the lung function recovery after lung resection for lung cancer.

  13. Lung cancer in elderly

    International Nuclear Information System (INIS)

    Wagnerova, M.

    2007-01-01

    Lung cancer is the leading cause of cancer deaths in Europe and USA. The median age of diagnosis is currently 69 years, however this is gradually increasing with the aging population. Patients over age of 70 represent 40 % of all patients with non-small cell lung cancer. Age alone has not been found to be a significant prognostic factor in many malignancies, including lung cancer with performance status and stage being of greater importance. In lung cancer it is also evident that older patients gain equivalent benefit from cancer therapies as their younger counterparts. Elderly patients are under-treated in all aspects of their disease course from histological diagnosis to active therapy with surgical resection, radiotherapy or chemotherapy, irrespective of performance status or co-morbidities. Elderly patients are also underrepresented in lung cancer clinical trials. In this review is presented knowledge about lung cancer in elderly. (author)

  14. Diet and lung cancer

    DEFF Research Database (Denmark)

    Fabricius, P; Lange, Peter

    2003-01-01

    Lung cancer is the leading cause of cancer-related deaths worldwide. While cigarette smoking is of key importance, factors such as diet also play a role in the development of lung cancer. MedLine and Embase were searched with diet and lung cancer as the key words. Recently published reviews...... and large well designed original articles were preferred to form the basis for the present article. A diet rich in fruit and vegetables reduces the incidence of lung cancer by approximately 25%. The reduction is of the same magnitude in current smokers, ex-smokers and never smokers. Supplementation...... with vitamins A, C and E and beta-carotene offers no protection against the development of lung cancer. On the contrary, beta-carotene supplementation has, in two major randomised intervention trials, resulted in an increased mortality. Smoking remains the leading cause of lung cancer. The adverse effects...

  15. SARS – Lung Pathology

    Indian Academy of Sciences (India)

    Dry nonproductive cough – may show minimal lung infiltration. Recovery; * Lungs get fluid in bronchi- droplets infective and +ve for virus in culture and PCR. May also have co-infection with chlamydia/metapneumoviruses. Recovery; * Lung tissue destroyed due to ? immunological/cytokine mediated damage-Recovery ...

  16. Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis

    International Nuclear Information System (INIS)

    Greenberg, M.; Ebel, D.S.

    2009-01-01

    We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of ∼15 (micro)m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 (micro)m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

  17. How to optimize the lung donor.

    Science.gov (United States)

    Sales, Gabriele; Costamagna, Andrea; Fanelli, Vito; Boffini, Massimo; Pugliese, Francesco; Mascia, Luciana; Brazzi, Luca

    2018-02-01

    Over the last two decades, lung transplantation emerged as the standard of care for patients with advanced and terminal lung disease. Despite the increment in lung transplantation rates, in 2016 the overall mortality while on waiting list in Italy reached 10%, whereas only 39% of the wait-list patients were successfully transplanted. A number of approaches, including protective ventilatory strategy, accurate management of fluid balance, and administration of a hormonal resuscitation therapy, have been reported to improve lung donor performance before organ retrieval. These approaches, in conjunction with the use of ex-vivo lung perfusion technique contributed to expand the lung donor pool, without affecting the harvest of other organs and the outcomes of lung recipients. However, the efficacy of issues related to the ex-vivo lung perfusion technique, such as the optimal ventilation strategy, the ischemia-reperfusion induced lung injury management, the prophylaxis of germs transmission from donor to recipient and the application of targeted pharmacologic therapies to treat specific donor lung injuries are still to be explored. The main objective of the present review is to summarize the "state-of-art" strategies to optimize the donor lungs and to present the actual role of ex-vivo lung perfusion in the process of lung transplant. Moreover, different approaches about the technique reported in literature and several issues that are under investigation to treat specific donor lung injury will be discussed.

  18. Bilateral lung masses: The same aetiology?

    Directory of Open Access Journals (Sweden)

    C. Damas

    2007-03-01

    Full Text Available The authors describe the case of a 50 year old woman, smoker, healthy until September 2003 when she presented persistent dry cough, fatigue and weight loss. Chest x-ray showed two lung masses, one in the superior right lobe and the other in the lingula lobe of the left lung.The patient underwent TFNA (transthoracic fine needle aspiration and the cytological result was compatible with small cell lung cancer. Staging procedures identified hepatic lesions, probably secondary. Presence of hepatic metastasis and contralateral lung lesions defined the stage of the disease as disseminate. Chemotherapy with carboplatin and etoposide was started. Six months later the right lesion had decreased but the left lesion had increased. TFNA of this lesion revealed adenocarci-noma. A new treatment was started with vinorelbine and gemcitabine. After four cycles of chemotherapy without any response patient underwent radiotherapy of the left lesion.After 28 months of follow up the patient was asymptomatic and able to manage her normal daily routine. Multiple lung cancers can be considered as synchronous or metachronous, depending on the time of diagnosis. Metachronous lesions are the most frequent (50–70% of all cases and adenocarcinoma the more frequent histological pattern.In this case the disease was at a disseminate stage, which did not suggest a synchronous lung tumour. While the disease was at an advanced stage with poor prognosis at diagnosis, the evolution of the two different lung tumours did not seem to compromise patient's daily routine. Resumo: Os autores descrevem o caso de uma mulher de 50 anos, fumadora. Assintomática até Setembro de 2003, altura em que refere o aparecimento de tosse seca, cansaço e perda de peso. Na radiografia do tórax eram evidentes duas massas, uma no lobo superior direito e outra no lobo lingular. A doente foi submetida a biópsia aspirativa transtorácica e a citologia obtida foi compatível com carcinoma

  19. Interstitial Lung Disease

    Science.gov (United States)

    ... propranolol (Inderal, Innopran), may harm lung tissue. Some antibiotics. Nitrofurantoin (Macrobid, Macrodantin, others) and ethambutol (Myambutol) can cause lung damage. Anti-inflammatory drugs. Certain anti-inflammatory drugs, such as rituximab ( ...

  20. First Danish experience with ex vivo lung perfusion of donor lungs before transplantation.

    Science.gov (United States)

    Henriksen, Ian Sune Iversen; Møller-Sørensen, Hasse; Møller, Christian Holdfold; Zemtsovski, Mikhail; Nilsson, Jens Christian; Seidelin, Casper Tobias; Perch, Michael; Iversen, Martin; Steinbrüchel, Daniel

    2014-03-01

    The number of lung transplantations is limited by a general lack of donor organs. Ex vivo lung perfusion (EVLP) is a novel method to optimise and evaluate marginal donor lungs prior to transplantation. We describe our experiences with EVLP in Denmark during the first year after its introduction. The study was conducted by prospective registration of donor offers and lung transplantations in Denmark from 1 May 2012 to 30 April 2013. Donor lungs without any contraindications were transplanted in the traditional manner. Taken for EVLP were donor lungs that were otherwise considered transplantable, but failed to meet the usual criteria due to possible contusions or because they were from donors with sepsis or unable to pass the oxygenation test. In the study period, seven of 33 Danish lung transplantations were made possible due to EVLP. One patient died of non-EVLP-related causes, but all other recipients were alive with normal graft function at the end of our registration period. All lungs showed an improved PaO2/FiO2 ratio from a median 23.1 kPa (8.8-38.9) within the donor to 58.8 kPa (34.9-76.5) (FiO2 = 1.0) after EVLP, which corresponds to a 155% improved oxygenation. The median time to extubation, time in intensive care unit and the admission period were 1, 7 and 39 days, respectively. In the first year after the introduction of EVLP in Denmark, seven pairs of donor lungs that previously would have been rejected have been transplanted as a result of their improved function. EVLP seems to be a safe way to increase the use of marginal donor lungs. no funding was granted for the present paper. not relevant.

  1. Lung cancer-A global perspective.

    Science.gov (United States)

    McIntyre, Amanda; Ganti, Apar Kishor

    2017-04-01

    Lung cancer is the leading cause of cancer deaths worldwide. While tobacco exposure is responsible for the majority of lung cancers, the incidence of lung cancer in never smokers, especially Asian women, is increasing. There is a global variation in lung cancer biology with EGFR mutations being more common in Asian patients, while Kras mutation is more common in Caucasians. This review will focus on the global variations in lung cancer and its treatment. © 2017 Wiley Periodicals, Inc.

  2. Arterioscanning of lungs

    International Nuclear Information System (INIS)

    Petrovskij, B.V.; Rabkin, I.Kh.; Matevosov, A.L.

    1980-01-01

    Studied is lung microcirculation by means of introducting radioactive albumin (MAA 131 I introduction through a catheter) in bronchial vessels. Arterioscanning technique and its peculiarities are described in detail. It is established that results of arterioscanning must be estimated taking into account the nature of MAA 131 I distribution and fixation, counting rate and duration of radioactive registration in the range of pathologic neoplasms. It is shown that arterioscanning permits to reveal the 20-80 μm diameter vessels . This method can be one of the most important ones in the early diagnosis of lung cancer. The data on the diagnostic effectiveness of lung bronchial arteriography and arterioscanning in the cases of chronic inflammatory diseases, tuberculosis and some benigh lung tumours and neoplasms are also presented

  3. Dutch Lung Surgery Audit: A National Audit Comprising Lung and Thoracic Surgery Patients.

    Science.gov (United States)

    Berge, Martijn Ten; Beck, Naomi; Heineman, David Jonathan; Damhuis, Ronald; Steup, Willem Hans; van Huijstee, Pieter Jan; Eerenberg, Jan Peter; Veen, Eelco; Maat, Alexander; Versteegh, Michel; van Brakel, Thomas; Schreurs, Wilhemina Hendrika; Wouters, Michel Wilhelmus

    2018-04-21

    The nationwide Dutch Lung Surgery Audit (DLSA) started in 2012 to monitor and evaluate the quality of lung surgery in the Netherlands as an improvement tool. This outline describes the establishment, structure and organization of the audit by the Dutch Society of Lung Surgeons (NVvL) and the Dutch Society of Cardiothoracic Surgeons (NVT), in collaboration with the Dutch Institute for Clinical Auditing (DICA). In addition, first four-year results are presented. The NVvL and NVT initiated a web-based registration including weekly updated online feedback for participating hospitals. Data verification by external data managers is performed on regular basis. The audit is incorporated in national quality improvement programs and participation in the DLSA is mandatory by health insurance organizations and the National Healthcare Inspectorate. Between 1 January 2012 and 31 December 2015, all hospitals performing lung surgery participated and a total of 19,557 patients were registered from which almost half comprised lung cancer patients. Nationwide the guideline adherence increased over the years and 96.5% of lung cancer patients were discussed in preoperative multidisciplinary teams. Overall postoperative complications and mortality after non-small cell lung cancer surgery were 15.5% and 2.0%, respectively. The audit provides reliable benchmarked information for caregivers and hospital management with potential to start local, regional or national improvement initiatives. Currently, the audit is further completed with data from non-surgical lung cancer patients including treatment data from pulmonary oncologists and radiation oncologists. This will ultimately provide a comprehensive overview of lung cancer treatment in The Netherlands. Copyright © 2018. Published by Elsevier Inc.

  4. Genetic Variation in GSTP1, Lung Function, Risk of Lung Cancer, and Mortality

    DEFF Research Database (Denmark)

    Nørskov, Marianne S.; Dahl, Morten; Tybjærg-Hansen, Anne

    2017-01-01

    66,069 individuals from the white general population for two common functional variants in the glutathione S-transferase pi 1 gene (GSTP1)—amino acid isoleucine 105 changed to a valine (Ile105Val) and amino acid alanine 114 changed to a valine (Ala114Val)—and recorded lung function, lung cancer......Introduction Glutathione S-transferase pi 1 metabolizes carcinogens from tobacco smoke in the lung. We tested whether genetically altered glutathione S-transferase pi 1 activity affects lung function and risk for tobacco-related cancer and mortality in the general population. Methods We genotyped......, tobacco-related cancer, and death as outcomes. Results Lung function was increased stepwise with the Ile105Val genotype overall (p

  5. Prediction of residual lung function after lung surgery, and examination of blood perfusion in the pre- and postoperative lung using three-dimensional SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Shimatani, Shinji [Toho Univ., Tokyo (Japan). School of Medicine

    2001-01-01

    In order to predict postoperative pulmonary function after lung surgery, preoperative {sup 99m}Tc-macroaggregated albumin (MAA) lung perfusion scans with single-photon emission computed tomography (SPECT) were performed. Spirometry was also performed before and 4-6 months after surgery in 40 patients. In addition, changes in blood perfusion in the pre- and postoperative lung were examined by postoperative lung perfusion scans in 18 of the 40 patients. We measured the three-dimensional (3-D) imaging volume of the operative and contralateral lungs using the volumes rendering method at blood perfusion thresholds of 20, 50 and 75%, utilizing {sup 99m}Tc-MAA lung perfusion, and predicted pulmonary function by means of the measured volumes. We examined the correlation between predicted and the measured values of postoperative pulmonary function, forced vital capacity (FVC) and forced expiratory volume in one second (FEV{sub 1.0}). The correlation between FEV{sub 1.0} predicted by SPECT (threshold 50%) and measured postoperative lung function resembled that between lung function predicted by the standard planar method and measured FEV{sub 1.0} in the lobectomy group. We then examined the ratios of both pre- and postoperative blood perfusion volumes obtained using 3-D imaging at lung perfusion threshold ranges of 10% each (PV20-29, PV30-39) to pre- and postoperative total perfusion (PV20-100). In the lobectomy group, the postoperative PV20-29/PV20-100 value was significantly higher for the operative side lung than the preoperative PV20-29/PV20-100 value, and the postoperative PV50-59, 60-69, 70-79, 80-89 and 90-100/PV20-100 values were significantly lower than the respective preoperative values. However, in the contralateral lung, the respective pre- and postoperative PV/PV20-100 values were almost identical. These findings suggest that the rate of low blood perfusion increased while the rate of middle to high perfusion decreased in the lobectomy group in the operative

  6. [Lung abscess which needed to be distinguished from lung cancer; report of a case].

    Science.gov (United States)

    Kamiya, Kazunori; Yoshizu, Akira; Misumi, Yuki; Hida, Naoya; Okamoto, Hiroaki; Yoshida, Sachiko

    2011-12-01

    Differential diagnosis of lung abscess from lung cancer is sometimes difficult. In February 2009, a 57-year-old man consulted our hospital complaining of bloody sputum. Chest computed tomography (CT) demonstrated a 2.5 cm nodule with pleural indentation, spicula and vascular involvement in the right S(3). Bronchofiberscope could not establish a definitive diagnosis. Blood test showed no abnormality. Three months later, progression of the nodule to the adjacent middle lobe was demonstrated by follow-up CT, and F-18 fluorodeoxyglucose positron emission tomography (FDG-PET) showed isotope accumulation in the nodule and hilar lymph node. A diagnosis of lung cancer was suspected and surgery was performed. The diagnosis of possible lung cancer was made by needle biopsy, and the patient underwent right upper lobectomy and partial resection of middle lobe with standard nodal dissection. The final pathological diagnosis was lung abscess. Lung abscess must be kept in mind as a possible differential diagnosis when abnormal shadow suspected of lung cancer is observed.

  7. Quantitative computed tomography of lung parenchyma in patients with emphysema: analysis of higher-density lung regions

    Science.gov (United States)

    Lederman, Dror; Leader, Joseph K.; Zheng, Bin; Sciurba, Frank C.; Tan, Jun; Gur, David

    2011-03-01

    Quantitative computed tomography (CT) has been widely used to detect and evaluate the presence (or absence) of emphysema applying the density masks at specific thresholds, e.g., -910 or -950 Hounsfield Unit (HU). However, it has also been observed that subjects with similar density-mask based emphysema scores could have varying lung function, possibly indicating differences of disease severity. To assess this possible discrepancy, we investigated whether density distribution of "viable" lung parenchyma regions with pixel values > -910 HU correlates with lung function. A dataset of 38 subjects, who underwent both pulmonary function testing and CT examinations in a COPD SCCOR study, was assembled. After the lung regions depicted on CT images were automatically segmented by a computerized scheme, we systematically divided the lung parenchyma into different density groups (bins) and computed a number of statistical features (i.e., mean, standard deviation (STD), skewness of the pixel value distributions) in these density bins. We then analyzed the correlations between each feature and lung function. The correlation between diffusion lung capacity (DLCO) and STD of pixel values in the bin of -910HU lung parenchyma and lung function, which indicates that similar to the conventional density mask method, the pixel value distribution features in "viable" lung parenchyma areas may also provide clinically useful information to improve assessments of lung disease severity as measured by lung functional tests.

  8. Anisotropic strain in YBa2Cu3O7-δ films analysed by deconvolution of two-dimensional intensity data

    International Nuclear Information System (INIS)

    Broetz, J.; Fuess, H.

    2001-01-01

    The influence of the instrumental resolution on two-dimensional reflection profiles of epitaxic YBa 2 Cu 3 O 7-δ films on SrTiO 3 (001) has been studied in order to investigate the strain in the superconducting films. The X-ray diffraction intensity data were obtained by two-dimensional scans in reciprocal space (q-scan). Since the reflection broadening caused by the apparatus differs for each position in reciprocal space, a highly crystalline substrate was used as a standard. Thus it was possible to measure a standard very close to the YBa 2 Cu 3 O 7-δ reflections in reciprocal space. The two-dimensional deconvolution of reflections by a new computer program revealed an anisotropic strain of the two twinning systems of the film. (orig.)

  9. /sup 67/Ga lung scan

    Energy Technology Data Exchange (ETDEWEB)

    Niden, A.H.; Mishkin, F.S.; Khurana, M.M.L.; Pick, R.

    1977-03-21

    Twenty-three patients with clinical signs of pulmonary embolic disease and lung infiltrates were studied to determine the value of gallium citrate /sup 67/Ga lung scan in differentiating embolic from inflammatory lung disease. In 11 patients without angiographically proved embolism, only seven had corresponding ventilation-perfusion defects compatible with inflammatory disease. In seven of these 11 patients, the /sup 67/Ga concentration indicated inflammatory disease. In the 12 patients with angiographically proved embolic disease, six had corresponding ventilation-perfusion defects compatible with inflammatory disease. None had an accumulation of /sup 67/Ga in the area of pulmonary infiltrate. Thus, ventilation-perfusion lung scans are of limited value when lung infiltrates are present. In contrast, the accumulation of /sup 67/Ga in the lung indicates an inflammatory process. Gallium imaging can help select those patients with lung infiltrates who need angiography.

  10. The Danish Lung Cancer Registry

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Rasmussen, Torben Riis

    2016-01-01

    AIM OF DATABASE: The Danish Lung Cancer Registry (DLCR) was established by the Danish Lung Cancer Group. The primary and first goal of the DLCR was to improve survival and the overall clinical management of Danish lung cancer patients. STUDY POPULATION: All Danish primary lung cancer patients since...... 2000 are included into the registry and the database today contains information on more than 50,000 cases of lung cancer. MAIN VARIABLES: The database contains information on patient characteristics such as age, sex, diagnostic procedures, histology, tumor stage, lung function, performance...... the results are commented for local, regional, and national audits. Indicator results are supported by descriptive reports with details on diagnostics and treatment. CONCLUSION: DLCR has since its creation been used to improve the quality of treatment of lung cancer in Denmark and it is increasingly used...

  11. Relationship between radiation dose and lung function in patients with lung cancer receiving radiotherapy

    International Nuclear Information System (INIS)

    Harsaker, V.; Dale, E.; Bruland, O.S.; Olsen, D.R.

    2003-01-01

    In patients with inoperable non-small cell lung cancer (NSCLC), radical radiotherapy is the treatment of choice. The dose is limited by consequential pneumonitis and lung fibrosis. Hence, a better understanding of the relationship between the dose-volume distributions and normal tissue side effects is needed. CT is a non-invasive method to monitor the development of fibrosis and pneumonitis, and spirometry is an established tool to measure lung function. NSCLC patients were included in a multicenter trial and treated with megavoltage conformal radiotherapy. In a subgroup comprising 16 patients, a total dose of 59-63 Gy with 1.8-1.9 Gy per fraction was given. Dose-volume histograms were calculated and corrected according to the linear-quadratic formula using alpha/beta=3 Gy. The patients underwent repetitive CT examinations (mean follow-up, 133 days) following radiotherapy, and pre and post treatment spirometry (mean follow-up, 240 days). A significant correlation was demonstrated between local lung dose and changes in CT numbers >30 days after treatment (p 40 Gy Gy there was a sudden increase in CT numbers at 70-90 days. Somewhat unexpectedly, the highest mean lung doses were found in patients with the least reductions in lung function (peak expiratory flow; p<0.001). The correlation between CT numbers, radiation dose and time after treatment show that CT may be used to monitor development of lung fibrosis/pneumonitis after radiotherapy for lung cancer. Paradoxically, the patients with the highest mean lung doses experienced the minimum deterioration of lung function. This may be explained by reduction in the volume of existing tumour masses obstructing the airways, leading to relief of symptoms. This finding stresses the role of radiotherapy for lung cancer, especially where the treatment aim is palliative

  12. Two unusual cases of brain metastases from lung primary malignant melanoma

    International Nuclear Information System (INIS)

    Rodríguez, A.; Mañana, G.; Panuncio, A.; Rodríguez, R.; Roldán, G.; Sosa, A.

    2004-01-01

    Start with two cases of brain metastases from lung melanoma are presented who were diagnosed in the Neuropathology Laboratory of the Department of Anatomy Pathology, Institute of Neurology, Hospital de Clinicas, Montevideo, emphasizing the pathological diagnostic criteria and their evolution clinic. Both patients presented at the time of the initial consultation injuries amelánica respectively pigmented single brain. In both cases ruled by the morphology and the use of complementary techniques metastasis carcinoma. The main differential diagnosis of these lesions is whether is a primitive brain tumor, pigmented or not, or of a secondary tumor melanin: metastatic malignant melanoma. In both cases the patients had been studied one being in an unresectable lung injury, and in the other showed a single pulmonary nodule was resected in its entirety. the pulmonary lesions were for malignant melanoma, one with ample pigment and the other for the most part amelánico, with few areas retained pigment. He studied dermatologist, discarded the presence of a cutaneous malignant melanoma primitive. Other locations were also excluded

  13. Eosinophilic Lung Disorders

    Science.gov (United States)

    ... problems characterized by having an increased number of eosinophils (white blood cells) in the lungs. These white ... category of pneumonias that feature increased numbers of eosinophils in the lung tissue. Pneumonia is an inflammatory ...

  14. Flock worker's lung: chronic interstitial lung disease in the nylon flocking industry.

    Science.gov (United States)

    Kern, D G; Crausman, R S; Durand, K T; Nayer, A; Kuhn, C

    1998-08-15

    Two young men working at a nylon flocking plant in Rhode Island developed interstitial lung disease of unknown cause. Similar clusters at the same company's Canadian plant were reported previously. To define the extent, clinicopathologic features, and potential causes of the apparent disease outbreak. Case-finding survey and retrospective cohort study. Academic occupational medicine program. All workers employed at the Rhode Island plant on or after 15 June 1990. Symptomatic employees had chest radiography, pulmonary function tests, high-resolution computed tomography, and serologic testing. Those with unexplained radiographic or pulmonary function abnormalities underwent bronchoalveolar lavage, lung biopsy, or both. The case definition of "flock worker's lung" required histologic evidence of interstitial lung disease (or lavage evidence of lung inflammation) not explained by another condition. Eight cases of flock worker's lung were identified at the Rhode Island plant. Three cases were characterized by a high proportion of eosinophils (25% to 40%) in lavage fluid. Six of the seven patients who had biopsy had histologic findings of nonspecific interstitial pneumonia, and the seventh had bronchiolitis obliterans organizing pneumonia. All seven of these patients had peribronchovascular interstitial lymphoid nodules, usually with germinal centers, and most had lymphocytic bronchiolitis and interstitial fibrosis. All improved after leaving work. Review of the Canadian tissue specimens showed many similar histologic findings. Among the 165-member study cohort, a 48-fold or greater increase was seen in the sex-adjusted incidence rate of all interstitial lung disease. Work in the nylon flocking industry poses substantial risk for a previously unrecognized occupational interstitial lung disease. Nylon fiber is the suspected cause of this condition.

  15. Lung scintigraphy; Centellograma pulmonar

    Energy Technology Data Exchange (ETDEWEB)

    Dalenz, Roberto

    1994-12-31

    A review of lung scintigraphy, perfusion scintigraphy with SPECT, lung ventilation SPECT, blood pool SPECT. The procedure of lung perfusion studies, radiopharmaceutical, administration and clinical applications, imaging processing .Results encountered and evaluation criteria after Biello and Pioped. Recommendations and general considerations have been studied about relation of this radiopharmaceutical with other pathologies.

  16. Donor Lung Procurement by Surgical Fellow with an Expectation of High Rate of Lung Utilisation.

    Science.gov (United States)

    Smail, Hassiba; Saxena, Pankaj; Wallinder, Andreas; Lin, Enjarn; Snell, Gregory I; Hobson, Jamie; Zimmet, Adam D; Marasco, Silvana F; McGiffin, David C

    2017-12-22

    There is an ever increasing demand for donor lungs in patients waiting for transplantation. Lungs of many potential donors will be rejected if the standard criteria for donor assessment are followed. We have expanded our donor lung pool by accepting marginal donors and establishing a donation after circulatory death program. We have achieved comparable results using marginal donors and accepting donor lungs following donation after circulatory death. We present our assessment and technical guidelines on lung procurement taking into consideration an increasingly complex cohort of lung donors. These guidelines form the basis of the lung procurement training program involving surgical Fellows at the Alfred Hospital in Melbourne, Australia. Copyright © 2017 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  17. Nuclear magnetic resonance relaxation times for human lung cancer and lung tissues

    International Nuclear Information System (INIS)

    Matsuura, Yoshifumi; Shioya, Sumie; Kurita, Daisaku; Ohta, Takashi; Haida, Munetaka; Ohta, Yasuyo; Suda, Syuichi; Fukuzaki, Minoru.

    1994-01-01

    We investigated the nuclear magnetic resonance (NMR) relaxation times, T 1 and T 2 , for lung cancer tissue, and other samples of lung tissue obtained from surgical specimens. The samples were nine squamous cell carcinomas, five necrotic squamous cell carcinomas, 15 adenocarcinomas, two benign mesotheliomas, and 13 fibrotic lungs. The relaxation times were measured with a 90 MHz NMR spectrometer and the results were correlated with histological changes. The values of T 1 and T 2 for squamous cell carcinoma and mesothelioma were significantly longer than those of adenocarcinoma and fibrotic lung tissue. There were no significant differences in values of T 1 and T 2 between adenocarcinoma and lung tissue. The values of T 1 and T 2 for benign mesothelioma were similar to those of squamous cell carcinoma, which suggested that increases in T 1 and T 2 are not specific to malignant tissues. (author)

  18. Mesenchymal Stem Cells Adopt Lung Cell Phenotype in Normal and Radiation-induced Lung Injury Conditions.

    Science.gov (United States)

    Maria, Ola M; Maria, Ahmed M; Ybarra, Norma; Jeyaseelan, Krishinima; Lee, Sangkyu; Perez, Jessica; Shalaby, Mostafa Y; Lehnert, Shirley; Faria, Sergio; Serban, Monica; Seuntjens, Jan; El Naqa, Issam

    2016-04-01

    Lung tissue exposure to ionizing irradiation can invariably occur during the treatment of a variety of cancers leading to increased risk of radiation-induced lung disease (RILD). Mesenchymal stem cells (MSCs) possess the potential to differentiate into epithelial cells. However, cell culture methods of primary type II pneumocytes are slow and cannot provide a sufficient number of cells to regenerate damaged lungs. Moreover, effects of ablative radiation doses on the ability of MSCs to differentiate in vitro into lung cells have not been investigated yet. Therefore, an in vitro coculture system was used, where MSCs were physically separated from dissociated lung tissue obtained from either healthy or high ablative doses of 16 or 20 Gy whole thorax irradiated rats. Around 10±5% and 20±3% of cocultured MSCs demonstrated a change into lung-specific Clara and type II pneumocyte cells when MSCs were cocultured with healthy lung tissue. Interestingly, in cocultures with irradiated lung biopsies, the percentage of MSCs changed into Clara and type II pneumocytes cells increased to 40±7% and 50±6% at 16 Gy irradiation dose and 30±5% and 40±8% at 20 Gy irradiation dose, respectively. These data suggest that MSCs to lung cell differentiation is possible without cell fusion. In addition, 16 and 20 Gy whole thorax irradiation doses that can cause varying levels of RILD, induced different percentages of MSCs to adopt lung cell phenotype compared with healthy lung tissue, providing encouraging outlook for RILD therapeutic intervention for ablative radiotherapy prescriptions.

  19. Statistical lung model for microdosimetry

    International Nuclear Information System (INIS)

    Fisher, D.R.; Hadley, R.T.

    1984-03-01

    To calculate the microdosimetry of plutonium in the lung, a mathematical description is needed of lung tissue microstructure that defines source-site parameters. Beagle lungs were expanded using a glutaraldehyde fixative at 30 cm water pressure. Tissue specimens, five microns thick, were stained with hematoxylin and eosin then studied using an image analyzer. Measurements were made along horizontal lines through the magnified tissue image. The distribution of air space and tissue chord lengths and locations of epithelial cell nuclei were recorded from about 10,000 line scans. The distribution parameters constituted a model of lung microstructure for predicting the paths of random alpha particle tracks in the lung and the probability of traversing biologically sensitive sites. This lung model may be used in conjunction with established deposition and retention models for determining the microdosimetry in the pulmonary lung for a wide variety of inhaled radioactive materials

  20. Telomerase in lung cancer diagnostics

    International Nuclear Information System (INIS)

    Kovkarova, E.; Stefanovski, T.; Dimov, A.; Naumovski, J.

    2003-01-01

    Background. Telomerase is a ribonucleoprotein that looks after the telomeric cap of the linear chromosomes maintaining its length. It is over expressed in tumour tissues, but not in normal somatic cells. Therefore the aim of this study was to determine the telomerase activity in lung cancer patients as novel marker for lung cancer detection evaluating the influence of tissue/cell obtaining technique. Material and methods. Using the TRAP (telomeric repeat amplification protocol), telomerase activity was determined in material obtained from bronchobiopsy (60 lung cancer patients compared with 20 controls) and washings from transthoracic fine needle aspiration biopsy performed in 10 patients with peripheral lung tumours. Results. Telomerase activity was detected in 75% of the lung cancer bronchobyopsies, and in 100% in transthoracic needle washings. Conclusions. Measurement of telomerase activity can contribute in fulfilling the diagnosis of lung masses and nodules suspected for lung cancer. (author)

  1. Measurement of asbestos bodies in lung tissue of autopsy cases diagnosed with primary lung cancer

    International Nuclear Information System (INIS)

    Idei, Yuka; Kamada, Satoe; Matsumoto, Shoji; Ohnishi, Kazuo; Kitazawa, Riko; Kitazawa, Sohei

    2007-01-01

    To investigate the relation between asbestos-related lung cancer and the concentration of asbestos bodies in lung tissue, we analyzed the concentration in 24 autopsy cases diagnosed with primary lung cancer, with regard to the gender, age, histological type of lung cancer and occupation of each case. The asbestos bodies were measured according to Kohyama's method. Positive cases (more than 5,000 bodies per 1 g of dry lung tissue) were further analyzed for asbestosis and pleural plaques by chest X-ray and chest CT. Two cases exhibited more than 5,000 bodies, five cases between 1,000 and 5,000, and seventeen cases less than 1,000. The occupation of the two positive cases was not informative: one demonstrated neither asbestosis nor pleural plaques, and the other showed only pleural plaques. Although the number of cases of asbestos-related lung cancer is minimal among all lung cancer cases, the number of the former may exceed that of mesothelioma patients. Not only physicians but also radiologists, surgeons and pathologists need to collaborate in the diagnosis of asbestos-related lung cancer. (author)

  2. Effect of increases in lung volume on clearance of aerosolized solute from human lungs

    Energy Technology Data Exchange (ETDEWEB)

    Marks, J.D.; Luce, J.M.; Lazar, N.M.; Wu, J.N.; Lipavsky, A.; Murray, J.F.

    1985-10-01

    To study the effect of increases in lung volume on solute uptake, we measured clearance of /sup 99m/Tc-diethylenetriaminepentaacetic acid (Tc-DTPA) at different lung volumes in 19 healthy humans. Seven subjects inhaled aerosols (1 micron activity median aerodynamic diam) at ambient pressure; clearance and functional residual capacity (FRC) were measured at ambient pressure (control) and at increased lung volume produced by positive pressure (12 cmH2O continuous positive airway pressure (CPAP)) or negative pressure (voluntary breathing). Six different subjects inhaled aerosol at ambient pressure; clearance and FRC were measured at ambient pressure and CPAP of 6, 12, and 18 cmH2O pressure. Six additional subjects inhaled aerosol at ambient pressure or at CPAP of 12 cmH2O; clearance and FRC were determined at CPAP of 12 cmH2O. According to the results, Tc-DTPA clearance from human lungs is accelerated exponentially by increases in lung volume, this effect occurs whether lung volume is increased by positive or negative pressure breathing, and the effect is the same whether lung volume is increased during or after aerosol administration. The effect of lung volume must be recognized when interpreting the results of this method.

  3. Effect of increases in lung volume on clearance of aerosolized solute from human lungs

    International Nuclear Information System (INIS)

    Marks, J.D.; Luce, J.M.; Lazar, N.M.; Wu, J.N.; Lipavsky, A.; Murray, J.F.

    1985-01-01

    To study the effect of increases in lung volume on solute uptake, we measured clearance of /sup 99m/Tc-diethylenetriaminepentaacetic acid (Tc-DTPA) at different lung volumes in 19 healthy humans. Seven subjects inhaled aerosols (1 micron activity median aerodynamic diam) at ambient pressure; clearance and functional residual capacity (FRC) were measured at ambient pressure (control) and at increased lung volume produced by positive pressure [12 cmH 2 O continuous positive airway pressure (CPAP)] or negative pressure (voluntary breathing). Six different subjects inhaled aerosol at ambient pressure; clearance and FRC were measured at ambient pressure and CPAP of 6, 12, and 18 cmH 2 O pressure. Six additional subjects inhaled aerosol at ambient pressure or at CPAP of 12 cmH 2 O; clearance and FRC were determined at CPAP of 12 cmH 2 O. According to the results, Tc-DTPA clearance from human lungs is accelerated exponentially by increases in lung volume, this effect occurs whether lung volume is increased by positive or negative pressure breathing, and the effect is the same whether lung volume is increased during or after aerosol administration. The effect of lung volume must be recognized when interpreting the results of this method

  4. New estimates for human lung dimensions

    International Nuclear Information System (INIS)

    Kennedy, Christine; Sidavasan, Sivalal; Kramer, Gary

    2008-01-01

    Full text: The currently used lung dimensions in dosimetry were originally estimated in the 1940s from Army recruits. This study provides new estimates of lung dimensions based on images acquired from a sample from the general population (varying age and sex). Building accurate models, called phantoms, of the human lung requires that the spatial dimensions (length, width, and depth) be quantified, in addition to volume. Errors in dose estimates may result from improperly sized lungs as the counting efficiency of externally mounted detectors (e.g., in a lung counter) is dependent on the position of internally deposited radioactive material (i.e., the size of the lung). This study investigates the spatial dimensions of human lungs. Lung phantoms have previously been made in one of two sizes. The Lawrence Livermore National Laboratory Torso Phantom (LLNL) has deep, short lungs whose dimensions do not comply well with the data published in Report 23 (Reference Man) issued by the International Commission on Radiological Protection (ICRP). The Japanese Atomic Energy Research Institute Torso Phantom(JAERI), has longer, shallower lungs that also deviate from the ICRP values. However, careful examination of the ICRP recommended values shows that they are soft. In fact, they have been dropped from the ICRP's Report 89 which updates Report 23. Literature surveys have revealed a wealth of information on lung volume, but very little data on the spatial dimensions of human lungs. Better lung phantoms need to be constructed to more accurately represent a person so that dose estimates may be quantified more accurately in view of the new, lower, dose limits for occupationally exposed workers and the general public. Retrospective chest images of 60 patients who underwent imaging of the chest- lungs as part of their healthy persons occupational screening for lung disease were chosen. The chosen normal lung images represent the general population). Ages, gender and weight of the

  5. Stages of Small Cell Lung Cancer

    Science.gov (United States)

    ... Lung Cancer Prevention Lung Cancer Screening Research Small Cell Lung Cancer Treatment (PDQ®)–Patient Version General Information About Small Cell Lung Cancer Go to Health Professional Version Key Points Small ...

  6. Hypercapnic acidosis modulates inflammation, lung mechanics, and edema in the isolated perfused lung.

    Science.gov (United States)

    De Smet, Hilde R; Bersten, Andrew D; Barr, Heather A; Doyle, Ian R

    2007-12-01

    Low tidal volume (V(T)) ventilation strategies may be associated with permissive hypercapnia, which has been shown by ex vivo and in vivo studies to have protective effects. We hypothesized that hypercapnic acidosis may be synergistic with low V(T) ventilation; therefore, we studied the effects of hypercapnia and V(T) on unstimulated and lipopolysaccharide-stimulated isolated perfused lungs. Isolated perfused rat lungs were ventilated for 2 hours with low (7 mL/kg) or moderately high (20 mL/kg) V(T) and 5% or 20% CO(2), with lipopolysaccharide or saline added to the perfusate. Hypercapnia resulted in reduced pulmonary edema, lung stiffness, tumor necrosis factor alpha (TNF-alpha) and interleukin 6 (IL-6) in the lavage and perfusate. The moderately high V(T) did not cause lung injury but increased lavage IL-6 and perfusate IL-6 as well as TNF-alpha. Pulmonary edema and respiratory mechanics improved, possibly as a result of a stretch-induced increase in surfactant turnover. Lipopolysaccharide did not induce significant lung injury. We conclude that hypercapnia exerts a protective effect by modulating inflammation, lung mechanics, and edema. The moderately high V(T) used in this study stimulated inflammation but paradoxically improved edema and lung mechanics with an associated increase in surfactant release.

  7. Factors influencing the decline in lung density in a Danish lung cancer screening cohort

    DEFF Research Database (Denmark)

    Shaker, Saher B.; Dirksen, Asger; Lo, Pechin Chien Pau

    2012-01-01

    Lung cancer screening trials provide an opportunity to study the natural history of emphysema by using CT lung density as a surrogate parameter.In the Danish Lung Cancer Screening Trial, 2,052 participants were included. At screening rounds, smoking habits were recorded and spirometry was performed....... CT lung density was measured as the volume-adjusted 15th percentile density (PD15). A mixed effects model was used with former smoking males with...

  8. Cannabis smoking and lung cancer risk: Pooled analysis in the International Lung Cancer Consortium

    OpenAIRE

    Zhang, L.R.; Morgenstern, H.; Greenland, S.; Chang, S.C.; Lazarus, P.; Teare, M.D.; Woll, P.J.; Orlow, I.; Cox, B.; Brhane, Y.; Liu, G.; Hung, R.J.

    2015-01-01

    To investigate the association between cannabis smoking and lung cancer risk, data on 2,159 lung cancer cases and 2,985 controls were pooled from 6 case-control studies in the US, Canada, UK, and New Zealand within the International Lung Cancer Consortium. Study-specific associations between cannabis smoking and lung cancer were estimated using unconditional logistic regression adjusting for sociodemographic factors, tobacco smoking status and pack-years; odds-ratio estimates were pooled usin...

  9. Measurement of lung volume by lung perfusion scanning using SPECT and prediction of postoperative respiratory function

    International Nuclear Information System (INIS)

    Andou, Akio; Shimizu, Nobuyosi; Maruyama, Shuichiro

    1992-01-01

    Measurement of lung volume by lung perfusion scanning using single photon emission computed tomography (SPECT) and its usefulness for the prediction of respiratory function after lung resection were investigated. The lung volumes calculated in 5 patients by SPECT (threshold level 20%) using 99m Tc-macroaggregated albumin (MAA), related very closely to the actually measured lung volumes. This results prompted us to calculate the total lung volume and the volume of the lobe to be resected in 18 patients with lung cancer by SPECT. Based on the data obtained, postoperative respiratory function was predicted. The predicted values of forced vital capacity (FVC), forced expiratory volume (FEV 1.0 ), and maximum vital volume (MVV) showed closer correlations with the actually measured postoperative values (FVC, FEV 1.0 , MVV : r=0.944, r=0.917, r=0.795 respectively), than the values predicted by the ordinary lung perfusion scanning. This method facilitates more detailed evaluation of local lung function on a lobe-by-lobe basis, and can be applied clinically to predict postoperative respiratory function. (author)

  10. Lung release of HIPDM: A new index of lung dysfunction for clinical and experimental studies

    International Nuclear Information System (INIS)

    Pistolesi, M.; Miniati, M.; Ghelarducci, L.

    1985-01-01

    Lung uptake, metabolism and release of amines has been experimentally documented. The authors studied in rabbit and man the lung kinetics of radioiodinated N-N-N'-trimethyl-N'-(2-hydroxy-3-methyl-5-iodobenzyl)-1, 3-propanediamine (HIPDM). In rabbits, after i.v. injection, 95% of HIPDM is kept within the lungs and is then released with a mean time (t-bar) of several hours as assessed both in vivo, by gamma camera external counting (n=5; t-bar=7.0 hrs), and in vitro by measuring activity in lung homogenates at various times after injection (n=56; t-bar=7.6 hrs). In 10 healthy non smoking subjects t-bar was 6.4 +- 1 hrs, whereas it was 12.1 +- 2 hrs in 10 asymptomatic smokers with normal pulmonary function tests. Preliminary clinical studies showed that HIPDM lung release is delayed in non smoking patients with primary pulmonary hypertension (n=4; t-bar=11.5 +- 2 hrs) and to a greater extent in adult respiratory distress syndrome (n=4; t-bar=25.8 +- 5hrs), whereas it was not significantly affected in cardiogenic pulmonary edema (n=4; t-bar=8.8 +- 2 hrs). Hence, both smoke exposure and injury to the lung microcirculation may impair HIPDM lung kinetics. HIPDM external counting may therefore provide a new index of lung dysfunction in man. Rabbit can be used as a model to evaluate HIPDM lung kinetics in experimentally induced lung injury

  11. The aging lung

    Directory of Open Access Journals (Sweden)

    Lowery EM

    2013-11-01

    Full Text Available Erin M Lowery,1 Aleah L Brubaker,2 Erica Kuhlmann,1 Elizabeth J Kovacs31Division of Pulmonary and Critical Care Medicine, Department of Internal Medicine at Loyola University Medical Center, 2Loyola University Stritch School of Medicine, 3Department of Surgery, Loyola University Medical Center, Maywood, IL, USAAbstract: There are many age-associated changes in the respiratory and pulmonary immune system. These changes include decreases in the volume of the thoracic cavity, reduced lung volumes, and alterations in the muscles that aid respiration. Muscle function on a cellular level in the aging population is less efficient. The elderly population has less pulmonary reserve, and cough strength is decreased in the elderly population due to anatomic changes and muscle atrophy. Clearance of particles from the lung through the mucociliary elevator is decreased and associated with ciliary dysfunction. Many complex changes in immunity with aging contribute to increased susceptibility to infections including a less robust immune response from both the innate and adaptive immune systems. Considering all of these age-related changes to the lungs, pulmonary disease has significant consequences for the aging population. Chronic lower respiratory tract disease is the third leading cause of death in people aged 65 years and older. With a large and growing aging population, it is critical to understand how the body changes with age and how this impacts the entire respiratory system. Understanding the aging process in the lung is necessary in order to provide optimal care to our aging population. This review focuses on the nonpathologic aging process in the lung, including structural changes, changes in muscle function, and pulmonary immunologic function, with special consideration of obstructive lung disease in the elderly.Keywords: aging, lung, pulmonary immunology, COPD

  12. Lung deformations and radiation-induced regional lung collapse in patients treated with stereotactic body radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Diot, Quentin, E-mail: quentin.diot@ucdenver.edu; Kavanagh, Brian; Vinogradskiy, Yevgeniy; Gaspar, Laurie; Miften, Moyed [Department of Radiation Oncology, University of Colorado School of Medicine, Aurora, Colorado 80045 (United States); Garg, Kavita [Department of Radiology, University of Colorado School of Medicine, Aurora, Colorado 80045 (United States)

    2015-11-15

    Purpose: To differentiate radiation-induced fibrosis from regional lung collapse outside of the high dose region in patients treated with stereotactic body radiation therapy (SBRT) for lung tumors. Methods: Lung deformation maps were computed from pre-treatment and post-treatment computed tomography (CT) scans using a point-to-point translation method. Fifty anatomical landmarks inside the lung (vessel or airway branches) were matched on planning and follow-up scans for the computation process. Two methods using the deformation maps were developed to differentiate regional lung collapse from fibrosis: vector field and Jacobian methods. A total of 40 planning and follow-ups CT scans were analyzed for 20 lung SBRT patients. Results: Regional lung collapse was detected in 15 patients (75%) using the vector field method, in ten patients (50%) using the Jacobian method, and in 12 patients (60%) by radiologists. In terms of sensitivity and specificity the Jacobian method performed better. Only weak correlations were observed between the dose to the proximal airways and the occurrence of regional lung collapse. Conclusions: The authors presented and evaluated two novel methods using anatomical lung deformations to investigate lung collapse and fibrosis caused by SBRT treatment. Differentiation of these distinct physiological mechanisms beyond what is usually labeled “fibrosis” is necessary for accurate modeling of lung SBRT-induced injuries. With the help of better models, it becomes possible to expand the therapeutic benefits of SBRT to a larger population of lung patients with large or centrally located tumors that were previously considered ineligible.

  13. Alveolar epithelial fluid transport capacity in reperfusion lung injury after lung transplantation.

    Science.gov (United States)

    Ware, L B; Golden, J A; Finkbeiner, W E; Matthay, M A

    1999-03-01

    Reperfusion lung injury is an important cause of morbidity and mortality after orthotopic lung transplantation. The purpose of this study was to investigate the function of the alveolar epithelium in the setting of reperfusion lung injury. Simultaneous samples of pulmonary edema fluid and plasma were collected from eight patients with severe post-transplantation reperfusion edema. The edema fluid to plasma protein ratio was measured, an indicator of alveolar-capillary barrier permeability. The initial edema fluid to plasma protein ratio was > 0.75 in six of eight patients, confirming the presence of increased permeability of the alveolar-capillary barrier. Graft ischemic time was positively correlated with the degree of permeability (r = 0.77, p mean +/- SD). Alveolar fluid clearance was calculated from serial samples in six patients. Intact alveolar fluid clearance correlated with less histologic injury, rapid resolution of hypoxemia, and more rapid resolution of radiographic infiltrates. The two patients with no net alveolar fluid clearance had persistent hypoxemia and more severe histologic injury. This study provides the first direct evidence that increased permeability to protein is the usual cause of reperfusion edema after lung transplantation, with longer ischemic times associated with greater permeability to protein in the transplanted lung. The high rates of alveolar fluid clearance indicate that the fluid transport capacity of the alveolar epithelium may be well preserved in the allograft despite reperfusion lung injury. The ability to reabsorb fluid from the alveolar space was a marker of less severe reperfusion injury, whereas the degree of alveolar-capillary barrier permeability to protein was not. Measurement of alveolar fluid clearance may be useful to assess the severity of reperfusion lung injury and to predict outcome when pulmonary edema develops after lung transplantation.

  14. Why does the lung hyperinflate?

    Science.gov (United States)

    Ferguson, Gary T

    2006-04-01

    Patients with chronic obstructive pulmonary disease (COPD) often have some degree of hyperinflation of the lungs. Hyperinflated lungs can produce significant detrimental effects on breathing, as highlighted by improvements in patient symptoms after lung volume reduction surgery. Measures of lung volumes correlate better with impairment of patient functional capabilities than do measures of airflow. Understanding the mechanisms by which hyperinflation occurs in COPD provides better insight into how treatments can improve patients' health. Both static and dynamic processes can contribute to lung hyperinflation in COPD. Static hyperinflation is caused by a decrease in elasticity of the lung due to emphysema. The lungs exert less recoil pressure to counter the recoil pressure of the chest wall, resulting in an equilibrium of recoil forces at a higher resting volume than normal. Dynamic hyperinflation is more common and can occur independent of or in addition to static hyperinflation. It results from air being trapped within the lungs after each breath due to a disequilibrium between the volumes inhaled and exhaled. The ability to fully exhale depends on the degree of airflow limitation and the time available for exhalation. These can both vary, causing greater hyperinflation during exacerbations or increased respiratory demand, such as during exercise. Reversibility of dynamic hyperinflation offers the possibility for intervention. Use of bronchodilators with prolonged durations of action, such as tiotropium, can sustain significant reductions in lung inflation similar in effect to lung volume reduction surgery. How efficacy of bronchodilators is assessed may, therefore, need to be reevaluated.

  15. SU-E-J-249: Correlation of Mean Lung Ventilation Value with Ratio of Total Lung Volumes

    International Nuclear Information System (INIS)

    Yu, N; Qu, H; Xia, P

    2014-01-01

    Purpose: Lung ventilation function measured from 4D-CT and from breathing correlated CT images is a novel concept to incorporate the lung physiologic function into treatment planning of radiotherapy. The calculated ventilation functions may vary from different breathing patterns, affecting evaluation of the treatment plans. The purpose of this study is to correlate the mean lung ventilation value with the ratio of the total lung volumes obtained from the relevant CTs. Methods: A ventilation map was calculated from the variations of voxel-to-voxel CT densities from two breathing phases from either 4D-CT or breathing correlated CTs. An open source image registration tool of Plastimatch was used to deform the inhale phase images to the exhale phase images. To calculate the ventilation map inside lung, the whole lung was delineated and the tissue outside the lung was masked out. With a software tool developed in house, the 3D ventilation map was then converted in the DICOM format associated with the planning CT images. The ventilation map was analyzed on a clinical workstation. To correlate ventilation map thus calculated with lung volume change, the total lung volume change was compared the mean ventilation from our method. Results: Twenty two patients who underwent stereotactic body irradiation for lung cancer was selected for this retrospective study. For this group of patients, the ratio of lung volumes for the inhale (Vin ) and exhale phase (Vex ) was shown to be linearly related to the mean of the local ventilation (Vent), Vin/Vex=1.+0.49*Vent (R2=0.93, p<0.01). Conclusion: The total lung volume change is highly correlated with the mean of local ventilation. The mean of local ventilation may be useful to assess the patient's lung capacity

  16. The effect of irradiation on lung function and perfusion in patients with lung cancer

    International Nuclear Information System (INIS)

    Abratt, Raymond P.; Willcox, Paul A.

    1995-01-01

    Purpose: To prospectively study the changes in lung function in patients with lung carcinoma treated with relatively high doses of irradiation. Methods and Materials: Lung function was assessed prior to and at 6 and 12 months following radiation therapy by a clinical dyspnea score, formal pulmonary function tests (lung volume spirometry and diffusion capacity) as well as an ipsilateral hemithorax lung perfusion scan. Changes in dyspnea score were evaluated by the chi-square and the Fishers exact test. Changes in formal lung function tests were compared with the t-test for dependent data and correlations with the t-test for independent data. Fifty-one patients were entered into the study. There were 42 evaluable patients at 6 months after irradiation and 22 evaluable patients at 12 months after irradiation. Results: A worsening of dyspnea score from 1 to 2, which is clinically acceptable, occurred in 50% or more of patients. However, a dyspnea score of 3, which is a serious complication, developed in only 5% of patients. The diffusion capacity (DLCO) decreased by 14% at 6 months and 12% at 12 months) (p < 0.0001). The forced vital capacity and total lung capacity decreased between 6% and 8% at 6 month and 12 months, which was statistically significant. The forced expiratory volume in 1 s decreased between 2 and 3% at 6 month and 12 months, which was not statistically significant. The ipsilateral hemithorax perfusion decreased by 17 and 20% at 6 and 12 months (p < 0.0001). There was no correlation between the initial hemithorax perfusion, or its decrease at follow up and the decrease in DLCO. Conclusion: Lung irradiation results in some loss of lung function in patients with lung cancer with a projected survival of 6 months or more. The pretreatment DLCO assessment should be useful in predicting clinical tolerance to irradiation

  17. Obesity-Induced Endoplasmic Reticulum Stress Causes Lung Endothelial Dysfunction and Promotes Acute Lung Injury.

    Science.gov (United States)

    Shah, Dilip; Romero, Freddy; Guo, Zhi; Sun, Jianxin; Li, Jonathan; Kallen, Caleb B; Naik, Ulhas P; Summer, Ross

    2017-08-01

    Obesity is a significant risk factor for acute respiratory distress syndrome. The mechanisms underlying this association are unknown. We recently showed that diet-induced obese mice exhibit pulmonary vascular endothelial dysfunction, which is associated with enhanced susceptibility to LPS-induced acute lung injury. Here, we demonstrate that lung endothelial dysfunction in diet-induced obese mice coincides with increased endoplasmic reticulum (ER) stress. Specifically, we observed enhanced expression of the major sensors of misfolded proteins, including protein kinase R-like ER kinase, inositol-requiring enzyme α, and activating transcription factor 6, in whole lung and in primary lung endothelial cells isolated from diet-induced obese mice. Furthermore, we found that primary lung endothelial cells exposed to serum from obese mice, or to saturated fatty acids that mimic obese serum, resulted in enhanced expression of markers of ER stress and the induction of other biological responses that typify the lung endothelium of diet-induced obese mice, including an increase in expression of endothelial adhesion molecules and a decrease in expression of endothelial cell-cell junctional proteins. Similar changes were observed in lung endothelial cells and in whole-lung tissue after exposure to tunicamycin, a compound that causes ER stress by blocking N-linked glycosylation, indicating that ER stress causes endothelial dysfunction in the lung. Treatment with 4-phenylbutyric acid, a chemical protein chaperone that reduces ER stress, restored vascular endothelial cell expression of adhesion molecules and protected against LPS-induced acute lung injury in diet-induced obese mice. Our work indicates that fatty acids in obese serum induce ER stress in the pulmonary endothelium, leading to pulmonary endothelial cell dysfunction. Our work suggests that reducing protein load in the ER of pulmonary endothelial cells might protect against acute respiratory distress syndrome in obese

  18. Response function during oxygen sputter profiling and its application to deconvolution of ultrashallow B depth profiles in Si

    International Nuclear Information System (INIS)

    Shao Lin; Liu Jiarui; Wang Chong; Ma, Ki B.; Zhang Jianming; Chen, John; Tang, Daniel; Patel, Sanjay; Chu Weikan

    2003-01-01

    The secondary ion mass spectrometry (SIMS) response function to a B 'δ surface layer' has been investigated. Using electron-gun evaporation combined with liquid nitrogen cooling of target, we are able to deposit an ultrathin B layer without detectable island formation. The B spatial distribution obtained from SIMS is exponentially decaying with a decay length approximately a linear function of the incident energy of the oxygen during the SIMS analysis. Deconvolution with the response function has been applied to reconstruct the spatial distribution of ultra-low-energy B implants. A correction to depth and yield scales due to transient sputtering near the Si surface region was also applied. Transient erosion shifts the profile shallower, but beam mixing shifts it deeper. These mutually compensating effects make the adjusted distribution almost the same as original data. The one significant difference is a buried B peak observed near the surface region

  19. Radiofrequency Ablation of Lung Tumors

    Science.gov (United States)

    ... News Physician Resources Professions Site Index A-Z Radiofrequency Ablation (RFA) / Microwave Ablation (MWA) of Lung Tumors ... and Microwave Ablation of Lung Tumors? What are Radiofrequency and Microwave Ablation of Lung Tumors? Radiofrequency ablation, ...

  20. EPR spectrum deconvolution and dose assessment of fossil tooth enamel using maximum likelihood common factor analysis

    International Nuclear Information System (INIS)

    Vanhaelewyn, G.; Callens, F.; Gruen, R.

    2000-01-01

    In order to determine the components which give rise to the EPR spectrum around g = 2 we have applied Maximum Likelihood Common Factor Analysis (MLCFA) on the EPR spectra of enamel sample 1126 which has previously been analysed by continuous wave and pulsed EPR as well as EPR microscopy. MLCFA yielded agreeing results on three sets of X-band spectra and the following components were identified: an orthorhombic component attributed to CO - 2 , an axial component CO 3- 3 , as well as four isotropic components, three of which could be attributed to SO - 2 , a tumbling CO - 2 and a central line of a dimethyl radical. The X-band results were confirmed by analysis of Q-band spectra where three additional isotropic lines were found, however, these three components could not be attributed to known radicals. The orthorhombic component was used to establish dose response curves for the assessment of the past radiation dose, D E . The results appear to be more reliable than those based on conventional peak-to-peak EPR intensity measurements or simple Gaussian deconvolution methods