WorldWideScience

Sample records for reconstruction based trajectory

  1. Mid-Ventilation Concept for Mobile Pulmonary Tumors: Internal Tumor Trajectory Versus Selective Reconstruction of Four-Dimensional Computed Tomography Frames Based on External Breathing Motion

    International Nuclear Information System (INIS)

    Guckenberger, Matthias; Wilbert, Juergen; Krieger, Thomas; Richter, Anne; Baier, Kurt; Flentje, Michael

    2009-01-01

    Purpose: To evaluate the accuracy of direct reconstruction of mid-ventilation and peak-phase four-dimensional (4D) computed tomography (CT) frames based on the external breathing signal. Methods and Materials: For 11 patients with 15 pulmonary targets, a respiration-correlated CT study (4D CT) was acquired for treatment planning. After retrospective time-based sorting of raw projection data and reconstruction of eight CT frames equally distributed over the breathing cycle, mean tumor position (P mean ), mid-ventilation frame, and breathing motion were evaluated based on the internal tumor trajectory. Analysis of the external breathing signal (pressure sensor around abdomen) with amplitude-based sorting of projections was performed for direct reconstruction of the mid-ventilation frame and frames at peak phases of the breathing cycle. Results: On the basis of the eight 4D CT frames equally spaced in time, tumor motion was largest in the craniocaudal direction, with 12 ± 7 mm on average. Tumor motion between the two frames reconstructed at peak phases was not different in the craniocaudal and anterior-posterior directions but was systematically smaller in the left-right direction by 1 mm on average. The 3-dimensional distance between P mean and the tumor position in the mid-ventilation frame based on the internal tumor trajectory was 1.2 ± 1 mm. Reconstruction of the mid-ventilation frame at the mean amplitude position of the external breathing signal resulted in tumor positions 2.0 ± 1.1 mm distant from P mean . Breathing-induced motion artifacts in mid-ventilation frames caused negligible changes in tumor volume and shape. Conclusions: Direct reconstruction of the mid-ventilation frame and frames at peak phases based on the external breathing signal was reliable. This makes the reconstruction of only three 4D CT frames sufficient for application of the mid-ventilation technique in clinical practice.

  2. Reconstruction of equilibrium trajectories during whole-body movements.

    Science.gov (United States)

    Domen, K; Latash, M L; Zatsiorsky, V M

    1999-03-01

    The framework of the equilibrium-point hypothesis was used to reconstruct equilibrium trajectories (ETs) of the ankle, hip and body center of mass during quick voluntary hip flexions ('Japanese courtesy bow') by standing subjects. Different spring loads applied to the subject's back were used to introduce smooth perturbations that are necessary to reconstruct ETs based on a series of trials at the same task. Time patterns of muscle torques were calculated using inverse dynamics techniques. A second-order linear model was employed to calculate the instantaneous position of the spring-like joint or center of mass characteristic at different times during the movement. ETs of the joints and of the center of mass had significantly different shapes from the actual trajectories. Integral measures of electromyographic bursts of activity in postural muscles demonstrated a relation to muscle length corresponding to the equilibrium-point hypothesis.

  3. IRVE-II Post-Flight Trajectory Reconstruction

    Science.gov (United States)

    O'Keefe, Stephen A.; Bose, David M.

    2010-01-01

    NASA s Inflatable Re-entry Vehicle Experiment (IRVE) II successfully demonstrated an inflatable aerodynamic decelerator after being launched aboard a sounding rocket from Wallops Flight Facility (WFF). Preliminary day of flight data compared well with pre-flight Monte Carlo analysis, and a more complete trajectory reconstruction performed with an Extended Kalman Filter (EKF) approach followed. The reconstructed trajectory and comparisons to an attitude solution provided by NASA Sounding Rocket Operations Contract (NSROC) personnel at WFF are presented. Additional comparisons are made between the reconstructed trajectory and pre and post-flight Monte Carlo trajectory predictions. Alternative observations of the trajectory are summarized which leverage flight accelerometer measurements, the pre-flight aerodynamic database, and on-board flight video. Finally, analysis of the payload separation and aeroshell deployment events are presented. The flight trajectory is reconstructed to fidelity sufficient to assess overall project objectives related to flight dynamics and overall, IRVE-II flight dynamics are in line with expectations

  4. Reconstruction of pre-instrumental storm track trajectories across the U.S. Pacific Northwest using circulation-based field sampling of Pinus Ponderosa

    Science.gov (United States)

    Wise, E.; Dannenberg, M. P.

    2015-12-01

    The trajectory of incoming storms from the Pacific Ocean is a key influence on drought and flood regimes in western North America. Flow is typically from the west in a zonal pattern, but decadal shifts between zonal and meridional flow have been identified as key features in hydroclimatic variability over the instrumental period. In Washington and most of the Pacific Northwest, there tend to be lower-latitude storm systems that result in decreased precipitation in El Niño years. However, the Columbia Basin in central Washington behaves in opposition to the surrounding region and typically has average to above-average precipitation in El Niño years due to changing storm-track trajectories and a decreasing rain shadow effect on the leeward side of the Cascades. This direct connection between storm-track position and precipitation patterns in Washington provided an exceptional opportunity for circulation-based field sampling and chronology development. New Pinus ponderosa (Ponderosa pine) tree-ring chronologies were developed from eight sites around the Columbia Basin in Washington and used to examine year-to-year changes in moisture regimes. Results show that these sites are representative of the two distinct climate response areas. The divergence points between these two site responses allowed us to reconstruct changing precipitation patterns since the late-17th century, and to link these patterns to previously reconstructed atmospheric pressure and El Niño indices. This study highlights the potential for using synoptic climatology to inform field-based proxy collection.

  5. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  6. Direct iterative reconstruction of computed tomography trajectories (DIRECTT)

    International Nuclear Information System (INIS)

    Lange, A.; Hentschel, M.P.; Schors, J.

    2004-01-01

    The direct reconstruction approach employs an iterative procedure by selection of and angular averaging over projected trajectory data of volume elements. This avoids the blur effects of the classical Fourier method due to the sampling theorem. But longer computing time is required. The reconstructed tomographic images reveal at least the spatial resolution of the radiation detector. Any set of projection angles may be selected for the measurements. Limited rotation of the object yields still good reconstruction of details. Projections of a partial region of the object can be reconstructed without additional artifacts thus reducing the overall radiation dose. Noisy signal data from low dose irradiation have low impact on spatial resolution. The image quality is monitored during all iteration steps and is pre-selected according to the specific requirements. DIRECTT can be applied independently from the measurement equipment in addition to conventional reconstruction or as a refinement filter. (author)

  7. X-ray computed tomography reconstruction on non-standard trajectories for robotized inspection

    International Nuclear Information System (INIS)

    Banjak, Hussein

    2016-01-01

    The number of industrial applications of computed tomography (CT) is large and rapidly increasing with typical areas of use in the aerospace, automotive and transport industry. To support this growth of CT in the industrial field, the identified requirements concern firstly software development to improve the reconstruction algorithms and secondly the automation of the inspection process. Indeed, the use of robots gives more flexibility in the acquisition trajectory and allows the control of large and complex objects, which cannot be inspected using classical CT systems. In this context of new CT trend, a robotic platform has been installed at CEA LIST to better understand and solve specific challenges linked to the robotization of the CT process. The considered system integrates two robots that move the X-ray generator and detector. This thesis aims at achieving this new development. In particular, the objective is to develop and implement analytical and iterative reconstruction algorithms adapted to such robotized trajectories. The main focus of this thesis is concerned with helical-like scanning trajectories. We consider two main problems that could occur during acquisition process: truncated and limited-angle data. We present in this work experimental results for reconstruction on such non-standard trajectories. CIVA software is used to simulate these complex inspections and our developed algorithms are integrated as reconstruction tools. This thesis contains three parts. In the first part, we introduce the basic principles of CT and we present an overview of existing analytical and iterative algorithms for non-standard trajectories. In the second part, we modify the approximate helical FDK algorithm to deal with transversely truncated data and we propose a modified FDK algorithm adapted to reverse helical trajectory with the scan range less than 360 degrees. For iterative reconstruction, we propose two algebraic methods named SART-FISTA-TV and DART

  8. Space-Varying Iterative Restoration of Diffuse Optical Tomograms Reconstructed by the Photon Average Trajectories Method

    Directory of Open Access Journals (Sweden)

    Kravtsenyuk Olga V

    2007-01-01

    Full Text Available The possibility of improving the spatial resolution of diffuse optical tomograms reconstructed by the photon average trajectories (PAT method is substantiated. The PAT method recently presented by us is based on a concept of an average statistical trajectory for transfer of light energy, the photon average trajectory (PAT. The inverse problem of diffuse optical tomography is reduced to a solution of an integral equation with integration along a conditional PAT. As a result, the conventional algorithms of projection computed tomography can be used for fast reconstruction of diffuse optical images. The shortcoming of the PAT method is that it reconstructs the images blurred due to averaging over spatial distributions of photons which form the signal measured by the receiver. To improve the resolution, we apply a spatially variant blur model based on an interpolation of the spatially invariant point spread functions simulated for the different small subregions of the image domain. Two iterative algorithms for solving a system of linear algebraic equations, the conjugate gradient algorithm for least squares problem and the modified residual norm steepest descent algorithm, are used for deblurring. It is shown that a gain in spatial resolution can be obtained.

  9. Space-Varying Iterative Restoration of Diffuse Optical Tomograms Reconstructed by the Photon Average Trajectories Method

    Directory of Open Access Journals (Sweden)

    Vladimir V. Lyubimov

    2007-01-01

    Full Text Available The possibility of improving the spatial resolution of diffuse optical tomograms reconstructed by the photon average trajectories (PAT method is substantiated. The PAT method recently presented by us is based on a concept of an average statistical trajectory for transfer of light energy, the photon average trajectory (PAT. The inverse problem of diffuse optical tomography is reduced to a solution of an integral equation with integration along a conditional PAT. As a result, the conventional algorithms of projection computed tomography can be used for fast reconstruction of diffuse optical images. The shortcoming of the PAT method is that it reconstructs the images blurred due to averaging over spatial distributions of photons which form the signal measured by the receiver. To improve the resolution, we apply a spatially variant blur model based on an interpolation of the spatially invariant point spread functions simulated for the different small subregions of the image domain. Two iterative algorithms for solving a system of linear algebraic equations, the conjugate gradient algorithm for least squares problem and the modified residual norm steepest descent algorithm, are used for deblurring. It is shown that a 27% gain in spatial resolution can be obtained.

  10. Fast MR image reconstruction for partially parallel imaging with arbitrary k-space trajectories.

    Science.gov (United States)

    Ye, Xiaojing; Chen, Yunmei; Lin, Wei; Huang, Feng

    2011-03-01

    Both acquisition and reconstruction speed are crucial for magnetic resonance (MR) imaging in clinical applications. In this paper, we present a fast reconstruction algorithm for SENSE in partially parallel MR imaging with arbitrary k-space trajectories. The proposed method is a combination of variable splitting, the classical penalty technique and the optimal gradient method. Variable splitting and the penalty technique reformulate the SENSE model with sparsity regularization as an unconstrained minimization problem, which can be solved by alternating two simple minimizations: One is the total variation and wavelet based denoising that can be quickly solved by several recent numerical methods, whereas the other one involves a linear inversion which is solved by the optimal first order gradient method in our algorithm to significantly improve the performance. Comparisons with several recent parallel imaging algorithms indicate that the proposed method significantly improves the computation efficiency and achieves state-of-the-art reconstruction quality.

  11. A parallel algorithm for 3D particle tracking and Lagrangian trajectory reconstruction

    International Nuclear Information System (INIS)

    Barker, Douglas; Zhang, Yuanhui; Lifflander, Jonathan; Arya, Anshu

    2012-01-01

    Particle-tracking methods are widely used in fluid mechanics and multi-target tracking research because of their unique ability to reconstruct long trajectories with high spatial and temporal resolution. Researchers have recently demonstrated 3D tracking of several objects in real time, but as the number of objects is increased, real-time tracking becomes impossible due to data transfer and processing bottlenecks. This problem may be solved by using parallel processing. In this paper, a parallel-processing framework has been developed based on frame decomposition and is programmed using the asynchronous object-oriented Charm++ paradigm. This framework can be a key step in achieving a scalable Lagrangian measurement system for particle-tracking velocimetry and may lead to real-time measurement capabilities. The parallel tracking algorithm was evaluated with three data sets including the particle image velocimetry standard 3D images data set #352, a uniform data set for optimal parallel performance and a computational-fluid-dynamics-generated non-uniform data set to test trajectory reconstruction accuracy, consistency with the sequential version and scalability to more than 500 processors. The algorithm showed strong scaling up to 512 processors and no inherent limits of scalability were seen. Ultimately, up to a 200-fold speedup is observed compared to the serial algorithm when 256 processors were used. The parallel algorithm is adaptable and could be easily modified to use any sequential tracking algorithm, which inputs frames of 3D particle location data and outputs particle trajectories

  12. DIRECT DETECTION OF THE HELICAL MAGNETIC FIELD GEOMETRY FROM 3D RECONSTRUCTION OF PROMINENCE KNOT TRAJECTORIES

    Energy Technology Data Exchange (ETDEWEB)

    Zapiór, Maciej; Martinez-Gómez, David, E-mail: zapior.maciek@gmail.com [Physics Department, University of the Balearic Islands, Cra. de Valldemossa, km 7.5. Palma (Illes Balears), E-07122 (Spain)

    2016-02-01

    Based on the data collected by the Vacuum Tower Telescope located in the Teide Observatory in the Canary Islands, we analyzed the three-dimensional (3D) motion of so-called knots in a solar prominence of 2014 June 9. Trajectories of seven knots were reconstructed, giving information of the 3D geometry of the magnetic field. Helical motion was detected. From the equipartition principle, we estimated the lower limit of the magnetic field in the prominence to ≈1–3 G and from the Ampère’s law the lower limit of the electric current to ≈1.2 × 10{sup 9} A.

  13. Direct Detection of the Helical Magnetic Field Geometry from 3D Reconstruction of Prominence Knot Trajectories

    Science.gov (United States)

    Zapiór, Maciej; Martínez-Gómez, David

    2016-02-01

    Based on the data collected by the Vacuum Tower Telescope located in the Teide Observatory in the Canary Islands, we analyzed the three-dimensional (3D) motion of so-called knots in a solar prominence of 2014 June 9. Trajectories of seven knots were reconstructed, giving information of the 3D geometry of the magnetic field. Helical motion was detected. From the equipartition principle, we estimated the lower limit of the magnetic field in the prominence to ≈1-3 G and from the Ampère’s law the lower limit of the electric current to ≈1.2 × 109 A.

  14. Path-based Queries on Trajectory Data

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Pelekis, Nikos; Theodoridis, Yannis

    2014-01-01

    In traffic research, management, and planning a number of path-based analyses are heavily used, e.g., for computing turn-times, evaluating green waves, or studying traffic flow. These analyses require retrieving the trajectories that follow the full path being analyzed. Existing path queries cannot...... sufficiently support such path-based analyses because they retrieve all trajectories that touch any edge in the path. In this paper, we define and formalize the strict path query. This is a novel query type tailored to support path-based analysis, where trajectories must follow all edges in the path...... a specific path by only retrieving data from the first and last edge in the path. To correctly answer strict path queries existing network-constrained trajectory indexes must retrieve data from all edges in the path. An extensive performance study of NETTRA using a very large real-world trajectory data set...

  15. Voting based object boundary reconstruction

    Science.gov (United States)

    Tian, Qi; Zhang, Like; Ma, Jingsheng

    2005-07-01

    A voting-based object boundary reconstruction approach is proposed in this paper. Morphological technique was adopted in many applications for video object extraction to reconstruct the missing pixels. However, when the missing areas become large, the morphological processing cannot bring us good results. Recently, Tensor voting has attracted people"s attention, and it can be used for boundary estimation on curves or irregular trajectories. However, the complexity of saliency tensor creation limits its applications in real-time systems. An alternative approach based on tensor voting is introduced in this paper. Rather than creating saliency tensors, we use a "2-pass" method for orientation estimation. For the first pass, Sobel d*etector is applied on a coarse boundary image to get the gradient map. In the second pass, each pixel puts decreasing weights based on its gradient information, and the direction with maximum weights sum is selected as the correct orientation of the pixel. After the orientation map is obtained, pixels begin linking edges or intersections along their direction. The approach is applied to various video surveillance clips under different conditions, and the experimental results demonstrate significant improvement on the final extracted objects accuracy.

  16. Optical wedge method for spatial reconstruction of particle trajectories

    International Nuclear Information System (INIS)

    Asatiani, T.L.; Alchudzhyan, S.V.; Gazaryan, K.A.; Zograbyan, D.Sh.; Kozliner, L.I.; Krishchyan, V.M.; Martirosyan, G.S.; Ter-Antonyan, S.V.

    1978-01-01

    A technique of optical wedges allowing the full reconstruction of pictures of events in space is considered. The technique is used for the detection of particle tracks in optical wide-gap spark chambers by photographing in one projection. The optical wedges are refracting right-angle plastic prisms positioned between the camera and the spark chamber so that through them both ends of the track are photographed. A method for calibrating measurements is given, and an estimate made of the accuracy of the determination of the second projection with the help of the optical wedges

  17. Three-Dimensional Reconstruction of a Gas Bubble Trajectory in Liquid

    Directory of Open Access Journals (Sweden)

    Augustyniak Jakub

    2014-01-01

    Full Text Available The identification of the shape of the bubble trajectory is crucial for understanding the mechanism of bubble motion in liquid. In the paper it has been presented the technique of 3D bubble trajectory reconstruction using a single high speed camera and the system of mirrors. In the experiment a glass tank filled with distilled water was used. The nozzle through which the bubbles were generated was placed in the centre of the tank. The movement of the bubbles was recorded with a high speed camera, the Phantom v1610 at a 600 fps. The techniques of image analysis has been applied to determine the coordinates of mass centre of each bubble image. The 3D trajectory of bubble can be obtained by using triangulation methods. In the paper the measurement error of imaging computer tomography has been estimated. The maximum measurement error was equal to ±0,65 [mm]. Trajectories of subsequently departing bubbles were visualized.

  18. Synthetic triphones from trajectory-based feature distributions

    CSIR Research Space (South Africa)

    Badenhorst, J

    2015-11-01

    Full Text Available we reconstruct models for unseen transitions. In the current study, we restrict ourselves to triphone modelling, and aim to generate synthetic triphones from seen diphones. If this is possible, the same approach should be applicable to larger contexts... are applied in a similar fashion. Using trajectory models for the same goal, builds on prior work analysing co-articulation trajectories [7], [8], [9] as well as various studies on trajectory modelling for ASR purposes [10], [11], [12], [13]. Particularly...

  19. Three-dimensional Reconstruction of Dust Particle Trajectories in the NSTX

    International Nuclear Information System (INIS)

    Boeglin, W.U.; Roquemore, A.L.; Maqueda, R.

    2009-01-01

    Highly mobile incandescent dust particles are routinely observed on NSTX using two fast cameras operating in the visible region. An analysis method to reconstruct dust particle trajectories in space using two fast cameras is presented in this paper. Position accuracies of a few millimeters depending on the particle's location have been achieved and particle velocities between 10 and 200 m/s have been observed

  20. Post-flight trajectory reconstruction of suborbital free-flyers using GPS raw data

    Science.gov (United States)

    Ivchenko, N.; Yuan, Y.; Linden, E.

    2017-08-01

    This paper describes the reconstruction of postflight trajectories of suborbital free flying units by using logged GPS raw data. We took the reconstruction as a global least squares optimization problem, using both the pseudo-range and Doppler observables, and solved it by using the trust-region-reflective algorithm, which enabled navigational solutions of high accuracy. The code tracking was implemented with a large number of correlators and least squares curve fitting, in order to improve the precision of the code start times, while a more conventional phased lock loop was used for Doppler tracking. We proposed a weighting scheme to account for fast signal strength variation due to free-flier fast rotation, and a penalty for jerk to achieve a smooth solution. We applied these methods to flight data of two suborbital free flying units launched on REXUS 12 sounding rocket, reconstructing the trajectory, receiver clock error and wind up rates. The trajectory exhibits a parabola with the apogee around 80 km, and the velocity profile shows the details of payloadwobbling. The wind up rates obtained match the measurements from onboard angular rate sensors.

  1. Post-flight trajectory reconstruction of suborbital free-flyers using GPS raw data

    Directory of Open Access Journals (Sweden)

    Ivchenko N.

    2017-08-01

    Full Text Available This paper describes the reconstruction of postflight trajectories of suborbital free flying units by using logged GPS raw data. We took the reconstruction as a global least squares optimization problem, using both the pseudo-range and Doppler observables, and solved it by using the trust-region-reflective algorithm, which enabled navigational solutions of high accuracy. The code tracking was implemented with a large number of correlators and least squares curve fitting, in order to improve the precision of the code start times, while a more conventional phased lock loop was used for Doppler tracking. We proposed a weighting scheme to account for fast signal strength variation due to free-flier fast rotation, and a penalty for jerk to achieve a smooth solution. We applied these methods to flight data of two suborbital free flying units launched on REXUS 12 sounding rocket, reconstructing the trajectory, receiver clock error and wind up rates. The trajectory exhibits a parabola with the apogee around 80 km, and the velocity profile shows the details of payloadwobbling. The wind up rates obtained match the measurements from onboard angular rate sensors.

  2. LINEAR LATTICE AND TRAJECTORY RECONSTRUCTION AND CORRECTION AT FAST LINEAR ACCELERATOR

    Energy Technology Data Exchange (ETDEWEB)

    Romanov, A. [Fermilab; Edstrom, D. [Fermilab; Halavanau, A. [Northern Illinois U.

    2017-07-16

    The low energy part of the FAST linear accelerator based on 1.3 GHz superconducting RF cavities was successfully commissioned [1]. During commissioning, beam based model dependent methods were used to correct linear lattice and trajectory. Lattice correction algorithm is based on analysis of beam shape from profile monitors and trajectory responses to dipole correctors. Trajectory responses to field gradient variations in quadrupoles and phase variations in superconducting RF cavities were used to correct bunch offsets in quadrupoles and accelerating cavities relative to their magnetic axes. Details of used methods and experimental results are presented.

  3. Compressed sensing reconstruction of cardiac cine MRI using golden angle spiral trajectories.

    Science.gov (United States)

    Tolouee, Azar; Alirezaie, Javad; Babyn, Paul

    2015-11-01

    In dynamic cardiac cine Magnetic Resonance Imaging (MRI), the spatiotemporal resolution is limited by the low imaging speed. Compressed sensing (CS) theory has been applied to improve the imaging speed and thus the spatiotemporal resolution. The purpose of this paper is to improve CS reconstruction of under sampled data by exploiting spatiotemporal sparsity and efficient spiral trajectories. We extend k-t sparse algorithm to spiral trajectories to achieve high spatio temporal resolutions in cardiac cine imaging. We have exploited spatiotemporal sparsity of cardiac cine MRI by applying a 2D+time wavelet-Fourier transform. For efficient coverage of k-space, we have used a modified version of multi shot (interleaved) spirals trajectories. In order to reduce incoherent aliasing artifact, we use different random undersampling pattern for each temporal frame. Finally, we have used nonuniform fast Fourier transform (NUFFT) algorithm to reconstruct the image from the non-uniformly acquired samples. The proposed approach was tested in simulated and cardiac cine MRI data. Results show that higher acceleration factors with improved image quality can be obtained with the proposed approach in comparison to the existing state-of-the-art method. The flexibility of the introduced method should allow it to be used not only for the challenging case of cardiac imaging, but also for other patient motion where the patient moves or breathes during acquisition. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. On the possibility of the geometrical reconstruction of the charged particle trajectories in the streamer chamber

    International Nuclear Information System (INIS)

    Constantin, F.; Jipa, A.; Ilie, Gh.

    1998-01-01

    An interesting problem in the experiments using visualisation detectors is that of the geometrical reconstruction of the trajectories. In this work a new method for the geometrical reconstruction of trajectories of the charged particles produced in nucleus-nucleus collisions at 4.5 A GeV/c is proposed. The experiments have been performed at the JINR Synchrophasotron, in the frame of the SKM 200 Collaboration. The geometrical reconstruction method is based on the facilities offered by the Sun3VME-MaxVideo20 work-station, a real time image processing machine produced by DataCube Corporation. An algorithm is constructed taking into account some relevant characteristics of the pictures. For a typical picture, the centre, a very noisy region, is the starting point for all main tracks (the vertex); the poor contrast makes tracks identification difficult. Surrounding this first region there is an almost circular belt with a better contrast and without overlapping tracks. Finally, the third region, the outer one, is the origin of the secondary tracks, which is also noisy. The secondary tracks identify particles created in the chamber far from the vertex; secondary particle creation induces a large noise into the image and the sharpness reduces. The areas of these three regions vary from one picture to other, their fractions amounting around 20%, 50%, and 30%, respectively. The algorithm treats the primary tracks only. It takes great advantage for the well-defined geometrical vertex position. The primary tracks represent curved trajectories of charged particles moving in a magnetic field. As curved tracks are harder to identify relative to straight lines, we propose a conformal transformation from the surface z = x + iy to the surface w = u + iv related by the relation z a 2 /w. It transforms circles passing through origin in z plane into straight lines in w plane. The a 2 factor is a constant which must be determined. Practically, we transform a discrete image by

  5. Fast reconstruction of trajectories of charged muons recorded by the MUCH detector in the CBM experiment

    International Nuclear Information System (INIS)

    Ablyazimov, T.O.; Ivanov, V.V.

    2017-01-01

    The CBM experiment is currently being developed in GSI (Darmstadt, Germany) at the FAIR accelerator complex by an international collaboration including JINR. One of the main goals of the experiment is a research of charmonium production process in nucleus-nucleus collisions at high energies. The registration of such decays as J/ψ → μ"+μ"− is planned to be carried out in real time. The current paper presents an algorithm suitable for fast reconstruction of trajectories of charged muons from J/ψ decays recorded by the MUCH detector. [ru

  6. Trajectory Design to Benefit Trajectory-Based Surface Operations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Trajectory-based operations constitute a key mechanism considered by the Joint Planning and Development Office (JPDO) for managing traffic in high-density or...

  7. Trajectory Design to Benefit Trajectory-Based Surface Operations, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Trajectory-based operations constitute a key mechanism considered by the Joint Planning and Development Office (JPDO) for managing traffic in high-density or...

  8. Trajectory Reconstruction and Uncertainty Analysis Using Mars Science Laboratory Pre-Flight Scale Model Aeroballistic Testing

    Science.gov (United States)

    Lugo, Rafael A.; Tolson, Robert H.; Schoenenberger, Mark

    2013-01-01

    As part of the Mars Science Laboratory (MSL) trajectory reconstruction effort at NASA Langley Research Center, free-flight aeroballistic experiments of instrumented MSL scale models was conducted at Aberdeen Proving Ground in Maryland. The models carried an inertial measurement unit (IMU) and a flush air data system (FADS) similar to the MSL Entry Atmospheric Data System (MEADS) that provided data types similar to those from the MSL entry. Multiple sources of redundant data were available, including tracking radar and on-board magnetometers. These experimental data enabled the testing and validation of the various tools and methodologies that will be used for MSL trajectory reconstruction. The aerodynamic parameters Mach number, angle of attack, and sideslip angle were estimated using minimum variance with a priori to combine the pressure data and pre-flight computational fluid dynamics (CFD) data. Both linear and non-linear pressure model terms were also estimated for each pressure transducer as a measure of the errors introduced by CFD and transducer calibration. Parameter uncertainties were estimated using a "consider parameters" approach.

  9. GPU based Monte Carlo for PET image reconstruction: detector modeling

    International Nuclear Information System (INIS)

    Légrády; Cserkaszky, Á; Lantos, J.; Patay, G.; Bükki, T.

    2011-01-01

    Monte Carlo (MC) calculations and Graphical Processing Units (GPUs) are almost like the dedicated hardware designed for the specific task given the similarities between visible light transport and neutral particle trajectories. A GPU based MC gamma transport code has been developed for Positron Emission Tomography iterative image reconstruction calculating the projection from unknowns to data at each iteration step taking into account the full physics of the system. This paper describes the simplified scintillation detector modeling and its effect on convergence. (author)

  10. Model-based segmentation and classification of trajectories (Extended abstract)

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Buchin, K.; Buchin, M.; Sijben, S.; Westenberg, M.A.

    2014-01-01

    We present efficient algorithms for segmenting and classifying a trajectory based on a parameterized movement model like the Brownian bridge movement model. Segmentation is the problem of subdividing a trajectory into parts such that each art is homogeneous in its movement characteristics. We

  11. Underwater navigation using diffusion-based trajectory observers

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Opderbecke, Jan

    2007-01-01

    This paper addresses the issue of estimating underwater vehicle trajectories using gyro-Doppler (body-fixed velocities) and acoustic positioning signals (earth-fixed positions). The approach consists of diffusion-based observers processing a whole trajectory segment at a time, allowing the consid...

  12. Game of thrown bombs in 3D: using high speed cameras and photogrammetry techniques to reconstruct bomb trajectories at Stromboli (Italy)

    Science.gov (United States)

    Gaudin, D.; Taddeucci, J.; Scarlato, P.; Del Bello, E.; Houghton, B. F.; Orr, T. R.; Andronico, D.; Kueppers, U.

    2015-12-01

    Large juvenile bombs and lithic clasts, produced and ejected during explosive volcanic eruptions, follow ballistic trajectories. Of particular interest are: 1) the determination of ejection velocity and launch angle, which give insights into shallow conduit conditions and geometry; 2) particle trajectories, with an eye on trajectory evolution caused by collisions between bombs, as well as the interaction between bombs and ash/gas plumes; and 3) the computation of the final emplacement of bomb-sized clasts, which is important for hazard assessment and risk management. Ground-based imagery from a single camera only allows the reconstruction of bomb trajectories in a plan perpendicular to the line of sight, which may lead to underestimation of bomb velocities and does not allow the directionality of the ejections to be studied. To overcome this limitation, we adapted photogrammetry techniques to reconstruct 3D bomb trajectories from two or three synchronized high-speed video cameras. In particular, we modified existing algorithms to consider the errors that may arise from the very high velocity of the particles and the impossibility of measuring tie points close to the scene. Our method was tested during two field campaigns at Stromboli. In 2014, two high-speed cameras with a 500 Hz frame rate and a ~2 cm resolution were set up ~350m from the crater, 10° apart and synchronized. The experiment was repeated with similar parameters in 2015, but using three high-speed cameras in order to significantly reduce uncertainties and allow their estimation. Trajectory analyses for tens of bombs at various times allowed for the identification of shifts in the mean directivity and dispersal angle of the jets during the explosions. These time evolutions are also visible on the permanent video-camera monitoring system, demonstrating the applicability of our method to all kinds of explosive volcanoes.

  13. Young adults' trajectories of Ecstasy use: a population based study.

    Science.gov (United States)

    Smirnov, Andrew; Najman, Jake M; Hayatbakhsh, Reza; Plotnikova, Maria; Wells, Helene; Legosz, Margot; Kemp, Robert

    2013-11-01

    Young adults' Ecstasy use trajectories have important implications for individual and population-level consequences of Ecstasy use, but little relevant research has been conducted. This study prospectively examines Ecstasy trajectories in a population-based sample. Data are from the Natural History Study of Drug Use, a retrospective/prospective cohort study conducted in Australia. Population screening identified a probability sample of Ecstasy users aged 19-23 years. Complete data for 30 months of follow-up, comprising 4 time intervals, were available for 297 participants (88.4% of sample). Trajectories were derived using cluster analysis based on recent Ecstasy use at each interval. Trajectory predictors were examined using a generalized ordered logit model and included Ecstasy dependence (World Mental Health Composite International Diagnostic Instrument), psychological distress (Hospital Anxiety Depression Scale), aggression (Young Adult Self Report) and contextual factors (e.g. attendance at electronic/dance music events). Three Ecstasy trajectories were identified (low, intermediate and high use). At its peak, the high-use trajectory involved 1-2 days Ecstasy use per week. Decreasing frequency of use was observed for intermediate and high-use trajectories from 12 months, independently of market factors. Intermediate and high-use trajectory membership was predicted by past Ecstasy consumption (>70 pills) and attendance at electronic/dance music events. High-use trajectory members were unlikely to have used Ecstasy for more than 3 years and tended to report consistently positive subjective effects at baseline. Given the social context and temporal course of Ecstasy use, Ecstasy trajectories might be better understood in terms of instrumental rather than addictive drug use patterns. © 2013 Elsevier Ltd. All rights reserved.

  14. A Framework for Autonomous Trajectory-Based Operations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed is a framework for autonomous Traffic Flow Management (TFM) under Trajectory Based Operations (TBO) for Unmanned Aerial Systems (UAS). The...

  15. PhyloBot: A Web Portal for Automated Phylogenetics, Ancestral Sequence Reconstruction, and Exploration of Mutational Trajectories.

    Directory of Open Access Journals (Sweden)

    Victor Hanson-Smith

    2016-07-01

    Full Text Available The method of phylogenetic ancestral sequence reconstruction is a powerful approach for studying evolutionary relationships among protein sequence, structure, and function. In particular, this approach allows investigators to (1 reconstruct and "resurrect" (that is, synthesize in vivo or in vitro extinct proteins to study how they differ from modern proteins, (2 identify key amino acid changes that, over evolutionary timescales, have altered the function of the protein, and (3 order historical events in the evolution of protein function. Widespread use of this approach has been slow among molecular biologists, in part because the methods require significant computational expertise. Here we present PhyloBot, a web-based software tool that makes ancestral sequence reconstruction easy. Designed for non-experts, it integrates all the necessary software into a single user interface. Additionally, PhyloBot provides interactive tools to explore evolutionary trajectories between ancestors, enabling the rapid generation of hypotheses that can be tested using genetic or biochemical approaches. Early versions of this software were used in previous studies to discover genetic mechanisms underlying the functions of diverse protein families, including V-ATPase ion pumps, DNA-binding transcription regulators, and serine/threonine protein kinases. PhyloBot runs in a web browser, and is available at the following URL: http://www.phylobot.com. The software is implemented in Python using the Django web framework, and runs on elastic cloud computing resources from Amazon Web Services. Users can create and submit jobs on our free server (at the URL listed above, or use our open-source code to launch their own PhyloBot server.

  16. Reconstructing the landing trajectory of the CE-3 lunar probe by using images from the landing camera

    International Nuclear Information System (INIS)

    Liu Jian-Jun; Yan Wei; Li Chun-Lai; Tan Xu; Ren Xin; Mu Ling-Li

    2014-01-01

    An accurate determination of the landing trajectory of Chang'e-3 (CE-3) is significant for verifying orbital control strategy, optimizing orbital planning, accurately determining the landing site of CE-3 and analyzing the geological background of the landing site. Due to complexities involved in the landing process, there are some differences between the planned trajectory and the actual trajectory of CE-3. The landing camera on CE-3 recorded a sequence of the landing process with a frequency of 10 frames per second. These images recorded by the landing camera and high-resolution images of the lunar surface are utilized to calculate the position of the probe, so as to reconstruct its precise trajectory. This paper proposes using the method of trajectory reconstruction by Single Image Space Resection to make a detailed study of the hovering stage at a height of 100 m above the lunar surface. Analysis of the data shows that the closer CE-3 came to the lunar surface, the higher the spatial resolution of images that were acquired became, and the more accurately the horizontal and vertical position of CE-3 could be determined. The horizontal and vertical accuracies were 7.09 m and 4.27 m respectively during the hovering stage at a height of 100.02 m. The reconstructed trajectory can reflect the change in CE-3's position during the powered descent process. A slight movement in CE-3 during the hovering stage is also clearly demonstrated. These results will provide a basis for analysis of orbit control strategy, and it will be conducive to adjustment and optimization of orbit control strategy in follow-up missions

  17. Region-of-interest reconstruction for a cone-beam dental CT with a circular trajectory

    International Nuclear Information System (INIS)

    Hu, Zhanli; Zou, Jing; Gui, Jianbao; Zheng, Hairong; Xia, Dan

    2013-01-01

    Dental CT is the most appropriate and accurate device for preoperative evaluation of dental implantation. It can demonstrate the quantity of bone in three dimensions (3D), the location of important adjacent anatomic structures and the quality of available bone with minimal geometric distortion. Nevertheless, with the rapid increase of dental CT examinations, we are facing the problem of dose reduction without loss of image quality. In this work, backprojection-filtration (BPF) and Feldkamp–Davis–Kress (FDK) algorithm was applied to reconstruct the 3D full image and region-of-interest (ROI) image from complete and truncated circular cone-beam data respectively by computer-simulation. In addition, the BPF algorithm was evaluated based on the 3D ROI-image reconstruction from real data, which was acquired from our developed circular cone-beam prototype dental CT system. The results demonstrated that the ROI-image quality reconstructed from truncated data using the BPF algorithm was comparable to that reconstructed from complete data. The FDK algorithm, however, created artifacts while reconstructing ROI-image. Thus it can be seen, for circular cone-beam dental CT, reducing scanning angular range of the BPF algorithm used for ROI-image reconstruction are helpful for reducing the radiation dose and scanning time. Finally, an analytical method was developed for estimation of the ROI projection area on the detector before CT scanning, which would help doctors to roughly estimate the total radiation dose before the CT examination. -- Highlights: ► BPF algorithm was applied by using dental CT for the first time. ► A method was developed for estimation of projection region before CT scanning. ► Roughly predict the total radiation dose before CT scans. ► Potential reduce imaging radiation dose, scatter, and scanning time

  18. VISUAL UAV TRAJECTORY PLAN SYSTEM BASED ON NETWORK MAP

    Directory of Open Access Journals (Sweden)

    X. L. Li

    2012-07-01

    Full Text Available The base map of the current software UP-30 using in trajectory plan for Unmanned Aircraft Vehicle is vector diagram. UP-30 draws navigation points manually. But in the field of operation process, the efficiency and the quality of work is influenced because of insufficient information, screen reflection, calculate inconveniently and other factors. If we do this work in indoor, the effect of external factors on the results would be eliminated, the network earth users can browse the free world high definition satellite images through downloading a client software, and can export the high resolution image by standard file format. This brings unprecedented convenient of trajectory plan. But the images must be disposed by coordinate transformation, geometric correction. In addition, according to the requirement of mapping scale ,camera parameters and overlap degree we can calculate exposure hole interval and trajectory distance between the adjacent trajectory automatically . This will improve the degree of automation of data collection. Software will judge the position of next point according to the intersection of the trajectory and the survey area and ensure the position of point according to trajectory distance. We can undertake the points artificially. So the trajectory plan is automatic and flexible. Considering safety, the date can be used in flying after simulating flight. Finally we can export all of the date using a key

  19. Visual Uav Trajectory Plan System Based on Network Map

    Science.gov (United States)

    Li, X. L.; Lin, Z. J.; Su, G. Z.; Wu, B. Y.

    2012-07-01

    The base map of the current software UP-30 using in trajectory plan for Unmanned Aircraft Vehicle is vector diagram. UP-30 draws navigation points manually. But in the field of operation process, the efficiency and the quality of work is influenced because of insufficient information, screen reflection, calculate inconveniently and other factors. If we do this work in indoor, the effect of external factors on the results would be eliminated, the network earth users can browse the free world high definition satellite images through downloading a client software, and can export the high resolution image by standard file format. This brings unprecedented convenient of trajectory plan. But the images must be disposed by coordinate transformation, geometric correction. In addition, according to the requirement of mapping scale ,camera parameters and overlap degree we can calculate exposure hole interval and trajectory distance between the adjacent trajectory automatically . This will improve the degree of automation of data collection. Software will judge the position of next point according to the intersection of the trajectory and the survey area and ensure the position of point according to trajectory distance. We can undertake the points artificially. So the trajectory plan is automatic and flexible. Considering safety, the date can be used in flying after simulating flight. Finally we can export all of the date using a key

  20. Adaptive density trajectory cluster based on time and space distance

    Science.gov (United States)

    Liu, Fagui; Zhang, Zhijie

    2017-10-01

    There are some hotspot problems remaining in trajectory cluster for discovering mobile behavior regularity, such as the computation of distance between sub trajectories, the setting of parameter values in cluster algorithm and the uncertainty/boundary problem of data set. As a result, based on the time and space, this paper tries to define the calculation method of distance between sub trajectories. The significance of distance calculation for sub trajectories is to clearly reveal the differences in moving trajectories and to promote the accuracy of cluster algorithm. Besides, a novel adaptive density trajectory cluster algorithm is proposed, in which cluster radius is computed through using the density of data distribution. In addition, cluster centers and number are selected by a certain strategy automatically, and uncertainty/boundary problem of data set is solved by designed weighted rough c-means. Experimental results demonstrate that the proposed algorithm can perform the fuzzy trajectory cluster effectively on the basis of the time and space distance, and obtain the optimal cluster centers and rich cluster results information adaptably for excavating the features of mobile behavior in mobile and sociology network.

  1. Dynamic trajectory-based couch motion for improvement of radiation therapy trajectories in cranial SRT

    Energy Technology Data Exchange (ETDEWEB)

    MacDonald, R. Lee [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, Nova Scotia B3H 4R2 (Canada); Thomas, Christopher G., E-mail: Chris.Thomas@cdha.nshealth.ca [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, Nova Scotia B3H 4R2 (Canada); Department of Medical Physics, Nova Scotia Cancer Centre, Queen Elizabeth II Health Sciences Centre, Halifax, Nova Scotia B3H 1V7 (Canada); Department of Radiation Oncology, Dalhousie University, Halifax, Nova Scotia B3H 4R2 (Canada); Department of Radiology, Dalhousie University, Halifax, Nova Scotia B3H 4R2 (Canada)

    2015-05-15

    Purpose: To investigate potential improvement in external beam stereotactic radiation therapy plan quality for cranial cases using an optimized dynamic gantry and patient support couch motion trajectory, which could minimize exposure to sensitive healthy tissue. Methods: Anonymized patient anatomy and treatment plans of cranial cancer patients were used to quantify the geometric overlap between planning target volumes and organs-at-risk (OARs) based on their two-dimensional projection from source to a plane at isocenter as a function of gantry and couch angle. Published dose constraints were then used as weighting factors for the OARs to generate a map of couch-gantry coordinate space, indicating degree of overlap at each point in space. A couch-gantry collision space was generated by direct measurement on a linear accelerator and couch using an anthropomorphic solid-water phantom. A dynamic, fully customizable algorithm was written to generate a navigable ideal trajectory for the patient specific couch-gantry space. The advanced algorithm can be used to balance the implementation of absolute minimum values of overlap with the clinical practicality of large-scale couch motion and delivery time. Optimized cranial cancer treatment trajectories were compared to conventional treatment trajectories. Results: Comparison of optimized treatment trajectories with conventional treatment trajectories indicated an average decrease in mean dose to the OARs of 19% and an average decrease in maximum dose to the OARs of 12%. Degradation was seen for homogeneity index (6.14% ± 0.67%–5.48% ± 0.76%) and conformation number (0.82 ± 0.02–0.79 ± 0.02), but neither was statistically significant. Removal of OAR constraints from volumetric modulated arc therapy optimization reveals that reduction in dose to OARs is almost exclusively due to the optimized trajectory and not the OAR constraints. Conclusions: The authors’ study indicated that simultaneous couch and gantry motion

  2. Dynamic trajectory-based couch motion for improvement of radiation therapy trajectories in cranial SRT

    International Nuclear Information System (INIS)

    MacDonald, R. Lee; Thomas, Christopher G.

    2015-01-01

    Purpose: To investigate potential improvement in external beam stereotactic radiation therapy plan quality for cranial cases using an optimized dynamic gantry and patient support couch motion trajectory, which could minimize exposure to sensitive healthy tissue. Methods: Anonymized patient anatomy and treatment plans of cranial cancer patients were used to quantify the geometric overlap between planning target volumes and organs-at-risk (OARs) based on their two-dimensional projection from source to a plane at isocenter as a function of gantry and couch angle. Published dose constraints were then used as weighting factors for the OARs to generate a map of couch-gantry coordinate space, indicating degree of overlap at each point in space. A couch-gantry collision space was generated by direct measurement on a linear accelerator and couch using an anthropomorphic solid-water phantom. A dynamic, fully customizable algorithm was written to generate a navigable ideal trajectory for the patient specific couch-gantry space. The advanced algorithm can be used to balance the implementation of absolute minimum values of overlap with the clinical practicality of large-scale couch motion and delivery time. Optimized cranial cancer treatment trajectories were compared to conventional treatment trajectories. Results: Comparison of optimized treatment trajectories with conventional treatment trajectories indicated an average decrease in mean dose to the OARs of 19% and an average decrease in maximum dose to the OARs of 12%. Degradation was seen for homogeneity index (6.14% ± 0.67%–5.48% ± 0.76%) and conformation number (0.82 ± 0.02–0.79 ± 0.02), but neither was statistically significant. Removal of OAR constraints from volumetric modulated arc therapy optimization reveals that reduction in dose to OARs is almost exclusively due to the optimized trajectory and not the OAR constraints. Conclusions: The authors’ study indicated that simultaneous couch and gantry motion

  3. Evidence-Based ACL Reconstruction

    Directory of Open Access Journals (Sweden)

    E. Carlos RODRIGUEZ-MERCHAN

    2015-01-01

    Full Text Available There is controversy in the literature regarding a number of topics related to anterior cruciate ligament (ACLreconstruction. The purpose of this article is to answer the following questions: 1 Bone patellar tendon bone (BPTB reconstruction or hamstring reconstruction (HR; 2 Double bundle or single bundle; 3 Allograft or authograft; 4 Early or late reconstruction; 5 Rate of return to sports after ACL reconstruction; 6 Rate of osteoarthritis after ACL reconstruction. A Cochrane Library and PubMed (MEDLINE search of systematic reviews and meta-analysis related to ACL reconstruction was performed. The key words were: ACL reconstruction, systematic reviews and meta-analysis. The main criteria for selection were that the articles were systematic reviews and meta-analysesfocused on the aforementioned questions. Sixty-nine articles were found, but only 26 were selected and reviewed because they had a high grade (I-II of evidence. BPTB-R was associated with better postoperative knee stability but with a higher rate of morbidity. However, the results of both procedures in terms of functional outcome in the long-term were similar. The double-bundle ACL reconstruction technique showed better outcomes in rotational laxity, although functional recovery was similar between single-bundle and double-bundle. Autograft yielded better results than allograft. There was no difference between early and delayed reconstruction. 82% of patients were able to return to some kind of sport participation. 28% of patients presented radiological signs of osteoarthritis with a follow-up of minimum 10 years.

  4. Computer Based Road Accident Reconstruction Experiences

    Directory of Open Access Journals (Sweden)

    Milan Batista

    2005-03-01

    Full Text Available Since road accident analyses and reconstructions are increasinglybased on specific computer software for simulationof vehicle d1iving dynamics and collision dynamics, and forsimulation of a set of trial runs from which the model that bestdescribes a real event can be selected, the paper presents anoverview of some computer software and methods available toaccident reconstruction experts. Besides being time-saving,when properly used such computer software can provide moreauthentic and more trustworthy accident reconstruction, thereforepractical experiences while using computer software toolsfor road accident reconstruction obtained in the TransportSafety Laboratory at the Faculty for Maritime Studies andTransport of the University of Ljubljana are presented and discussed.This paper addresses also software technology for extractingmaximum information from the accident photo-documentationto support accident reconstruction based on the simulationsoftware, as well as the field work of reconstruction expertsor police on the road accident scene defined by this technology.

  5. Diffusion-Based Trajectory Observers with Variance Constraints

    DEFF Research Database (Denmark)

    Alcocer, Alex; Jouffroy, Jerome; Oliveira, Paulo

    Diffusion-based trajectory observers have been recently proposed as a simple and efficient framework to solve diverse smoothing problems in underwater navigation. For instance, to obtain estimates of the trajectories of an underwater vehicle given position fixes from an acoustic positioning system...... of smoothing and is determined by resorting to trial and error. This paper presents a methodology to choose the observer gain by taking into account a priori information on the variance of the position measurement errors. Experimental results with data from an acoustic positioning system are presented...

  6. Trajectory data privacy protection based on differential privacy mechanism

    Science.gov (United States)

    Gu, Ke; Yang, Lihao; Liu, Yongzhi; Liao, Niandong

    2018-05-01

    In this paper, we propose a trajectory data privacy protection scheme based on differential privacy mechanism. In the proposed scheme, the algorithm first selects the protected points from the user’s trajectory data; secondly, the algorithm forms the polygon according to the protected points and the adjacent and high frequent accessed points that are selected from the accessing point database, then the algorithm calculates the polygon centroids; finally, the noises are added to the polygon centroids by the differential privacy method, and the polygon centroids replace the protected points, and then the algorithm constructs and issues the new trajectory data. The experiments show that the running time of the proposed algorithms is fast, the privacy protection of the scheme is effective and the data usability of the scheme is higher.

  7. Using machine learning and surface reconstruction to accurately differentiate different trajectories of mood and energy dysregulation in youth.

    Science.gov (United States)

    Versace, Amelia; Sharma, Vinod; Bertocci, Michele A; Bebko, Genna; Iyengar, Satish; Dwojak, Amanda; Bonar, Lisa; Perlman, Susan B; Schirda, Claudiu; Travis, Michael; Gill, Mary Kay; Diwadkar, Vaibhav A; Sunshine, Jeffrey L; Holland, Scott K; Kowatch, Robert A; Birmaher, Boris; Axelson, David; Frazier, Thomas W; Arnold, L Eugene; Fristad, Mary A; Youngstrom, Eric A; Horwitz, Sarah M; Findling, Robert L; Phillips, Mary L

    2017-01-01

    Difficulty regulating positive mood and energy is a feature that cuts across different pediatric psychiatric disorders. Yet, little is known regarding the neural mechanisms underlying different developmental trajectories of positive mood and energy regulation in youth. Recent studies indicate that machine learning techniques can help elucidate the role of neuroimaging measures in classifying individual subjects by specific symptom trajectory. Cortical thickness measures were extracted in sixty-eight anatomical regions covering the entire brain in 115 participants from the Longitudinal Assessment of Manic Symptoms (LAMS) study and 31 healthy comparison youth (12.5 y/o;-Male/Female = 15/16;-IQ = 104;-Right/Left handedness = 24/5). Using a combination of trajectories analyses, surface reconstruction, and machine learning techniques, the present study aims to identify the extent to which measures of cortical thickness can accurately distinguish youth with higher (n = 18) from those with lower (n = 34) trajectories of manic-like behaviors in a large sample of LAMS youth (n = 115; 13.6 y/o; M/F = 68/47, IQ = 100.1, R/L = 108/7). Machine learning analyses revealed that widespread cortical thickening in portions of the left dorsolateral prefrontal cortex, right inferior and middle temporal gyrus, bilateral precuneus, and bilateral paracentral gyri and cortical thinning in portions of the right dorsolateral prefrontal cortex, left ventrolateral prefrontal cortex, and right parahippocampal gyrus accurately differentiate (Area Under Curve = 0.89;p = 0.03) youth with different (higher vs lower) trajectories of positive mood and energy dysregulation over a period up to 5years, as measured by the Parent General Behavior Inventory-10 Item Mania Scale. Our findings suggest that specific patterns of cortical thickness may reflect transdiagnostic neural mechanisms associated with different temporal trajectories of positive mood and energy dysregulation in youth. This approach has

  8. Using machine learning and surface reconstruction to accurately differentiate different trajectories of mood and energy dysregulation in youth.

    Directory of Open Access Journals (Sweden)

    Amelia Versace

    Full Text Available Difficulty regulating positive mood and energy is a feature that cuts across different pediatric psychiatric disorders. Yet, little is known regarding the neural mechanisms underlying different developmental trajectories of positive mood and energy regulation in youth. Recent studies indicate that machine learning techniques can help elucidate the role of neuroimaging measures in classifying individual subjects by specific symptom trajectory. Cortical thickness measures were extracted in sixty-eight anatomical regions covering the entire brain in 115 participants from the Longitudinal Assessment of Manic Symptoms (LAMS study and 31 healthy comparison youth (12.5 y/o;-Male/Female = 15/16;-IQ = 104;-Right/Left handedness = 24/5. Using a combination of trajectories analyses, surface reconstruction, and machine learning techniques, the present study aims to identify the extent to which measures of cortical thickness can accurately distinguish youth with higher (n = 18 from those with lower (n = 34 trajectories of manic-like behaviors in a large sample of LAMS youth (n = 115; 13.6 y/o; M/F = 68/47, IQ = 100.1, R/L = 108/7. Machine learning analyses revealed that widespread cortical thickening in portions of the left dorsolateral prefrontal cortex, right inferior and middle temporal gyrus, bilateral precuneus, and bilateral paracentral gyri and cortical thinning in portions of the right dorsolateral prefrontal cortex, left ventrolateral prefrontal cortex, and right parahippocampal gyrus accurately differentiate (Area Under Curve = 0.89;p = 0.03 youth with different (higher vs lower trajectories of positive mood and energy dysregulation over a period up to 5years, as measured by the Parent General Behavior Inventory-10 Item Mania Scale. Our findings suggest that specific patterns of cortical thickness may reflect transdiagnostic neural mechanisms associated with different temporal trajectories of positive mood and energy dysregulation in youth. This

  9. Human action recognition using trajectory-based representation

    Directory of Open Access Journals (Sweden)

    Haiam A. Abdul-Azim

    2015-07-01

    Full Text Available Recognizing human actions in video sequences has been a challenging problem in the last few years due to its real-world applications. A lot of action representation approaches have been proposed to improve the action recognition performance. Despite the popularity of local features-based approaches together with “Bag-of-Words” model for action representation, it fails to capture adequate spatial or temporal relationships. In an attempt to overcome this problem, a trajectory-based local representation approaches have been proposed to capture the temporal information. This paper introduces an improvement of trajectory-based human action recognition approaches to capture discriminative temporal relationships. In our approach, we extract trajectories by tracking the detected spatio-temporal interest points named “cuboid features” with matching its SIFT descriptors over the consecutive frames. We, also, propose a linking and exploring method to obtain efficient trajectories for motion representation in realistic conditions. Then the volumes around the trajectories’ points are described to represent human actions based on the Bag-of-Words (BOW model. Finally, a support vector machine is used to classify human actions. The effectiveness of the proposed approach was evaluated on three popular datasets (KTH, Weizmann and UCF sports. Experimental results showed that the proposed approach yields considerable performance improvement over the state-of-the-art approaches.

  10. A new circulation type classification based upon Lagrangian air trajectories

    Directory of Open Access Journals (Sweden)

    Alexandre M. Ramos

    2014-10-01

    Full Text Available A new classification method of the large-scale circulation characteristic for a specific target area (NW Iberian Peninsula is presented, based on the analysis of 90-h backward trajectories arriving in this area calculated with the 3-D Lagrangian particle dispersion model FLEXPART. A cluster analysis is applied to separate the backward trajectories in up to five representative air streams for each day. Specific measures are then used to characterise the distinct air streams (e.g., curvature of the trajectories, cyclonic or anticyclonic flow, moisture evolution, origin and length of the trajectories. The robustness of the presented method is demonstrated in comparison with the Eulerian Lamb weather type classification.A case study of the 2003 heatwave is discussed in terms of the new Lagrangian circulation and the Lamb weather type classifications. It is shown that the new classification method adds valuable information about the pertinent meteorological conditions, which are missing in an Eulerian approach. The new method is climatologically evaluated for the five-year time period from December 1999 to November 2004. The ability of the method to capture the inter-seasonal circulation variability in the target region is shown. Furthermore, the multi-dimensional character of the classification is shortly discussed, in particular with respect to inter-seasonal differences. Finally, the relationship between the new Lagrangian classification and the precipitation in the target area is studied.

  11. Sensor-Based Trajectory Generation for Advanced Driver Assistance System

    Directory of Open Access Journals (Sweden)

    Christopher James Shackleton

    2013-03-01

    Full Text Available This paper investigates the trajectory generation problem for an advanced driver assistance system that could sense the driving state of the vehicle, so that a collision free trajectory can be generated safely. Specifically, the problem of trajectory generation is solved for the safety assessment of the driving state and to manipulate the vehicle in order to avoid any possible collisions. The vehicle senses the environment so as to obtain information about other vehicles and static obstacles ahead. Vehicles may share the perception of the environment via an inter-vehicle communication system. The planning algorithm is based on a visibility graph. A lateral repulsive potential is applied to adaptively maintain a trade-off between the trajectory length and vehicle clearance, which is the greatest problem associated with visibility graphs. As opposed to adaptive roadmap approaches, the algorithm exploits the structured nature of the environment for construction of the roadmap. Furthermore, the mostly organized nature of traffic systems is exploited to obtain orientation invariance, which is another limitation of both visibility graphs and adaptive roadmaps. Simulation results show that the algorithm can successfully solve the problem for a variety of commonly found scenarios.

  12. Accelerated Compressed Sensing Based CT Image Reconstruction.

    Science.gov (United States)

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  13. Accelerated Compressed Sensing Based CT Image Reconstruction

    Directory of Open Access Journals (Sweden)

    SayedMasoud Hashemi

    2015-01-01

    Full Text Available In X-ray computed tomography (CT an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  14. Parallel CT image reconstruction based on GPUs

    International Nuclear Information System (INIS)

    Flores, Liubov A.; Vidal, Vicent; Mayo, Patricia; Rodenas, Francisco; Verdú, Gumersindo

    2014-01-01

    In X-ray computed tomography (CT) iterative methods are more suitable for the reconstruction of images with high contrast and precision in noisy conditions from a small number of projections. However, in practice, these methods are not widely used due to the high computational cost of their implementation. Nowadays technology provides the possibility to reduce effectively this drawback. It is the goal of this work to develop a fast GPU-based algorithm to reconstruct high quality images from under sampled and noisy projection data. - Highlights: • We developed GPU-based iterative algorithm to reconstruct images. • Iterative algorithms are capable to reconstruct images from under sampled set of projections. • The computer cost of the implementation of the developed algorithm is low. • The efficiency of the algorithm increases for the large scale problems

  15. DD4Hep based event reconstruction

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Frank, Markus; Gaede, Frank-Dieter; Hynds, Daniel; Lu, Shaojun; Nikiforou, Nikiforos; Petric, Marko; Simoniello, Rosa; Voutsinas, Georgios Gerasimos

    The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measuremen...

  16. Rapid time-resolved magnetic resonance angiography via a multiecho radial trajectory and GraDeS reconstruction.

    Science.gov (United States)

    Lee, Gregory R; Seiberlich, Nicole; Sunshine, Jeffrey L; Carroll, Timothy J; Griswold, Mark A

    2013-02-01

    Contrast-enhanced magnetic resonance angiography is challenging due to the need for both high spatial and temporal resolution. A multishot trajectory composed of pseudo-random rotations of a single multiecho radial readout was developed. The trajectory is designed to give incoherent aliasing artifacts and a relatively uniform distribution of projections over all time scales. A field map (computed from the same data set) is used to avoid signal dropout in regions of substantial field inhomogeneity. A compressed sensing reconstruction using the GraDeS algorithm was used. Whole brain angiograms were reconstructed at 1-mm isotropic resolution and a 1.1-s frame rate (corresponding to an acceleration factor > 100). The only parameter which must be chosen is the number of iterations of the GraDeS algorithm. A larger number of iterations improves the temporal behavior at cost of decreased image signal-to-noise ratio. The resulting images provide a good depiction of the cerebral vasculature and have excellent arterial/venous separation. Copyright © 2012 Wiley Periodicals, Inc.

  17. Compressed Sensing, Pseudodictionary-Based, Superresolution Reconstruction

    Directory of Open Access Journals (Sweden)

    Chun-mei Li

    2016-01-01

    Full Text Available The spatial resolution of digital images is the critical factor that affects photogrammetry precision. Single-frame, superresolution, image reconstruction is a typical underdetermined, inverse problem. To solve this type of problem, a compressive, sensing, pseudodictionary-based, superresolution reconstruction method is proposed in this study. The proposed method achieves pseudodictionary learning with an available low-resolution image and uses the K-SVD algorithm, which is based on the sparse characteristics of the digital image. Then, the sparse representation coefficient of the low-resolution image is obtained by solving the norm of l0 minimization problem, and the sparse coefficient and high-resolution pseudodictionary are used to reconstruct image tiles with high resolution. Finally, single-frame-image superresolution reconstruction is achieved. The proposed method is applied to photogrammetric images, and the experimental results indicate that the proposed method effectively increase image resolution, increase image information content, and achieve superresolution reconstruction. The reconstructed results are better than those obtained from traditional interpolation methods in aspect of visual effects and quantitative indicators.

  18. Multi Sector Planning Tools for Trajectory-Based Operations

    Science.gov (United States)

    Prevot, Thomas; Mainini, Matthew; Brasil, Connie

    2010-01-01

    This paper discusses a suite of multi sector planning tools for trajectory-based operations that were developed and evaluated in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center. The toolset included tools for traffic load and complexity assessment as well as trajectory planning and coordination. The situation assessment tools included an integrated suite of interactive traffic displays, load tables, load graphs, and dynamic aircraft filters. The planning toolset allowed for single and multi aircraft trajectory planning and data communication-based coordination of trajectories between operators. Also newly introduced was a real-time computation of sector complexity into the toolset that operators could use in lieu of aircraft count to better estimate and manage sector workload, especially in situations with convective weather. The tools were used during a joint NASA/FAA multi sector planner simulation in the AOL in 2009 that had multiple objectives with the assessment of the effectiveness of the tools being one of them. Current air traffic control operators who were experienced as area supervisors and traffic management coordinators used the tools throughout the simulation and provided their usefulness and usability ratings in post simulation questionnaires. This paper presents these subjective assessments as well as the actual usage data that was collected during the simulation. The toolset was rated very useful and usable overall. Many elements received high scores by the operators and were used frequently and successfully. Other functions were not used at all, but various requests for new functions and capabilities were received that could be added to the toolset.

  19. GPU-based online track reconstruction for PANDA and application to the analysis of D→Kππ

    Energy Technology Data Exchange (ETDEWEB)

    Herten, Andreas

    2015-07-02

    The PANDA experiment is a new hadron physics experiment which is being built for the FAIR facility in Darmstadt, Germany. PANDA will employ a novel scheme of data acquisition: the experiment will reconstruct the full stream of events in realtime to make trigger decisions based on the event topology. An important part of this online event reconstruction is online track reconstruction. Online track reconstruction algorithms need to reconstruct particle trajectories in nearly realtime. This work uses the high-throughput devices of Graphics Processing Units to benchmark different online track reconstruction algorithms. The reconstruction of D{sup ±}→K{sup -+}π{sup ±}π{sup ±} is studied extensively and one online track reconstruction algorithm applied.

  20. MILP-Based 4D Trajectory Planning for Tactical Trajectory Management, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences proposes to develop specialized algorithms and software decision-aiding tools for four-dimensional (4D) vehicle-centric, tactical trajectory...

  1. Adaptive Square-Shaped Trajectory-Based Service Location Protocol in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hwa-Jung Lim

    2010-04-01

    Full Text Available In this paper we propose an adaptive square-shaped trajectory (ASST-based service location method to ensure load scalability in wireless sensor networks. This first establishes a square-shaped trajectory over the nodes that surround a target point computed by the hash function and any user can access it, using the hash. Both the width and the size of the trajectory are dynamically adjustable, depending on the number of queries made to the service information on the trajectory. The number of sensor nodes on the trajectory varies in proportion to the changing trajectory shape, allowing high loads to be distributed around the hot spot area.

  2. Reconstruction of implanted marker trajectories from cone-beam CT projection images using interdimensional correlation modeling

    International Nuclear Information System (INIS)

    Chung, Hyekyun; Poulsen, Per Rugaard; Keall, Paul J.; Cho, Seungryong; Cho, Byungchul

    2016-01-01

    Purpose: Cone-beam CT (CBCT) is a widely used imaging modality for image-guided radiotherapy. Most vendors provide CBCT systems that are mounted on a linac gantry. Thus, CBCT can be used to estimate the actual 3-dimensional (3D) position of moving respiratory targets in the thoracic/abdominal region using 2D projection images. The authors have developed a method for estimating the 3D trajectory of respiratory-induced target motion from CBCT projection images using interdimensional correlation modeling. Methods: Because the superior–inferior (SI) motion of a target can be easily analyzed on projection images of a gantry-mounted CBCT system, the authors investigated the interdimensional correlation of the SI motion with left–right and anterior–posterior (AP) movements while the gantry is rotating. A simple linear model and a state-augmented model were implemented and applied to the interdimensional correlation analysis, and their performance was compared. The parameters of the interdimensional correlation models were determined by least-square estimation of the 2D error between the actual and estimated projected target position. The method was validated using 160 3D tumor trajectories from 46 thoracic/abdominal cancer patients obtained during CyberKnife treatment. The authors’ simulations assumed two application scenarios: (1) retrospective estimation for the purpose of moving tumor setup used just after volumetric matching with CBCT; and (2) on-the-fly estimation for the purpose of real-time target position estimation during gating or tracking delivery, either for full-rotation volumetric-modulated arc therapy (VMAT) in 60 s or a stationary six-field intensity-modulated radiation therapy (IMRT) with a beam delivery time of 20 s. Results: For the retrospective CBCT simulations, the mean 3D root-mean-square error (RMSE) for all 4893 trajectory segments was 0.41 mm (simple linear model) and 0.35 mm (state-augmented model). In the on-the-fly simulations, prior

  3. Reconstruction of implanted marker trajectories from cone-beam CT projection images using interdimensional correlation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Hyekyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon 34141, South Korea and Department of Radiation Oncology, Asan Medical Center, University of Ulsan College of Medicine, Seoul 138-736 (Korea, Republic of); Poulsen, Per Rugaard [Department of Oncology, Aarhus University Hospital, Nørrebrogade 44, 8000 Aarhus C (Denmark); Keall, Paul J. [Radiation Physics Laboratory, Sydney Medical School, University of Sydney, NSW 2006 (Australia); Cho, Seungryong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon 34141 (Korea, Republic of); Cho, Byungchul, E-mail: cho.byungchul@gmail.com, E-mail: bcho@amc.seoul.kr [Department of Radiation Oncology, Asan Medical Center, University of Ulsan College of Medicine, Seoul 05505 (Korea, Republic of)

    2016-08-15

    Purpose: Cone-beam CT (CBCT) is a widely used imaging modality for image-guided radiotherapy. Most vendors provide CBCT systems that are mounted on a linac gantry. Thus, CBCT can be used to estimate the actual 3-dimensional (3D) position of moving respiratory targets in the thoracic/abdominal region using 2D projection images. The authors have developed a method for estimating the 3D trajectory of respiratory-induced target motion from CBCT projection images using interdimensional correlation modeling. Methods: Because the superior–inferior (SI) motion of a target can be easily analyzed on projection images of a gantry-mounted CBCT system, the authors investigated the interdimensional correlation of the SI motion with left–right and anterior–posterior (AP) movements while the gantry is rotating. A simple linear model and a state-augmented model were implemented and applied to the interdimensional correlation analysis, and their performance was compared. The parameters of the interdimensional correlation models were determined by least-square estimation of the 2D error between the actual and estimated projected target position. The method was validated using 160 3D tumor trajectories from 46 thoracic/abdominal cancer patients obtained during CyberKnife treatment. The authors’ simulations assumed two application scenarios: (1) retrospective estimation for the purpose of moving tumor setup used just after volumetric matching with CBCT; and (2) on-the-fly estimation for the purpose of real-time target position estimation during gating or tracking delivery, either for full-rotation volumetric-modulated arc therapy (VMAT) in 60 s or a stationary six-field intensity-modulated radiation therapy (IMRT) with a beam delivery time of 20 s. Results: For the retrospective CBCT simulations, the mean 3D root-mean-square error (RMSE) for all 4893 trajectory segments was 0.41 mm (simple linear model) and 0.35 mm (state-augmented model). In the on-the-fly simulations, prior

  4. Trajectory-Based Visual Localization in Underwater Surveying Missions

    Directory of Open Access Journals (Sweden)

    Antoni Burguera

    2015-01-01

    Full Text Available We present a new vision-based localization system applied to an autonomous underwater vehicle (AUV with limited sensing and computation capabilities. The traditional EKF-SLAM approaches are usually expensive in terms of execution time; the approach presented in this paper strengthens this method by adopting a trajectory-based schema that reduces the computational requirements. The pose of the vehicle is estimated using an extended Kalman filter (EKF, which predicts the vehicle motion by means of a visual odometer and corrects these predictions using the data associations (loop closures between the current frame and the previous ones. One of the most important steps in this procedure is the image registration method, as it reinforces the data association and, thus, makes it possible to close loops reliably. Since the use of standard EKFs entail linearization errors that can distort the vehicle pose estimations, the approach has also been tested using an iterated Kalman filter (IEKF. Experiments have been conducted using a real underwater vehicle in controlled scenarios and in shallow sea waters, showing an excellent performance with very small errors, both in the vehicle pose and in the overall trajectory estimates.

  5. Trajectory-Based Visual Localization in Underwater Surveying Missions

    Science.gov (United States)

    Burguera, Antoni; Bonin-Font, Francisco; Oliver, Gabriel

    2015-01-01

    We present a new vision-based localization system applied to an autonomous underwater vehicle (AUV) with limited sensing and computation capabilities. The traditional EKF-SLAM approaches are usually expensive in terms of execution time; the approach presented in this paper strengthens this method by adopting a trajectory-based schema that reduces the computational requirements. The pose of the vehicle is estimated using an extended Kalman filter (EKF), which predicts the vehicle motion by means of a visual odometer and corrects these predictions using the data associations (loop closures) between the current frame and the previous ones. One of the most important steps in this procedure is the image registration method, as it reinforces the data association and, thus, makes it possible to close loops reliably. Since the use of standard EKFs entail linearization errors that can distort the vehicle pose estimations, the approach has also been tested using an iterated Kalman filter (IEKF). Experiments have been conducted using a real underwater vehicle in controlled scenarios and in shallow sea waters, showing an excellent performance with very small errors, both in the vehicle pose and in the overall trajectory estimates. PMID:25594602

  6. Vision-based map building and trajectory planning to enable autonomous flight through urban environments

    Science.gov (United States)

    Watkins, Adam S.

    The desire to use Unmanned Air Vehicles (UAVs) in a variety of complex missions has motivated the need to increase the autonomous capabilities of these vehicles. This research presents autonomous vision-based mapping and trajectory planning strategies for a UAV navigating in an unknown urban environment. It is assumed that the vehicle's inertial position is unknown because GPS in unavailable due to environmental occlusions or jamming by hostile military assets. Therefore, the environment map is constructed from noisy sensor measurements taken at uncertain vehicle locations. Under these restrictions, map construction becomes a state estimation task known as the Simultaneous Localization and Mapping (SLAM) problem. Solutions to the SLAM problem endeavor to estimate the state of a vehicle relative to concurrently estimated environmental landmark locations. The presented work focuses specifically on SLAM for aircraft, denoted as airborne SLAM, where the vehicle is capable of six degree of freedom motion characterized by highly nonlinear equations of motion. The airborne SLAM problem is solved with a variety of filters based on the Rao-Blackwellized particle filter. Additionally, the environment is represented as a set of geometric primitives that are fit to the three-dimensional points reconstructed from gathered onboard imagery. The second half of this research builds on the mapping solution by addressing the problem of trajectory planning for optimal map construction. Optimality is defined in terms of maximizing environment coverage in minimum time. The planning process is decomposed into two phases of global navigation and local navigation. The global navigation strategy plans a coarse, collision-free path through the environment to a goal location that will take the vehicle to previously unexplored or incompletely viewed territory. The local navigation strategy plans detailed, collision-free paths within the currently sensed environment that maximize local coverage

  7. Mastectomy Skin Necrosis After Breast Reconstruction: A Comparative Analysis Between Autologous Reconstruction and Implant-Based Reconstruction.

    Science.gov (United States)

    Sue, Gloria R; Lee, Gordon K

    2018-05-01

    Mastectomy skin necrosis is a significant problem after breast reconstruction. We sought to perform a comparative analysis on this complication between patients undergoing autologous breast reconstruction and patients undergoing 2-stage expander implant breast reconstruction. A retrospective review was performed on consecutive patients undergoing autologous breast reconstruction or 2-stage expander implant breast reconstruction by the senior author from 2006 through 2015. Patient demographic factors including age, body mass index, history of diabetes, history of smoking, and history of radiation to the breast were collected. Our primary outcome measure was mastectomy skin necrosis. Fisher exact test was used for statistical analysis between the 2 patient cohorts. The treatment patterns of mastectomy skin necrosis were then analyzed. We identified 204 patients who underwent autologous breast reconstruction and 293 patients who underwent 2-stage expander implant breast reconstruction. Patients undergoing autologous breast reconstruction were older, heavier, more likely to have diabetes, and more likely to have had prior radiation to the breast compared with patients undergoing implant-based reconstruction. The incidence of mastectomy skin necrosis was 30.4% of patients in the autologous group compared with only 10.6% of patients in the tissue expander group (P care in the autologous group, only 3.2% were treated with local wound care in the tissue expander group (P skin necrosis is significantly more likely to occur after autologous breast reconstruction compared with 2-stage expander implant-based breast reconstruction. Patients with autologous reconstructions are more readily treated with local wound care compared with patients with tissue expanders, who tended to require operative treatment of this complication. Patients considering breast reconstruction should be counseled appropriately regarding the differences in incidence and management of mastectomy skin

  8. An FDK-like cone-beam SPECT reconstruction algorithm for non-uniform attenuated projections acquired using a circular trajectory

    International Nuclear Information System (INIS)

    Huang, Q; Zeng, G L; You, J; Gullberg, G T

    2005-01-01

    In this paper, Novikov's inversion formula of the attenuated two-dimensional (2D) Radon transform is applied to the reconstruction of attenuated fan-beam projections acquired with equal detector spacing and of attenuated cone-beam projections acquired with a flat planar detector and circular trajectory. The derivation of the fan-beam algorithm is obtained by transformation from parallel-beam coordinates to fan-beam coordinates. The cone-beam reconstruction algorithm is an extension of the fan-beam reconstruction algorithm using Feldkamp-Davis-Kress's (FDK) method. Computer simulations indicate that the algorithm is efficient and is accurate in reconstructing slices close to the central slice of the cone-beam orbit plane. When the attenuation map is set to zero the implementation is equivalent to the FDK method. Reconstructed images are also shown for noise corrupted projections

  9. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    Science.gov (United States)

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method. Copyright © 2013, The International Biometric Society.

  10. Optimization on Trajectory of Stanford Manipulator based on Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Han Xi

    2017-01-01

    Full Text Available The optimization of robot manipulator’s trajectory has become a hot topic in academic and industrial fields. In this paper, a method for minimizing the moving distance of robot manipulators is presented. The Stanford Manipulator is used as the research object and the inverse kinematics model is established with Denavit-Hartenberg method. Base on the initial posture matrix, the inverse kinematics model is used to find the initial state of each joint. In accordance with the given beginning moment, cubic polynomial interpolation is applied to each joint variable and the positive kinematic model is used to calculate the moving distance of end effector. Genetic algorithm is used to optimize the sequential order of each joint and the time difference between different starting time of joints. Numerical applications involving a Stanford manipulator are presented.

  11. Homotopy Based Reconstruction from Acoustic Images

    DEFF Research Database (Denmark)

    Sharma, Ojaswa

    of the inherent arrangement. The problem of reconstruction from arbitrary cross sections is a generic problem and is also shown to be solved here using the mathematical tool of continuous deformations. As part of a complete processing, segmentation using level set methods is explored for acoustic images and fast...... GPU (Graphics Processing Unit) based methods are suggested for a streaming computation on large volumes of data. Validation of results for acoustic images is not straightforward due to unavailability of ground truth. Accuracy figures for the suggested methods are provided using phantom object...

  12. Variance-based Salt Body Reconstruction

    KAUST Repository

    Ovcharenko, Oleg

    2017-05-26

    Seismic inversions of salt bodies are challenging when updating velocity models based on Born approximation- inspired gradient methods. We propose a variance-based method for velocity model reconstruction in regions complicated by massive salt bodies. The novel idea lies in retrieving useful information from simultaneous updates corresponding to different single frequencies. Instead of the commonly used averaging of single-iteration monofrequency gradients, our algorithm iteratively reconstructs salt bodies in an outer loop based on updates from a set of multiple frequencies after a few iterations of full-waveform inversion. The variance among these updates is used to identify areas where considerable cycle-skipping occurs. In such areas, we update velocities by interpolating maximum velocities within a certain region. The result of several recursive interpolations is later used as a new starting model to improve results of conventional full-waveform inversion. An application on part of the BP 2004 model highlights the evolution of the proposed approach and demonstrates its effectiveness.

  13. Introduction to Mobile Trajectory Based Services: A New Direction in Mobile Location Based Services

    Science.gov (United States)

    Khokhar, Sarfraz; Nilsson, Arne A.

    The mandate of E911 gave birth to the idea of Location Based Services (LBS) capitalizing on the knowledge of the mobile location. The underlying estimated location is a feasible area. There is yet another class of mobile services that could be based on the mobility profiling of a mobile user. The mobility profile of a mobile user is a set of the routine trajectories of his or her travel paths. We called such services as Mobile Trajectory Based Services (MTBS). This paper introduces MTBS and functional architecture of an MTBS system. Suitability of different location estimation technologies for MTBS has been discussed and supported with simulation results.

  14. Trajectory Optimization Based on Multi-Interval Mesh Refinement Method

    Directory of Open Access Journals (Sweden)

    Ningbo Li

    2017-01-01

    Full Text Available In order to improve the optimization accuracy and convergence rate for trajectory optimization of the air-to-air missile, a multi-interval mesh refinement Radau pseudospectral method was introduced. This method made the mesh endpoints converge to the practical nonsmooth points and decreased the overall collocation points to improve convergence rate and computational efficiency. The trajectory was divided into four phases according to the working time of engine and handover of midcourse and terminal guidance, and then the optimization model was built. The multi-interval mesh refinement Radau pseudospectral method with different collocation points in each mesh interval was used to solve the trajectory optimization model. Moreover, this method was compared with traditional h method. Simulation results show that this method can decrease the dimensionality of nonlinear programming (NLP problem and therefore improve the efficiency of pseudospectral methods for solving trajectory optimization problems.

  15. An incremental DPMM-based method for trajectory clustering, modeling, and retrieval.

    Science.gov (United States)

    Hu, Weiming; Li, Xi; Tian, Guodong; Maybank, Stephen; Zhang, Zhongfei

    2013-05-01

    Trajectory analysis is the basis for many applications, such as indexing of motion events in videos, activity recognition, and surveillance. In this paper, the Dirichlet process mixture model (DPMM) is applied to trajectory clustering, modeling, and retrieval. We propose an incremental version of a DPMM-based clustering algorithm and apply it to cluster trajectories. An appropriate number of trajectory clusters is determined automatically. When trajectories belonging to new clusters arrive, the new clusters can be identified online and added to the model without any retraining using the previous data. A time-sensitive Dirichlet process mixture model (tDPMM) is applied to each trajectory cluster for learning the trajectory pattern which represents the time-series characteristics of the trajectories in the cluster. Then, a parameterized index is constructed for each cluster. A novel likelihood estimation algorithm for the tDPMM is proposed, and a trajectory-based video retrieval model is developed. The tDPMM-based probabilistic matching method and the DPMM-based model growing method are combined to make the retrieval model scalable and adaptable. Experimental comparisons with state-of-the-art algorithms demonstrate the effectiveness of our algorithm.

  16. Free Energy Reconstruction from Logarithmic Mean-Force Dynamics Using Multiple Nonequilibrium Trajectories.

    Science.gov (United States)

    Morishita, Tetsuya; Yonezawa, Yasushige; Ito, Atsushi M

    2017-07-11

    Efficient and reliable estimation of the mean force (MF), the derivatives of the free energy with respect to a set of collective variables (CVs), has been a challenging problem because free energy differences are often computed by integrating the MF. Among various methods for computing free energy differences, logarithmic mean-force dynamics (LogMFD) [ Morishita et al., Phys. Rev. E 2012 , 85 , 066702 ] invokes the conservation law in classical mechanics to integrate the MF, which allows us to estimate the free energy profile along the CVs on-the-fly. Here, we present a method called parallel dynamics, which improves the estimation of the MF by employing multiple replicas of the system and is straightforwardly incorporated in LogMFD or a related method. In the parallel dynamics, the MF is evaluated by a nonequilibrium path-ensemble using the multiple replicas based on the Crooks-Jarzynski nonequilibrium work relation. Thanks to the Crooks relation, realizing full-equilibrium states is no longer mandatory for estimating the MF. Additionally, sampling in the hidden subspace orthogonal to the CV space is highly improved with appropriate weights for each metastable state (if any), which is hardly achievable by typical free energy computational methods. We illustrate how to implement parallel dynamics by combining it with LogMFD, which we call logarithmic parallel dynamics (LogPD). Biosystems of alanine dipeptide and adenylate kinase in explicit water are employed as benchmark systems to which LogPD is applied to demonstrate the effect of multiple replicas on the accuracy and efficiency in estimating the free energy profiles using parallel dynamics.

  17. Sampling Based Trajectory Planning for Robots in Dynamic Human Environments

    DEFF Research Database (Denmark)

    Svenstrup, Mikael

    2010-01-01

    Open-ended human environments, such as pedestrian streets, hospital corridors, train stations etc., are places where robots start to emerge. Hence, being able to plan safe and natural trajectories in these dynamic environments is an important skill for future generations of robots. In this work...... the problem is formulated as planning a minimal cost trajectory through a potential field, defined from the perceived position and motion of persons in the environment. A modified Rapidlyexploring Random Tree (RRT) algorithm is proposed as a solution to the planning problem. The algorithm implements a new...... for the uncertainty in the dynamic environment. The planning algorithm is demonstrated in a simulated pedestrian street environment....

  18. Multiphase Return Trajectory Optimization Based on Hybrid Algorithm

    Directory of Open Access Journals (Sweden)

    Yi Yang

    2016-01-01

    Full Text Available A hybrid trajectory optimization method consisting of Gauss pseudospectral method (GPM and natural computation algorithm has been developed and utilized to solve multiphase return trajectory optimization problem, where a phase is defined as a subinterval in which the right-hand side of the differential equation is continuous. GPM converts the optimal control problem to a nonlinear programming problem (NLP, which helps to improve calculation accuracy and speed of natural computation algorithm. Through numerical simulations, it is found that the multiphase optimal control problem could be solved perfectly.

  19. Fast implementations of reconstruction-based scatter compensation in fully 3D SPECT image reconstruction

    International Nuclear Information System (INIS)

    Kadrmas, Dan J.; Karimi, Seemeen S.; Frey, Eric C.; Tsui, Benjamin M.W.

    1998-01-01

    Accurate scatter compensation in SPECT can be performed by modelling the scatter response function during the reconstruction process. This method is called reconstruction-based scatter compensation (RBSC). It has been shown that RBSC has a number of advantages over other methods of compensating for scatter, but using RBSC for fully 3D compensation has resulted in prohibitively long reconstruction times. In this work we propose two new methods that can be used in conjunction with existing methods to achieve marked reductions in RBSC reconstruction times. The first method, coarse-grid scatter modelling, significantly accelerates the scatter model by exploiting the fact that scatter is dominated by low-frequency information. The second method, intermittent RBSC, further accelerates the reconstruction process by limiting the number of iterations during which scatter is modelled. The fast implementations were evaluated using a Monte Carlo simulated experiment of the 3D MCAT phantom with 99m Tc tracer, and also using experimentally acquired data with 201 Tl tracer. Results indicated that these fast methods can reconstruct, with fully 3D compensation, images very similar to those obtained using standard RBSC methods, and in reconstruction times that are an order of magnitude shorter. Using these methods, fully 3D iterative reconstruction with RBSC can be performed well within the realm of clinically realistic times (under 10 minutes for 64x64x24 image reconstruction). (author)

  20. Global Launcher Trajectory Optimization for Lunar Base Settlement

    NARCIS (Netherlands)

    Pagano, A.; Mooij, E.

    2010-01-01

    The problem of a mission to the Moon to set a permanent outpost can be tackled by dividing the journey into three phases: the Earth ascent, the Earth-Moon transfer and the lunar landing. In this paper we present an optimization analysis of Earth ascent trajectories of existing launch vehicles

  1. Data-based control trajectory planning for nonlinear systems

    International Nuclear Information System (INIS)

    Rhodes, C.; Morari, M.; Tsimring, L.S.; Rulkov, N.F.

    1997-01-01

    An open-loop trajectory planning algorithm is presented for computing an input sequence that drives an input-output system such that a reference trajectory is tracked. The algorithm utilizes only input-output data from the system to determine the proper control sequence, and does not require a mathematical or identified description of the system dynamics. From the input-output data, the controlled input trajectory is calculated in a open-quotes one-step-aheadclose quotes fashion using local modeling. Since the algorithm is calculated in this fashion, the output trajectories to be tracked can be nonperiodic. The algorithm is applied to a driven Lorenz system, and an experimental electrical circuit and the results are analyzed. Issues of stability associated with the implementation of this open-loop scheme are also examined using an analytic example of a driven Hacute enon map, problems associated with inverse controllers are illustrated, and solutions to these problems are proposed. copyright 1997 The American Physical Society

  2. Trajectory control of an articulated robot with a parallel drive arm based on splines under tension

    Science.gov (United States)

    Yi, Seung-Jong

    Today's industrial robots controlled by mini/micro computers are basically simple positioning devices. The positioning accuracy depends on the mathematical description of the robot configuration to place the end-effector at the desired position and orientation within the workspace and on following the specified path which requires the trajectory planner. In addition, the consideration of joint velocity, acceleration, and jerk trajectories are essential for trajectory planning of industrial robots to obtain smooth operation. The newly designed 6 DOF articulated robot with a parallel drive arm mechanism which permits the joint actuators to be placed in the same horizontal line to reduce the arm inertia and to increase load capacity and stiffness is selected. First, the forward kinematic and inverse kinematic problems are examined. The forward kinematic equations are successfully derived based on Denavit-Hartenberg notation with independent joint angle constraints. The inverse kinematic problems are solved using the arm-wrist partitioned approach with independent joint angle constraints. Three types of curve fitting methods used in trajectory planning, i.e., certain degree polynomial functions, cubic spline functions, and cubic spline functions under tension, are compared to select the best possible method to satisfy both smooth joint trajectories and positioning accuracy for a robot trajectory planner. Cubic spline functions under tension is the method selected for the new trajectory planner. This method is implemented for a 6 DOF articulated robot with a parallel drive arm mechanism to improve the smoothness of the joint trajectories and the positioning accuracy of the manipulator. Also, this approach is compared with existing trajectory planners, 4-3-4 polynomials and cubic spline functions, via circular arc motion simulations. The new trajectory planner using cubic spline functions under tension is implemented into the microprocessor based robot controller and

  3. Incomplete projection reconstruction of computed tomography based on the modified discrete algebraic reconstruction technique

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Gao, Zongzhao; Yang, YaFei

    2018-02-01

    Based on the discrete algebraic reconstruction technique (DART), this study aims to address and test a new improved algorithm applied to incomplete projection data to generate a high quality reconstruction image by reducing the artifacts and noise in computed tomography. For the incomplete projections, an augmented Lagrangian based on compressed sensing is first used in the initial reconstruction for segmentation of the DART to get higher contrast graphics for boundary and non-boundary pixels. Then, the block matching 3D filtering operator was used to suppress the noise and to improve the gray distribution of the reconstructed image. Finally, simulation studies on the polychromatic spectrum were performed to test the performance of the new algorithm. Study results show a significant improvement in the signal-to-noise ratios (SNRs) and average gradients (AGs) of the images reconstructed from incomplete data. The SNRs and AGs of the new images reconstructed by DART-ALBM were on average 30%-40% and 10% higher than the images reconstructed by DART algorithms. Since the improved DART-ALBM algorithm has a better robustness to limited-view reconstruction, which not only makes the edge of the image clear but also makes the gray distribution of non-boundary pixels better, it has the potential to improve image quality from incomplete projections or sparse projections.

  4. Trajectory reshaping based guidance with impact time and angle constraints

    Directory of Open Access Journals (Sweden)

    Zhao Yao

    2016-08-01

    Full Text Available This study presents a novel impact time and angle constrained guidance law for homing missiles. The guidance law is first developed with the prior-assumption of a stationary target, which is followed by the practical extension to a maneuvering target scenario. To derive the closed-form guidance law, the trajectory reshaping technique is utilized and it results in defining a specific polynomial function with two unknown coefficients. These coefficients are determined to satisfy the impact time and angle constraints as well as the zero miss distance. Furthermore, the proposed guidance law has three additional guidance gains as design parameters which make it possible to adjust the guided trajectory according to the operational conditions and missile’s capability. Numerical simulations are presented to validate the effectiveness of the proposed guidance law.

  5. A Segment-Based Trajectory Similarity Measure in the Urban Transportation Systems.

    Science.gov (United States)

    Mao, Yingchi; Zhong, Haishi; Xiao, Xianjian; Li, Xiaofang

    2017-03-06

    With the rapid spread of built-in GPS handheld smart devices, the trajectory data from GPS sensors has grown explosively. Trajectory data has spatio-temporal characteristics and rich information. Using trajectory data processing techniques can mine the patterns of human activities and the moving patterns of vehicles in the intelligent transportation systems. A trajectory similarity measure is one of the most important issues in trajectory data mining (clustering, classification, frequent pattern mining, etc.). Unfortunately, the main similarity measure algorithms with the trajectory data have been found to be inaccurate, highly sensitive of sampling methods, and have low robustness for the noise data. To solve the above problems, three distances and their corresponding computation methods are proposed in this paper. The point-segment distance can decrease the sensitivity of the point sampling methods. The prediction distance optimizes the temporal distance with the features of trajectory data. The segment-segment distance introduces the trajectory shape factor into the similarity measurement to improve the accuracy. The three kinds of distance are integrated with the traditional dynamic time warping algorithm (DTW) algorithm to propose a new segment-based dynamic time warping algorithm (SDTW). The experimental results show that the SDTW algorithm can exhibit about 57%, 86%, and 31% better accuracy than the longest common subsequence algorithm (LCSS), and edit distance on real sequence algorithm (EDR) , and DTW, respectively, and that the sensitivity to the noise data is lower than that those algorithms.

  6. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul; Al-Naffouri, Tareq Y.

    2012-01-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical

  7. Tensor-based Dictionary Learning for Dynamic Tomographic Reconstruction

    Science.gov (United States)

    Tan, Shengqi; Zhang, Yanbo; Wang, Ge; Mou, Xuanqin; Cao, Guohua; Wu, Zhifang; Yu, Hengyong

    2015-01-01

    In dynamic computed tomography (CT) reconstruction, the data acquisition speed limits the spatio-temporal resolution. Recently, compressed sensing theory has been instrumental in improving CT reconstruction from far few-view projections. In this paper, we present an adaptive method to train a tensor-based spatio-temporal dictionary for sparse representation of an image sequence during the reconstruction process. The correlations among atoms and across phases are considered to capture the characteristics of an object. The reconstruction problem is solved by the alternating direction method of multipliers. To recover fine or sharp structures such as edges, the nonlocal total variation is incorporated into the algorithmic framework. Preclinical examples including a sheep lung perfusion study and a dynamic mouse cardiac imaging demonstrate that the proposed approach outperforms the vectorized dictionary-based CT reconstruction in the case of few-view reconstruction. PMID:25779991

  8. Tensor-based dictionary learning for dynamic tomographic reconstruction

    International Nuclear Information System (INIS)

    Tan, Shengqi; Wu, Zhifang; Zhang, Yanbo; Mou, Xuanqin; Wang, Ge; Cao, Guohua; Yu, Hengyong

    2015-01-01

    In dynamic computed tomography (CT) reconstruction, the data acquisition speed limits the spatio-temporal resolution. Recently, compressed sensing theory has been instrumental in improving CT reconstruction from far few-view projections. In this paper, we present an adaptive method to train a tensor-based spatio-temporal dictionary for sparse representation of an image sequence during the reconstruction process. The correlations among atoms and across phases are considered to capture the characteristics of an object. The reconstruction problem is solved by the alternating direction method of multipliers. To recover fine or sharp structures such as edges, the nonlocal total variation is incorporated into the algorithmic framework. Preclinical examples including a sheep lung perfusion study and a dynamic mouse cardiac imaging demonstrate that the proposed approach outperforms the vectorized dictionary-based CT reconstruction in the case of few-view reconstruction. (paper)

  9. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  10. UAV-based mapping, back analysis and trajectory modeling of a coseismic rockfall in Lefkada island, Greece

    Science.gov (United States)

    Saroglou, Charalampos; Asteriou, Pavlos; Zekkos, Dimitrios; Tsiambaos, George; Clark, Marin; Manousakis, John

    2018-01-01

    We present field evidence and a kinematic study of a rock block mobilized in the Ponti area by a Mw = 6.5 earthquake near the island of Lefkada on 17 November 2015. A detailed survey was conducted using an unmanned aerial vehicle (UAV) with an ultrahigh definition (UHD) camera, which produced a high-resolution orthophoto and a digital terrain model (DTM). The sequence of impact marks from the rock trajectory on the ground surface was identified from the orthophoto and field verified. Earthquake characteristics were used to estimate the acceleration of the rock slope and the initial condition of the detached block. Using the impact points from the measured rockfall trajectory, an analytical reconstruction of the trajectory was undertaken, which led to insights on the coefficients of restitution (CORs). The measured trajectory was compared with modeled rockfall trajectories using recommended parameters. However, the actual trajectory could not be accurately predicted, revealing limitations of existing rockfall analysis software used in engineering practice.

  11. Generic FMS Platform for Evaluation of Autonomous Trajectory-Based Operation Concepts, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the Phase II work is to develop a generic, advanced Flight Management System (FMS) for the evaluation of autonomous 4D-trajectory based operations...

  12. Evaluation of proxy-based millennial reconstruction methods

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Terry C.K.; Tsao, Min [University of Victoria, Department of Mathematics and Statistics, Victoria, BC (Canada); Zwiers, Francis W. [Environment Canada, Climate Research Division, Toronto, ON (Canada)

    2008-08-15

    A range of existing statistical approaches for reconstructing historical temperature variations from proxy data are compared using both climate model data and real-world paleoclimate proxy data. We also propose a new method for reconstruction that is based on a state-space time series model and Kalman filter algorithm. The state-space modelling approach and the recently developed RegEM method generally perform better than their competitors when reconstructing interannual variations in Northern Hemispheric mean surface air temperature. On the other hand, a variety of methods are seen to perform well when reconstructing surface air temperature variability on decadal time scales. An advantage of the new method is that it can incorporate additional, non-temperature, information into the reconstruction, such as the estimated response to external forcing, thereby permitting a simultaneous reconstruction and detection analysis as well as future projection. An application of these extensions is also demonstrated in the paper. (orig.)

  13. Didactic trajectory of research in mathematics education using research-based learning

    Science.gov (United States)

    Charitas Indra Prahmana, Rully; Kusumah, Yaya S.; Darhim

    2017-10-01

    This study aims to describe the role of research-based learning in design a learning trajectory of research in mathematics education to enhance research and academic writing skills for pre-service mathematics teachers. The method used is a design research with three stages, namely the preliminary design, teaching experiment, and retrospective analysis. The research subjects are pre-service mathematics teacher class of 2012 from one higher education institution in Tangerang - Indonesia. The use of research-based learning in designing learning trajectory of research in mathematics education plays a crucial role as a trigger to enhancing math department preservice teachers research and academic writing skills. Also, this study also describes the design principles and characteristics of the learning trajectory namely didactic trajectory generated by the role of research-based learning syntax.

  14. Interval-based reconstruction for uncertainty quantification in PET

    Science.gov (United States)

    Kucharczak, Florentin; Loquin, Kevin; Buvat, Irène; Strauss, Olivier; Mariano-Goulart, Denis

    2018-02-01

    A new directed interval-based tomographic reconstruction algorithm, called non-additive interval based expectation maximization (NIBEM) is presented. It uses non-additive modeling of the forward operator that provides intervals instead of single-valued projections. The detailed approach is an extension of the maximum likelihood—expectation maximization algorithm based on intervals. The main motivation for this extension is that the resulting intervals have appealing properties for estimating the statistical uncertainty associated with the reconstructed activity values. After reviewing previously published theoretical concepts related to interval-based projectors, this paper describes the NIBEM algorithm and gives examples that highlight the properties and advantages of this interval valued reconstruction.

  15. Blind compressed sensing image reconstruction based on alternating direction method

    Science.gov (United States)

    Liu, Qinan; Guo, Shuxu

    2018-04-01

    In order to solve the problem of how to reconstruct the original image under the condition of unknown sparse basis, this paper proposes an image reconstruction method based on blind compressed sensing model. In this model, the image signal is regarded as the product of a sparse coefficient matrix and a dictionary matrix. Based on the existing blind compressed sensing theory, the optimal solution is solved by the alternative minimization method. The proposed method solves the problem that the sparse basis in compressed sensing is difficult to represent, which restrains the noise and improves the quality of reconstructed image. This method ensures that the blind compressed sensing theory has a unique solution and can recover the reconstructed original image signal from a complex environment with a stronger self-adaptability. The experimental results show that the image reconstruction algorithm based on blind compressed sensing proposed in this paper can recover high quality image signals under the condition of under-sampling.

  16. Quick plasma equilibrium reconstruction based on GPU

    International Nuclear Information System (INIS)

    Xiao Bingjia; Huang, Y.; Luo, Z.P.; Yuan, Q.P.; Lao, L.

    2014-01-01

    A parallel code named P-EFIT which could complete an equilibrium reconstruction iteration in 250 μs is described. It is built with the CUDA TM architecture by using Graphical Processing Unit (GPU). It is described for the optimization of middle-scale matrix multiplication on GPU and an algorithm which could solve block tri-diagonal linear system efficiently in parallel. Benchmark test is conducted. Static test proves the accuracy of the P-EFIT and simulation-test proves the feasibility of using P-EFIT for real-time reconstruction on 65x65 computation grids. (author)

  17. Knowledge-Based Trajectory Error Pattern Method Applied to an Active Force Control Scheme

    Directory of Open Access Journals (Sweden)

    Endra Pitowarno, Musa Mailah, Hishamuddin Jamaluddin

    2012-08-01

    Full Text Available The active force control (AFC method is known as a robust control scheme that dramatically enhances the performance of a robot arm particularly in compensating the disturbance effects. The main task of the AFC method is to estimate the inertia matrix in the feedback loop to provide the correct (motor torque required to cancel out these disturbances. Several intelligent control schemes have already been introduced to enhance the estimation methods of acquiring the inertia matrix such as those using neural network, iterative learning and fuzzy logic. In this paper, we propose an alternative scheme called Knowledge-Based Trajectory Error Pattern Method (KBTEPM to suppress the trajectory track error of the AFC scheme. The knowledge is developed from the trajectory track error characteristic based on the previous experimental results of the crude approximation method. It produces a unique, new and desirable error pattern when a trajectory command is forced. An experimental study was performed using simulation work on the AFC scheme with KBTEPM applied to a two-planar manipulator in which a set of rule-based algorithm is derived. A number of previous AFC schemes are also reviewed as benchmark. The simulation results show that the AFC-KBTEPM scheme successfully reduces the trajectory track error significantly even in the presence of the introduced disturbances.Key Words:  Active force control, estimated inertia matrix, robot arm, trajectory error pattern, knowledge-based.

  18. Quantitative tectonic reconstructions of Zealandia based on crustal thickness estimates

    Science.gov (United States)

    Grobys, Jan W. G.; Gohl, Karsten; Eagles, Graeme

    2008-01-01

    Zealandia is a key piece in the plate reconstruction of Gondwana. The positions of its submarine plateaus are major constraints on the best fit and breakup involving New Zealand, Australia, Antarctica, and associated microplates. As the submarine plateaus surrounding New Zealand consist of extended and highly extended continental crust, classic plate tectonic reconstructions assuming rigid plates and narrow plate boundaries fail to reconstruct these areas correctly. However, if the early breakup history shall be reconstructed, it is crucial to consider crustal stretching in a plate-tectonic reconstruction. We present a reconstruction of the basins around New Zealand (Great South Basin, Bounty Trough, and New Caledonia Basin) based on crustal balancing, an approach that takes into account the rifting and thinning processes affecting continental crust. In a first step, we computed a crustal thickness map of Zealandia using seismic, seismological, and gravity data. The crustal thickness map shows the submarine plateaus to have a uniform crustal thickness of 20-24 km and the basins to have a thickness of 12-16 km. We assumed that a reconstruction of Zealandia should close the basins and lead to a most uniform crustal thickness. We used the standard deviation of the reconstructed crustal thickness as a measure of uniformity. The reconstruction of the Campbell Plateau area shows that the amount of extension in the Bounty Trough and the Great South Basin is far smaller than previously thought. Our results indicate that the extension of the Bounty Trough and Great South Basin occurred simultaneously.

  19. Prepectoral Implant-Based Breast Reconstruction

    Directory of Open Access Journals (Sweden)

    Lyndsey Highton, BMBCh, MA, FRCS(Plast

    2017-09-01

    Conclusion:. Prepectoral implant placement with ADM cover is emerging as an alternative approach for IBR. This method facilitates breast reconstruction with a good cosmetic outcome for patients who want a quick recovery without potential compromise of pectoral muscle function and associated problems.

  20. Rapidly 3D Texture Reconstruction Based on Oblique Photography

    Directory of Open Access Journals (Sweden)

    ZHANG Chunsen

    2015-07-01

    Full Text Available This paper proposes a city texture fast reconstruction method based on aerial tilt image for reconstruction of three-dimensional city model. Based on the photogrammetry and computer vision theory and using the city building digital surface model obtained by prior treatment, through collinear equation calculation geometric projection of object and image space, to obtain the three-dimensional information and texture information of the structure and through certain the optimal algorithm selecting the optimal texture on the surface of the object, realize automatic extraction of the building side texture and occlusion handling of the dense building texture. The real image texture reconstruction results show that: the method to the 3D city model texture reconstruction has the characteristics of high degree of automation, vivid effect and low cost and provides a means of effective implementation for rapid and widespread real texture rapid reconstruction of city 3D model.

  1. Indoor Modelling from Slam-Based Laser Scanner: Door Detection to Envelope Reconstruction

    Science.gov (United States)

    Díaz-Vilariño, L.; Verbree, E.; Zlatanova, S.; Diakité, A.

    2017-09-01

    Updated and detailed indoor models are being increasingly demanded for various applications such as emergency management or navigational assistance. The consolidation of new portable and mobile acquisition systems has led to a higher availability of 3D point cloud data from indoors. In this work, we explore the combined use of point clouds and trajectories from SLAM-based laser scanner to automate the reconstruction of building indoors. The methodology starts by door detection, since doors represent transitions from one indoor space to other, which constitutes an initial approach about the global configuration of the point cloud into building rooms. For this purpose, the trajectory is used to create a vertical point cloud profile in which doors are detected as local minimum of vertical distances. As point cloud and trajectory are related by time stamp, this feature is used to subdivide the point cloud into subspaces according to the location of the doors. The correspondence between subspaces and building rooms is not unambiguous. One subspace always corresponds to one room, but one room is not necessarily depicted by just one subspace, for example, in case of a room containing several doors and in which the acquisition is performed in a discontinue way. The labelling problem is formulated as combinatorial approach solved as a minimum energy optimization. Once the point cloud is subdivided into building rooms, envelop (conformed by walls, ceilings and floors) is reconstructed for each space. The connectivity between spaces is included by adding the previously detected doors to the reconstructed model. The methodology is tested in a real case study.

  2. INDOOR MODELLING FROM SLAM-BASED LASER SCANNER: DOOR DETECTION TO ENVELOPE RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    L. Díaz-Vilariño

    2017-09-01

    Full Text Available Updated and detailed indoor models are being increasingly demanded for various applications such as emergency management or navigational assistance. The consolidation of new portable and mobile acquisition systems has led to a higher availability of 3D point cloud data from indoors. In this work, we explore the combined use of point clouds and trajectories from SLAM-based laser scanner to automate the reconstruction of building indoors. The methodology starts by door detection, since doors represent transitions from one indoor space to other, which constitutes an initial approach about the global configuration of the point cloud into building rooms. For this purpose, the trajectory is used to create a vertical point cloud profile in which doors are detected as local minimum of vertical distances. As point cloud and trajectory are related by time stamp, this feature is used to subdivide the point cloud into subspaces according to the location of the doors. The correspondence between subspaces and building rooms is not unambiguous. One subspace always corresponds to one room, but one room is not necessarily depicted by just one subspace, for example, in case of a room containing several doors and in which the acquisition is performed in a discontinue way. The labelling problem is formulated as combinatorial approach solved as a minimum energy optimization. Once the point cloud is subdivided into building rooms, envelop (conformed by walls, ceilings and floors is reconstructed for each space. The connectivity between spaces is included by adding the previously detected doors to the reconstructed model. The methodology is tested in a real case study.

  3. 3D dictionary learning based iterative cone beam CT reconstruction

    Directory of Open Access Journals (Sweden)

    Ti Bai

    2014-03-01

    Full Text Available Purpose: This work is to develop a 3D dictionary learning based cone beam CT (CBCT reconstruction algorithm on graphic processing units (GPU to improve the quality of sparse-view CBCT reconstruction with high efficiency. Methods: A 3D dictionary containing 256 small volumes (atoms of 3 × 3 × 3 was trained from a large number of blocks extracted from a high quality volume image. On the basis, we utilized cholesky decomposition based orthogonal matching pursuit algorithm to find the sparse representation of each block. To accelerate the time-consuming sparse coding in the 3D case, we implemented the sparse coding in a parallel fashion by taking advantage of the tremendous computational power of GPU. Conjugate gradient least square algorithm was adopted to minimize the data fidelity term. Evaluations are performed based on a head-neck patient case. FDK reconstruction with full dataset of 364 projections is used as the reference. We compared the proposed 3D dictionary learning based method with tight frame (TF by performing reconstructions on a subset data of 121 projections. Results: Compared to TF based CBCT reconstruction that shows good overall performance, our experiments indicated that 3D dictionary learning based CBCT reconstruction is able to recover finer structures, remove more streaking artifacts and also induce less blocky artifacts. Conclusion: 3D dictionary learning based CBCT reconstruction algorithm is able to sense the structural information while suppress the noise, and hence to achieve high quality reconstruction under the case of sparse view. The GPU realization of the whole algorithm offers a significant efficiency enhancement, making this algorithm more feasible for potential clinical application.-------------------------------Cite this article as: Bai T, Yan H, Shi F, Jia X, Lou Y, Xu Q, Jiang S, Mou X. 3D dictionary learning based iterative cone beam CT reconstruction. Int J Cancer Ther Oncol 2014; 2(2:020240. DOI: 10

  4. Back to the Future: Consistency-Based Trajectory Tracking

    Science.gov (United States)

    Kurien, James; Nayak, P. Pandurand; Norvig, Peter (Technical Monitor)

    2000-01-01

    Given a model of a physical process and a sequence of commands and observations received over time, the task of an autonomous controller is to determine the likely states of the process and the actions required to move the process to a desired configuration. We introduce a representation and algorithms for incrementally generating approximate belief states for a restricted but relevant class of partially observable Markov decision processes with very large state spaces. The algorithm presented incrementally generates, rather than revises, an approximate belief state at any point by abstracting and summarizing segments of the likely trajectories of the process. This enables applications to efficiently maintain a partial belief state when it remains consistent with observations and revisit past assumptions about the process' evolution when the belief state is ruled out. The system presented has been implemented and results on examples from the domain of spacecraft control are presented.

  5. Indoor Trajectory Tracking Scheme Based on Delaunay Triangulation and Heuristic Information in Wireless Sensor Networks.

    Science.gov (United States)

    Qin, Junping; Sun, Shiwen; Deng, Qingxu; Liu, Limin; Tian, Yonghong

    2017-06-02

    Object tracking and detection is one of the most significant research areas for wireless sensor networks. Existing indoor trajectory tracking schemes in wireless sensor networks are based on continuous localization and moving object data mining. Indoor trajectory tracking based on the received signal strength indicator ( RSSI ) has received increased attention because it has low cost and requires no special infrastructure. However, RSSI tracking introduces uncertainty because of the inaccuracies of measurement instruments and the irregularities (unstable, multipath, diffraction) of wireless signal transmissions in indoor environments. Heuristic information includes some key factors for trajectory tracking procedures. This paper proposes a novel trajectory tracking scheme based on Delaunay triangulation and heuristic information (TTDH). In this scheme, the entire field is divided into a series of triangular regions. The common side of adjacent triangular regions is regarded as a regional boundary. Our scheme detects heuristic information related to a moving object's trajectory, including boundaries and triangular regions. Then, the trajectory is formed by means of a dynamic time-warping position-fingerprint-matching algorithm with heuristic information constraints. Field experiments show that the average error distance of our scheme is less than 1.5 m, and that error does not accumulate among the regions.

  6. Oceanic Flights and Airspace: Improving Efficiency by Trajectory-Based Operations

    Science.gov (United States)

    Fernandes, Alicia Borgman; Rebollo, Juan; Koch, Michael

    2016-01-01

    Oceanic operations suffer from multiple inefficiencies, including pre-departure planning that does not adequately consider uncertainty in the proposed trajectory, restrictions on the routes that a flight operator can choose for an oceanic crossing, time-consuming processes and procedures for amending en route trajectories, and difficulties exchanging data between Flight Information Regions (FIRs). These inefficiencies cause aircraft to fly suboptimal trajectories, burning fuel and time that could be conserved. A concept to support integration of existing and emerging capabilities and concepts is needed to transition to an airspace system that employs Trajectory Based Operations (TBO) to improve efficiency and safety in oceanic operations. This paper describes such a concept and the results of preliminary activities to evaluate the concept, including a stakeholder feedback activity, user needs analysis, and high level benefits analysis.

  7. Right adrenal vein: comparison between adaptive statistical iterative reconstruction and model-based iterative reconstruction.

    Science.gov (United States)

    Noda, Y; Goshima, S; Nagata, S; Miyoshi, T; Kawada, H; Kawai, N; Tanahashi, Y; Matsuo, M

    2018-06-01

    To compare right adrenal vein (RAV) visualisation and contrast enhancement degree on adrenal venous phase images reconstructed using adaptive statistical iterative reconstruction (ASiR) and model-based iterative reconstruction (MBIR) techniques. This prospective study was approved by the institutional review board, and written informed consent was waived. Fifty-seven consecutive patients who underwent adrenal venous phase imaging were enrolled. The same raw data were reconstructed using ASiR 40% and MBIR. The expert and beginner independently reviewed computed tomography (CT) images. RAV visualisation rates, background noise, and CT attenuation of the RAV, right adrenal gland, inferior vena cava (IVC), hepatic vein, and bilateral renal veins were compared between the two reconstruction techniques. RAV visualisation rates were higher with MBIR than with ASiR (95% versus 88%, p=0.13 in expert and 93% versus 75%, p=0.002 in beginner, respectively). RAV visualisation confidence ratings with MBIR were significantly greater than with ASiR (pASiR (pASiR (p=0.0013 and 0.02). Reconstruction of adrenal venous phase images using MBIR significantly reduces background noise, leading to an improvement in the RAV visualisation compared with ASiR. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  8. 3D reconstruction based on light field images

    Science.gov (United States)

    Zhu, Dong; Wu, Chunhong; Liu, Yunluo; Fu, Dongmei

    2018-04-01

    This paper proposed a method of reconstructing three-dimensional (3D) scene from two light field images capture by Lytro illium. The work was carried out by first extracting the sub-aperture images from light field images and using the scale-invariant feature transform (SIFT) for feature registration on the selected sub-aperture images. Structure from motion (SFM) algorithm is further used on the registration completed sub-aperture images to reconstruct the three-dimensional scene. 3D sparse point cloud was obtained in the end. The method shows that the 3D reconstruction can be implemented by only two light field camera captures, rather than at least a dozen times captures by traditional cameras. This can effectively solve the time-consuming, laborious issues for 3D reconstruction based on traditional digital cameras, to achieve a more rapid, convenient and accurate reconstruction.

  9. Three Dimensional Dynamic Model Based Wind Field Reconstruction from Lidar Data

    International Nuclear Information System (INIS)

    Raach, Steffen; Schlipf, David; Haizmann, Florian; Cheng, Po Wen

    2014-01-01

    Using the inflowing horizontal and vertical wind shears for individual pitch controller is a promising method if blade bending measurements are not available. Due to the limited information provided by a lidar system the reconstruction of shears in real-time is a challenging task especially for the horizontal shear in the presence of changing wind direction. The internal model principle has shown to be a promising approach to estimate the shears and directions in 10 minutes averages with real measurement data. The static model based wind vector field reconstruction is extended in this work taking into account a dynamic reconstruction model based on Taylor's Frozen Turbulence Hypothesis. The presented method provides time series over several seconds of the wind speed, shears and direction, which can be directly used in advanced optimal preview control. Therefore, this work is an important step towards the application of preview individual blade pitch control under realistic wind conditions. The method is tested using a turbulent wind field and a detailed lidar simulator. For the simulation, the turbulent wind field structure is flowing towards the lidar system and is continuously misaligned with respect to the horizontal axis of the wind turbine. Taylor's Frozen Turbulence Hypothesis is taken into account to model the wind evolution. For the reconstruction, the structure is discretized into several stages where each stage is reduced to an effective wind speed, superposed with a linear horizontal and vertical wind shear. Previous lidar measurements are shifted using again Taylor's Hypothesis. The wind field reconstruction problem is then formulated as a nonlinear optimization problem, which minimizes the residual between the assumed wind model and the lidar measurements to obtain the misalignment angle and the effective wind speed and the wind shears for each stage. This method shows good results in reconstructing the wind characteristics of a three

  10. Quantum process reconstruction based on mutually unbiased basis

    International Nuclear Information System (INIS)

    Fernandez-Perez, A.; Saavedra, C.; Klimov, A. B.

    2011-01-01

    We study a quantum process reconstruction based on the use of mutually unbiased projectors (MUB projectors) as input states for a D-dimensional quantum system, with D being a power of a prime number. This approach connects the results of quantum-state tomography using mutually unbiased bases with the coefficients of a quantum process, expanded in terms of MUB projectors. We also study the performance of the reconstruction scheme against random errors when measuring probabilities at the MUB projectors.

  11. Trajectory-based understanding of the quantum-classical transition for barrier scattering

    Science.gov (United States)

    Chou, Chia-Chun

    2018-06-01

    The quantum-classical transition of wave packet barrier scattering is investigated using a hydrodynamic description in the framework of a nonlinear Schrödinger equation. The nonlinear equation provides a continuous description for the quantum-classical transition of physical systems by introducing a degree of quantumness. Based on the transition equation, the transition trajectory formalism is developed to establish the connection between classical and quantum trajectories. The quantum-classical transition is then analyzed for the scattering of a Gaussian wave packet from an Eckart barrier and the decay of a metastable state. Computational results for the evolution of the wave packet and the transmission probabilities indicate that classical results are recovered when the degree of quantumness tends to zero. Classical trajectories are in excellent agreement with the transition trajectories in the classical limit, except in some regions where transition trajectories cannot cross because of the single-valuedness of the transition wave function. As the computational results demonstrate, the process that the Planck constant tends to zero is equivalent to the gradual removal of quantum effects originating from the quantum potential. This study provides an insightful trajectory interpretation for the quantum-classical transition of wave packet barrier scattering.

  12. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul

    2012-12-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.

  13. 3D trajectories re-construction of droplets ejected in controlled tungsten melting studies in ASDEX Upgrade

    International Nuclear Information System (INIS)

    Yang, Zhongshi; Krieger, K.; Lunt, T.; Brochard, F.; Briancon, J.-L.; Neu, R.; Dux, R.; Janzer, A.; Potzel, S.; Pütterich, T.

    2013-01-01

    Two fast visible range camera systems with vertically and tangentially oriented crossed viewing fields in the divertor region of ASDEX Upgrade were used to observe tungsten (W) droplets ejected from a melting W-pin into the divertor plasma. To obtain the spatial (3D) trajectories of the tungsten droplets, the trajectory of a given droplet in the (2D) camera image coordinate system was derived for each view separately. From these data, the 3D droplet position was derived by computing the shortest line segment connecting crossed viewing chords to a given droplet taking the centre coordinate of the line segment as actual coordinate. The experimental error of the derived position was estimated from the length of the connecting line segment. In the toroidal direction, the computed trajectories can be described by an analytical transport model for the droplets assuming an initial droplet diameter in the range of 60–100 μm. In the vertical direction the vertical distance droplets travelled when reaching the edge of the viewing field also agrees with the model predictions. However, discrepancies are found with respect to the time point whence a droplet reverts its initial downwards motion due to centrifugal forces exceeding gravity and plasma generated force components

  14. Designing on ICT reconstruction software based on DSP techniques

    International Nuclear Information System (INIS)

    Liu Jinhui; Xiang Xincheng

    2006-01-01

    The convolution back project (CBP) algorithm is used to realize the CT image's reconstruction in ICT generally, which is finished by using PC or workstation. In order to add the ability of multi-platform operation of CT reconstruction software, a CT reconstruction method based on modern digital signal processor (DSP) technique is proposed and realized in this paper. The hardware system based on TI's C6701 DSP processor is selected to support the CT software construction. The CT reconstruction software is compiled only using assembly language related to the DSP hardware. The CT software can be run on TI's C6701 EVM board by inputting the CT data, and can get the CT Images that satisfy the real demands. (authors)

  15. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  16. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  17. Multifractal signal reconstruction based on singularity power spectrum

    International Nuclear Information System (INIS)

    Xiong, Gang; Yu, Wenxian; Xia, Wenxiang; Zhang, Shuning

    2016-01-01

    Highlights: • We propose a novel multifractal reconstruction method based on singularity power spectrum analysis (MFR-SPS). • The proposed MFR-SPS method has better power characteristic than the algorithm in Fraclab. • Further, the SPS-ISE algorithm performs better than the SPS-MFS algorithm. • Based on the proposed MFR-SPS method, we can restructure singularity white fractal noise (SWFN) and linear singularity modulation (LSM) multifractal signal, in equivalent sense, similar with the linear frequency modulation(LFM) signal and WGN in the Fourier domain. - Abstract: Fractal reconstruction (FR) and multifractal reconstruction (MFR) can be considered as the inverse problem of singularity spectrum analysis, and it is challenging to reconstruct fractal signal in accord with multifractal spectrum (MFS). Due to the multiple solutions of fractal reconstruction, the traditional methods of FR/MFR, such as FBM based method, wavelet based method, random wavelet series, fail to reconstruct fractal signal deterministically, and besides, those methods neglect the power spectral distribution in the singular domain. In this paper, we propose a novel MFR method based singularity power spectrum (SPS). Supposing the consistent uniform covering of multifractal measurement, we control the traditional power law of each scale of wavelet coefficients based on the instantaneous singularity exponents (ISE) or MFS, simultaneously control the singularity power law based on the SPS, and deduce the principle and algorithm of MFR based on SPS. Reconstruction simulation and error analysis of estimated ISE, MFS and SPS show the effectiveness and the improvement of the proposed methods compared to those obtained by the Fraclab package.

  18. Structured Light-Based 3D Reconstruction System for Plants

    OpenAIRE

    Nguyen, Thuy Tuong; Slaughter, David C.; Max, Nelson; Maloof, Julin N.; Sinha, Neelima

    2015-01-01

    Camera-based 3D reconstruction of physical objects is one of the most popular computer vision trends in recent years. Many systems have been built to model different real-world subjects, but there is lack of a completely robust system for plants. This paper presents a full 3D reconstruction system that incorporates both hardware structures (including the proposed structured light system to enhance textures on object surfaces) and software algorithms (including the proposed 3D point cloud regi...

  19. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  20. An Orthogonal Multi-Swarm Cooperative PSO Algorithm with a Particle Trajectory Knowledge Base

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2017-01-01

    Full Text Available A novel orthogonal multi-swarm cooperative particle swarm optimization (PSO algorithm with a particle trajectory knowledge base is presented in this paper. Different from the traditional PSO algorithms and other variants of PSO, the proposed orthogonal multi-swarm cooperative PSO algorithm not only introduces an orthogonal initialization mechanism and a particle trajectory knowledge base for multi-dimensional optimization problems, but also conceives a new adaptive cooperation mechanism to accomplish the information interaction among swarms and particles. Experiments are conducted on a set of benchmark functions, and the results show its better performance compared with traditional PSO algorithm in aspects of convergence, computational efficiency and avoiding premature convergence.

  1. Fast Dictionary-Based Reconstruction for Diffusion Spectrum Imaging

    Science.gov (United States)

    Bilgic, Berkin; Chatnuntawech, Itthi; Setsompop, Kawin; Cauley, Stephen F.; Yendiki, Anastasia; Wald, Lawrence L.; Adalsteinsson, Elfar

    2015-01-01

    Diffusion Spectrum Imaging (DSI) reveals detailed local diffusion properties at the expense of substantially long imaging times. It is possible to accelerate acquisition by undersampling in q-space, followed by image reconstruction that exploits prior knowledge on the diffusion probability density functions (pdfs). Previously proposed methods impose this prior in the form of sparsity under wavelet and total variation (TV) transforms, or under adaptive dictionaries that are trained on example datasets to maximize the sparsity of the representation. These compressed sensing (CS) methods require full-brain processing times on the order of hours using Matlab running on a workstation. This work presents two dictionary-based reconstruction techniques that use analytical solutions, and are two orders of magnitude faster than the previously proposed dictionary-based CS approach. The first method generates a dictionary from the training data using Principal Component Analysis (PCA), and performs the reconstruction in the PCA space. The second proposed method applies reconstruction using pseudoinverse with Tikhonov regularization with respect to a dictionary. This dictionary can either be obtained using the K-SVD algorithm, or it can simply be the training dataset of pdfs without any training. All of the proposed methods achieve reconstruction times on the order of seconds per imaging slice, and have reconstruction quality comparable to that of dictionary-based CS algorithm. PMID:23846466

  2. Fast dictionary-based reconstruction for diffusion spectrum imaging.

    Science.gov (United States)

    Bilgic, Berkin; Chatnuntawech, Itthi; Setsompop, Kawin; Cauley, Stephen F; Yendiki, Anastasia; Wald, Lawrence L; Adalsteinsson, Elfar

    2013-11-01

    Diffusion spectrum imaging reveals detailed local diffusion properties at the expense of substantially long imaging times. It is possible to accelerate acquisition by undersampling in q-space, followed by image reconstruction that exploits prior knowledge on the diffusion probability density functions (pdfs). Previously proposed methods impose this prior in the form of sparsity under wavelet and total variation transforms, or under adaptive dictionaries that are trained on example datasets to maximize the sparsity of the representation. These compressed sensing (CS) methods require full-brain processing times on the order of hours using MATLAB running on a workstation. This work presents two dictionary-based reconstruction techniques that use analytical solutions, and are two orders of magnitude faster than the previously proposed dictionary-based CS approach. The first method generates a dictionary from the training data using principal component analysis (PCA), and performs the reconstruction in the PCA space. The second proposed method applies reconstruction using pseudoinverse with Tikhonov regularization with respect to a dictionary. This dictionary can either be obtained using the K-SVD algorithm, or it can simply be the training dataset of pdfs without any training. All of the proposed methods achieve reconstruction times on the order of seconds per imaging slice, and have reconstruction quality comparable to that of dictionary-based CS algorithm.

  3. Robotic excavator trajectory control using an improved GA based PID controller

    Science.gov (United States)

    Feng, Hao; Yin, Chen-Bo; Weng, Wen-wen; Ma, Wei; Zhou, Jun-jing; Jia, Wen-hua; Zhang, Zi-li

    2018-05-01

    In order to achieve excellent trajectory tracking performances, an improved genetic algorithm (IGA) is presented to search for the optimal proportional-integral-derivative (PID) controller parameters for the robotic excavator. Firstly, the mathematical model of kinematic and electro-hydraulic proportional control system of the excavator are analyzed based on the mechanism modeling method. On this basis, the actual model of the electro-hydraulic proportional system are established by the identification experiment. Furthermore, the population, the fitness function, the crossover probability and mutation probability of the SGA are improved: the initial PID parameters are calculated by the Ziegler-Nichols (Z-N) tuning method and the initial population is generated near it; the fitness function is transformed to maintain the diversity of the population; the probability of crossover and mutation are adjusted automatically to avoid premature convergence. Moreover, a simulation study is carried out to evaluate the time response performance of the proposed controller, i.e., IGA based PID against the SGA and Z-N based PID controllers with a step signal. It was shown from the simulation study that the proposed controller provides the least rise time and settling time of 1.23 s and 1.81 s, respectively against the other tested controllers. Finally, two types of trajectories are designed to validate the performances of the control algorithms, and experiments are performed on the excavator trajectory control experimental platform. It was demonstrated from the experimental work that the proposed IGA based PID controller improves the trajectory accuracy of the horizontal line and slope line trajectories by 23.98% and 23.64%, respectively in comparison to the SGA tuned PID controller. The results further indicate that the proposed IGA tuning based PID controller is effective for improving the tracking accuracy, which may be employed in the trajectory control of an actual excavator.

  4. Mobile Robot Based on the Selection of Fuzzy Behaviours for following Trajectories in Crops

    Directory of Open Access Journals (Sweden)

    Claudio Urrea

    2016-06-01

    Full Text Available This article addresses the problem of trajectory tracking in crops by a weed sprayer mobile robot (WSMR. This problem arises because to fumigate, the robot must follow a predefined path and avoid any obstacles it may encounter. To achieve both trajectory tracking and obstacle avoidance, a control scheme based on different behaviours is proposed, which consists essentially of an adaptive controller with a reference model for trajectory tracking and a fuzzy reactive for obstacle avoidance. Each of these controllers is executed according to the selection of the fuzzy behaviour controller, which uses information delivered by anti-collision sensors located on the robot. As a result of the implementation of this behaviour-based architecture and by means of computer simulations and experimental laboratory tests, the WSMR demonstrates the capability of autonomously following a desired trajectory between the rows of a crop in the presence of obstacles. The results are evaluated by taking into account trajectory tracking curves and the operating requirements of each controller, as well as the application of different errors indices for quantitatively evaluating the proposed control scheme.

  5. HOTSPOTS DETECTION FROM TRAJECTORY DATA BASED ON SPATIOTEMPORAL DATA FIELD CLUSTERING

    Directory of Open Access Journals (Sweden)

    K. Qin

    2017-09-01

    Full Text Available City hotspots refer to the areas where residents visit frequently, and large traffic flow exist, which reflect the people travel patterns and distribution of urban function area. Taxi trajectory data contain abundant information about urban functions and citizen activities, and extracting interesting city hotspots from them can be of importance in urban planning, traffic command, public travel services etc. To detect city hotspots and discover a variety of changing patterns among them, we introduce a data field-based cluster analysis technique to the pick-up and drop-off points of taxi trajectory data and improve the method by introducing the time weight, which has been normalized to estimate the potential value in data field. Thus, in the light of the new potential function in data field, short distance and short time difference play a powerful role. So the region full of trajectory points, which is regarded as hotspots area, has a higher potential value, while the region with thin trajectory points has a lower potential value. The taxi trajectory data of Wuhan city in China on May 1, 6 and 9, 2015, are taken as the experimental data. From the result, we find the sustaining hotspots area and inconstant hotspots area in Wuhan city based on the spatiotemporal data field method. Further study will focus on optimizing parameter and the interaction among hotspots area.

  6. Automatic control logics to eliminate xenon oscillation based on Axial Offsets Trajectory Method

    International Nuclear Information System (INIS)

    Shimazu, Yoichiro

    1996-01-01

    We have proposed Axial Offsets (AO) Trajectory Method for xenon oscillation control in pressurized water reactors. The features of this method are described as such that it can clearly give necessary control operations to eliminate xenon oscillations. It is expected that using the features automatic control logics for xenon oscillations can be simple and be realized easily. We investigated automatic control logics. The AO Trajectory Method could realize a very simple logic only for eliminating xenon oscillations. However it was necessary to give another considerations to eliminate the xenon oscillation with a given axial power distribution. The other control logic based on the modern control theory was also studied for comparison of the control performance of the new control logic. As the results, it is presented that the automatic control logics based on the AO Trajectory Method are very simple and effective. (author)

  7. A Near-Term Concept for Trajectory Based Operations with Air/Ground Data Link Communication

    Science.gov (United States)

    McNally, David; Mueller, Eric; Thipphavong, David; Paielli, Russell; Cheng, Jinn-Hwei; Lee, Chuhan; Sahlman, Scott; Walton, Joe

    2010-01-01

    An operating concept and required system components for trajectory-based operations with air/ground data link for today's en route and transition airspace is proposed. Controllers are fully responsible for separation as they are today, and no new aircraft equipage is required. Trajectory automation computes integrated solutions to problems like metering, weather avoidance, traffic conflicts and the desire to find and fly more time/fuel efficient flight trajectories. A common ground-based system supports all levels of aircraft equipage and performance including those equipped and not equipped for data link. User interface functions for the radar controller's display make trajectory-based clearance advisories easy to visualize, modify if necessary, and implement. Laboratory simulations (without human operators) were conducted to test integrated operation of selected system components with uncertainty modeling. Results are based on 102 hours of Fort Worth Center traffic recordings involving over 37,000 individual flights. The presence of uncertainty had a marginal effect (5%) on minimum-delay conflict resolution performance, and windfavorable routes had no effect on detection and resolution metrics. Flight plan amendments and clearances were substantially reduced compared to today s operations. Top-of-descent prediction errors are the largest cause of failure indicating that better descent predictions are needed to reliably achieve fuel-efficient descent profiles in medium to heavy traffic. Improved conflict detections for climbing flights could enable substantially more continuous climbs to cruise altitude. Unlike today s Conflict Alert, tactical automation must alert when an altitude amendment is entered, but before the aircraft starts the maneuver. In every other failure case tactical automation prevented losses of separation. A real-time prototype trajectory trajectory-automation system is running now and could be made ready for operational testing at an en route

  8. The trajectory of hope: pathways to find meaning and reconstructing the self after a spinal cord injury.

    Science.gov (United States)

    Parashar, D

    2015-07-01

    This is a qualitative study. To evaluate and track the importance and the continuum of hope, and its trajectory, from the point of view of the individual with a spinal cord injury (SCI) and a rehabilitation psychologist. This study was conducted in the Indian Spinal Injuries Centre, New Delhi, India, and in patients' homes in the National Capital Region, India. Twenty individuals with an SCI were interviewed for the study at intervals of 2 weeks, 6 months, 1 year and 2 years since the time of the injury. Semi-structured interviews were conducted, in which the following theoretical research questions were investigated: 'What is the meaning, relevance and significance of hope in the individual's life following an SCI? Does the meaning and subject of hope change at different points in time?' Three distinctive themes markedly emerged in the trajectory of hope: (1) Hope for a complete recovery; (2) hope for self-reliance despite the injury; and (3) hope for an optimum quality of life. The make-up of each theme, its significance and contribution to recovery and/or rehabilitation, while tracking the influence of time since injury, family and friends, as well as other agencies and pathways, are discussed. After sustaining a life-altering injury, hope becomes the force that spurs individuals. Psychologists and rehabilitation counselors need to focus on instilling realistic hope, goal setting, sustaining motivation, enabling adaptive appraisals and problem-solving. Further recommendations include developing and testing interventions against the context of the continuum of hope.

  9. Flight trajectory recreation and playback system of aerial mission based on ossimplanet

    OpenAIRE

    Wu, Wu; Hu, Jiulin; Huang, Xiaofang; Chen, Huijie; Sun, Bo

    2014-01-01

    Recreation of flight trajectory is important among research areas. The design of a flight trajectory recreation and playback system is presented in this paper. Rather than transferring the flight data to diagram, graph and table, flight data is visualized on the 3D global of ossimPlanet. ossimPlanet is an open-source 3D global geo-spatial viewer and the system realization is based on analysis it. Users are allowed to choose their interested flight of aerial mission. The aerial ...

  10. Study of talcum charging status in parallel plate electrostatic separator based on particle trajectory analysis

    Science.gov (United States)

    Yunxiao, CAO; Zhiqiang, WANG; Jinjun, WANG; Guofeng, LI

    2018-05-01

    Electrostatic separation has been extensively used in mineral processing, and has the potential to separate gangue minerals from raw talcum ore. As for electrostatic separation, the particle charging status is one of important influence factors. To describe the talcum particle charging status in a parallel plate electrostatic separator accurately, this paper proposes a modern images processing method. Based on the actual trajectories obtained from sequence images of particle movement and the analysis of physical forces applied on a charged particle, a numerical model is built, which could calculate the charge-to-mass ratios represented as the charging status of particle and simulate the particle trajectories. The simulated trajectories agree well with the experimental results obtained by images processing. In addition, chemical composition analysis is employed to reveal the relationship between ferrum gangue mineral content and charge-to-mass ratios. Research results show that the proposed method is effective for describing the particle charging status in electrostatic separation.

  11. Gamma regularization based reconstruction for low dose CT

    International Nuclear Information System (INIS)

    Zhang, Junfeng; Chen, Yang; Hu, Yining; Luo, Limin; Shu, Huazhong; Li, Bicao; Liu, Jin; Coatrieux, Jean-Louis

    2015-01-01

    Reducing the radiation in computerized tomography is today a major concern in radiology. Low dose computerized tomography (LDCT) offers a sound way to deal with this problem. However, more severe noise in the reconstructed CT images is observed under low dose scan protocols (e.g. lowered tube current or voltage values). In this paper we propose a Gamma regularization based algorithm for LDCT image reconstruction. This solution is flexible and provides a good balance between the regularizations based on l 0 -norm and l 1 -norm. We evaluate the proposed approach using the projection data from simulated phantoms and scanned Catphan phantoms. Qualitative and quantitative results show that the Gamma regularization based reconstruction can perform better in both edge-preserving and noise suppression when compared with other norms. (paper)

  12. Analysing co-articulation using frame-based feature trajectories

    CSIR Research Space (South Africa)

    Badenhorst, J

    2010-11-01

    Full Text Available The authors investigate several approaches aimed at a more detailed understanding of co-articulation in spoken utterances. They find that the Euclidean difference between instantaneous frame-based feature values and the mean values of these features...

  13. Information-theoretic discrepancy based iterative reconstructions (IDIR) for polychromatic x-ray tomography

    International Nuclear Information System (INIS)

    Jang, Kwang Eun; Lee, Jongha; Sung, Younghun; Lee, SeongDeok

    2013-01-01

    Purpose: X-ray photons generated from a typical x-ray source for clinical applications exhibit a broad range of wavelengths, and the interactions between individual particles and biological substances depend on particles' energy levels. Most existing reconstruction methods for transmission tomography, however, neglect this polychromatic nature of measurements and rely on the monochromatic approximation. In this study, we developed a new family of iterative methods that incorporates the exact polychromatic model into tomographic image recovery, which improves the accuracy and quality of reconstruction.Methods: The generalized information-theoretic discrepancy (GID) was employed as a new metric for quantifying the distance between the measured and synthetic data. By using special features of the GID, the objective function for polychromatic reconstruction which contains a double integral over the wavelength and the trajectory of incident x-rays was simplified to a paraboloidal form without using the monochromatic approximation. More specifically, the original GID was replaced with a surrogate function with two auxiliary, energy-dependent variables. Subsequently, the alternating minimization technique was applied to solve the double minimization problem. Based on the optimization transfer principle, the objective function was further simplified to the paraboloidal equation, which leads to a closed-form update formula. Numerical experiments on the beam-hardening correction and material-selective reconstruction were conducted to compare and assess the performance of conventional methods and the proposed algorithms.Results: The authors found that the GID determines the distance between its two arguments in a flexible manner. In this study, three groups of GIDs with distinct data representations were considered. The authors demonstrated that one type of GIDs that comprises “raw” data can be viewed as an extension of existing statistical reconstructions; under a

  14. Matrix-based image reconstruction methods for tomography

    International Nuclear Information System (INIS)

    Llacer, J.; Meng, J.D.

    1984-10-01

    Matrix methods of image reconstruction have not been used, in general, because of the large size of practical matrices, ill condition upon inversion and the success of Fourier-based techniques. An exception is the work that has been done at the Lawrence Berkeley Laboratory for imaging with accelerated radioactive ions. An extension of that work into more general imaging problems shows that, with a correct formulation of the problem, positron tomography with ring geometries results in well behaved matrices which can be used for image reconstruction with no distortion of the point response in the field of view and flexibility in the design of the instrument. Maximum Likelihood Estimator methods of reconstruction, which use the system matrices tailored to specific instruments and do not need matrix inversion, are shown to result in good preliminary images. A parallel processing computer structure based on multiple inexpensive microprocessors is proposed as a system to implement the matrix-MLE methods. 14 references, 7 figures

  15. An Effective Privacy Architecture to Preserve User Trajectories in Reward-Based LBS Applications

    Directory of Open Access Journals (Sweden)

    A S M Touhidul Hasan

    2018-02-01

    Full Text Available How can training performance data (e.g., running or walking routes be collected, measured, and published in a mobile program while preserving user privacy? This question is becoming important in the context of the growing use of reward-based location-based service (LBS applications, which aim to promote employee training activities and to share such data with insurance companies in order to reduce the healthcare insurance costs of an organization. One of the main concerns of such applications is the privacy of user trajectories, because the applications normally collect user locations over time with identities. The leak of the identified trajectories often results in personal privacy breaches. For instance, a trajectory would expose user interest in places and behaviors in time by inference and linking attacks. This information can be used for spam advertisements or individual-based assaults. To the best of our knowledge, no existing studies can be directly applied to solve the problem while keeping data utility. In this paper, we identify the personal privacy problem in a reward-based LBS application and propose privacy architecture with a bounded perturbation technique to protect user’s trajectory from the privacy breaches. Bounded perturbation uses global location set (GLS to anonymize the trajectory data. In addition, the bounded perturbation will not generate any visiting points that are not possible to visit in real time. The experimental results on real-world datasets demonstrate that the proposed bounded perturbation can effectively anonymize location information while preserving data utility compared to the existing methods.

  16. Acellular dermal matrix based nipple reconstruction: A modified technique

    Directory of Open Access Journals (Sweden)

    Raghavan Vidya

    2017-09-01

    Full Text Available Nipple areolar reconstruction (NAR has evolved with the advancement in breast reconstruction and can improve self-esteem and, consequently, patient satisfaction. Although a variety of reconstruction techniques have been described in the literature varying from nipple sharing, local flaps to alloplastic and allograft augmentation, over time, loss of nipple projection remains a major problem. Acellular dermal matrices (ADM have revolutionised breast reconstruction more recently. We discuss the use of ADM to act as a base plate and strut to give support to the base and offer nipple bulk and projection in a primary procedure of NAR with a local clover shaped dermal flap in 5 breasts (4 patients. We used 5-point Likert scales (1 = highly unsatisfied, 5 = highly satisfied to assess patient satisfaction. Median age was 46 years (range: 38–55 years. Nipple projection of 8 mm, 7 mm, and 7 mms were achieved in the unilateral cases and 6 mm in the bilateral case over a median 18 month period. All patients reported at least a 4 on the Likert scale. We had no post-operative complications. It seems that nipple areolar reconstruction [NAR] using ADM can achieve nipple projection which is considered aesthetically pleasing for patients.

  17. Analysis of several Boolean operation based trajectory generation strategies for automotive spray applications

    Science.gov (United States)

    Gao, Guoyou; Jiang, Chunsheng; Chen, Tao; Hui, Chun

    2018-05-01

    Industrial robots are widely used in various processes of surface manufacturing, such as thermal spraying. The established robot programming methods are highly time-consuming and not accurate enough to fulfil the demands of the actual market. There are many off-line programming methods developed to reduce the robot programming effort. This work introduces the principle of several based robot trajectory generation strategy on planar surface and curved surface. Since the off-line programming software is widely used and thus facilitates the robot programming efforts and improves the accuracy of robot trajectory, the analysis of this work is based on the second development of off-line programming software Robot studio™. To meet the requirements of automotive paint industry, this kind of software extension helps provide special functions according to the users defined operation parameters. The presented planning strategy generates the robot trajectory by moving an orthogonal surface according to the information of coating surface, a series of intersection curves are then employed to generate the trajectory points. The simulation results show that the path curve created with this method is successive and smooth, which corresponds to the requirements of automotive spray industrial applications.

  18. Maximum likelihood-based analysis of photon arrival trajectories in single-molecule FRET

    Energy Technology Data Exchange (ETDEWEB)

    Waligorska, Marta [Adam Mickiewicz University, Faculty of Chemistry, Grunwaldzka 6, 60-780 Poznan (Poland); Molski, Andrzej, E-mail: amolski@amu.edu.pl [Adam Mickiewicz University, Faculty of Chemistry, Grunwaldzka 6, 60-780 Poznan (Poland)

    2012-07-25

    Highlights: Black-Right-Pointing-Pointer We study model selection and parameter recovery from single-molecule FRET experiments. Black-Right-Pointing-Pointer We examine the maximum likelihood-based analysis of two-color photon trajectories. Black-Right-Pointing-Pointer The number of observed photons determines the performance of the method. Black-Right-Pointing-Pointer For long trajectories, one can extract mean dwell times that are comparable to inter-photon times. -- Abstract: When two fluorophores (donor and acceptor) are attached to an immobilized biomolecule, anti-correlated fluctuations of the donor and acceptor fluorescence caused by Foerster resonance energy transfer (FRET) report on the conformational kinetics of the molecule. Here we assess the maximum likelihood-based analysis of donor and acceptor photon arrival trajectories as a method for extracting the conformational kinetics. Using computer generated data we quantify the accuracy and precision of parameter estimates and the efficiency of the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) in selecting the true kinetic model. We find that the number of observed photons is the key parameter determining parameter estimation and model selection. For long trajectories, one can extract mean dwell times that are comparable to inter-photon times.

  19. Tests of the Hardware and Software for the Reconstruction of Trajectories in the Experiment MINERvA

    International Nuclear Information System (INIS)

    Palomino Gallo, Jose Luis

    2009-01-01

    MINERvA experiment has a highly segmented and high precision neutrino detector able to record events with high statistic (over 13 millions in a four year run). MINERvA uses FERMILAB NuMI beamline. The detector will allow a detailed study of neutrino-nucleon interactions. Moreover, the detector has a target with different materials allowing, for the first time, the study of nuclear effects in neutrino interactions. We present here the work done with the MINERvA reconstruction group that has resulted in: (a) development of new codes to be added to the RecPack package so it can be adapted to the MINERvA detector structure; (b) finding optimum values for two of the MegaTracker reconstruction package variables: PEcut = 4 (minimum number of photo electrons for a signal to be accepted) and Chi2Cut = 200 (maximum value of χ 2 for a track to be accepted); (c) testing of the multi anode photomultiplier tubes used at MINERvA in order to determine the correlation between different channels and for checking the device's dark counts.

  20. Tests of the Hardware and Software for the Reconstruction of Trajectories in the Experiment MINERvA

    Energy Technology Data Exchange (ETDEWEB)

    Palomino Gallo, Jose Luis; /Rio de Janeiro, CBPF

    2009-05-01

    MINERvA experiment has a highly segmented and high precision neutrino detector able to record events with high statistic (over 13 millions in a four year run). MINERvA uses FERMILAB NuMI beamline. The detector will allow a detailed study of neutrino-nucleon interactions. Moreover, the detector has a target with different materials allowing, for the first time, the study of nuclear effects in neutrino interactions. We present here the work done with the MINERvA reconstruction group that has resulted in: (a) development of new codes to be added to the RecPack package so it can be adapted to the MINERvA detector structure; (b) finding optimum values for two of the MegaTracker reconstruction package variables: PEcut = 4 (minimum number of photo electrons for a signal to be accepted) and Chi2Cut = 200 (maximum value of {chi}{sup 2} for a track to be accepted); (c) testing of the multi anode photomultiplier tubes used at MINERvA in order to determine the correlation between different channels and for checking the device's dark counts.

  1. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  2. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  3. Reconstructing Macroeconomics Based on Statistical Physics

    Science.gov (United States)

    Aoki, Masanao; Yoshikawa, Hiroshi

    We believe that time has come to integrate the new approach based on statistical physics or econophysics into macroeconomics. Toward this goal, there must be more dialogues between physicists and economists. In this paper, we argue that there is no reason why the methods of statistical physics so successful in many fields of natural sciences cannot be usefully applied to macroeconomics that is meant to analyze the macroeconomy comprising a large number of economic agents. It is, in fact, weird to regard the macroeconomy as a homothetic enlargement of the representative micro agent. We trust the bright future of the new approach to macroeconomies based on statistical physics.

  4. Autocorrelation based reconstruction of two-dimensional binary objects

    International Nuclear Information System (INIS)

    Mejia-Barbosa, Y.; Castaneda, R.

    2005-10-01

    A method for reconstructing two-dimensional binary objects from its autocorrelation function is discussed. The objects consist of a finite set of identical elements. The reconstruction algorithm is based on the concept of class of element pairs, defined as the set of element pairs with the same separation vector. This concept allows to solve the redundancy introduced by the element pairs of each class. It is also shown that different objects, consisting of an equal number of elements and the same classes of pairs, provide Fraunhofer diffraction patterns with identical intensity distributions. However, the method predicts all the possible objects that produce the same Fraunhofer pattern. (author)

  5. Inference-Based Surface Reconstruction of Cluttered Environments

    KAUST Repository

    Biggers, K.

    2012-08-01

    We present an inference-based surface reconstruction algorithm that is capable of identifying objects of interest among a cluttered scene, and reconstructing solid model representations even in the presence of occluded surfaces. Our proposed approach incorporates a predictive modeling framework that uses a set of user-provided models for prior knowledge, and applies this knowledge to the iterative identification and construction process. Our approach uses a local to global construction process guided by rules for fitting high-quality surface patches obtained from these prior models. We demonstrate the application of this algorithm on several example data sets containing heavy clutter and occlusion. © 2012 IEEE.

  6. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  7. 3D Reconstruction of human bones based on dictionary learning.

    Science.gov (United States)

    Zhang, Binkai; Wang, Xiang; Liang, Xiao; Zheng, Jinjin

    2017-11-01

    An effective method for reconstructing a 3D model of human bones from computed tomography (CT) image data based on dictionary learning is proposed. In this study, the dictionary comprises the vertices of triangular meshes, and the sparse coefficient matrix indicates the connectivity information. For better reconstruction performance, we proposed a balance coefficient between the approximation and regularisation terms and a method for optimisation. Moreover, we applied a local updating strategy and a mesh-optimisation method to update the dictionary and the sparse matrix, respectively. The two updating steps are iterated alternately until the objective function converges. Thus, a reconstructed mesh could be obtained with high accuracy and regularisation. The experimental results show that the proposed method has the potential to obtain high precision and high-quality triangular meshes for rapid prototyping, medical diagnosis, and tissue engineering. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Split-Bregman-based sparse-view CT reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Vandeghinste, Bert; Vandenberghe, Stefaan [Ghent Univ. (Belgium). Medical Image and Signal Processing (MEDISIP); Goossens, Bart; Pizurica, Aleksandra; Philips, Wilfried [Ghent Univ. (Belgium). Image Processing and Interpretation Research Group (IPI); Beenhouwer, Jan de [Ghent Univ. (Belgium). Medical Image and Signal Processing (MEDISIP); Antwerp Univ., Wilrijk (Belgium). The Vision Lab; Staelens, Steven [Ghent Univ. (Belgium). Medical Image and Signal Processing (MEDISIP); Antwerp Univ., Edegem (Belgium). Molecular Imaging Centre Antwerp

    2011-07-01

    Total variation minimization has been extensively researched for image denoising and sparse view reconstruction. These methods show superior denoising performance for simple images with little texture, but result in texture information loss when applied to more complex images. It could thus be beneficial to use other regularizers within medical imaging. We propose a general regularization method, based on a split-Bregman approach. We show results for this framework combined with a total variation denoising operator, in comparison to ASD-POCS. We show that sparse-view reconstruction and noise regularization is possible. This general method will allow us to investigate other regularizers in the context of regularized CT reconstruction, and decrease the acquisition times in {mu}CT. (orig.)

  9. Visual Trajectory-Tracking Model-Based Control for Mobile Robots

    Directory of Open Access Journals (Sweden)

    Andrej Zdešar

    2013-09-01

    Full Text Available In this paper we present a visual-control algorithm for driving a mobile robot along the reference trajectory. The configuration of the system consists of a two-wheeled differentially driven mobile robot that is observed by an overhead camera, which can be placed at arbitrary, but reasonable, inclination with respect to the ground plane. The controller must be capable of generating appropriate tangential and angular control velocities for the trajectory-tracking problem, based on the information received about the robot position obtained in the image. To be able to track the position of the robot through a sequence of images in real-time, the robot is marked with an artificial marker that can be distinguishably recognized by the image recognition subsystem. Using the property of differential flatness, a dynamic feedback compensator can be designed for the system, thereby extending the system into a linear form. The presented control algorithm for reference tracking combines a feedforward and a feedback loop, the structure also known as a two DOF control scheme. The feedforward part should drive the system to the vicinity of the reference trajectory and the feedback part should eliminate any errors that occur due to noise and other disturbances etc. The feedforward control can never achieve accurate reference following, but this deficiency can be eliminated with the introduction of the feedback loop. The design of the model predictive control is based on the linear error model. The model predictive control is given in analytical form, so the computational burden is kept at a reasonable level for real-time implementation. The control algorithm requires that a reference trajectory is at least twice differentiable function. A suitable approach to design such a trajectory is by exploiting some useful properties of the Bernstein-Bézier parametric curves. The simulation experiments as well as real system experiments on a robot normally used in the

  10. A Trajectory Correction based on Multi-Step Lining-up for the CLIC Main Linac

    CERN Document Server

    D'Amico, T E

    1999-01-01

    In the CLIC main linac it is very important to minimise the trajectory excursion and consequently the emittance dilution in order to obtain the required luminosity. Several algorithms have been proposed and lately the ballistic method has proved to be very effective. The trajectory method described in this Note retains the main advantages of the latter while adding some interesting features. It is based on the separation of the unknown variables like the quadrupole misalignments, the offset and slope of the injection straight line and the misalignments of the beam position monitors (BPM). This is achieved by referring the trajectory relatively to the injection line and not to the average pre-alignment line and by using two trajectories each corresponding to slightly different quadrupole strengths. A reference straight line is then derived onto which the beam is bent by a kick obtained by moving the first quadrupole. The other quadrupoles are then aligned on that line. The quality of the correction depends mai...

  11. Association between Adolescent Substance Use and Obesity in Young Adulthood: A Group-based Dual Trajectory Analysis

    Science.gov (United States)

    Huang, David Y.C.; Lanza, H. Isabella; Anglin, M. Douglas

    2013-01-01

    Purpose This study investigated whether and how trajectories of substance use in adolescence were associated with obesity trajectories in young adulthood. We hypothesized that: (1) exposure to persistent substance use throughout adolescence may heighten obesity risk in young adulthood; and (2) such associations may differ once gender, ethnicity, socioeconomic status, and obesity status in adolescence, are considered. Methods The study included 5,141 adolescents from the child sample of the 1979 National Longitudinal Survey of Youth and utilized biennial data across the 12 assessments (1986-2008) to examine trajectories of substance use behaviors (i.e., cigarette smoking, alcohol use, and marijuana use) from ages 12 to 18 and obesity trajectories from ages 20 to 24. Group-based dual trajectory modeling was applied to examine sequential associations of trajectories of each type of substance use behavior with obesity trajectories. Results Three distinctive trajectory patterns were respectively identified for cigarette smoking, alcohol use, and marijuana use from ages 12 to 18, as well as for obesity status (BMI ≥ 30) from ages 20 to 24. Taking into account gender, ethnicity, socioeconomic status, and obesity status in adolescence, adolescents with the most problematic smoking trajectory (High-decreasing) were more likely to exhibit a High-obesity trajectory from ages 20 to 24. Also, adolescents with an Increasing marijuana use trajectory were more likely to exhibit an Increased obesity trajectory in young adulthood. Conclusions The current study demonstrates that adolescent substance use is associated with subsequent obesity in young adulthood. The associations appear to differ based on type of substance use and patterns of use. PMID:23899428

  12. Trajectory Planning of 7-DOF Space Manipulator for Minimizing Base Disturbance

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2016-03-01

    Full Text Available In the free-floating mode, there is intense dynamic coupling existing between the space manipulator and the base, and the base attitude may change while performing a motion with its manipulator. Therefore, it is necessary to reduce the interference that resulted from the manipulator movement. For planning trajectories of the space manipulator with 7 degrees of freedom (7-DOF, simulated annealing particle swarm optimization (SAPSO algorithm is presented in the paper. Firstly, kinematics equations are setup. Secondly, the joint functions are parameterized by sinusoidal functions, and the objective function is defined according to the motion constraints of manipulator and accuracy requirements of the base attitude. Finally, SAPSO algorithm is used to search the optimal trajectory. The simulation results verify the proposed method.

  13. Residual translation compensations in radar target narrowband imaging based on trajectory information

    Science.gov (United States)

    Yue, Wenjue; Peng, Bo; Wei, Xizhang; Li, Xiang; Liao, Dongping

    2018-05-01

    High velocity translation will result in defocusing scattering centers in radar imaging. In this paper, we propose a Residual Translation Compensations (RTC) method based on target trajectory information to eliminate the translation effects in radar imaging. Translation could not be simply regarded as a uniformly accelerated motion in reality. So the prior knowledge of the target trajectory is introduced to enhance compensation precision. First we use the two-body orbit model to figure out the radial distance. Then, stepwise compensations are applied to eliminate residual propagation delay based on conjugate multiplication method. Finally, tomography is used to confirm the validity of the method. Compare with translation parameters estimation method based on the spectral peak of the conjugate multiplied signal, RTC method in this paper enjoys a better tomography result. When the Signal Noise Ratio (SNR) of the radar echo signal is 4dB, the scattering centers can also be extracted clearly.

  14. Quadrotor Trajectory Tracking Based on Quasi-LPV System and Internal Model Control

    Directory of Open Access Journals (Sweden)

    ZeFang He

    2015-01-01

    Full Text Available Internal model control (IMC design method based on quasi-LPV (Linear Parameter Varying system is proposed. In this method, the nonlinear model is firstly transformed to the linear model based on quasi-LPV method; then, the quadrotor nonlinear motion function is transformed to transfer function matrix based on the transformation model from the state space to the transfer function; further, IMC is designed to control the controlled object represented by transfer function matrix and realize quadrotor trajectory tracking. The performance of the controller proposed in this paper is tested by tracking for three reference trajectories with drastic changes. The simulation results indicate that the control method proposed in this paper has stronger robustness to parameters uncertainty and disturbance rejection performance.

  15. Multivariable Super Twisting Based Robust Trajectory Tracking Control for Small Unmanned Helicopter

    Directory of Open Access Journals (Sweden)

    Xing Fang

    2015-01-01

    Full Text Available This paper presents a highly robust trajectory tracking controller for small unmanned helicopter with model uncertainties and external disturbances. First, a simplified dynamic model is developed, where the model uncertainties and external disturbances are treated as compounded disturbances. Then the system is divided into three interconnected subsystems: altitude subsystem, yaw subsystem, and horizontal subsystem. Second, a disturbance observer based controller (DOBC is designed based upon backstepping and multivariable super twisting control algorithm to obtain robust trajectory tracking property. A sliding mode observer works as an estimator of the compounded disturbances. In order to lessen calculative burden, a first-order exact differentiator is employed to estimate the time derivative of the virtual control. Moreover, proof of the stability of the closed-loop system based on Lyapunov method is given. Finally, simulation results are presented to illustrate the effectiveness and robustness of the proposed flight control scheme.

  16. Simulating the influence of life trajectory events on transport mode behavior in an agent-based system

    NARCIS (Netherlands)

    Verhoeven, M.; Arentze, T.A.; Timmermans, H.J.P.; Waerden, van der P.J.H.J.

    2007-01-01

    this paper describes the results of a study on the impact of lifecycle or life trajectory events on activity-travel decisions. This lifecycle trajectory of individual agents can be easily incorporated in an agent-based simulation system. This paper focuses on two lifecycle events, change in

  17. TrajAnalytics: An Open-Source, Web-Based Visual Analytics Software of Urban Trajectory Data

    OpenAIRE

    Zhao, Ye

    2018-01-01

    We developed a software system named TrajAnalytics, which explicitly supports interactive visual analytics of the emerging trajectory data. It offers data management capability and support various data queries by leveraging web-based computing platforms. It allows users to visually conduct queries and make sense of massive trajectory data.

  18. SU-E-T-436: Fluence-Based Trajectory Optimization for Non-Coplanar VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Smyth, G; Bamber, JC; Bedford, JL [Joint Department of Physics at The Institute of Cancer Research and The Royal Marsden NHS Foundation Trust, London (United Kingdom); Evans, PM [Centre for Vision, Speech and Signal Processing, University of Surrey, Guildford (United Kingdom); Saran, FH; Mandeville, HC [The Royal Marsden NHS Foundation Trust, Sutton (United Kingdom)

    2015-06-15

    Purpose: To investigate a fluence-based trajectory optimization technique for non-coplanar VMAT for brain cancer. Methods: Single-arc non-coplanar VMAT trajectories were determined using a heuristic technique for five patients. Organ at risk (OAR) volume intersected during raytracing was minimized for two cases: absolute volume and the sum of relative volumes weighted by OAR importance. These trajectories and coplanar VMAT formed starting points for the fluence-based optimization method. Iterative least squares optimization was performed on control points 24° apart in gantry rotation. Optimization minimized the root-mean-square (RMS) deviation of PTV dose from the prescription (relative importance 100), maximum dose to the brainstem (10), optic chiasm (5), globes (5) and optic nerves (5), plus mean dose to the lenses (5), hippocampi (3), temporal lobes (2), cochleae (1) and brain excluding other regions of interest (1). Control point couch rotations were varied in steps of up to 10° and accepted if the cost function improved. Final treatment plans were optimized with the same objectives in an in-house planning system and evaluated using a composite metric - the sum of optimization metrics weighted by importance. Results: The composite metric decreased with fluence-based optimization in 14 of the 15 plans. In the remaining case its overall value, and the PTV and OAR components, were unchanged but the balance of OAR sparing differed. PTV RMS deviation was improved in 13 cases and unchanged in two. The OAR component was reduced in 13 plans. In one case the OAR component increased but the composite metric decreased - a 4 Gy increase in OAR metrics was balanced by a reduction in PTV RMS deviation from 2.8% to 2.6%. Conclusion: Fluence-based trajectory optimization improved plan quality as defined by the composite metric. While dose differences were case specific, fluence-based optimization improved both PTV and OAR dosimetry in 80% of cases.

  19. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    Science.gov (United States)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  20. Robust, Efficient Depth Reconstruction With Hierarchical Confidence-Based Matching.

    Science.gov (United States)

    Sun, Li; Chen, Ke; Song, Mingli; Tao, Dacheng; Chen, Gang; Chen, Chun

    2017-07-01

    In recent years, taking photos and capturing videos with mobile devices have become increasingly popular. Emerging applications based on the depth reconstruction technique have been developed, such as Google lens blur. However, depth reconstruction is difficult due to occlusions, non-diffuse surfaces, repetitive patterns, and textureless surfaces, and it has become more difficult due to the unstable image quality and uncontrolled scene condition in the mobile setting. In this paper, we present a novel hierarchical framework with multi-view confidence-based matching for robust, efficient depth reconstruction in uncontrolled scenes. Particularly, the proposed framework combines local cost aggregation with global cost optimization in a complementary manner that increases efficiency and accuracy. A depth map is efficiently obtained in a coarse-to-fine manner by using an image pyramid. Moreover, confidence maps are computed to robustly fuse multi-view matching cues, and to constrain the stereo matching on a finer scale. The proposed framework has been evaluated with challenging indoor and outdoor scenes, and has achieved robust and efficient depth reconstruction.

  1. QR-decomposition based SENSE reconstruction using parallel architecture.

    Science.gov (United States)

    Ullah, Irfan; Nisar, Habab; Raza, Haseeb; Qasim, Malik; Inam, Omair; Omer, Hammad

    2018-04-01

    Magnetic Resonance Imaging (MRI) is a powerful medical imaging technique that provides essential clinical information about the human body. One major limitation of MRI is its long scan time. Implementation of advance MRI algorithms on a parallel architecture (to exploit inherent parallelism) has a great potential to reduce the scan time. Sensitivity Encoding (SENSE) is a Parallel Magnetic Resonance Imaging (pMRI) algorithm that utilizes receiver coil sensitivities to reconstruct MR images from the acquired under-sampled k-space data. At the heart of SENSE lies inversion of a rectangular encoding matrix. This work presents a novel implementation of GPU based SENSE algorithm, which employs QR decomposition for the inversion of the rectangular encoding matrix. For a fair comparison, the performance of the proposed GPU based SENSE reconstruction is evaluated against single and multicore CPU using openMP. Several experiments against various acceleration factors (AFs) are performed using multichannel (8, 12 and 30) phantom and in-vivo human head and cardiac datasets. Experimental results show that GPU significantly reduces the computation time of SENSE reconstruction as compared to multi-core CPU (approximately 12x speedup) and single-core CPU (approximately 53x speedup) without any degradation in the quality of the reconstructed images. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Prospective regularization design in prior-image-based reconstruction

    International Nuclear Information System (INIS)

    Dang, Hao; Siewerdsen, Jeffrey H; Stayman, J Webster

    2015-01-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in

  3. Trajectory Evaluation of Rotor-Flying Robots Using Accurate Inverse Computation Based on Algorithm Differentiation

    Directory of Open Access Journals (Sweden)

    Yuqing He

    2014-01-01

    Full Text Available Autonomous maneuvering flight control of rotor-flying robots (RFR is a challenging problem due to the highly complicated structure of its model and significant uncertainties regarding many aspects of the field. As a consequence, it is difficult in many cases to decide whether or not a flight maneuver trajectory is feasible. It is necessary to conduct an analysis of the flight maneuvering ability of an RFR prior to test flight. Our aim in this paper is to use a numerical method called algorithm differentiation (AD to solve this problem. The basic idea is to compute the internal state (i.e., attitude angles and angular rates and input profiles based on predetermined maneuvering trajectory information denoted by the outputs (i.e., positions and yaw angle and their higher-order derivatives. For this purpose, we first present a model of the RFR system and show that it is flat. We then cast the procedure for obtaining the required state/input based on the desired outputs as a static optimization problem, which is solved using AD and a derivative based optimization algorithm. Finally, we test our proposed method using a flight maneuver trajectory to verify its performance.

  4. Exploring the propagation of relativistic quantum wavepackets in the trajectory-based formulation

    Science.gov (United States)

    Tsai, Hung-Ming; Poirier, Bill

    2016-03-01

    In the context of nonrelativistic quantum mechanics, Gaussian wavepacket solutions of the time-dependent Schrödinger equation provide useful physical insight. This is not the case for relativistic quantum mechanics, however, for which both the Klein-Gordon and Dirac wave equations result in strange and counterintuitive wavepacket behaviors, even for free-particle Gaussians. These behaviors include zitterbewegung and other interference effects. As a potential remedy, this paper explores a new trajectory-based formulation of quantum mechanics, in which the wavefunction plays no role [Phys. Rev. X, 4, 040002 (2014)]. Quantum states are represented as ensembles of trajectories, whose mutual interaction is the source of all quantum effects observed in nature—suggesting a “many interacting worlds” interpretation. It is shown that the relativistic generalization of the trajectory-based formulation results in well-behaved free-particle Gaussian wavepacket solutions. In particular, probability density is positive and well-localized everywhere, and its spatial integral is conserved over time—in any inertial frame. Finally, the ensemble-averaged wavepacket motion is along a straight line path through spacetime. In this manner, the pathologies of the wave-based relativistic quantum theory, as applied to wavepacket propagation, are avoided.

  5. How to perform 3D reconstruction of skull base tumours.

    Science.gov (United States)

    Bonne, N-X; Dubrulle, F; Risoud, M; Vincent, C

    2017-04-01

    The surgical management of skull base lesions is difficult due to the complex anatomy of the region and the intimate relations between the lesion and adjacent nerves and vessels. Minimally invasive approaches are increasingly used in skull base surgery to ensure an optimal functional prognosis. Three-dimensional (3D) computed tomography (CT) reconstruction facilitates surgical planning by visualizing the anatomical relations of the lesions in all planes (arteries, veins, nerves, inner ear) and simulation of the surgical approach in the operating position. Helical CT angiography is performed with optimal timing of the injection in terms of tumour and vessel contrast enhancement. 3D definition of each structure is based on colour coding by automatic thresholding (bone, vessels) or manual segmentation on each slice (tumour, nerves, inner ear). Imaging is generally presented in 3 dimensions (superior, coronal, sagittal) with simulation of the surgical procedure (5 to 6 reconstructions in the operating position at different depths). Copyright © 2016. Published by Elsevier Masson SAS.

  6. Handling data redundancy in helical cone beam reconstruction with a cone-angle-based window function and its asymptotic approximation

    International Nuclear Information System (INIS)

    Tang Xiangyang; Hsieh Jiang

    2007-01-01

    A cone-angle-based window function is defined in this manuscript for image reconstruction using helical cone beam filtered backprojection (CB-FBP) algorithms. Rather than defining the window boundaries in a two-dimensional detector acquiring projection data for computed tomographic imaging, the cone-angle-based window function deals with data redundancy by selecting rays with the smallest cone angle relative to the reconstruction plane. To be computationally efficient, an asymptotic approximation of the cone-angle-based window function is also given and analyzed in this paper. The benefit of using such an asymptotic approximation also includes the avoidance of functional discontinuities that cause artifacts in reconstructed tomographic images. The cone-angle-based window function and its asymptotic approximation provide a way, equivalent to the Tam-Danielsson-window, for helical CB-FBP reconstruction algorithms to deal with data redundancy, regardless of where the helical pitch is constant or dynamically variable during a scan. By taking the cone-parallel geometry as an example, a computer simulation study is conducted to evaluate the proposed window function and its asymptotic approximation for helical CB-FBP reconstruction algorithm to handle data redundancy. The computer simulated Forbild head and thorax phantoms are utilized in the performance evaluation, showing that the proposed cone-angle-based window function and its asymptotic approximation can deal with data redundancy very well in cone beam image reconstruction from projection data acquired along helical source trajectories. Moreover, a numerical study carried out in this paper reveals that the proposed cone-angle-based window function is actually equivalent to the Tam-Danielsson-window, and rigorous mathematical proofs are being investigated

  7. CT image reconstruction system based on hardware implementation

    International Nuclear Information System (INIS)

    Silva, Hamilton P. da; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, Joao A.P.; Zibetti, Marcelo; Hormaza, Joel M.; Lopes, Ricardo T.

    2009-01-01

    Full text: The timing factor is very important for medical imaging systems, which can nowadays be synchronized by vital human signals, like heartbeats or breath. The use of hardware implemented devices in such a system has advantages considering the high speed of information treatment combined with arbitrary low cost on the market. This article refers to a hardware system which is based on electronic programmable logic called FPGA, model Cyclone II from ALTERA Corporation. The hardware was implemented on the UP3 ALTERA Kit. A partially connected neural network with unitary weights was programmed. The system was tested with 60 topographic projections, 100 points in each, of the Shepp and Logan phantom created by MATLAB. The main restriction was found to be the memory size available on the device: the dynamic range of reconstructed image was limited to 0 65535. Also, the normalization factor must be observed in order to do not saturate the image during the reconstruction and filtering process. The test shows a principal possibility to build CT image reconstruction systems for any reasonable amount of input data by arranging the parallel work of the hardware units like we have tested. However, further studies are necessary for better understanding of the error propagation from topographic projections to reconstructed image within the implemented method. (author)

  8. Porous media microstructure reconstruction using pixel-based and object-based simulated annealing: comparison with other reconstruction methods

    Energy Technology Data Exchange (ETDEWEB)

    Diogenes, Alysson N.; Santos, Luis O.E. dos; Fernandes, Celso P. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil); Appoloni, Carlos R. [Universidade Estadual de Londrina (UEL), PR (Brazil)

    2008-07-01

    The reservoir rocks physical properties are usually obtained in laboratory, through standard experiments. These experiments are often very expensive and time-consuming. Hence, the digital image analysis techniques are a very fast and low cost methodology for physical properties prediction, knowing only geometrical parameters measured from the rock microstructure thin sections. This research analyzes two methods for porous media reconstruction using the relaxation method simulated annealing. Using geometrical parameters measured from rock thin sections, it is possible to construct a three-dimensional (3D) model of the microstructure. We assume statistical homogeneity and isotropy and the 3D model maintains porosity spatial correlation, chord size distribution and d 3-4 distance transform distribution for a pixel-based reconstruction and spatial correlation for an object-based reconstruction. The 2D and 3D preliminary results are compared with microstructures reconstructed by truncated Gaussian methods. As this research is in its beginning, only the 2D results will be presented. (author)

  9. Parameter Identification of Static Friction Based on An Optimal Exciting Trajectory

    Science.gov (United States)

    Tu, X.; Zhao, P.; Zhou, Y. F.

    2017-12-01

    In this paper, we focus on how to improve the identification efficiency of friction parameters in a robot joint. First, the static friction model that has only linear dependencies with respect to their parameters is adopted so that the servomotor dynamics can be linearized. In this case, the traditional exciting trajectory based on Fourier series is modified by replacing the constant term with quintic polynomial to ensure the boundary continuity of speed and acceleration. Then, the Fourier-related parameters are optimized by genetic algorithm(GA) in which the condition number of regression matrix is set as the fitness function. At last, compared with the constant-velocity tracking experiment, the friction parameters from the exciting trajectory experiment has the similar result with the advantage of time reduction.

  10. Adaptive Mesh Iteration Method for Trajectory Optimization Based on Hermite-Pseudospectral Direct Transcription

    Directory of Open Access Journals (Sweden)

    Humin Lei

    2017-01-01

    Full Text Available An adaptive mesh iteration method based on Hermite-Pseudospectral is described for trajectory optimization. The method uses the Legendre-Gauss-Lobatto points as interpolation points; then the state equations are approximated by Hermite interpolating polynomials. The method allows for changes in both number of mesh points and the number of mesh intervals and produces significantly smaller mesh sizes with a higher accuracy tolerance solution. The derived relative error estimate is then used to trade the number of mesh points with the number of mesh intervals. The adaptive mesh iteration method is applied successfully to the examples of trajectory optimization of Maneuverable Reentry Research Vehicle, and the simulation experiment results show that the adaptive mesh iteration method has many advantages.

  11. Trajectory-based morphological operators: a model for efficient image processing.

    Science.gov (United States)

    Jimeno-Morenilla, Antonio; Pujol, Francisco A; Molina-Carmona, Rafael; Sánchez-Romero, José L; Pujol, Mar

    2014-01-01

    Mathematical morphology has been an area of intensive research over the last few years. Although many remarkable advances have been achieved throughout these years, there is still a great interest in accelerating morphological operations in order for them to be implemented in real-time systems. In this work, we present a new model for computing mathematical morphology operations, the so-called morphological trajectory model (MTM), in which a morphological filter will be divided into a sequence of basic operations. Then, a trajectory-based morphological operation (such as dilation, and erosion) is defined as the set of points resulting from the ordered application of the instant basic operations. The MTM approach allows working with different structuring elements, such as disks, and from the experiments, it can be extracted that our method is independent of the structuring element size and can be easily applied to industrial systems and high-resolution images.

  12. Algorithm research for user trajectory matching across social media networks based on paragraph2vec

    Science.gov (United States)

    Xu, Qian; Chen, Hongchang; Zhi, Hongxin; Wang, Yanchuan

    2018-04-01

    Identifying users across different social media networks (SMN) is to link accounts of the same user that belong to the same individual across SMNs. The problem is fundamental and important, and its results can benefit many applications such as cross SMN user modeling and recommendation. With the development of GPS technology and mobile communication, more and more social networks provide location services. This provides a new opportunity for cross SMN user identification. In this paper, we solve cross SMN user identification problem in an unsupervised manner by utilizing user trajectory data in SMNs. A paragraph2vec based algorithm is proposed in which location sequence feature of user trajectory is captured in temporal and spatial dimensions. Our experimental results validate the effectiveness and efficiency of our algorithm.

  13. Skull defect reconstruction based on a new hybrid level set.

    Science.gov (United States)

    Zhang, Ziqun; Zhang, Ran; Song, Zhijian

    2014-01-01

    Skull defect reconstruction is an important aspect of surgical repair. Historically, a skull defect prosthesis was created by the mirroring technique, surface fitting, or formed templates. These methods are not based on the anatomy of the individual patient's skull, and therefore, the prosthesis cannot precisely correct the defect. This study presented a new hybrid level set model, taking into account both the global optimization region information and the local accuracy edge information, while avoiding re-initialization during the evolution of the level set function. Based on the new method, a skull defect was reconstructed, and the skull prosthesis was produced by rapid prototyping technology. This resulted in a skull defect prosthesis that well matched the skull defect with excellent individual adaptation.

  14. Prediction of three-dimensional arm trajectories based on ECoG signals recorded from human sensorimotor cortex.

    Directory of Open Access Journals (Sweden)

    Yasuhiko Nakanishi

    Full Text Available Brain-machine interface techniques have been applied in a number of studies to control neuromotor prostheses and for neurorehabilitation in the hopes of providing a means to restore lost motor function. Electrocorticography (ECoG has seen recent use in this regard because it offers a higher spatiotemporal resolution than non-invasive EEG and is less invasive than intracortical microelectrodes. Although several studies have already succeeded in the inference of computer cursor trajectories and finger flexions using human ECoG signals, precise three-dimensional (3D trajectory reconstruction for a human limb from ECoG has not yet been achieved. In this study, we predicted 3D arm trajectories in time series from ECoG signals in humans using a novel preprocessing method and a sparse linear regression. Average Pearson's correlation coefficients and normalized root-mean-square errors between predicted and actual trajectories were 0.44~0.73 and 0.18~0.42, respectively, confirming the feasibility of predicting 3D arm trajectories from ECoG. We foresee this method contributing to future advancements in neuroprosthesis and neurorehabilitation technology.

  15. Tensor-Based Dictionary Learning for Spectral CT Reconstruction.

    Science.gov (United States)

    Zhang, Yanbo; Mou, Xuanqin; Wang, Ge; Yu, Hengyong

    2017-01-01

    Spectral computed tomography (CT) produces an energy-discriminative attenuation map of an object, extending a conventional image volume with a spectral dimension. In spectral CT, an image can be sparsely represented in each of multiple energy channels, and are highly correlated among energy channels. According to this characteristics, we propose a tensor-based dictionary learning method for spectral CT reconstruction. In our method, tensor patches are extracted from an image tensor, which is reconstructed using the filtered backprojection (FBP), to form a training dataset. With the Candecomp/Parafac decomposition, a tensor-based dictionary is trained, in which each atom is a rank-one tensor. Then, the trained dictionary is used to sparsely represent image tensor patches during an iterative reconstruction process, and the alternating minimization scheme is adapted for optimization. The effectiveness of our proposed method is validated with both numerically simulated and real preclinical mouse datasets. The results demonstrate that the proposed tensor-based method generally produces superior image quality, and leads to more accurate material decomposition than the currently popular popular methods.

  16. Tensor-based Dictionary Learning for Spectral CT Reconstruction

    Science.gov (United States)

    Zhang, Yanbo; Wang, Ge

    2016-01-01

    Spectral computed tomography (CT) produces an energy-discriminative attenuation map of an object, extending a conventional image volume with a spectral dimension. In spectral CT, an image can be sparsely represented in each of multiple energy channels, and are highly correlated among energy channels. According to this characteristics, we propose a tensor-based dictionary learning method for spectral CT reconstruction. In our method, tensor patches are extracted from an image tensor, which is reconstructed using the filtered backprojection (FBP), to form a training dataset. With the Candecomp/Parafac decomposition, a tensor-based dictionary is trained, in which each atom is a rank-one tensor. Then, the trained dictionary is used to sparsely represent image tensor patches during an iterative reconstruction process, and the alternating minimization scheme is adapted for optimization. The effectiveness of our proposed method is validated with both numerically simulated and real preclinical mouse datasets. The results demonstrate that the proposed tensor-based method generally produces superior image quality, and leads to more accurate material decomposition than the currently popular popular methods. PMID:27541628

  17. Quartet-based methods to reconstruct phylogenetic networks.

    Science.gov (United States)

    Yang, Jialiang; Grünewald, Stefan; Xu, Yifei; Wan, Xiu-Feng

    2014-02-20

    Phylogenetic networks are employed to visualize evolutionary relationships among a group of nucleotide sequences, genes or species when reticulate events like hybridization, recombination, reassortant and horizontal gene transfer are believed to be involved. In comparison to traditional distance-based methods, quartet-based methods consider more information in the reconstruction process and thus have the potential to be more accurate. We introduce QuartetSuite, which includes a set of new quartet-based methods, namely QuartetS, QuartetA, and QuartetM, to reconstruct phylogenetic networks from nucleotide sequences. We tested their performances and compared them with other popular methods on two simulated nucleotide sequence data sets: one generated from a tree topology and the other from a complicated evolutionary history containing three reticulate events. We further validated these methods to two real data sets: a bacterial data set consisting of seven concatenated genes of 36 bacterial species and an influenza data set related to recently emerging H7N9 low pathogenic avian influenza viruses in China. QuartetS, QuartetA, and QuartetM have the potential to accurately reconstruct evolutionary scenarios from simple branching trees to complicated networks containing many reticulate events. These methods could provide insights into the understanding of complicated biological evolutionary processes such as bacterial taxonomy and reassortant of influenza viruses.

  18. Maximum likelihood-based analysis of single-molecule photon arrival trajectories

    Science.gov (United States)

    Hajdziona, Marta; Molski, Andrzej

    2011-02-01

    In this work we explore the statistical properties of the maximum likelihood-based analysis of one-color photon arrival trajectories. This approach does not involve binning and, therefore, all of the information contained in an observed photon strajectory is used. We study the accuracy and precision of parameter estimates and the efficiency of the Akaike information criterion and the Bayesian information criterion (BIC) in selecting the true kinetic model. We focus on the low excitation regime where photon trajectories can be modeled as realizations of Markov modulated Poisson processes. The number of observed photons is the key parameter in determining model selection and parameter estimation. For example, the BIC can select the true three-state model from competing two-, three-, and four-state kinetic models even for relatively short trajectories made up of 2 × 103 photons. When the intensity levels are well-separated and 104 photons are observed, the two-state model parameters can be estimated with about 10% precision and those for a three-state model with about 20% precision.

  19. Maximum likelihood-based analysis of single-molecule photon arrival trajectories.

    Science.gov (United States)

    Hajdziona, Marta; Molski, Andrzej

    2011-02-07

    In this work we explore the statistical properties of the maximum likelihood-based analysis of one-color photon arrival trajectories. This approach does not involve binning and, therefore, all of the information contained in an observed photon strajectory is used. We study the accuracy and precision of parameter estimates and the efficiency of the Akaike information criterion and the Bayesian information criterion (BIC) in selecting the true kinetic model. We focus on the low excitation regime where photon trajectories can be modeled as realizations of Markov modulated Poisson processes. The number of observed photons is the key parameter in determining model selection and parameter estimation. For example, the BIC can select the true three-state model from competing two-, three-, and four-state kinetic models even for relatively short trajectories made up of 2 × 10(3) photons. When the intensity levels are well-separated and 10(4) photons are observed, the two-state model parameters can be estimated with about 10% precision and those for a three-state model with about 20% precision.

  20. Reconstruction of pressure sores with perforator-based propeller flaps.

    Science.gov (United States)

    Jakubietz, Rafael G; Jakubietz, Danni F; Zahn, Robert; Schmidt, Karsten; Meffert, Rainer H; Jakubietz, Michael G

    2011-03-01

    Perforator flaps have been successfully used for reconstruction of pressure sores. Although V-Y advancement flaps approximate debrided wound edges, perforator-based propeller flaps allow rotation of healthy tissue into the defect. Perforator-based propeller flaps were planned in 13 patients. Seven pressure sores were over the sacrum, five over the ischial tuberosity, and one on the tip of the scapula. Three patients were paraplegic, six were bedridden, and five were ambulatory. In three patients, no perforators were found. In 10 patients, propeller flaps were transferred. In two patients, total flap necrosis occurred, which was reconstructed with local advancement flaps. In two cases, a wound dehiscence occurred and had to be revised. One hematoma required evacuation. No further complications were noted. No recurrence at the flap site occurred. Local perforator flaps allow closure of pressure sores without harvesting muscle. The propeller version has the added benefit of transferring tissue from a distant site, avoiding reapproximation of original wound edges. Twisting of the pedicle may cause torsion and venous obstruction. This can be avoided by dissecting a pedicle of at least 3 cm. Propeller flaps are a safe option for soft tissue reconstruction of pressure sores. © Thieme Medical Publishers.

  1. GPU based Monte Carlo for PET image reconstruction: parameter optimization

    International Nuclear Information System (INIS)

    Cserkaszky, Á; Légrády, D.; Wirth, A.; Bükki, T.; Patay, G.

    2011-01-01

    This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)

  2. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  3. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  4. Model-based microwave image reconstruction: simulations and experiments

    International Nuclear Information System (INIS)

    Ciocan, Razvan; Jiang Huabei

    2004-01-01

    We describe an integrated microwave imaging system that can provide spatial maps of dielectric properties of heterogeneous media with tomographically collected data. The hardware system (800-1200 MHz) was built based on a lock-in amplifier with 16 fixed antennas. The reconstruction algorithm was implemented using a Newton iterative method with combined Marquardt-Tikhonov regularizations. System performance was evaluated using heterogeneous media mimicking human breast tissue. Finite element method coupled with the Bayliss and Turkel radiation boundary conditions were applied to compute the electric field distribution in the heterogeneous media of interest. The results show that inclusions embedded in a 76-diameter background medium can be quantitatively reconstructed from both simulated and experimental data. Quantitative analysis of the microwave images obtained suggests that an inclusion of 14 mm in diameter is the smallest object that can be fully characterized presently using experimental data, while objects as small as 10 mm in diameter can be quantitatively resolved with simulated data

  5. Understanding Random Effects in Group-Based Trajectory Modeling: An Application of Moffitt’s Developmental Taxonomy

    OpenAIRE

    Saunders, Jessica M.

    2010-01-01

    The group-based trajectory modeling approach is a systematic way of categorizing subjects into different groups based on their developmental trajectories using formal and objective statistical criteria. With the recent advancement in methods and statistical software, modeling possibilities are almost limitless; however, parallel advances in theory development have not kept pace. This paper examines some of the modeling options that are becoming more widespread and how they impact both empiric...

  6. Perforator based rectus free tissue transfer for head and neck reconstruction: New reconstructive advantages from an old friend.

    Science.gov (United States)

    Kang, Stephen Y; Spector, Matthew E; Chepeha, Douglas B

    2017-11-01

    To demonstrate three reconstructive advantages of the perforator based rectus free tissue transfer: long pedicle, customizable adipose tissue, and volume reconstruction without muscle atrophy within a contained space. Thirty patients with defects of the head and neck were reconstructed with the perforator based rectus free tissue transfer. Transplant success was 93%. Mean pedicle length was 13.4cm. Eleven patients (37%) had vessel-poor necks and the long pedicle provided by this transplant avoided the need for vein grafts in these patients. Adipose tissue was molded in 17 patients (57%). Twenty-five patients (83%) had defects within a contained space, such as the orbit, where it was critical to have a transplant that avoided muscle atrophy. The perforator based rectus free tissue transfer provides a long pedicle, moldable fat for flap customization, and is useful in reconstruction of defects within a contained space where volume loss due to muscle atrophy is prevented. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. The impact of covariance misspecification in group-based trajectory models for longitudinal data with non-stationary covariance structure.

    Science.gov (United States)

    Davies, Christopher E; Glonek, Gary Fv; Giles, Lynne C

    2017-08-01

    One purpose of a longitudinal study is to gain a better understanding of how an outcome of interest changes among a given population over time. In what follows, a trajectory will be taken to mean the series of measurements of the outcome variable for an individual. Group-based trajectory modelling methods seek to identify subgroups of trajectories within a population, such that trajectories that are grouped together are more similar to each other than to trajectories in distinct groups. Group-based trajectory models generally assume a certain structure in the covariances between measurements, for example conditional independence, homogeneous variance between groups or stationary variance over time. Violations of these assumptions could be expected to result in poor model performance. We used simulation to investigate the effect of covariance misspecification on misclassification of trajectories in commonly used models under a range of scenarios. To do this we defined a measure of performance relative to the ideal Bayesian correct classification rate. We found that the more complex models generally performed better over a range of scenarios. In particular, incorrectly specified covariance matrices could significantly bias the results but using models with a correct but more complicated than necessary covariance matrix incurred little cost.

  8. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    Science.gov (United States)

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  9. Graph-cut based discrete-valued image reconstruction.

    Science.gov (United States)

    Tuysuzoglu, Ahmet; Karl, W Clem; Stojanovic, Ivana; Castañòn, David; Ünlü, M Selim

    2015-05-01

    Efficient graph-cut methods have been used with great success for labeling and denoising problems occurring in computer vision. Unfortunately, the presence of linear image mappings has prevented the use of these techniques in most discrete-amplitude image reconstruction problems. In this paper, we develop a graph-cut based framework for the direct solution of discrete amplitude linear image reconstruction problems cast as regularized energy function minimizations. We first analyze the structure of discrete linear inverse problem cost functions to show that the obstacle to the application of graph-cut methods to their solution is the variable mixing caused by the presence of the linear sensing operator. We then propose to use a surrogate energy functional that overcomes the challenges imposed by the sensing operator yet can be utilized efficiently in existing graph-cut frameworks. We use this surrogate energy functional to devise a monotonic iterative algorithm for the solution of discrete valued inverse problems. We first provide experiments using local convolutional operators and show the robustness of the proposed technique to noise and stability to changes in regularization parameter. Then we focus on nonlocal, tomographic examples where we consider limited-angle data problems. We compare our technique with state-of-the-art discrete and continuous image reconstruction techniques. Experiments show that the proposed method outperforms state-of-the-art techniques in challenging scenarios involving discrete valued unknowns.

  10. Holographic images reconstructed from GMR-based fringe pattern

    Directory of Open Access Journals (Sweden)

    Kikuchi Hiroshi

    2013-01-01

    Full Text Available We have developed a magneto-optical spatial light modulator (MOSLM using giant magneto-resistance (GMR structures for realizing a holographic three-dimensional (3D display. For practical applications, reconstructed image of hologram consisting of GMR structures should be investigated in order to study the feasibility of the MOSLM. In this study, we fabricated a hologram with GMR based fringe-pattern and demonstrated a reconstructed image. A fringe-pattern convolving a crossshaped image was calculated by a conventional binary computer generated hologram (CGH technique. The CGH-pattern has 2,048 × 2,048 with 5 μm pixel pitch. The GMR stack consists of a Tb-Fe-Co/CoFe pinned layer, a Ag spacer, a Gd-Fe free layer for light modulation, and a Ru capping layer, was deposited by dc-magnetron sputtering. The GMR hologram was formed using photo-lithography and Krion milling processes, followed by the deposition of a Tb-Fe-Co reference layer with large coercivity and the same Kerr-rotation angle compared to the free layer, and a lift-off process. The reconstructed image of the ON-state was clearly observed and successfully distinguished from the OFF-state by switching the magnetization direction of the free-layer with an external magnetic field. These results indicate the possibility of realizing a holographic 3D display by the MOSLM using the GMR structures.

  11. Three-dimension reconstruction based on spatial light modulator

    International Nuclear Information System (INIS)

    Deng Xuejiao; Zhang Nanyang; Zeng Yanan; Yin Shiliang; Wang Weiyu

    2011-01-01

    Three-dimension reconstruction, known as an important research direction of computer graphics, is widely used in the related field such as industrial design and manufacture, construction, aerospace, biology and so on. Via such technology we can obtain three-dimension digital point cloud from a two-dimension image, and then simulate the three-dimensional structure of the physical object for further study. At present, the obtaining of three-dimension digital point cloud data is mainly based on the adaptive optics system with Shack-Hartmann sensor and phase-shifting digital holography. Referring to surface fitting, there are also many available methods such as iterated discrete fourier transform, convolution and image interpolation, linear phase retrieval. The main problems we came across in three-dimension reconstruction are the extraction of feature points and arithmetic of curve fitting. To solve such problems, we can, first of all, calculate the relevant surface normal vector information of each pixel in the light source coordinate system, then these vectors are to be converted to the coordinates of image through the coordinate conversion, so the expectant 3D point cloud get arise. Secondly, after the following procedures of de-noising, repairing, the feature points can later be selected and fitted to get the fitting function of the surface topography by means of Zernike polynomial, so as to reconstruct the determinand's three-dimensional topography. In this paper, a new kind of three-dimension reconstruction algorithm is proposed, with the assistance of which, the topography can be estimated from its grayscale at different sample points. Moreover, the previous stimulation and the experimental results prove that the new algorithm has a strong capability to fit, especially for large-scale objects .

  12. Three-dimension reconstruction based on spatial light modulator

    Energy Technology Data Exchange (ETDEWEB)

    Deng Xuejiao; Zhang Nanyang; Zeng Yanan; Yin Shiliang; Wang Weiyu, E-mail: daisydelring@yahoo.com.cn [Huazhong University of Science and Technology (China)

    2011-02-01

    Three-dimension reconstruction, known as an important research direction of computer graphics, is widely used in the related field such as industrial design and manufacture, construction, aerospace, biology and so on. Via such technology we can obtain three-dimension digital point cloud from a two-dimension image, and then simulate the three-dimensional structure of the physical object for further study. At present, the obtaining of three-dimension digital point cloud data is mainly based on the adaptive optics system with Shack-Hartmann sensor and phase-shifting digital holography. Referring to surface fitting, there are also many available methods such as iterated discrete fourier transform, convolution and image interpolation, linear phase retrieval. The main problems we came across in three-dimension reconstruction are the extraction of feature points and arithmetic of curve fitting. To solve such problems, we can, first of all, calculate the relevant surface normal vector information of each pixel in the light source coordinate system, then these vectors are to be converted to the coordinates of image through the coordinate conversion, so the expectant 3D point cloud get arise. Secondly, after the following procedures of de-noising, repairing, the feature points can later be selected and fitted to get the fitting function of the surface topography by means of Zernike polynomial, so as to reconstruct the determinand's three-dimensional topography. In this paper, a new kind of three-dimension reconstruction algorithm is proposed, with the assistance of which, the topography can be estimated from its grayscale at different sample points. Moreover, the previous stimulation and the experimental results prove that the new algorithm has a strong capability to fit, especially for large-scale objects .

  13. Three-dimension reconstruction based on spatial light modulator

    Science.gov (United States)

    Deng, Xuejiao; Zhang, Nanyang; Zeng, Yanan; Yin, Shiliang; Wang, Weiyu

    2011-02-01

    Three-dimension reconstruction, known as an important research direction of computer graphics, is widely used in the related field such as industrial design and manufacture, construction, aerospace, biology and so on. Via such technology we can obtain three-dimension digital point cloud from a two-dimension image, and then simulate the three-dimensional structure of the physical object for further study. At present, the obtaining of three-dimension digital point cloud data is mainly based on the adaptive optics system with Shack-Hartmann sensor and phase-shifting digital holography. Referring to surface fitting, there are also many available methods such as iterated discrete fourier transform, convolution and image interpolation, linear phase retrieval. The main problems we came across in three-dimension reconstruction are the extraction of feature points and arithmetic of curve fitting. To solve such problems, we can, first of all, calculate the relevant surface normal vector information of each pixel in the light source coordinate system, then these vectors are to be converted to the coordinates of image through the coordinate conversion, so the expectant 3D point cloud get arise. Secondly, after the following procedures of de-noising, repairing, the feature points can later be selected and fitted to get the fitting function of the surface topography by means of Zernike polynomial, so as to reconstruct the determinand's three-dimensional topography. In this paper, a new kind of three-dimension reconstruction algorithm is proposed, with the assistance of which, the topography can be estimated from its grayscale at different sample points. Moreover, the previous stimulation and the experimental results prove that the new algorithm has a strong capability to fit, especially for large-scale objects .

  14. A Novel Method of Robust Trajectory Linearization Control Based on Disturbance Rejection

    Directory of Open Access Journals (Sweden)

    Xingling Shao

    2014-01-01

    Full Text Available A novel method of robust trajectory linearization control for a class of nonlinear systems with uncertainties based on disturbance rejection is proposed. Firstly, on the basis of trajectory linearization control (TLC method, a feedback linearization based control law is designed to transform the original tracking error dynamics to the canonical integral-chain form. To address the issue of reducing the influence made by uncertainties, with tracking error as input, linear extended state observer (LESO is constructed to estimate the tracking error vector, as well as the uncertainties in an integrated manner. Meanwhile, the boundedness of the estimated error is investigated by theoretical analysis. In addition, decoupled controller (which has the characteristic of well-tuning and simple form based on LESO is synthesized to realize the output tracking for closed-loop system. The closed-loop stability of the system under the proposed LESO-based control structure is established. Also, simulation results are presented to illustrate the effectiveness of the control strategy.

  15. An equilibrium-point model of electromyographic patterns during single-joint movements based on experimentally reconstructed control signals.

    Science.gov (United States)

    Latash, M L; Goodman, S R

    1994-01-01

    The purpose of this work has been to develop a model of electromyographic (EMG) patterns during single-joint movements based on a version of the equilibrium-point hypothesis, a method for experimental reconstruction of the joint compliant characteristics, the dual-strategy hypothesis, and a kinematic model of movement trajectory. EMG patterns are considered emergent properties of hypothetical control patterns that are equally affected by the control signals and peripheral feedback reflecting actual movement trajectory. A computer model generated the EMG patterns based on simulated movement kinematics and hypothetical control signals derived from the reconstructed joint compliant characteristics. The model predictions have been compared to published recordings of movement kinematics and EMG patterns in a variety of movement conditions, including movements over different distances, at different speeds, against different-known inertial loads, and in conditions of possible unexpected decrease in the inertial load. Changes in task parameters within the model led to simulated EMG patterns qualitatively similar to the experimentally recorded EMG patterns. The model's predictive power compares it favourably to the existing models of the EMG patterns. Copyright © 1994. Published by Elsevier Ltd.

  16. Using rapidly-exploring random tree-based algorithms to find smooth and optimal trajectories

    CSIR Research Space (South Africa)

    Matebese, B

    2012-10-01

    Full Text Available -exploring random tree-based algorithms to fi nd smooth and optimal trajectories B MATEBESE1, MK BANDA2 AND S UTETE1 1CSIR Modelling and Digital Science, PO Box 395, Pretoria, South Africa, 0001 2Department of Applied Mathematics, Stellenbosch University... and complex environments. The RRT algorithm is the most popular and has the ability to find a feasible solution faster than other algorithms. The drawback of using RRT is that, as the number of samples increases, the probability that the algorithm converges...

  17. Trajectory Optimization of Spray Painting Robot for Complex Curved Surface Based on Exponential Mean Bézier Method

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2017-01-01

    Full Text Available Automated tool trajectory planning for spray painting robots is still a challenging problem, especially for a large complex curved surface. This paper presents a new method of trajectory optimization for spray painting robot based on exponential mean Bézier method. The definition and the three theorems of exponential mean Bézier curves are discussed. Then a spatial painting path generation method based on exponential mean Bézier curves is developed. A new simple algorithm for trajectory optimization on complex curved surfaces is introduced. A golden section method is adopted to calculate the values. The experimental results illustrate that the exponential mean Bézier curves enhanced flexibility of the path planning, and the trajectory optimization algorithm achieved satisfactory performance. This method can also be extended to other applications.

  18. Comprehensive quantification of signal-to-noise ratio and g-factor for image-based and k-space-based parallel imaging reconstructions.

    Science.gov (United States)

    Robson, Philip M; Grant, Aaron K; Madhuranthakam, Ananth J; Lattanzi, Riccardo; Sodickson, Daniel K; McKenzie, Charles A

    2008-10-01

    Parallel imaging reconstructions result in spatially varying noise amplification characterized by the g-factor, precluding conventional measurements of noise from the final image. A simple Monte Carlo based method is proposed for all linear image reconstruction algorithms, which allows measurement of signal-to-noise ratio and g-factor and is demonstrated for SENSE and GRAPPA reconstructions for accelerated acquisitions that have not previously been amenable to such assessment. Only a simple "prescan" measurement of noise amplitude and correlation in the phased-array receiver, and a single accelerated image acquisition are required, allowing robust assessment of signal-to-noise ratio and g-factor. The "pseudo multiple replica" method has been rigorously validated in phantoms and in vivo, showing excellent agreement with true multiple replica and analytical methods. This method is universally applicable to the parallel imaging reconstruction techniques used in clinical applications and will allow pixel-by-pixel image noise measurements for all parallel imaging strategies, allowing quantitative comparison between arbitrary k-space trajectories, image reconstruction, or noise conditioning techniques. (c) 2008 Wiley-Liss, Inc.

  19. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    International Nuclear Information System (INIS)

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-01

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated

  20. Identifying Different Transportation Modes from Trajectory Data Using Tree-Based Ensemble Classifiers

    Directory of Open Access Journals (Sweden)

    Zhibin Xiao

    2017-02-01

    Full Text Available Recognition of transportation modes can be used in different applications including human behavior research, transport management and traffic control. Previous work on transportation mode recognition has often relied on using multiple sensors or matching Geographic Information System (GIS information, which is not possible in many cases. In this paper, an approach based on ensemble learning is proposed to infer hybrid transportation modes using only Global Position System (GPS data. First, in order to distinguish between different transportation modes, we used a statistical method to generate global features and extract several local features from sub-trajectories after trajectory segmentation, before these features were combined in the classification stage. Second, to obtain a better performance, we used tree-based ensemble models (Random Forest, Gradient Boosting Decision Tree, and XGBoost instead of traditional methods (K-Nearest Neighbor, Decision Tree, and Support Vector Machines to classify the different transportation modes. The experiment results on the later have shown the efficacy of our proposed approach. Among them, the XGBoost model produced the best performance with a classification accuracy of 90.77% obtained on the GEOLIFE dataset, and we used a tree-based ensemble method to ensure accurate feature selection to reduce the model complexity.

  1. CC_TRS: Continuous Clustering of Trajectory Stream Data Based on Micro Cluster Life

    Directory of Open Access Journals (Sweden)

    Musaab Riyadh

    2017-01-01

    Full Text Available The rapid spreading of positioning devices leads to the generation of massive spatiotemporal trajectories data. In some scenarios, spatiotemporal data are received in stream manner. Clustering of stream data is beneficial for different applications such as traffic management and weather forecasting. In this article, an algorithm for Continuous Clustering of Trajectory Stream Data Based on Micro Cluster Life is proposed. The algorithm consists of two phases. There is the online phase where temporal micro clusters are used to store summarized spatiotemporal information for each group of similar segments. The clustering task in online phase is based on temporal micro cluster lifetime instead of time window technique which divides stream data into time bins and clusters each bin separately. For offline phase, a density based clustering approach is used to generate macro clusters depending on temporal micro clusters. The evaluation of the proposed algorithm on real data sets shows the efficiency and the effectiveness of the proposed algorithm and proved it is efficient alternative to time window technique.

  2. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  3. Image quality of iterative reconstruction in cranial CT imaging: comparison of model-based iterative reconstruction (MBIR) and adaptive statistical iterative reconstruction (ASiR).

    Science.gov (United States)

    Notohamiprodjo, S; Deak, Z; Meurer, F; Maertz, F; Mueck, F G; Geyer, L L; Wirth, S

    2015-01-01

    The purpose of this study was to compare cranial CT (CCT) image quality (IQ) of the MBIR algorithm with standard iterative reconstruction (ASiR). In this institutional review board (IRB)-approved study, raw data sets of 100 unenhanced CCT examinations (120 kV, 50-260 mAs, 20 mm collimation, 0.984 pitch) were reconstructed with both ASiR and MBIR. Signal-to-noise (SNR) and contrast-to-noise (CNR) were calculated from attenuation values measured in caudate nucleus, frontal white matter, anterior ventricle horn, fourth ventricle, and pons. Two radiologists, who were blinded to the reconstruction algorithms, evaluated anonymized multiplanar reformations of 2.5 mm with respect to depiction of different parenchymal structures and impact of artefacts on IQ with a five-point scale (0: unacceptable, 1: less than average, 2: average, 3: above average, 4: excellent). MBIR decreased artefacts more effectively than ASiR (p ASiR was 2 (p ASiR (p ASiR. As CCT is an examination that is frequently required, the use of MBIR may allow for substantial reduction of radiation exposure caused by medical diagnostics. • Model-Based iterative reconstruction (MBIR) effectively decreased artefacts in cranial CT. • MBIR reconstructed images were rated with significantly higher scores for image quality. • Model-Based iterative reconstruction may allow reduced-dose diagnostic examination protocols.

  4. Online Detection of Anomalous Sub-trajectories: A Sliding Window Approach Based on Conformal Anomaly Detection and Local Outlier Factor

    OpenAIRE

    Laxhammar , Rikard; Falkman , Göran

    2012-01-01

    Part 4: First Conformal Prediction and Its Applications Workshop (COPA 2012); International audience; Automated detection of anomalous trajectories is an important problem in the surveillance domain. Various algorithms based on learning of normal trajectory patterns have been proposed for this problem. Yet, these algorithms suffer from one or more of the following limitations: First, they are essentially designed for offline anomaly detection in databases. Second, they are insensitive to loca...

  5. A fast image reconstruction technique based on ART

    International Nuclear Information System (INIS)

    Zhang Shunli; Zhang Dinghua; Wang Kai; Huang Kuidong; Li Weibin

    2007-01-01

    Algebraic Reconstruction Technique (ART) is an iterative method for image reconstruction. Improving its reconstruction speed has been one of the important researching aspects of ART. For the simplified weight coefficients reconstruction model of ART, a fast grid traverse algorithm is proposed, which can determine the grid index by simple operations such as addition, subtraction and comparison. Since the weight coefficients are calculated at real time during iteration, large amount of storage is saved and the reconstruction speed is greatly increased. Experimental results show that the new algorithm is very effective and the reconstruction speed is improved about 10 times compared with the traditional algorithm. (authors)

  6. Precise shape reconstruction by active pattern in total-internal-reflection-based tactile sensor.

    Science.gov (United States)

    Saga, Satoshi; Taira, Ryosuke; Deguchi, Koichiro

    2014-03-01

    We are developing a total-internal-reflection-based tactile sensor in which the shape is reconstructed using an optical reflection. This sensor consists of silicone rubber, an image pattern, and a camera. It reconstructs the shape of the sensor surface from an image of a pattern reflected at the inner sensor surface by total internal reflection. In this study, we propose precise real-time reconstruction by employing an optimization method. Furthermore, we propose to use active patterns. Deformation of the reflection image causes reconstruction errors. By controlling the image pattern, the sensor reconstructs the surface deformation more precisely. We implement the proposed optimization and active-pattern-based reconstruction methods in a reflection-based tactile sensor, and perform reconstruction experiments using the system. A precise deformation experiment confirms the linearity and precision of the reconstruction.

  7. Level-set-based reconstruction algorithm for EIT lung images: first clinical results.

    Science.gov (United States)

    Rahmati, Peyman; Soleimani, Manuchehr; Pulletz, Sven; Frerichs, Inéz; Adler, Andy

    2012-05-01

    We show the first clinical results using the level-set-based reconstruction algorithm for electrical impedance tomography (EIT) data. The level-set-based reconstruction method (LSRM) allows the reconstruction of non-smooth interfaces between image regions, which are typically smoothed by traditional voxel-based reconstruction methods (VBRMs). We develop a time difference formulation of the LSRM for 2D images. The proposed reconstruction method is applied to reconstruct clinical EIT data of a slow flow inflation pressure-volume manoeuvre in lung-healthy and adult lung-injury patients. Images from the LSRM and the VBRM are compared. The results show comparable reconstructed images, but with an improved ability to reconstruct sharp conductivity changes in the distribution of lung ventilation using the LSRM.

  8. Level-set-based reconstruction algorithm for EIT lung images: first clinical results

    International Nuclear Information System (INIS)

    Rahmati, Peyman; Adler, Andy; Soleimani, Manuchehr; Pulletz, Sven; Frerichs, Inéz

    2012-01-01

    We show the first clinical results using the level-set-based reconstruction algorithm for electrical impedance tomography (EIT) data. The level-set-based reconstruction method (LSRM) allows the reconstruction of non-smooth interfaces between image regions, which are typically smoothed by traditional voxel-based reconstruction methods (VBRMs). We develop a time difference formulation of the LSRM for 2D images. The proposed reconstruction method is applied to reconstruct clinical EIT data of a slow flow inflation pressure–volume manoeuvre in lung-healthy and adult lung-injury patients. Images from the LSRM and the VBRM are compared. The results show comparable reconstructed images, but with an improved ability to reconstruct sharp conductivity changes in the distribution of lung ventilation using the LSRM. (paper)

  9. Upscaling of dilution and mixing using a trajectory based Spatial Markov random walk model in a periodic flow domain

    Science.gov (United States)

    Sund, Nicole L.; Porta, Giovanni M.; Bolster, Diogo

    2017-05-01

    The Spatial Markov Model (SMM) is an upscaled model that has been used successfully to predict effective mean transport across a broad range of hydrologic settings. Here we propose a novel variant of the SMM, applicable to spatially periodic systems. This SMM is built using particle trajectories, rather than travel times. By applying the proposed SMM to a simple benchmark problem we demonstrate that it can predict mean effective transport, when compared to data from fully resolved direct numerical simulations. Next we propose a methodology for using this SMM framework to predict measures of mixing and dilution, that do not just depend on mean concentrations, but are strongly impacted by pore-scale concentration fluctuations. We use information from trajectories of particles to downscale and reconstruct pore-scale approximate concentration fields from which mixing and dilution measures are then calculated. The comparison between measurements from fully resolved simulations and predictions with the SMM agree very favorably.

  10. RGBD Video Based Human Hand Trajectory Tracking and Gesture Recognition System

    Directory of Open Access Journals (Sweden)

    Weihua Liu

    2015-01-01

    Full Text Available The task of human hand trajectory tracking and gesture trajectory recognition based on synchronized color and depth video is considered. Toward this end, in the facet of hand tracking, a joint observation model with the hand cues of skin saliency, motion and depth is integrated into particle filter in order to move particles to local peak in the likelihood. The proposed hand tracking method, namely, salient skin, motion, and depth based particle filter (SSMD-PF, is capable of improving the tracking accuracy considerably, in the context of the signer performing the gesture toward the camera device and in front of moving, cluttered backgrounds. In the facet of gesture recognition, a shape-order context descriptor on the basis of shape context is introduced, which can describe the gesture in spatiotemporal domain. The efficient shape-order context descriptor can reveal the shape relationship and embed gesture sequence order information into descriptor. Moreover, the shape-order context leads to a robust score for gesture invariant. Our approach is complemented with experimental results on the settings of the challenging hand-signed digits datasets and American sign language dataset, which corroborate the performance of the novel techniques.

  11. Constructing activity–mobility trajectories of college students based on smart card transaction data

    Directory of Open Access Journals (Sweden)

    Negin Ebadi

    2017-12-01

    Full Text Available In this research, we use UB card as a convenient source of combined smart transaction data in order to define a campus-wide model for constructing students’ activity–mobility trajectories in time–space dimension. UB Card is a student’s official ID at the University at Buffalo and is used across campus for various activities including Stampedes and Shuttles (on-campus bus system, facilities access, library services, dining, shopping, and etc. Two activity–mobility trajectory construction algorithms are developed. The base algorithm constructs students’ activity–mobility patterns in space–time dimension using a set of smart card transaction data points as the only inputs. The modified individualized algorithm constructs activity–mobility patterns with prior knowledge of students’ previous patterns as they have similar patterns for certain days of the week. A database of 37 students’ travel survey and UB card transactions that contains a period of 5 days have been used to illustrate the results of the study. Three measures of errors have been proposed to capture the time allocation, location deviation, and activity sequences. These errors present an acceptable accuracy (12–25% error ranges for activity types and average 0.04–0.16 miles of error for location predictions and show the potential of inferring activity–mobility behaviors based on smart card transaction type data sets.

  12. Vision-Based Leader Vehicle Trajectory Tracking for Multiple Agricultural Vehicles.

    Science.gov (United States)

    Zhang, Linhuan; Ahamed, Tofael; Zhang, Yan; Gao, Pengbo; Takigawa, Tomohiro

    2016-04-22

    The aim of this study was to design a navigation system composed of a human-controlled leader vehicle and a follower vehicle. The follower vehicle automatically tracks the leader vehicle. With such a system, a human driver can control two vehicles efficiently in agricultural operations. The tracking system was developed for the leader and the follower vehicle, and control of the follower was performed using a camera vision system. A stable and accurate monocular vision-based sensing system was designed, consisting of a camera and rectangular markers. Noise in the data acquisition was reduced by using the least-squares method. A feedback control algorithm was used to allow the follower vehicle to track the trajectory of the leader vehicle. A proportional-integral-derivative (PID) controller was introduced to maintain the required distance between the leader and the follower vehicle. Field experiments were conducted to evaluate the sensing and tracking performances of the leader-follower system while the leader vehicle was driven at an average speed of 0.3 m/s. In the case of linear trajectory tracking, the RMS errors were 6.5 cm, 8.9 cm and 16.4 cm for straight, turning and zigzag paths, respectively. Again, for parallel trajectory tracking, the root mean square (RMS) errors were found to be 7.1 cm, 14.6 cm and 14.0 cm for straight, turning and zigzag paths, respectively. The navigation performances indicated that the autonomous follower vehicle was able to follow the leader vehicle, and the tracking accuracy was found to be satisfactory. Therefore, the developed leader-follower system can be implemented for the harvesting of grains, using a combine as the leader and an unloader as the autonomous follower vehicle.

  13. A Control Framework for Anthropomorphic Biped Walking Based on Stabilizing Feedforward Trajectories.

    Science.gov (United States)

    Rezazadeh, Siavash; Gregg, Robert D

    2016-10-01

    Although dynamic walking methods have had notable successes in control of bipedal robots in the recent years, still most of the humanoid robots rely on quasi-static Zero Moment Point controllers. This work is an attempt to design a highly stable controller for dynamic walking of a human-like model which can be used both for control of humanoid robots and prosthetic legs. The method is based on using time-based trajectories that can induce a highly stable limit cycle to the bipedal robot. The time-based nature of the controller motivates its use to entrain a model of an amputee walking, which can potentially lead to a better coordination of the interaction between the prosthesis and the human. The simulations demonstrate the stability of the controller and its robustness against external perturbations.

  14. Trajectory Generation and Stability Analysis for Reconfigurable Klann Mechanism Based Walking Robot

    Directory of Open Access Journals (Sweden)

    Jaichandar Kulandaidaasan Sheba

    2016-06-01

    Full Text Available Reconfigurable legged robots based on one degree of freedom are highly desired because they are effective on rough and irregular terrains and they provide mobility in such terrain with simple control schemes. It is necessary that reconfigurable legged robots should maintain stability during rest and motion, with a minimum number of legs while maintaining their full range of walking patterns resulting from different gait configuration. In this paper we present a method to generate input trajectory for reconfigurable quadruped robots based on Klann mechanism to properly synchronize movement. Six useful gait cycles based on this reconfigurable Klann mechanism for quadruped robots has been clearly shown here. The platform stability for these six useful gait cycles are validated through simulated results which clearly shows the capabilities of reconfigurable design.

  15. Reconstructing Carotenoid-Based and Structural Coloration in Fossil Skin.

    Science.gov (United States)

    McNamara, Maria E; Orr, Patrick J; Kearns, Stuart L; Alcalá, Luis; Anadón, Pere; Peñalver, Enrique

    2016-04-25

    Evidence of original coloration in fossils provides insights into the visual communication strategies used by ancient animals and the functional evolution of coloration over time [1-7]. Hitherto, all reconstructions of the colors of reptile integument and the plumage of fossil birds and feathered dinosaurs have been of melanin-based coloration [1-6]. Extant animals also use other mechanisms for producing color [8], but these have not been identified in fossils. Here we report the first examples of carotenoid-based coloration in the fossil record, and of structural coloration in fossil integument. The fossil skin, from a 10 million-year-old colubrid snake from the Late Miocene Libros Lagerstätte (Teruel, Spain) [9, 10], preserves dermal pigment cells (chromatophores)-xanthophores, iridophores, and melanophores-in calcium phosphate. Comparison with chromatophore abundance and position in extant reptiles [11-15] indicates that the fossil snake was pale-colored in ventral regions; dorsal and lateral regions were green with brown-black and yellow-green transverse blotches. Such coloration most likely functioned in substrate matching and intraspecific signaling. Skin replicated in authigenic minerals is not uncommon in exceptionally preserved fossils [16, 17], and dermal pigment cells generate coloration in numerous reptile, amphibian, and fish taxa today [18]. Our discovery thus represents a new means by which to reconstruct the original coloration of exceptionally preserved fossil vertebrates. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Task-based optimization of image reconstruction in breast CT

    Science.gov (United States)

    Sanchez, Adrian A.; Sidky, Emil Y.; Pan, Xiaochuan

    2014-03-01

    We demonstrate a task-based assessment of image quality in dedicated breast CT in order to optimize the number of projection views acquired. The methodology we employ is based on the Hotelling Observer (HO) and its associated metrics. We consider two tasks: the Rayleigh task of discerning between two resolvable objects and a single larger object, and the signal detection task of classifying an image as belonging to either a signalpresent or signal-absent hypothesis. HO SNR values are computed for 50, 100, 200, 500, and 1000 projection view images, with the total imaging radiation dose held constant. We use the conventional fan-beam FBP algorithm and investigate the effect of varying the width of a Hanning window used in the reconstruction, since this affects both the noise properties of the image and the under-sampling artifacts which can arise in the case of sparse-view acquisitions. Our results demonstrate that fewer projection views should be used in order to increase HO performance, which in this case constitutes an upper-bound on human observer performance. However, the impact on HO SNR of using fewer projection views, each with a higher dose, is not as significant as the impact of employing regularization in the FBP reconstruction through a Hanning filter.

  17. Drinking and smoking patterns during pregnancy: Development of group-based trajectories in the Safe Passage Study.

    Science.gov (United States)

    Dukes, Kimberly; Tripp, Tara; Willinger, Marian; Odendaal, Hein; Elliott, Amy J; Kinney, Hannah C; Robinson, Fay; Petersen, Julie M; Raffo, Cheryl; Hereld, Dale; Groenewald, Coen; Angal, Jyoti; Hankins, Gary; Burd, Larry; Fifer, William P; Myers, Michael M; Hoffman, Howard J; Sullivan, Lisa

    2017-08-01

    Precise identification of drinking and smoking patterns during pregnancy is crucial to better understand the risk to the fetus. The purpose of this manuscript is to describe the methodological approach used to define prenatal drinking and smoking trajectories from a large prospective pregnancy cohort, and to describe maternal characteristics associated with different exposure patterns. In the Safe Passage Study, detailed information regarding quantity, frequency, and timing of exposure was self-reported up to four times during pregnancy and at 1 month post-delivery. Exposure trajectories were developed using data from 11,692 pregnancies (9912 women) where pregnancy outcome was known. Women were from three diverse populations: white (23%) and American Indian (17%) in the Northern Plains, US, and mixed ancestry (59%) in South Africa (other/not specified [1%]). Group-based trajectory modeling was used to identify 5 unique drinking trajectories (1 none/minimal, 2 quitting groups, 2 continuous groups) and 7 smoking trajectories (1 none/minimal, 2 quitting groups, 4 continuous groups). Women with pregnancies assigned to the low- or high-continuous drinking groups were less likely to have completed high school and were more likely to have enrolled in the study in the third trimester, be of mixed ancestry, or be depressed than those assigned to the none/minimal or quit-drinking groups. Results were similar when comparing continuous smokers to none/minimal and quit-smoking groups. Further, women classified as high- or low-continuous drinkers were more likely to smoke at moderate-, high-, and very high-continuous levels, as compared to women classified as non-drinkers and quitters. This is the first study of this size to utilize group-based trajectory modeling to identify unique prenatal drinking and smoking trajectories. These trajectories will be used in future analyses to determine which specific exposure patterns subsequently manifest as poor peri- and postnatal outcomes

  18. SPECTRAL RECONSTRUCTION BASED ON SVM FOR CROSS CALIBRATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2017-05-01

    Full Text Available Chinese HY-1C/1D satellites will use a 5nm/10nm-resolutional visible-near infrared(VNIR hyperspectral sensor with the solar calibrator to cross-calibrate with other sensors. The hyperspectral radiance data are composed of average radiance in the sensor’s passbands and bear a spectral smoothing effect, a transform from the hyperspectral radiance data to the 1-nm-resolution apparent spectral radiance by spectral reconstruction need to be implemented. In order to solve the problem of noise cumulation and deterioration after several times of iteration by the iterative algorithm, a novel regression method based on SVM is proposed, which can approach arbitrary complex non-linear relationship closely and provide with better generalization capability by learning. In the opinion of system, the relationship between the apparent radiance and equivalent radiance is nonlinear mapping introduced by spectral response function(SRF, SVM transform the low-dimensional non-linear question into high-dimensional linear question though kernel function, obtaining global optimal solution by virtue of quadratic form. The experiment is performed using 6S-simulated spectrums considering the SRF and SNR of the hyperspectral sensor, measured reflectance spectrums of water body and different atmosphere conditions. The contrastive result shows: firstly, the proposed method is with more reconstructed accuracy especially to the high-frequency signal; secondly, while the spectral resolution of the hyperspectral sensor reduces, the proposed method performs better than the iterative method; finally, the root mean square relative error(RMSRE which is used to evaluate the difference of the reconstructed spectrum and the real spectrum over the whole spectral range is calculated, it decreses by one time at least by proposed method.

  19. Model-based image reconstruction for four-dimensional PET

    International Nuclear Information System (INIS)

    Li Tianfang; Thorndyke, Brian; Schreibmann, Eduard; Yang Yong; Xing Lei

    2006-01-01

    Positron emission tonography (PET) is useful in diagnosis and radiation treatment planning for a variety of cancers. For patients with cancers in thoracic or upper abdominal region, the respiratory motion produces large distortions in the tumor shape and size, affecting the accuracy in both diagnosis and treatment. Four-dimensional (4D) (gated) PET aims to reduce the motion artifacts and to provide accurate measurement of the tumor volume and the tracer concentration. A major issue in 4D PET is the lack of statistics. Since the collected photons are divided into several frames in the 4D PET scan, the quality of each reconstructed frame degrades as the number of frames increases. The increased noise in each frame heavily degrades the quantitative accuracy of the PET imaging. In this work, we propose a method to enhance the performance of 4D PET by developing a new technique of 4D PET reconstruction with incorporation of an organ motion model derived from 4D-CT images. The method is based on the well-known maximum-likelihood expectation-maximization (ML-EM) algorithm. During the processes of forward- and backward-projection in the ML-EM iterations, all projection data acquired at different phases are combined together to update the emission map with the aid of deformable model, the statistics is therefore greatly improved. The proposed algorithm was first evaluated with computer simulations using a mathematical dynamic phantom. Experiment with a moving physical phantom was then carried out to demonstrate the accuracy of the proposed method and the increase of signal-to-noise ratio over three-dimensional PET. Finally, the 4D PET reconstruction was applied to a patient case

  20. Integrating the Base of Aircraft Data (BADA) in CTAS Trajectory Synthesizer

    Science.gov (United States)

    Abramson, Michael; Ali, Kareem

    2012-01-01

    The Center-Terminal Radar Approach Control (TRACON) Automation System (CTAS), developed at NASA Ames Research Center for assisting controllers in the management and control of air traffic in the extended terminal area, supports the modeling of more than four hundred aircraft types. However, 90% of them are supported indirectly by mapping them to one of a relatively few aircraft types for which CTAS has detailed drag and engine thrust models. On the other hand, the Base of Aircraft Data (BADA), developed and maintained by Eurocontrol, supports more than 300 aircraft types, about one third of which are directly supported, i.e. they have validated performance data. All these data were made available for CTAS by integrating BADA version 3.8 into CTAS Trajectory Synthesizer (TS). Several validation tools were developed and used to validate the integrated code and to evaluate the accuracy of trajectory predictions generated using CTAS "native" and BADA Aircraft Performance Models (APM) comparing them with radar track data. Results of these comparisons indicate that the two models have different strengths and weaknesses. The BADA APM can improve the accuracy of CTAS predictions at least for some aircraft types, especially small aircraft, and for some flight phases, especially climb.

  1. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach.

    Directory of Open Access Journals (Sweden)

    Hyunseok Park

    Full Text Available The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents.

  2. The didactic situation in geometry learning based on analysis of learning obstacles and learning trajectory

    Science.gov (United States)

    Sulistyowati, Fitria; Budiyono, Slamet, Isnandar

    2017-12-01

    This study aims to design a didactic situation based on the analysis of learning obstacles and learning trajectory on prism volume. The type of this research is qualitative and quantitative research with steps: analyzing the learning obstacles and learning trajectory, preparing the didactic situation, applying the didactic situation in the classroom, mean difference test of problem solving ability with t-test statistic. The subjects of the study were 8th grade junior high school students in Magelang 2016/2017 selected randomly from eight existing classes. The result of this research is the design of didactic situations that can be implemented in prism volume learning. The effectiveness of didactic situations that have been designed is shown by the mean difference test that is the problem solving ability of the students after the application of the didactic situation better than before the application. The didactic situation that has been generated is expected to be a consideration for teachers to design lessons that match the character of learners, classrooms and teachers themselves, so that the potential thinking of learners can be optimized to avoid the accumulation of learning obstacles.

  3. Trajectory-based nonadiabatic dynamics with time-dependent density functional theory.

    Science.gov (United States)

    Curchod, Basile F E; Rothlisberger, Ursula; Tavernelli, Ivano

    2013-05-10

    Understanding the fate of an electronically excited molecule constitutes an important task for theoretical chemistry, and practical implications range from the interpretation of atto- and femtosecond spectroscopy to the development of light-driven molecular machines, the control of photochemical reactions, and the possibility of capturing sunlight energy. However, many challenging conceptual and technical problems are involved in the description of these phenomena such as 1) the failure of the well-known Born-Oppenheimer approximation; 2) the need for accurate electronic properties such as potential energy surfaces, excited nuclear forces, or nonadiabatic coupling terms; and 3) the necessity of describing the dynamics of the photoexcited nuclear wavepacket. This review provides an overview of the current methods to address points 1) and 3) and shows how time-dependent density functional theory (TDDFT) and its linear-response extension can be used for point 2). First, the derivation of Ehrenfest dynamics and nonadiabatic Bohmian dynamics is discussed and linked to Tully's trajectory surface hopping. Second, the coupling of these trajectory-based nonadiabatic schemes with TDDFT is described in detail with special emphasis on the derivation of the required electronic structure properties. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Improving head and neck CTA with hybrid and model-based iterative reconstruction techniques

    NARCIS (Netherlands)

    Niesten, J. M.; van der Schaaf, I. C.; Vos, P. C.; Willemink, MJ; Velthuis, B. K.

    2015-01-01

    AIM: To compare image quality of head and neck computed tomography angiography (CTA) reconstructed with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and model-based iterative reconstruction (MIR) algorithms. MATERIALS AND METHODS: The raw data of 34 studies were

  5. Low Emissions and Delay Optimization for an Isolated Signalized Intersection Based on Vehicular Trajectories.

    Directory of Open Access Journals (Sweden)

    Ciyun Lin

    Full Text Available A traditional traffic signal control system is established based on vehicular delay, queue length, saturation and other indicators. However, due to the increasing severity of urban environmental pollution issues and the development of a resource-saving and environmentally friendly social philosophy, the development of low-carbon and energy-efficient urban transport is required. This paper first defines vehicular trajectories and the calculation of vehicular emissions based on VSP. Next, a regression analysis method is used to quantify the relationship between vehicular emissions and delay, and a traffic signal control model is established to reduce emissions and delay using the enumeration method combined with saturation constraints. Finally, one typical intersection of Changchun is selected to verify the model proposed in this paper; its performance efficiency is also compared using simulations in VISSIM. The results of this study show that the proposed model can significantly reduce vehicle delay and traffic emissions simultaneously.

  6. Low Emissions and Delay Optimization for an Isolated Signalized Intersection Based on Vehicular Trajectories.

    Science.gov (United States)

    Lin, Ciyun; Gong, Bowen; Qu, Xin

    2015-01-01

    A traditional traffic signal control system is established based on vehicular delay, queue length, saturation and other indicators. However, due to the increasing severity of urban environmental pollution issues and the development of a resource-saving and environmentally friendly social philosophy, the development of low-carbon and energy-efficient urban transport is required. This paper first defines vehicular trajectories and the calculation of vehicular emissions based on VSP. Next, a regression analysis method is used to quantify the relationship between vehicular emissions and delay, and a traffic signal control model is established to reduce emissions and delay using the enumeration method combined with saturation constraints. Finally, one typical intersection of Changchun is selected to verify the model proposed in this paper; its performance efficiency is also compared using simulations in VISSIM. The results of this study show that the proposed model can significantly reduce vehicle delay and traffic emissions simultaneously.

  7. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    Science.gov (United States)

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  8. Structured Light-Based 3D Reconstruction System for Plants.

    Science.gov (United States)

    Nguyen, Thuy Tuong; Slaughter, David C; Max, Nelson; Maloof, Julin N; Sinha, Neelima

    2015-07-29

    Camera-based 3D reconstruction of physical objects is one of the most popular computer vision trends in recent years. Many systems have been built to model different real-world subjects, but there is lack of a completely robust system for plants. This paper presents a full 3D reconstruction system that incorporates both hardware structures (including the proposed structured light system to enhance textures on object surfaces) and software algorithms (including the proposed 3D point cloud registration and plant feature measurement). This paper demonstrates the ability to produce 3D models of whole plants created from multiple pairs of stereo images taken at different viewing angles, without the need to destructively cut away any parts of a plant. The ability to accurately predict phenotyping features, such as the number of leaves, plant height, leaf size and internode distances, is also demonstrated. Experimental results show that, for plants having a range of leaf sizes and a distance between leaves appropriate for the hardware design, the algorithms successfully predict phenotyping features in the target crops, with a recall of 0.97 and a precision of 0.89 for leaf detection and less than a 13-mm error for plant size, leaf size and internode distance.

  9. Structured Light-Based 3D Reconstruction System for Plants

    Directory of Open Access Journals (Sweden)

    Thuy Tuong Nguyen

    2015-07-01

    Full Text Available Camera-based 3D reconstruction of physical objects is one of the most popular computer vision trends in recent years. Many systems have been built to model different real-world subjects, but there is lack of a completely robust system for plants. This paper presents a full 3D reconstruction system that incorporates both hardware structures (including the proposed structured light system to enhance textures on object surfaces and software algorithms (including the proposed 3D point cloud registration and plant feature measurement. This paper demonstrates the ability to produce 3D models of whole plants created from multiple pairs of stereo images taken at different viewing angles, without the need to destructively cut away any parts of a plant. The ability to accurately predict phenotyping features, such as the number of leaves, plant height, leaf size and internode distances, is also demonstrated. Experimental results show that, for plants having a range of leaf sizes and a distance between leaves appropriate for the hardware design, the algorithms successfully predict phenotyping features in the target crops, with a recall of 0.97 and a precision of 0.89 for leaf detection and less than a 13-mm error for plant size, leaf size and internode distance.

  10. Introducing the fit-criteria assessment plot - A visualisation tool to assist class enumeration in group-based trajectory modelling.

    Science.gov (United States)

    Klijn, Sven L; Weijenberg, Matty P; Lemmens, Paul; van den Brandt, Piet A; Lima Passos, Valéria

    2017-10-01

    Background and objective Group-based trajectory modelling is a model-based clustering technique applied for the identification of latent patterns of temporal changes. Despite its manifold applications in clinical and health sciences, potential problems of the model selection procedure are often overlooked. The choice of the number of latent trajectories (class-enumeration), for instance, is to a large degree based on statistical criteria that are not fail-safe. Moreover, the process as a whole is not transparent. To facilitate class enumeration, we introduce a graphical summary display of several fit and model adequacy criteria, the fit-criteria assessment plot. Methods An R-code that accepts universal data input is presented. The programme condenses relevant group-based trajectory modelling output information of model fit indices in automated graphical displays. Examples based on real and simulated data are provided to illustrate, assess and validate fit-criteria assessment plot's utility. Results Fit-criteria assessment plot provides an overview of fit criteria on a single page, placing users in an informed position to make a decision. Fit-criteria assessment plot does not automatically select the most appropriate model but eases the model assessment procedure. Conclusions Fit-criteria assessment plot is an exploratory, visualisation tool that can be employed to assist decisions in the initial and decisive phase of group-based trajectory modelling analysis. Considering group-based trajectory modelling's widespread resonance in medical and epidemiological sciences, a more comprehensive, easily interpretable and transparent display of the iterative process of class enumeration may foster group-based trajectory modelling's adequate use.

  11. A Distributed Resilient Autonomous Framework for Manned/Unmanned Trajectory-Based Operations, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Resilient Ops, working in collaboration with Metron Aviation, Inc., proposes to develop a prototype system for planning Unmanned Aircraft Systems (UAS) trajectories...

  12. A Red-Light Running Prevention System Based on Artificial Neural Network and Vehicle Trajectory Data

    Directory of Open Access Journals (Sweden)

    Pengfei Li

    2014-01-01

    Full Text Available The high frequency of red-light running and complex driving behaviors at the yellow onset at intersections cannot be explained solely by the dilemma zone and vehicle kinematics. In this paper, the author presented a red-light running prevention system which was based on artificial neural networks (ANNs to approximate the complex driver behaviors during yellow and all-red clearance and serve as the basis of an innovative red-light running prevention system. The artificial neural network and vehicle trajectory are applied to identify the potential red-light runners. The ANN training time was also acceptable and its predicting accurate rate was over 80%. Lastly, a prototype red-light running prevention system with the trained ANN model was described. This new system can be directly retrofitted into the existing traffic signal systems.

  13. A red-light running prevention system based on artificial neural network and vehicle trajectory data.

    Science.gov (United States)

    Li, Pengfei; Li, Yan; Guo, Xiucheng

    2014-01-01

    The high frequency of red-light running and complex driving behaviors at the yellow onset at intersections cannot be explained solely by the dilemma zone and vehicle kinematics. In this paper, the author presented a red-light running prevention system which was based on artificial neural networks (ANNs) to approximate the complex driver behaviors during yellow and all-red clearance and serve as the basis of an innovative red-light running prevention system. The artificial neural network and vehicle trajectory are applied to identify the potential red-light runners. The ANN training time was also acceptable and its predicting accurate rate was over 80%. Lastly, a prototype red-light running prevention system with the trained ANN model was described. This new system can be directly retrofitted into the existing traffic signal systems.

  14. A Red-Light Running Prevention System Based on Artificial Neural Network and Vehicle Trajectory Data

    Science.gov (United States)

    Li, Pengfei; Li, Yan; Guo, Xiucheng

    2014-01-01

    The high frequency of red-light running and complex driving behaviors at the yellow onset at intersections cannot be explained solely by the dilemma zone and vehicle kinematics. In this paper, the author presented a red-light running prevention system which was based on artificial neural networks (ANNs) to approximate the complex driver behaviors during yellow and all-red clearance and serve as the basis of an innovative red-light running prevention system. The artificial neural network and vehicle trajectory are applied to identify the potential red-light runners. The ANN training time was also acceptable and its predicting accurate rate was over 80%. Lastly, a prototype red-light running prevention system with the trained ANN model was described. This new system can be directly retrofitted into the existing traffic signal systems. PMID:25435870

  15. Multi-criteria ACO-based Algorithm for Ship’s Trajectory Planning

    Directory of Open Access Journals (Sweden)

    Agnieszka Lazarowska

    2017-03-01

    Full Text Available The paper presents a new approach for solving a path planning problem for ships in the environment with static and dynamic obstacles. The algorithm utilizes a heuristic method, classified to the group of Swarm Intelligence approaches, called the Ant Colony Optimization. The method is inspired by a collective behaviour of ant colonies. A group of agents - artificial ants searches through the solution space in order to find a safe, optimal trajectory for a ship. The problem is considered as a multi-criteria optimization task. The criteria taken into account during problem solving are: path safety, path length, the International Regulations for Preventing Collisions at Sea (COLREGs compliance and path smoothness. The paper includes the description of the new multi-criteria ACO-based algorithm along with the presentation and discussion of simulation tests results.

  16. The roles of lesson study in the development of mathematics learning instrument based on learning trajectory

    Science.gov (United States)

    Misnasanti; Dien, C. A.; Azizah, F.

    2018-03-01

    This study is aimed to describe Lesson Study (LS) activity and its roles in the development of mathematics learning instruments based on Learning Trajectory (LT). This study is a narrative study of teacher’s experiences in joining LS activity. Data collecting in this study will use three methods such as observation, documentations, and deep interview. The collected data will be analyzed with Milles and Huberman’s model that consists of reduction, display, and verification. The study result shows that through LS activity, teachers know more about how students think. Teachers also can revise their mathematics learning instrument in the form of lesson plan. It means that LS activity is important to make a better learning instruments and focus on how student learn not on how teacher teach.

  17. Long range trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Allen, P. W.; Jessup, E. A.; White, R. E. [Air Resources Field Research Office, Las Vegas, Nevada (United States)

    1967-07-01

    A single air molecule can have a trajectory that can be described with a line, but most meteorologists use single lines to represent the trajectories of air parcels. A single line trajectory has the disadvantage that it is a categorical description of position. Like categorized forecasts it provides no qualification, and no provision for dispersion in case the parcel contains two or more molecules which may take vastly different paths. Diffusion technology has amply demonstrated that an initial aerosol cloud or volume of gas in the atmosphere not only grows larger, but sometimes divides into puffs, each having a different path or swath. Yet, the average meteorologist, faced with the problem of predicting the future motion of a cloud, usually falls back on the line trajectory approach with the explanation that he had no better tool for long range application. In his more rational moments, he may use some arbitrary device to spread his cloud with distance. One such technique has been to separate the trajectory into two or more trajectories, spaced about the endpoint of the original trajectory after a short period of travel, repeating this every so often like a chain reaction. This has the obvious disadvantage of involving a large amount of labor without much assurance of improved accuracy. Another approach is to draw a circle about the trajectory endpoint, to represent either diffusion or error. The problem then is to know what radius to give the circle and also whether to call it diffusion or error. Meteorologists at the Nevada Test Site (NTS) are asked frequently to provide advice which involves trajectory technology, such as prediction of an aerosol cloud path, reconstruction of the motion of a volume of air, indication of the dilution, and the possible trajectory prediction error over great distances. Therefore, we set out, nearly three years ago, to provide some statistical knowledge about the status of our trajectory technology. This report contains some of the

  18. Direct Reconstruction of CT-based Attenuation Correction Images for PET with Cluster-Based Penalties

    Science.gov (United States)

    Kim, Soo Mee; Alessio, Adam M.; De Man, Bruno; Asma, Evren; Kinahan, Paul E.

    2015-01-01

    Extremely low-dose CT acquisitions for the purpose of PET attenuation correction will have a high level of noise and biasing artifacts due to factors such as photon starvation. This work explores a priori knowledge appropriate for CT iterative image reconstruction for PET attenuation correction. We investigate the maximum a posteriori (MAP) framework with cluster-based, multinomial priors for the direct reconstruction of the PET attenuation map. The objective function for direct iterative attenuation map reconstruction was modeled as a Poisson log-likelihood with prior terms consisting of quadratic (Q) and mixture (M) distributions. The attenuation map is assumed to have values in 4 clusters: air+background, lung, soft tissue, and bone. Under this assumption, the MP was a mixture probability density function consisting of one exponential and three Gaussian distributions. The relative proportion of each cluster was jointly estimated during each voxel update of direct iterative coordinate decent (dICD) method. Noise-free data were generated from NCAT phantom and Poisson noise was added. Reconstruction with FBP (ramp filter) was performed on the noise-free (ground truth) and noisy data. For the noisy data, dICD reconstruction was performed with the combination of different prior strength parameters (β and γ) of Q- and M-penalties. The combined quadratic and mixture penalties reduces the RMSE by 18.7% compared to post-smoothed iterative reconstruction and only 0.7% compared to quadratic alone. For direct PET attenuation map reconstruction from ultra-low dose CT acquisitions, the combination of quadratic and mixture priors offers regularization of both variance and bias and is a potential method to derive attenuation maps with negligible patient dose. However, the small improvement in quantitative accuracy relative to the substantial increase in algorithm complexity does not currently justify the use of mixture-based PET attenuation priors for reconstruction of CT

  19. Robust adaptive multichannel SAR processing based on covariance matrix reconstruction

    Science.gov (United States)

    Tan, Zhen-ya; He, Feng

    2018-04-01

    With the combination of digital beamforming (DBF) processing, multichannel synthetic aperture radar(SAR) systems in azimuth promise well in high-resolution and wide-swath imaging, whereas conventional processing methods don't take the nonuniformity of scattering coefficient into consideration. This paper brings up a robust adaptive Multichannel SAR processing method which utilizes the Capon spatial spectrum estimator to obtain the spatial spectrum distribution over all ambiguous directions first, and then the interference-plus-noise covariance Matrix is reconstructed based on definition to acquire the Multichannel SAR processing filter. The performance of processing under nonuniform scattering coefficient is promoted by this novel method and it is robust again array errors. The experiments with real measured data demonstrate the effectiveness and robustness of the proposed method.

  20. Trajectories of Depressive Symptoms Among Web-Based Health Risk Assessment Participants.

    Science.gov (United States)

    Bedrosian, Richard; Hawrilenko, Matt; Cole-Lewis, Heather

    2017-03-31

    Health risk assessments (HRAs), which often screen for depressive symptoms, are administered to millions of employees and health plan members each year. HRA data provide an opportunity to examine longitudinal trends in depressive symptomatology, as researchers have done previously with other populations. The primary research questions were: (1) Can we observe longitudinal trajectories in HRA populations like those observed in other study samples? (2) Do HRA variables, which primarily reflect modifiable health risks, help us to identify predictors associated with these trajectories? (3) Can we make meaningful recommendations for population health management, applicable to HRA participants, based on predictors we identify? This study used growth mixture modeling (GMM) to examine longitudinal trends in depressive symptomatology among 22,963 participants in a Web-based HRA used by US employers and health plans. The HRA assessed modifiable health risks and variables such as stress, sleep, and quality of life. Five classes were identified: A "minimal depression" class (63.91%, 14,676/22,963) whose scores were consistently low across time, a "low risk" class (19.89%, 4568/22,963) whose condition remained subthreshold, a "deteriorating" class (3.15%, 705/22,963) who began at subthreshold but approached severe depression by the end of the study, a "chronic" class (4.71%, 1081/22,963) who remained highly depressed over time, and a "remitting" class (8.42%, 1933/22,963) who had moderate depression to start, but crossed into minimal depression by the end. Among those with subthreshold symptoms, individuals who were male (PInternet Research (http://www.jmir.org), 31.03.2017.

  1. Efficient parsimony-based methods for phylogenetic network reconstruction.

    Science.gov (United States)

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2007-01-15

    Phylogenies--the evolutionary histories of groups of organisms-play a major role in representing relationships among biological entities. Although many biological processes can be effectively modeled as tree-like relationships, others, such as hybrid speciation and horizontal gene transfer (HGT), result in networks, rather than trees, of relationships. Hybrid speciation is a significant evolutionary mechanism in plants, fish and other groups of species. HGT plays a major role in bacterial genome diversification and is a significant mechanism by which bacteria develop resistance to antibiotics. Maximum parsimony is one of the most commonly used criteria for phylogenetic tree inference. Roughly speaking, inference based on this criterion seeks the tree that minimizes the amount of evolution. In 1990, Jotun Hein proposed using this criterion for inferring the evolution of sequences subject to recombination. Preliminary results on small synthetic datasets. Nakhleh et al. (2005) demonstrated the criterion's application to phylogenetic network reconstruction in general and HGT detection in particular. However, the naive algorithms used by the authors are inapplicable to large datasets due to their demanding computational requirements. Further, no rigorous theoretical analysis of computing the criterion was given, nor was it tested on biological data. In the present work we prove that the problem of scoring the parsimony of a phylogenetic network is NP-hard and provide an improved fixed parameter tractable algorithm for it. Further, we devise efficient heuristics for parsimony-based reconstruction of phylogenetic networks. We test our methods on both synthetic and biological data (rbcL gene in bacteria) and obtain very promising results.

  2. SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality.

    Science.gov (United States)

    Chen, Long; Tang, Wen; John, Nigel W; Wan, Tao Ruan; Zhang, Jian Jun

    2018-05-01

    based on a robust 3D calibration. We demonstrate the clinical relevance of our proposed system through two examples: (a) measurement of the surface; (b) depth cues in monocular endoscopy. The performance and accuracy evaluations of the proposed framework consist of two steps. First, we have created a computer-generated endoscopy simulation video to quantify the accuracy of the camera tracking by comparing the results of the video camera tracking with the recorded ground-truth camera trajectories. The accuracy of the surface reconstruction is assessed by evaluating the Root Mean Square Distance (RMSD) of surface vertices of the reconstructed mesh with that of the ground truth 3D models. An error of 1.24 mm for the camera trajectories has been obtained and the RMSD for surface reconstruction is 2.54 mm, which compare favourably with previous approaches. Second, in vivo laparoscopic videos are used to examine the quality of accurate AR based annotation and measurement, and the creation of depth cues. These results show the potential promise of our geometry-aware AR technology to be used in MIS surgical scenes. The results show that the new framework is robust and accurate in dealing with challenging situations such as the rapid endoscopy camera movements in monocular MIS scenes. Both camera tracking and surface reconstruction based on a sparse point cloud are effective and operated in real-time. This demonstrates the potential of our algorithm for accurate AR localization and depth augmentation with geometric cues and correct surface measurements in MIS with monocular endoscopes. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    International Nuclear Information System (INIS)

    Rit, S; Vila Oliva, M; Sarrut, D; Brousmiche, S; Labarbe, R; Sharp, G C

    2014-01-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  4. Model-based iterative reconstruction for reduction of radiation dose in abdominopelvic CT: comparison to adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Yasaka, Koichiro; Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2013-12-01

    To evaluate dose reduction and image quality of abdominopelvic computed tomography (CT) reconstructed with model-based iterative reconstruction (MBIR) compared to adaptive statistical iterative reconstruction (ASIR). In this prospective study, 85 patients underwent referential-, low-, and ultralow-dose unenhanced abdominopelvic CT. Images were reconstructed with ASIR for low-dose (L-ASIR) and ultralow-dose CT (UL-ASIR), and with MBIR for ultralow-dose CT (UL-MBIR). Image noise was measured in the abdominal aorta and iliopsoas muscle. Subjective image analyses and a lesion detection study (adrenal nodules) were conducted by two blinded radiologists. A reference standard was established by a consensus panel of two different radiologists using referential-dose CT reconstructed with filtered back projection. Compared to low-dose CT, there was a 63% decrease in dose-length product with ultralow-dose CT. UL-MBIR had significantly lower image noise than L-ASIR and UL-ASIR (all pASIR and UL-ASIR (all pASIR in diagnostic acceptability (p>0.65), or diagnostic performance for adrenal nodules (p>0.87). MBIR significantly improves image noise and streak artifacts compared to ASIR, and can achieve radiation dose reduction without severely compromising image quality.

  5. Research on compressive sensing reconstruction algorithm based on total variation model

    Science.gov (United States)

    Gao, Yu-xuan; Sun, Huayan; Zhang, Tinghua; Du, Lin

    2017-12-01

    Compressed sensing for breakthrough Nyquist sampling theorem provides a strong theoretical , making compressive sampling for image signals be carried out simultaneously. In traditional imaging procedures using compressed sensing theory, not only can it reduces the storage space, but also can reduce the demand for detector resolution greatly. Using the sparsity of image signal, by solving the mathematical model of inverse reconfiguration, realize the super-resolution imaging. Reconstruction algorithm is the most critical part of compression perception, to a large extent determine the accuracy of the reconstruction of the image.The reconstruction algorithm based on the total variation (TV) model is more suitable for the compression reconstruction of the two-dimensional image, and the better edge information can be obtained. In order to verify the performance of the algorithm, Simulation Analysis the reconstruction result in different coding mode of the reconstruction algorithm based on the TV reconstruction algorithm. The reconstruction effect of the reconfigurable algorithm based on TV based on the different coding methods is analyzed to verify the stability of the algorithm. This paper compares and analyzes the typical reconstruction algorithm in the same coding mode. On the basis of the minimum total variation algorithm, the Augmented Lagrangian function term is added and the optimal value is solved by the alternating direction method.Experimental results show that the reconstruction algorithm is compared with the traditional classical algorithm based on TV has great advantages, under the low measurement rate can be quickly and accurately recovers target image.

  6. Lunar and interplanetary trajectories

    CERN Document Server

    Biesbroek, Robin

    2016-01-01

    This book provides readers with a clear description of the types of lunar and interplanetary trajectories, and how they influence satellite-system design. The description follows an engineering rather than a mathematical approach and includes many examples of lunar trajectories, based on real missions. It helps readers gain an understanding of the driving subsystems of interplanetary and lunar satellites. The tables and graphs showing features of trajectories make the book easy to understand. .

  7. Time-based Reconstruction of Free-streaming Data in CBM

    Science.gov (United States)

    Akishina, Valentina; Kisel, Ivan; Vassiliev, Iouri; Zyzak, Maksym

    2018-02-01

    Traditional latency-limited trigger architectures typical for conventional experiments are inapplicable for the CBM experiment. Instead, CBM will ship and collect time-stamped data into a readout buffer in a form of a time-slice of a certain length and deliver it to a large computer farm, where online event reconstruction and selection will be performed. Grouping measurements into physical collisions must be performed in software and requires reconstruction not only in space, but also in time, the so-called 4-dimensional track reconstruction and event building. The tracks, reconstructed with 4D Cellular Automaton track finder, are combined into event-corresponding clusters according to the estimated time in the target position and the errors, obtained with the Kalman Filter method. The reconstructed events are given as inputs to the KF Particle Finder package for short-lived particle reconstruction. The results of time-based reconstruction of simulated collisions in CBM are presented and discussed in details.

  8. Learning from Your Network of Friends: A Trajectory Representation Learning Model Based on Online Social Ties

    KAUST Repository

    Alharbi, Basma Mohammed; Zhang, Xiangliang

    2017-01-01

    Location-Based Social Networks (LBSNs) capture individuals whereabouts for a large portion of the population. To utilize this data for user (location)-similarity based tasks, one must map the raw data into a low-dimensional uniform feature space. However, due to the nature of LBSNs, many users have sparse and incomplete check-ins. In this work, we propose to overcome this issue by leveraging the network of friends, when learning the new feature space. We first analyze the impact of friends on individuals's mobility, and show that individuals trajectories are correlated with thoseof their friends and friends of friends (2-hop friends) in an online setting. Based on our observation, we propose a mixed-membership model that infers global mobility patterns from users' check-ins and their network of friends, without impairing the model's complexity. Our proposed model infers global patterns and learns new representations for both usersand locations simultaneously. We evaluate the inferred patterns and compare the quality of the new user representation against baseline methods on a social link prediction problem.

  9. Learning from Your Network of Friends: A Trajectory Representation Learning Model Based on Online Social Ties

    KAUST Repository

    Alharbi, Basma Mohammed

    2017-02-07

    Location-Based Social Networks (LBSNs) capture individuals whereabouts for a large portion of the population. To utilize this data for user (location)-similarity based tasks, one must map the raw data into a low-dimensional uniform feature space. However, due to the nature of LBSNs, many users have sparse and incomplete check-ins. In this work, we propose to overcome this issue by leveraging the network of friends, when learning the new feature space. We first analyze the impact of friends on individuals\\'s mobility, and show that individuals trajectories are correlated with thoseof their friends and friends of friends (2-hop friends) in an online setting. Based on our observation, we propose a mixed-membership model that infers global mobility patterns from users\\' check-ins and their network of friends, without impairing the model\\'s complexity. Our proposed model infers global patterns and learns new representations for both usersand locations simultaneously. We evaluate the inferred patterns and compare the quality of the new user representation against baseline methods on a social link prediction problem.

  10. POLYANA-A tool for the calculation of molecular radial distribution functions based on Molecular Dynamics trajectories

    Science.gov (United States)

    Dimitroulis, Christos; Raptis, Theophanes; Raptis, Vasilios

    2015-12-01

    We present an application for the calculation of radial distribution functions for molecular centres of mass, based on trajectories generated by molecular simulation methods (Molecular Dynamics, Monte Carlo). When designing this application, the emphasis was placed on ease of use as well as ease of further development. In its current version, the program can read trajectories generated by the well-known DL_POLY package, but it can be easily extended to handle other formats. It is also very easy to 'hack' the program so it can compute intermolecular radial distribution functions for groups of interaction sites rather than whole molecules.

  11. D Reconstruction from Uav-Based Hyperspectral Images

    Science.gov (United States)

    Liu, L.; Xu, L.; Peng, J.

    2018-04-01

    Reconstructing the 3D profile from a set of UAV-based images can obtain hyperspectral information, as well as the 3D coordinate of any point on the profile. Our images are captured from the Cubert UHD185 (UHD) hyperspectral camera, which is a new type of high-speed onboard imaging spectrometer. And it can get both hyperspectral image and panchromatic image simultaneously. The panchromatic image have a higher spatial resolution than hyperspectral image, but each hyperspectral image provides considerable information on the spatial spectral distribution of the object. Thus there is an opportunity to derive a high quality 3D point cloud from panchromatic image and considerable spectral information from hyperspectral image. The purpose of this paper is to introduce our processing chain that derives a database which can provide hyperspectral information and 3D position of each point. First, We adopt a free and open-source software, Visual SFM which is based on structure from motion (SFM) algorithm, to recover 3D point cloud from panchromatic image. And then get spectral information of each point from hyperspectral image by a self-developed program written in MATLAB. The production can be used to support further research and applications.

  12. Design of an Ecological Flow-based Interface for 4D Trajectory Management in Air Traffic Control

    NARCIS (Netherlands)

    Pinto, J.; Klomp, R.E.; Borst, C.; Van Paassen, M.M.; Mulder, M.

    2015-01-01

    The concept of trajectory-based operations as proposed by SESAR and NextGen seeks to increase airspace efficiency and capacity by introducing time as an explicit control variable. Such form of operations lean heavily on the introduction of higher levels of automation to support the human air traffic

  13. Sliding mode based trajectory linearization control for hypersonic reentry vehicle via extended disturbance observer.

    Science.gov (United States)

    Xingling, Shao; Honglun, Wang

    2014-11-01

    This paper proposes a novel hybrid control framework by combing observer-based sliding mode control (SMC) with trajectory linearization control (TLC) for hypersonic reentry vehicle (HRV) attitude tracking problem. First, fewer control consumption is achieved using nonlinear tracking differentiator (TD) in the attitude loop. Second, a novel SMC that employs extended disturbance observer (EDO) to counteract the effect of uncertainties using a new sliding surface which includes the estimation error is integrated to address the tracking error stabilization issues in the attitude and angular rate loop, respectively. In addition, new results associated with EDO are examined in terms of dynamic response and noise-tolerant performance, as well as estimation accuracy. The key feature of the proposed compound control approach is that chattering free tracking performance with high accuracy can be ensured for HRV in the presence of multiple uncertainties under control constraints. Based on finite time convergence stability theory, the stability of the resulting closed-loop system is well established. Also, comparisons and extensive simulation results are presented to demonstrate the effectiveness of the control strategy. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Active disturbance rejection based trajectory linearization control for hypersonic reentry vehicle with bounded uncertainties.

    Science.gov (United States)

    Shao, Xingling; Wang, Honglun

    2015-01-01

    This paper investigates a novel compound control scheme combined with the advantages of trajectory linearization control (TLC) and alternative active disturbance rejection control (ADRC) for hypersonic reentry vehicle (HRV) attitude tracking system with bounded uncertainties. Firstly, in order to overcome actuator saturation problem, nonlinear tracking differentiator (TD) is applied in the attitude loop to achieve fewer control consumption. Then, linear extended state observers (LESO) are constructed to estimate the uncertainties acting on the LTV system in the attitude and angular rate loop. In addition, feedback linearization (FL) based controllers are designed using estimates of uncertainties generated by LESO in each loop, which enable the tracking error for closed-loop system in the presence of large uncertainties to converge to the residual set of the origin asymptotically. Finally, the compound controllers are derived by integrating with the nominal controller for open-loop nonlinear system and FL based controller. Also, comparisons and simulation results are presented to illustrate the effectiveness of the control strategy. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Three-dimensional reconstructions in spine and screw trajectory simulation on 3D digital images: a step by step approach by using Mimics software.

    Science.gov (United States)

    Chen, Dong; Chen, Chun-Hui; Tang, Li; Wang, Kai; Li, Yu-Zhe; Phan, Kevin; Wu, Ai-Min

    2017-12-01

    There is a rapidly increasing amount of literature outlining the use of three-dimensional (3D) reconstruction and printing technologies in recent years. However, precise instructive articles which describe step-by-step methods of reconstructing 3D images from computed tomography (CT) or magnetic resonance imaging (MRI) remain limited. To address these issues, this article describes a detailed protocol which will allow the reader to easily perform the 3D reconstruction in their future research, to allow investigation of the appropriate surgical anatomy and allow innovative designs of novel screw fixation techniques or pre-operative surgical planning.

  16. Breast reconstruction with anatomical implants: A review of indications and techniques based on current literature.

    Science.gov (United States)

    Gardani, Marco; Bertozzi, Nicolò; Grieco, Michele Pio; Pesce, Marianna; Simonacci, Francesco; Santi, PierLuigi; Raposio, Edoardo

    2017-09-01

    One important modality of breast cancer therapy is surgical treatment, which has become increasingly less mutilating over the last century. Breast reconstruction has become an integrated part of breast cancer treatment due to long-term psychosexual health factors and its importance for breast cancer survivors. Both autogenous tissue-based and implant-based reconstruction provides satisfactory reconstructive options due to better surgeon awareness of "the ideal breast size", although each has its own advantages and disadvantages. An overview of the current options in breast reconstruction is presented in this article.

  17. 4D Trajectory Estimation for Air Traffic Control Automation System Based on Hybrid System Theory

    Directory of Open Access Journals (Sweden)

    Xin-Min Tang

    2012-03-01

    Full Text Available To resolve the problem of future airspace management under great traffic flow and high density condition, 4D trajectory estimation has become one of the core technologies of the next new generation air traffic control automation system. According to the flight profile and the dynamics models of different aircraft types under different flight conditions, a hybrid system model that switches the aircraft from one flight stage to another with aircraft state changing continuously in one state is constructed. Additionally, air temperature and wind speed are used to modify aircraft true airspeed as well as ground speed, and the hybrid system evolution simulation is used to estimate aircraft 4D trajectory. The case study proves that 4D trajectory estimated through hybrid system model can image the flight dynamic states of aircraft and satisfy the needs of the planned flight altitude profile.KEY WORDSair traffic management, 4D trajectory estimation, hybrid system model, aircraft dynamic model

  18. An RRT-Based path planner for use in trajectory imitation

    CSIR Research Space (South Africa)

    Claassens, J

    2010-01-01

    Full Text Available The authors propose a more robust robot programming; by demonstration system planner that produces a reproduction; path which satisfies statistical constraints derived from demonstration; trajectories and avoids obstacles given the freedom; in those...

  19. Satellite Images-Based Obstacle Recognition and Trajectory Generation for Agricultural Vehicles

    Directory of Open Access Journals (Sweden)

    Mehmet Bodur

    2015-12-01

    Full Text Available In this study, a method for the generation of tracking trajectory points, detection and positioning of obstacles in agricultural fields have been presented. Our principal contribution is to produce traceable GPS trajectories for agricultural vehicles to be utilized by path planning algorithms, rather than a new path planning algorithm. The proposed system works with minimal initialization requirements, specifically, a single geographical coordinate entry of an agricultural field. The automation of agricultural plantation requires many aspects to be addressed, many of which have been covered in previous studies. Depending on the type of crop, different agricultural vehicles may be used in the field. However, regardless of their application, they all follow a specified trajectory in the field. This study takes advantage of satellite images for the detection and positioning of obstacles, and the generation of GPS trajectories in the agricultural realm. A set of image processing techniques is applied in Matlab for detection and positioning.

  20. Complications After Mastectomy and Immediate Breast Reconstruction for Breast Cancer: A Claims-Based Analysis

    Science.gov (United States)

    Jagsi, Reshma; Jiang, Jing; Momoh, Adeyiza O.; Alderman, Amy; Giordano, Sharon H.; Buchholz, Thomas A.; Pierce, Lori J.; Kronowitz, Steven J.; Smith, Benjamin D.

    2016-01-01

    Objective To evaluate complications after post-mastectomy breast reconstruction, particularly in the setting of adjuvant radiotherapy. Summary-Background Data Most studies of complications after breast reconstruction have been conducted at centers of excellence; relatively little is known about complication rates in radiated patients treated in the broader community. This information is relevant for breast cancer patients' decision-making. Methods Using the claims-based MarketScan database, we described complications in 14,894 women undergoing mastectomy for breast cancer from 1998-2007 who received immediate autologous reconstruction (n=2637), immediate implant-based reconstruction (n=3007), or no reconstruction within the first two postoperative years (n=9250). We used a generalized estimating equation to evaluate associations between complications and radiotherapy over time. Results Wound complications were diagnosed within the first two postoperative years in 2.3% of patients without reconstruction, 4.4% with implants, and 9.5% with autologous reconstruction (pimplants, and 20.7% with autologous reconstruction (pimplant removal in patients with implant reconstruction (OR 1.48, pbreast reconstruction differ by approach. Radiation therapy appears to modestly increase certain risks, including infection and implant removal. PMID:25876011

  1. Efficacy of Vancomycin-based Continuous Triple Antibiotic Irrigation in Immediate, Implant-based Breast Reconstruction

    Directory of Open Access Journals (Sweden)

    Lisa M. Hunsicker, MD, FACS

    2017-12-01

    Conclusions:. Continuous breast irrigation with a vancomycin-based triple antibiotic solution is a safe and effective accompaniment for immediate implant reconstruction. Use of intramuscular anesthetic injection for postoperative pain control allows the elastomeric infusion pump to be available for local tissue antibiotic irrigation.

  2. An industrial robot singular trajectories planning based on graphs and neural networks

    Science.gov (United States)

    Łęgowski, Adrian; Niezabitowski, Michał

    2016-06-01

    Singular trajectories are rarely used because of issues during realization. A method of planning trajectories for given set of points in task space with use of graphs and neural networks is presented. In every desired point the inverse kinematics problem is solved in order to derive all possible solutions. A graph of solutions is made. The shortest path is determined to define required nodes in joint space. Neural networks are used to define the path between these nodes.

  3. Optimal control and optimal trajectories of regional macroeconomic dynamics based on the Pontryagin maximum principle

    Science.gov (United States)

    Bulgakov, V. K.; Strigunov, V. V.

    2009-05-01

    The Pontryagin maximum principle is used to prove a theorem concerning optimal control in regional macroeconomics. A boundary value problem for optimal trajectories of the state and adjoint variables is formulated, and optimal curves are analyzed. An algorithm is proposed for solving the boundary value problem of optimal control. The performance of the algorithm is demonstrated by computing an optimal control and the corresponding optimal trajectories.

  4. A New Track Reconstruction Algorithm suitable for Parallel Processing based on Hit Triplets and Broken Lines

    Directory of Open Access Journals (Sweden)

    Schöning André

    2016-01-01

    Full Text Available Track reconstruction in high track multiplicity environments at current and future high rate particle physics experiments is a big challenge and very time consuming. The search for track seeds and the fitting of track candidates are usually the most time consuming steps in the track reconstruction. Here, a new and fast track reconstruction method based on hit triplets is proposed which exploits a three-dimensional fit model including multiple scattering and hit uncertainties from the very start, including the search for track seeds. The hit triplet based reconstruction method assumes a homogeneous magnetic field which allows to give an analytical solutions for the triplet fit result. This method is highly parallelizable, needs fewer operations than other standard track reconstruction methods and is therefore ideal for the implementation on parallel computing architectures. The proposed track reconstruction algorithm has been studied in the context of the Mu3e-experiment and a typical LHC experiment.

  5. Edge Artifacts in Point Spread Function-based PET Reconstruction in Relation to Object Size and Reconstruction Parameters

    Directory of Open Access Journals (Sweden)

    Yuji Tsutsui

    2017-06-01

    Full Text Available Objective(s: We evaluated edge artifacts in relation to phantom diameter and reconstruction parameters in point spread function (PSF-based positron emission tomography (PET image reconstruction.Methods: PET data were acquired from an original cone-shaped phantom filled with 18F solution (21.9 kBq/mL for 10 min using a Biograph mCT scanner. The images were reconstructed using the baseline ordered subsets expectation maximization (OSEM algorithm and the OSEM with PSF correction model. The reconstruction parameters included a pixel size of 1.0, 2.0, or 3.0 mm, 1-12 iterations, 24 subsets, and a full width at half maximum (FWHM of the post-filter Gaussian filter of 1.0, 2.0, or 3.0 mm. We compared both the maximum recovery coefficient (RCmax and the mean recovery coefficient (RCmean in the phantom at different diameters.Results: The OSEM images had no edge artifacts, but the OSEM with PSF images had a dense edge delineating the hot phantom at diameters 10 mm or more and a dense spot at the center at diameters of 8 mm or less. The dense edge was clearly observed on images with a small pixel size, a Gaussian filter with a small FWHM, and a high number of iterations. At a phantom diameter of 6-7 mm, the RCmax for the OSEM and OSEM with PSF images was 60% and 140%, respectively (pixel size: 1.0 mm; FWHM of the Gaussian filter: 2.0 mm; iterations: 2. The RCmean of the OSEM with PSF images did not exceed 100%.Conclusion: PSF-based image reconstruction resulted in edge artifacts, the degree of which depends on the pixel size, number of iterations, FWHM of the Gaussian filter, and object size.

  6. Anisotropic Diffusion based Brain MRI Segmentation and 3D Reconstruction

    Directory of Open Access Journals (Sweden)

    M. Arfan Jaffar

    2012-06-01

    Full Text Available In medical field visualization of the organs is very imperative for accurate diagnosis and treatment of any disease. Brain tumor diagnosis and surgery also required impressive 3D visualization of the brain to the radiologist. Detection and 3D reconstruction of brain tumors from MRI is a computationally time consuming and error-prone task. Proposed system detects and presents a 3D visualization model of the brain and tumor inside which greatly helps the radiologist to effectively diagnose and analyze the brain tumor. We proposed a multi-phase segmentation and visualization technique which overcomes the many problems of 3D volume segmentation methods like lake of fine details. In this system segmentation is done in three different phases which reduces the error chances. The system finds contours for skull, brain and tumor. These contours are stacked over and two novel methods are used to find the 3D visualization models. The results of these techniques, particularly of interpolation based, are impressive. Proposed system is tested against publically available data set [41] and MRI datasets available from MRI aamp; CT center Rawalpindi, Pakistan [42].

  7. Parallelization of the model-based iterative reconstruction algorithm DIRA

    International Nuclear Information System (INIS)

    Oertenberg, A.; Sandborg, M.; Alm Carlsson, G.; Malusek, A.; Magnusson, M.

    2016-01-01

    New paradigms for parallel programming have been devised to simplify software development on multi-core processors and many-core graphical processing units (GPU). Despite their obvious benefits, the parallelization of existing computer programs is not an easy task. In this work, the use of the Open Multiprocessing (OpenMP) and Open Computing Language (OpenCL) frameworks is considered for the parallelization of the model-based iterative reconstruction algorithm DIRA with the aim to significantly shorten the code's execution time. Selected routines were parallelized using OpenMP and OpenCL libraries; some routines were converted from MATLAB to C and optimised. Parallelization of the code with the OpenMP was easy and resulted in an overall speedup of 15 on a 16-core computer. Parallelization with OpenCL was more difficult owing to differences between the central processing unit and GPU architectures. The resulting speedup was substantially lower than the theoretical peak performance of the GPU; the cause was explained. (authors)

  8. A Trajectory Generation Method Based on Edge Detection for Auto-Sealant Cartesian Robot

    Directory of Open Access Journals (Sweden)

    Eka Samsul Maarif

    2014-07-01

    Full Text Available This paper presents algorithm ingenerating trajectory for sealant process using captured image. Cartesian robot as auto-sealant in manufacturing process has increased productivity, reduces human error and saves time. But, different sealant path in many engine models means not only different trajectory but also different program. Therefore robot with detection ability to generate its own trajectory is needed. This paper describes best lighting technique in capturing image and applies edge detection in trajectory generation as the solution. The algorithm comprises image capturing, Canny edge detection, integral projection in localizing outer most edge, scanning coordinates, and generating vector direction codes. The experiment results show that the best technique is diffuse lighting at 10 Cd. The developed method gives connected point to point trajectory which forms sealant path with a point to next point distance is equal to 90° motor rotation. Directional movement for point to point trajectory is controlled by generated codes which are ready to be sent by serial communication to robot controller as instruction for motors which actuate axes X and Y directions.

  9. Socioeconomic differences in children’s television viewing trajectory: A population-based prospective cohort study

    Science.gov (United States)

    van Grieken, Amy; Moll, Henriëtte A.; Jaddoe, Vincent W. V.; Wijtzes, Anne I.; Raat, Hein

    2017-01-01

    We aimed to evaluate the association between family socioeconomic status and repeatedly measured child television viewing time from early childhood to the school period. We analyzed data on 3,561 Dutch children from the Generation R Study, a population-based study in the Netherlands. Parent-reported television viewing time for children aged 2, 3, 4, 6 and 9 years were collected by questionnaires sent from April 2004 until January 2015. Odds ratios of watching television ≥1 hour/day at each age were calculated for children of mothers with low, mid-low, mid-high and high (reference group) education and children from low, middle and high (reference group) income households. A generalized logistic mixed model was used to assess the association between family socioeconomic status and child television viewing time trajectory. The percentage of children watching television ≥1 hour/day increased from age 2 to 9 years for all children (24.2%-85.0% for children of low-educated mothers; 4.7%-61.4% for children of high-educated mothers; 17.2%-74.9% for children from low income households; 6.2%-65.1% for children from high income households). Independent socioeconomic effect in child television viewing time was found for maternal educational level. The interaction between net household income and child age in longitudinal analyses was significant (p = 0.01), indicating that the television viewing time trajectories were different in household income subgroups. However the interaction between maternal educational level and child age was not significant (p = 0.19). Inverse socioeconomic gradients in child television viewing time were found from the preschool period to the late school period. The educational differences between the various educational subgroups remained stable with increasing age, but the differences between household income groups changed over time. Intervention developers and healthcare practitioners need to raise awareness among non-highly educated parents

  10. Socioeconomic differences in children's television viewing trajectory: A population-based prospective cohort study.

    Science.gov (United States)

    Yang-Huang, Junwen; van Grieken, Amy; Moll, Henriëtte A; Jaddoe, Vincent W V; Wijtzes, Anne I; Raat, Hein

    2017-01-01

    We aimed to evaluate the association between family socioeconomic status and repeatedly measured child television viewing time from early childhood to the school period. We analyzed data on 3,561 Dutch children from the Generation R Study, a population-based study in the Netherlands. Parent-reported television viewing time for children aged 2, 3, 4, 6 and 9 years were collected by questionnaires sent from April 2004 until January 2015. Odds ratios of watching television ≥1 hour/day at each age were calculated for children of mothers with low, mid-low, mid-high and high (reference group) education and children from low, middle and high (reference group) income households. A generalized logistic mixed model was used to assess the association between family socioeconomic status and child television viewing time trajectory. The percentage of children watching television ≥1 hour/day increased from age 2 to 9 years for all children (24.2%-85.0% for children of low-educated mothers; 4.7%-61.4% for children of high-educated mothers; 17.2%-74.9% for children from low income households; 6.2%-65.1% for children from high income households). Independent socioeconomic effect in child television viewing time was found for maternal educational level. The interaction between net household income and child age in longitudinal analyses was significant (p = 0.01), indicating that the television viewing time trajectories were different in household income subgroups. However the interaction between maternal educational level and child age was not significant (p = 0.19). Inverse socioeconomic gradients in child television viewing time were found from the preschool period to the late school period. The educational differences between the various educational subgroups remained stable with increasing age, but the differences between household income groups changed over time. Intervention developers and healthcare practitioners need to raise awareness among non-highly educated parents

  11. Reconstruction and 3D visualisation based on objective real 3D based documentation.

    Science.gov (United States)

    Bolliger, Michael J; Buck, Ursula; Thali, Michael J; Bolliger, Stephan A

    2012-09-01

    Reconstructions based directly upon forensic evidence alone are called primary information. Historically this consists of documentation of findings by verbal protocols, photographs and other visual means. Currently modern imaging techniques such as 3D surface scanning and radiological methods (computer tomography, magnetic resonance imaging) are also applied. Secondary interpretation is based on facts and the examiner's experience. Usually such reconstructive expertises are given in written form, and are often enhanced by sketches. However, narrative interpretations can, especially in complex courses of action, be difficult to present and can be misunderstood. In this report we demonstrate the use of graphic reconstruction of secondary interpretation with supporting pictorial evidence, applying digital visualisation (using 'Poser') or scientific animation (using '3D Studio Max', 'Maya') and present methods of clearly distinguishing between factual documentation and examiners' interpretation based on three cases. The first case involved a pedestrian who was initially struck by a car on a motorway and was then run over by a second car. The second case involved a suicidal gunshot to the head with a rifle, in which the trigger was pushed with a rod. The third case dealt with a collision between two motorcycles. Pictorial reconstruction of the secondary interpretation of these cases has several advantages. The images enable an immediate overview, give rise to enhanced clarity, and compel the examiner to look at all details if he or she is to create a complete image.

  12. DataComm in Flight Deck Surface Trajectory-Based Operations

    Science.gov (United States)

    Bakowski, Deborah L.; Foyle, David C.; Hooey, Becky L.; Meyer, Glenn R.; Wolter, Cynthia A.

    2012-01-01

    The purpose of this pilot-in-the-loop aircraft taxi simulation was to evaluate a NextGen concept for surface trajectory-based operations (STBO) in which air traffic control (ATC) issued taxi clearances with a required time of arrival (RTA) by Data Communications (DataComm). Flight deck avionics, driven by an error-nulling algorithm, displayed the speed needed to meet the RTA. To ensure robustness of the algorithm, the ability of 10 two-pilot crews to meet the RTA was tested in nine experimental trials representing a range of realistic conditions including a taxi route change, an RTA change, a departure clearance change, and a crossing traffic hold scenario. In some trials, these DataComm taxi clearances or clearance modifications were accompanied by 'preview' information, in which the airport map display showed a preview of the proposed route changes, including the necessary speed to meet the RTA. Overall, the results of this study show that with the aid of the RTA speed algorithm, pilots were able to meet their RTAs with very little time error in all of the robustness-testing scenarios. Results indicated that when taxi clearance changes were issued by DataComm only, pilots required longer notification distances than with voice communication. However, when the DataComm was accompanied by graphical preview, the notification distance required by pilots was equivalent to that for voice.

  13. DataComm in Flight Deck Surface Trajectory-Based Operations. Chapter 20

    Science.gov (United States)

    Bakowski, Deborah L.; Foyle, David C.; Hooey, Becky L.; Meyer, Glenn R.; Wolter, Cynthia A.

    2012-01-01

    The purpose of this pilot-in-the-loop aircraft taxi simulation was to evaluate a NextGen concept for surface trajectory-based operations (STBO) in which air traffic control (ATC) issued taxi clearances with a required time of arrival (RTA) by Data Communications (DataComm). Flight deck avionics, driven by an error-nulling algorithm, displayed the speed needed to meet the RTA. To ensure robustness of the algorithm, the ability of 10 two-pilot crews to meet the RTA was tested in nine experimental trials representing a range of realistic conditions including a taxi route change, an RTA change, a departure clearance change, and a crossing traffic hold scenario. In some trials, these DataComm taxi clearances or clearance modifications were accompanied by preview information, in which the airport map display showed a preview of the proposed route changes, including the necessary speed to meet the RTA. Overall, the results of this study show that with the aid of the RTA speed algorithm, pilots were able to meet their RTAs with very little time error in all of the robustness-testing scenarios. Results indicated that when taxi clearance changes were issued by DataComm only, pilots required longer notification distances than with voice communication. However, when the DataComm was accompanied by graphical preview, the notification distance required by pilots was equivalent to that for voice.

  14. Towards Designing Graceful Degradation into Trajectory Based Operations: A Human-systems Integration Approach

    Science.gov (United States)

    Edwards, Tamsyn; Lee, Paul

    2017-01-01

    One of the most fundamental changes to the air traffic management system in NextGen is the concept of trajectory based operations (TBO). With the introduction of such change, system safety and resilience is a critical concern, in particular, the ability of systems to gracefully degrade. In order to design graceful degradation into a TBO envrionment, knowledge of the potential causes of degradation, and appropriate solutions, is required. In addition, previous research has predominantly explored the technological contribution to graceful degradation, frequently neglecting to consider the role of the human operator, specifically, air traffic controllers (ATCOs). This is out of step with real-world operations, and potentially limits an ecologically valid understanding of achieving graceful degradation in an air traffic control (ATC) environment. The following literature review aims to identify and summarize the literature to date on the potential causes of degradation in ATC and the solutions that may be applied within a TBO context, with a specific focus on the contribution of the air traffic controller. A framework of graceful degradation, developed from the literature, is presented. It is argued that in order to achieve graceful degradation within TBO, a human-system integration approach must be applied.

  15. Towards Designing Graceful Degradation into Trajectory Based Operations: A Human-Machine System Integration Approach

    Science.gov (United States)

    Edwards, Tamsyn; Lee, Paul

    2017-01-01

    One of the most fundamental changes to the air traffic management system in NextGen is the concept of trajectory based operations (TBO). With the introduction of such change, system safety and resilience is a critical concern, in particular, the ability of systems to gracefully degrade. In order to design graceful degradation into a TBO envrionment, knowledge of the potential causes of degradation, and appropriate solutions, is required. In addition, previous research has predominantly explored the technological contribution to graceful degradation, frequently neglecting to consider the role of the human operator, specifically, air traffic controllers (ATCOs). This is out of step with real-world operations, and potentially limits an ecologically valid understanding of achieving graceful degradation in an air traffic control (ATC) environment. The following literature review aims to identify and summarize the literature to date on the potential causes of degradation in ATC and the solutions that may be applied within a TBO context, with a specific focus on the contribution of the air traffic controller. A framework of graceful degradation, developed from the literature, is presented. It is argued that in order to achieve graceful degradation within TBO, a human-system integration approach must be applied.

  16. A population-feedback control based algorithm for well trajectory optimization using proxy model

    Directory of Open Access Journals (Sweden)

    Javad Kasravi

    2017-04-01

    Full Text Available Wellbore instability is one of the concerns in the field of drilling engineering. This phenomenon is affected by several factors such as azimuth, inclination angle, in-situ stress, mud weight, and rock strength parameters. Among these factors, azimuth, inclination angle, and mud weight are controllable. The objective of this paper is to introduce a new procedure based on elastoplastic theory in wellbore stability solution to determine the optimum well trajectory and global minimum mud pressure required (GMMPR. Genetic algorithm (GA was applied as a main optimization engine that employs proportional feedback controller to obtain the minimum mud pressure required (MMPR. The feedback function repeatedly calculated and updated the error between the simulated and set point of normalized yielded zone area (NYZA. To reduce computation expenses, an artificial neural network (ANN was used as a proxy (surrogate model to approximate the behavior of the actual wellbore model. The methodology was applied to a directional well in southwestern Iranian oilfield. The results demonstrated that the error between the predicted GMMPR and practical safe mud pressure was 4% for elastoplastic method, and 22% for conventional elastic solution.

  17. Lane-Level Road Information Mining from Vehicle GPS Trajectories Based on Naïve Bayesian Classification

    Directory of Open Access Journals (Sweden)

    Luliang Tang

    2015-11-01

    Full Text Available In this paper, we propose a novel approach for mining lane-level road network information from low-precision vehicle GPS trajectories (MLIT, which includes the number and turn rules of traffic lanes based on naïve Bayesian classification. First, the proposed method (MLIT uses an adaptive density optimization method to remove outliers from the raw GPS trajectories based on their space-time distribution and density clustering. Second, MLIT acquires the number of lanes in two steps. The first step establishes a naïve Bayesian classifier according to the trace features of the road plane and road profiles and the real number of lanes, as found in the training samples. The second step confirms the number of lanes using test samples in reference to the naïve Bayesian classifier using the known trace features of test sample. Third, MLIT infers the turn rules of each lane through tracking GPS trajectories. Experiments were conducted using the GPS trajectories of taxis in Wuhan, China. Compared with human-interpreted results, the automatically generated lane-level road network information was demonstrated to be of higher quality in terms of displaying detailed road networks with the number of lanes and turn rules of each lane.

  18. A MapReduce-Based Parallel Frequent Pattern Growth Algorithm for Spatiotemporal Association Analysis of Mobile Trajectory Big Data

    Directory of Open Access Journals (Sweden)

    Dawen Xia

    2018-01-01

    Full Text Available Frequent pattern mining is an effective approach for spatiotemporal association analysis of mobile trajectory big data in data-driven intelligent transportation systems. While existing parallel algorithms have been successfully applied to frequent pattern mining of large-scale trajectory data, two major challenges are how to overcome the inherent defects of Hadoop to cope with taxi trajectory big data including massive small files and how to discover the implicitly spatiotemporal frequent patterns with MapReduce. To conquer these challenges, this paper presents a MapReduce-based Parallel Frequent Pattern growth (MR-PFP algorithm to analyze the spatiotemporal characteristics of taxi operating using large-scale taxi trajectories with massive small file processing strategies on a Hadoop platform. More specifically, we first implement three methods, that is, Hadoop Archives (HAR, CombineFileInputFormat (CFIF, and Sequence Files (SF, to overcome the existing defects of Hadoop and then propose two strategies based on their performance evaluations. Next, we incorporate SF into Frequent Pattern growth (FP-growth algorithm and then implement the optimized FP-growth algorithm on a MapReduce framework. Finally, we analyze the characteristics of taxi operating in both spatial and temporal dimensions by MR-PFP in parallel. The results demonstrate that MR-PFP is superior to existing Parallel FP-growth (PFP algorithm in efficiency and scalability.

  19. A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis

    Directory of Open Access Journals (Sweden)

    Huanhuan Li

    2017-08-01

    Full Text Available The Shipboard Automatic Identification System (AIS is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW, a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our

  20. A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis.

    Science.gov (United States)

    Li, Huanhuan; Liu, Jingxian; Liu, Ryan Wen; Xiong, Naixue; Wu, Kefeng; Kim, Tai-Hoon

    2017-08-04

    The Shipboard Automatic Identification System (AIS) is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW), a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA) is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our proposed method with

  1. Pediatric 320-row cardiac computed tomography using electrocardiogram-gated model-based full iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Shirota, Go; Maeda, Eriko; Namiki, Yoko; Bari, Razibul; Abe, Osamu [The University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan); Ino, Kenji [The University of Tokyo Hospital, Imaging Center, Tokyo (Japan); Torigoe, Rumiko [Toshiba Medical Systems, Tokyo (Japan)

    2017-10-15

    Full iterative reconstruction algorithm is available, but its diagnostic quality in pediatric cardiac CT is unknown. To compare the imaging quality of two algorithms, full and hybrid iterative reconstruction, in pediatric cardiac CT. We included 49 children with congenital cardiac anomalies who underwent cardiac CT. We compared quality of images reconstructed using the two algorithms (full and hybrid iterative reconstruction) based on a 3-point scale for the delineation of the following anatomical structures: atrial septum, ventricular septum, right atrium, right ventricle, left atrium, left ventricle, main pulmonary artery, ascending aorta, aortic arch including the patent ductus arteriosus, descending aorta, right coronary artery and left main trunk. We evaluated beam-hardening artifacts from contrast-enhancement material using a 3-point scale, and we evaluated the overall image quality using a 5-point scale. We also compared image noise, signal-to-noise ratio and contrast-to-noise ratio between the algorithms. The overall image quality was significantly higher with full iterative reconstruction than with hybrid iterative reconstruction (3.67±0.79 vs. 3.31±0.89, P=0.0072). The evaluation scores for most of the gross structures were higher with full iterative reconstruction than with hybrid iterative reconstruction. There was no significant difference between full and hybrid iterative reconstruction for the presence of beam-hardening artifacts. Image noise was significantly lower in full iterative reconstruction, while signal-to-noise ratio and contrast-to-noise ratio were significantly higher in full iterative reconstruction. The diagnostic quality was superior in images with cardiac CT reconstructed with electrocardiogram-gated full iterative reconstruction. (orig.)

  2. Pediatric 320-row cardiac computed tomography using electrocardiogram-gated model-based full iterative reconstruction

    International Nuclear Information System (INIS)

    Shirota, Go; Maeda, Eriko; Namiki, Yoko; Bari, Razibul; Abe, Osamu; Ino, Kenji; Torigoe, Rumiko

    2017-01-01

    Full iterative reconstruction algorithm is available, but its diagnostic quality in pediatric cardiac CT is unknown. To compare the imaging quality of two algorithms, full and hybrid iterative reconstruction, in pediatric cardiac CT. We included 49 children with congenital cardiac anomalies who underwent cardiac CT. We compared quality of images reconstructed using the two algorithms (full and hybrid iterative reconstruction) based on a 3-point scale for the delineation of the following anatomical structures: atrial septum, ventricular septum, right atrium, right ventricle, left atrium, left ventricle, main pulmonary artery, ascending aorta, aortic arch including the patent ductus arteriosus, descending aorta, right coronary artery and left main trunk. We evaluated beam-hardening artifacts from contrast-enhancement material using a 3-point scale, and we evaluated the overall image quality using a 5-point scale. We also compared image noise, signal-to-noise ratio and contrast-to-noise ratio between the algorithms. The overall image quality was significantly higher with full iterative reconstruction than with hybrid iterative reconstruction (3.67±0.79 vs. 3.31±0.89, P=0.0072). The evaluation scores for most of the gross structures were higher with full iterative reconstruction than with hybrid iterative reconstruction. There was no significant difference between full and hybrid iterative reconstruction for the presence of beam-hardening artifacts. Image noise was significantly lower in full iterative reconstruction, while signal-to-noise ratio and contrast-to-noise ratio were significantly higher in full iterative reconstruction. The diagnostic quality was superior in images with cardiac CT reconstructed with electrocardiogram-gated full iterative reconstruction. (orig.)

  3. A computationally efficient OMP-based compressed sensing reconstruction for dynamic MRI

    International Nuclear Information System (INIS)

    Usman, M; Prieto, C; Schaeffter, T; Batchelor, P G; Odille, F; Atkinson, D

    2011-01-01

    Compressed sensing (CS) methods in MRI are computationally intensive. Thus, designing novel CS algorithms that can perform faster reconstructions is crucial for everyday applications. We propose a computationally efficient orthogonal matching pursuit (OMP)-based reconstruction, specifically suited to cardiac MR data. According to the energy distribution of a y-f space obtained from a sliding window reconstruction, we label the y-f space as static or dynamic. For static y-f space images, a computationally efficient masked OMP reconstruction is performed, whereas for dynamic y-f space images, standard OMP reconstruction is used. The proposed method was tested on a dynamic numerical phantom and two cardiac MR datasets. Depending on the field of view composition of the imaging data, compared to the standard OMP method, reconstruction speedup factors ranging from 1.5 to 2.5 are achieved. (note)

  4. A Support-Based Reconstruction for SENSE MRI

    Directory of Open Access Journals (Sweden)

    Bradley S. Peterson

    2013-03-01

    Full Text Available A novel, rapid algorithm to speed up and improve the reconstruction of sensitivity encoding (SENSE MRI was proposed in this paper. The essence of the algorithm was that it iteratively solved the model of simple SENSE on a pixel-by-pixel basis in the region of support (ROS. The ROS was obtained from scout images of eight channels by morphological operations such as opening and filling. All the pixels in the FOV were paired and classified into four types, according to their spatial locations with respect to the ROS, and each with corresponding procedures of solving the inverse problem for image reconstruction. The sensitivity maps, used for the image reconstruction and covering only the ROS, were obtained by a polynomial regression model without extrapolation to keep the estimation errors small. The experiments demonstrate that the proposed method improves the reconstruction of SENSE in terms of speed and accuracy. The mean square errors (MSE of our reconstruction is reduced by 16.05% for a 2D brain MR image and the mean MSE over the whole slices in a 3D brain MRI is reduced by 30.44% compared to those of the traditional methods. The computation time is only 25%, 45%, and 70% of the traditional method for images with numbers of pixels in the orders of 103, 104, and 105–107, respectively.

  5. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    International Nuclear Information System (INIS)

    DeMarco, J; McCloskey, S; Low, D; Moran, J

    2014-01-01

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT plan file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment

  6. Parametric Human Body Reconstruction Based on Sparse Key Points.

    Science.gov (United States)

    Cheng, Ke-Li; Tong, Ruo-Feng; Tang, Min; Qian, Jing-Ye; Sarkis, Michel

    2016-11-01

    We propose an automatic parametric human body reconstruction algorithm which can efficiently construct a model using a single Kinect sensor. A user needs to stand still in front of the sensor for a couple of seconds to measure the range data. The user's body shape and pose will then be automatically constructed in several seconds. Traditional methods optimize dense correspondences between range data and meshes. In contrast, our proposed scheme relies on sparse key points for the reconstruction. It employs regression to find the corresponding key points between the scanned range data and some annotated training data. We design two kinds of feature descriptors as well as corresponding regression stages to make the regression robust and accurate. Our scheme follows with dense refinement where a pre-factorization method is applied to improve the computational efficiency. Compared with other methods, our scheme achieves similar reconstruction accuracy but significantly reduces runtime.

  7. Prepectoral Implant-Based Breast Reconstruction and Postmastectomy Radiotherapy: Short-Term Outcomes

    Directory of Open Access Journals (Sweden)

    Steven Sigalove, MD

    2017-12-01

    Conclusions:. Immediate implant-based prepectoral breast reconstruction followed by PMRT appears to be well tolerated, with no excess risk of adverse outcomes, at least in the short term. Longer follow-up is needed to better understand the risk of PMRT in prepectorally reconstructed breasts.

  8. Registration-based Reconstruction of Four-dimensional Cone Beam Computed Tomography

    DEFF Research Database (Denmark)

    Christoffersen, Christian; Hansen, David Christoffer; Poulsen, Per Rugaard

    2013-01-01

    We present a new method for reconstruction of four-dimensional (4D) cone beam computed tomography from an undersampled set of X-ray projections. The novelty of the proposed method lies in utilizing optical flow based registration to facilitate that each temporal phase is reconstructed from the full...

  9. Fourier-based reconstruction via alternating direction total variation minimization in linear scan CT

    International Nuclear Information System (INIS)

    Cai, Ailong; Wang, Linyuan; Yan, Bin; Zhang, Hanming; Li, Lei; Xi, Xiaoqi; Li, Jianxin

    2015-01-01

    In this study, we consider a novel form of computed tomography (CT), that is, linear scan CT (LCT), which applies a straight line trajectory. Furthermore, an iterative algorithm is proposed for pseudo-polar Fourier reconstruction through total variation minimization (PPF-TVM). Considering that the sampled Fourier data are distributed in pseudo-polar coordinates, the reconstruction model minimizes the TV of the image subject to the constraint that the estimated 2D Fourier data for the image are consistent with the 1D Fourier transform of the projection data. PPF-TVM employs the alternating direction method (ADM) to develop a robust and efficient iteration scheme, which ensures stable convergence provided that appropriate parameter values are given. In the ADM scheme, PPF-TVM applies the pseudo-polar fast Fourier transform and its adjoint to iterate back and forth between the image and frequency domains. Thus, there is no interpolation in the Fourier domain, which makes the algorithm both fast and accurate. PPF-TVM is particularly useful for limited angle reconstruction in LCT and it appears to be robust against artifacts. The PPF-TVM algorithm was tested with the FORBILD head phantom and real data in comparisons with state-of-the-art algorithms. Simulation studies and real data verification suggest that PPF-TVM can reconstruct higher accuracy images with lower time consumption

  10. Back-trajectory-based source apportionment of airborne sulfur and nitrogen concentrations at Rocky Mountain National Park, Colorado, USA

    Science.gov (United States)

    Gebhart, Kristi A.; Schichtel, Bret A.; Malm, William C.; Barna, Michael G.; Rodriguez, Marco A.; Collett, Jeffrey L., Jr.

    2011-01-01

    The Rocky Mountain Atmospheric Nitrogen and Sulfur Study (RoMANS), conducted during the spring and summer of 2006, was designed to assess the sources of nitrogen and sulfur species that contribute to wet and dry deposition and visibility impairment at Rocky Mountain National Park (RMNP), Colorado. Several source apportionment methods were utilized for RoMANS, including the Trajectory Mass Balance (TrMB) Model, a receptor-based method in which the hourly measured concentrations are the dependent variables and the residence times of back trajectories in several source regions are the independent variables. The regression coefficients are estimates of the mean emissions, dispersion, chemical transformation, and deposition between the source areas and the receptors. For RoMANS, a new ensemble technique was employed in which input parameters were varied to explore the range, variability, and model sensitivity of source attribution results and statistical measures of model fit over thousands of trials for each set of concentration measurements. Results showed that carefully chosen source regions dramatically improved the ability of TrMB to reproduce temporal patterns in the measured concentrations, and source attribution results were also very sensitive to source region choices. Conversely, attributions were relatively insensitive to trajectory start height, trajectory length, minimum endpoints per source area, and maximum endpoint height, as long as the trajectories were long enough to reach contributing source areas and were not overly restricted in height or horizontal location. Source attribution results estimated that more than half the ammonia and 30-45% of sulfur dioxide and other nitrogen-containing species at the RoMANS core site were from sources within the state of Colorado. Approximately a quarter to a third of the sulfate was from within Colorado.

  11. NextGen Flight Deck Surface Trajectory-Based Operations (STBO): Contingency Holds

    Science.gov (United States)

    Bakowski, Deborah Lee; Hooey, Becky Lee; Foyle, David C.; Wolter, Cynthia A.; Cheng, Lara W. S.

    2013-01-01

    The purpose of this pilot-in-the-loop taxi simulation was to investigate a NextGen Surface Trajectory-Based Operations (STBO) concept called "contingency holds." The contingency-hold concept parses a taxi route into segments, allowing an air traffic control (ATC) surface traffic management (STM) system to hold an aircraft when necessary for safety. Under nominal conditions, if the intersection or active runway crossing is clear, the hold is removed, allowing the aircraft to continue taxiing without slowing, thus improving taxi efficiency, while minimizing the excessive brake use, fuel burn, and emissions associated with stop-and-go taxi. However, when a potential traffic conflict exists, the hold remains in place as a fail-safe mechanism. In this departure operations simulation, the taxi clearance included a required time of arrival (RTA) to a specified intersection. The flight deck was equipped with speed-guidance avionics to aid the pilot in safely meeting the RTA. On two trials, the contingency hold was not released, and pilots were required to stop. On two trials the contingency hold was released 15 sec prior to the RTA, and on two trials the contingency hold was released 30 sec prior to the RTA. When the hold remained in place, all pilots complied with the hold. Results also showed that when the hold was released at 15-sec or 30-sec prior to the RTA, the 30-sec release allowed pilots to maintain nominal taxi speed, thus supporting continuous traffic flow; whereas, the 15-sec release did not. The contingency-hold concept, with at least a 30-sec release, allows pilots to improve taxiing efficiency by reducing braking, slowing, and stopping, but still maintains safety in that no pilots "busted" the clearance holds. Overall, the evidence suggests that the contingency-hold concept is a viable concept for optimizing efficiency while maintaining safety.

  12. An Examination of Selected Datacom Options for the Near-Term Implementation of Trajectory Based Operations

    Science.gov (United States)

    Johnson, Walter W.; Lachter, Joel B.; Battiste, Vernol; Lim, Veranika; Brandt, Summer L.; Koteskey, Robert W.; Dao, Arik-Quang V.; Ligda, Sarah V.; Wu, Shu-Chieh

    2011-01-01

    A primary feature of the Next Generation Air Transportation System (NextGen) is trajectory based operations (TBO). Under TBO, aircraft flight plans are known to computer systems on the ground that aid in scheduling and separation. The Future Air Navigation System (FANS) was developed to support TBO, but relatively few aircraft in the US are FANSequipped. Thus, any near-term implementation must provide TBO procedures for non-FANS aircraft. Previous research has explored controller clearances, but any implementation must also provide procedures for aircraft requests. The work presented here aims to surface issues surrounding TBO communication procedures for non-FANS aircraft and for aircraft requesting deviations around weather. Three types of communication were explored: Voice, FANS, and ACARS,(Aircraft Communications Addressing and Reporting System). ACARS and FANS are datacom systems that differ in that FANS allows uplinked flight plans to be loaded into the Flight Management System (FMS), while ACARS delivers flight plans as text that must be entered manually via the Control Display Unit (CDU). Sixteen pilots (eight two-person flight decks) and four controllers participated in 32 20-minute scenarios that required the flight decks to navigate through convective weather as they approached their top of descents (TODs). Findings: The rate of non-conformance was higher than anticipated, with aircraft off path more than 20% of the time. Controllers did not differentiate between the ACARS and FANS datacom, and were mixed in their preference for Voice vs. datacom (ACARS and FANS). Pilots uniformly preferred Voice to datacom, particularly ACARS. Much of their dislike appears to result from the slow response times in the datacom conditions. As a result, participants frequently resorted to voice communication. These results imply that, before implementing TBO in environments where pilots make weather deviation requests, further research is needed to develop communication

  13. Identification of aerosol types over an urban site based on air-mass trajectory classification

    Science.gov (United States)

    Pawar, G. V.; Devara, P. C. S.; Aher, G. R.

    2015-10-01

    Columnar aerosol properties retrieved from MICROTOPS II Sun Photometer measurements during 2010-2013 over Pune (18°32‧N; 73°49‧E, 559 m amsl), a tropical urban station in India, are analyzed to identify aerosol types in the atmospheric column. Identification/classification is carried out on the basis of dominant airflow patterns, and the method of discrimination of aerosol types on the basis of relation between aerosol optical depth (AOD500 nm) and Ångström exponent (AE, α). Five potential advection pathways viz., NW/N, SW/S, N, SE/E and L have been identified over the observing site by employing the NOAA-HYSPLIT air mass back trajectory analysis. Based on AE against AOD500 nm scatter plot and advection pathways followed five major aerosol types viz., continental average (CA), marine continental average (MCA), urban/industrial and biomass burning (UB), desert dust (DD) and indeterminate or mixed type (MT) have been identified. In winter, sector SE/E, a representative of air masses traversed over Bay of Bengal and Eastern continental Indian region has relatively small AOD (τpλ = 0.43 ± 0.13) and high AE (α = 1.19 ± 0.15). These values imply the presence of accumulation/sub-micron size anthropogenic aerosols. During pre-monsoon, aerosols from the NW/N sector have high AOD (τpλ = 0.61 ± 0.21), and low AE (α = 0.54 ± 0.14) indicating an increase in the loading of coarse-mode particles over Pune. Dominance of UB type in winter season for all the years (i.e. 2010-2013) may be attributed to both local/transported aerosols. During pre-monsoon seasons, MT is the dominant aerosol type followed by UB and DD, while the background aerosols are insignificant.

  14. Complexity analysis of the Next Gen Air Traffic Management System: trajectory based operations.

    Science.gov (United States)

    Lyons, Rhonda

    2012-01-01

    According to Federal Aviation Administration traffic predictions currently our Air Traffic Management (ATM) system is operating at 150 percent capacity; forecasting that within the next two decades, the traffic with increase to a staggering 250 percent [17]. This will require a major redesign of our system. Today's ATM system is complex. It is designed to safely, economically, and efficiently provide air traffic services through the cost-effective provision of facilities and seamless services in collaboration with multiple agents however, contrary the vision, the system is loosely integrated and is suffering tremendously from antiquated equipment and saturated airways. The new Next Generation (Next Gen) ATM system is designed to transform the current system into an agile, robust and responsive set of operations that are designed to safely manage the growing needs of the projected increasingly complex, diverse set of air transportation system users and massive projected worldwide traffic rates. This new revolutionary technology-centric system is dynamically complex and is much more sophisticated than it's soon to be predecessor. ATM system failures could yield large scale catastrophic consequences as it is a safety critical system. This work will attempt to describe complexity and the complex nature of the NextGen ATM system and Trajectory Based Operational. Complex human factors interactions within Next Gen will be analyzed using a proposed dual experimental approach designed to identify hazards, gaps and elicit emergent hazards that would not be visible if conducted in isolation. Suggestions will be made along with a proposal for future human factors research in the TBO safety critical Next Gen environment.

  15. A globally nonsingular quaternion-based formulation for all-electric satellite trajectory optimization

    Science.gov (United States)

    Libraro, Paola

    The general electric propulsion orbit-raising maneuver of a spacecraft must contend with four main limiting factors: the longer time of flight, multiple eclipses prohibiting continuous thrusting, long exposure to radiation from the Van Allen belt and high power requirement of the electric engines. In order to optimize a low-thrust transfer with respect to these challenges, the choice of coordinates and corresponding equations of motion used to describe the kinematical and dynamical behavior of the satellite is of critical importance. This choice can potentially affect the numerical optimization process as well as limit the set of mission scenarios that can be investigated. To increase the ability to determine the feasible set of mission scenarios able to address the challenges of an all-electric orbit-raising, a set of equations free of any singularities is required to consider a completely arbitrary injection orbit. For this purpose a new quaternion-based formulation of a spacecraft translational dynamics that is globally nonsingular has been developed. The minimum-time low-thrust problem has been solved using the new set of equations of motion inside a direct optimization scheme in order to investigate optimal low-thrust trajectories over the full range of injection orbit inclinations between 0 and 90 degrees with particular focus on high-inclinations. The numerical results consider a specific mission scenario in order to analyze three key aspects of the problem: the effect of the initial guess on the shape and duration of the transfer, the effect of Earth oblateness on transfer time and the role played by, radiation damage and power degradation in all-electric minimum-time transfers. Finally trade-offs between mass and cost savings are introduced through a test case.

  16. Ranking of tree-ring based temperature reconstructions of the past millennium

    Science.gov (United States)

    Esper, Jan; Krusic, Paul J.; Ljungqvist, Fredrik C.; Luterbacher, Jürg; Carrer, Marco; Cook, Ed; Davi, Nicole K.; Hartl-Meier, Claudia; Kirdyanov, Alexander; Konter, Oliver; Myglan, Vladimir; Timonen, Mauri; Treydte, Kerstin; Trouet, Valerie; Villalba, Ricardo; Yang, Bao; Büntgen, Ulf

    2016-08-01

    Tree-ring chronologies are widely used to reconstruct high-to low-frequency variations in growing season temperatures over centuries to millennia. The relevance of these timeseries in large-scale climate reconstructions is often determined by the strength of their correlation against instrumental temperature data. However, this single criterion ignores several important quantitative and qualitative characteristics of tree-ring chronologies. Those characteristics are (i) data homogeneity, (ii) sample replication, (iii) growth coherence, (iv) chronology development, and (v) climate signal including the correlation with instrumental data. Based on these 5 characteristics, a reconstruction-scoring scheme is proposed and applied to 39 published, millennial-length temperature reconstructions from Asia, Europe, North America, and the Southern Hemisphere. Results reveal no reconstruction scores highest in every category and each has their own strengths and weaknesses. Reconstructions that perform better overall include N-Scan and Finland from Europe, E-Canada from North America, Yamal and Dzhelo from Asia. Reconstructions performing less well include W-Himalaya and Karakorum from Asia, Tatra and S-Finland from Europe, and Great Basin from North America. By providing a comprehensive set of criteria to evaluate tree-ring chronologies we hope to improve the development of large-scale temperature reconstructions spanning the past millennium. All reconstructions and their corresponding scores are provided at http://www.blogs.uni-mainz.de/fb09climatology.

  17. Homotopy based Surface Reconstruction with Application to Acoustic Signals

    DEFF Research Database (Denmark)

    Sharma, Ojaswa; Anton, François

    2011-01-01

    reconstruct information between any pair of successive cross sections are derived. The zero level set of the resulting homotopy field generates the desired surface. Four types of homotopies are suggested that are well suited to generate a smooth surface. We also provide derivation of necessary higher order...

  18. A Total Variation-Based Reconstruction Method for Dynamic MRI

    Directory of Open Access Journals (Sweden)

    Germana Landi

    2008-01-01

    Full Text Available In recent years, total variation (TV regularization has become a popular and powerful tool for image restoration and enhancement. In this work, we apply TV minimization to improve the quality of dynamic magnetic resonance images. Dynamic magnetic resonance imaging is an increasingly popular clinical technique used to monitor spatio-temporal changes in tissue structure. Fast data acquisition is necessary in order to capture the dynamic process. Most commonly, the requirement of high temporal resolution is fulfilled by sacrificing spatial resolution. Therefore, the numerical methods have to address the issue of images reconstruction from limited Fourier data. One of the most successful techniques for dynamic imaging applications is the reduced-encoded imaging by generalized-series reconstruction method of Liang and Lauterbur. However, even if this method utilizes a priori data for optimal image reconstruction, the produced dynamic images are degraded by truncation artifacts, most notably Gibbs ringing, due to the spatial low resolution of the data. We use a TV regularization strategy in order to reduce these truncation artifacts in the dynamic images. The resulting TV minimization problem is solved by the fixed point iteration method of Vogel and Oman. The results of test problems with simulated and real data are presented to illustrate the effectiveness of the proposed approach in reducing the truncation artifacts of the reconstructed images.

  19. An adaptive multi-spline refinement algorithm in simulation based sailboat trajectory optimization using onboard multi-core computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2016-06-01

    Full Text Available A new dynamic programming based parallel algorithm adapted to on-board heterogeneous computers for simulation based trajectory optimization is studied in the context of “high-performance sailing”. The algorithm uses a new discrete space of continuously differentiable functions called the multi-splines as its search space representation. A basic version of the algorithm is presented in detail (pseudo-code, time and space complexity, search space auto-adaptation properties. Possible extensions of the basic algorithm are also described. The presented experimental results show that contemporary heterogeneous on-board computers can be effectively used for solving simulation based trajectory optimization problems. These computers can be considered micro high performance computing (HPC platforms-they offer high performance while remaining energy and cost efficient. The simulation based approach can potentially give highly accurate results since the mathematical model that the simulator is built upon may be as complex as required. The approach described is applicable to many trajectory optimization problems due to its black-box represented performance measure and use of OpenCL.

  20. An Improved Backstepping-Based Controller for Three-Dimensional Trajectory Tracking of a Midwater Trawl System

    Directory of Open Access Journals (Sweden)

    Zhao Yan

    2016-01-01

    Full Text Available An improved backstepping control method for three-dimensional trajectory tracking of a midwater trawl system is investigated. A new mathematical model of the trawl system while considering the horizontal expansion effect of two otter boards is presented based on the Newton Euler method. Subsequently, an active path tracking strategy of the trawl system based on the backstepping method is proposed. The nonstrict feedback characteristic of the proposed model employs a control allocation method and several parallel nonlinear PID (Proportion Integration Differentiation controllers to eliminate the high-order state variables. Then, the stability analysis by the Lyapunov Stability Theory shows that the proposed controller can maintain the stability of the trawl system even with the presence of external disturbances. To validate the proposed controller, a simulation comparison with a linear PID controller was conducted. The simulation results illustrate that the improved backstepping controller is effective for three-dimensional trajectory tracking of the midwater trawl system.

  1. Error Propagation dynamics: from PIV-based pressure reconstruction to vorticity field calculation

    Science.gov (United States)

    Pan, Zhao; Whitehead, Jared; Richards, Geordie; Truscott, Tadd; USU Team; BYU Team

    2017-11-01

    Noninvasive data from velocimetry experiments (e.g., PIV) have been used to calculate vorticity and pressure fields. However, the noise, error, or uncertainties in the PIV measurements would eventually propagate to the calculated pressure or vorticity field through reconstruction schemes. Despite the vast applications of pressure and/or vorticity field calculated from PIV measurements, studies on the error propagation from the velocity field to the reconstructed fields (PIV-pressure and PIV-vorticity are few. In the current study, we break down the inherent connections between PIV-based pressure reconstruction and PIV-based vorticity calculation. The similar error propagation dynamics, which involve competition between physical properties of the flow and numerical errors from reconstruction schemes, are found in both PIV-pressure and PIV-vorticity reconstructions.

  2. Reflexive Language and Ethnic Minority Activism in Hong Kong: A Trajectory-Based Analysis

    Science.gov (United States)

    Pérez-Milans, Miguel; Soto, Carlos

    2016-01-01

    This article engages with Archer's call to further research on reflexivity and social change under conditions of late modernity (2007, 2010, 2012) from the perspective of existing work on reflexive discourse in the language disciplines (Silverstein 1976, Lucy 1993). Drawing from a linguistic ethnography of the networked trajectories of a group of…

  3. Trajectories of suicidal ideation in people seeking web-based help for suicidality

    DEFF Research Database (Denmark)

    Madsen, Trine; Van Spijker, Bregje; Karstoft, Karen Inge

    2016-01-01

    Background: Suicidal ideation (SI) is a common mental health problem. Variability in intensity of SI over time has been linked to suicidal behavior, yet little is known about the temporal course of SI.  Objective: The primary aim was to identify prototypical trajectories of SI in the general popu...

  4. Design Decisions in Developing Learning Trajectories-Based Assessments in Mathematics: A Case Study

    Science.gov (United States)

    Penuel, William R.; Confrey, Jere; Maloney, Alan; Rupp, André A.

    2014-01-01

    This article analyzes the design decisions of a team developing diagnostic assessments for a learning trajectory focused on rational number reasoning. The analysis focuses on the design rationale for key decisions about how to develop the cognitive assessments and related validity arguments within a fluid state and national policy context. The…

  5. Model-based iterative reconstruction and adaptive statistical iterative reconstruction: dose-reduced CT for detecting pancreatic calcification

    International Nuclear Information System (INIS)

    Yasaka, Koichiro; Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2016-01-01

    Iterative reconstruction methods have attracted attention for reducing radiation doses in computed tomography (CT). To investigate the detectability of pancreatic calcification using dose-reduced CT reconstructed with model-based iterative construction (MBIR) and adaptive statistical iterative reconstruction (ASIR). This prospective study approved by Institutional Review Board included 85 patients (57 men, 28 women; mean age, 69.9 years; mean body weight, 61.2 kg). Unenhanced CT was performed three times with different radiation doses (reference-dose CT [RDCT], low-dose CT [LDCT], ultralow-dose CT [ULDCT]). From RDCT, LDCT, and ULDCT, images were reconstructed with filtered-back projection (R-FBP, used for establishing reference standard), ASIR (L-ASIR), and MBIR and ASIR (UL-MBIR and UL-ASIR), respectively. A lesion (pancreatic calcification) detection test was performed by two blinded radiologists with a five-point certainty level scale. Dose-length products of RDCT, LDCT, and ULDCT were 410, 97, and 36 mGy-cm, respectively. Nine patients had pancreatic calcification. The sensitivity for detecting pancreatic calcification with UL-MBIR was high (0.67–0.89) compared to L-ASIR or UL-ASIR (0.11–0.44), and a significant difference was seen between UL-MBIR and UL-ASIR for one reader (P = 0.014). The area under the receiver-operating characteristic curve for UL-MBIR (0.818–0.860) was comparable to that for L-ASIR (0.696–0.844). The specificity was lower with UL-MBIR (0.79–0.92) than with L-ASIR or UL-ASIR (0.96–0.99), and a significant difference was seen for one reader (P < 0.01). In UL-MBIR, pancreatic calcification can be detected with high sensitivity, however, we should pay attention to the slightly lower specificity

  6. Model-based iterative reconstruction and adaptive statistical iterative reconstruction: dose-reduced CT for detecting pancreatic calcification.

    Science.gov (United States)

    Yasaka, Koichiro; Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2016-01-01

    Iterative reconstruction methods have attracted attention for reducing radiation doses in computed tomography (CT). To investigate the detectability of pancreatic calcification using dose-reduced CT reconstructed with model-based iterative construction (MBIR) and adaptive statistical iterative reconstruction (ASIR). This prospective study approved by Institutional Review Board included 85 patients (57 men, 28 women; mean age, 69.9 years; mean body weight, 61.2 kg). Unenhanced CT was performed three times with different radiation doses (reference-dose CT [RDCT], low-dose CT [LDCT], ultralow-dose CT [ULDCT]). From RDCT, LDCT, and ULDCT, images were reconstructed with filtered-back projection (R-FBP, used for establishing reference standard), ASIR (L-ASIR), and MBIR and ASIR (UL-MBIR and UL-ASIR), respectively. A lesion (pancreatic calcification) detection test was performed by two blinded radiologists with a five-point certainty level scale. Dose-length products of RDCT, LDCT, and ULDCT were 410, 97, and 36 mGy-cm, respectively. Nine patients had pancreatic calcification. The sensitivity for detecting pancreatic calcification with UL-MBIR was high (0.67-0.89) compared to L-ASIR or UL-ASIR (0.11-0.44), and a significant difference was seen between UL-MBIR and UL-ASIR for one reader (P = 0.014). The area under the receiver-operating characteristic curve for UL-MBIR (0.818-0.860) was comparable to that for L-ASIR (0.696-0.844). The specificity was lower with UL-MBIR (0.79-0.92) than with L-ASIR or UL-ASIR (0.96-0.99), and a significant difference was seen for one reader (P < 0.01). In UL-MBIR, pancreatic calcification can be detected with high sensitivity, however, we should pay attention to the slightly lower specificity.

  7. Accelerated gradient methods for total-variation-based CT image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Joergensen, Jakob H.; Hansen, Per Christian [Technical Univ. of Denmark, Lyngby (Denmark). Dept. of Informatics and Mathematical Modeling; Jensen, Tobias L.; Jensen, Soeren H. [Aalborg Univ. (Denmark). Dept. of Electronic Systems; Sidky, Emil Y.; Pan, Xiaochuan [Chicago Univ., Chicago, IL (United States). Dept. of Radiology

    2011-07-01

    Total-variation (TV)-based CT image reconstruction has shown experimentally to be capable of producing accurate reconstructions from sparse-view data. In particular TV-based reconstruction is well suited for images with piecewise nearly constant regions. Computationally, however, TV-based reconstruction is demanding, especially for 3D imaging, and the reconstruction from clinical data sets is far from being close to real-time. This is undesirable from a clinical perspective, and thus there is an incentive to accelerate the solution of the underlying optimization problem. The TV reconstruction can in principle be found by any optimization method, but in practice the large scale of the systems arising in CT image reconstruction preclude the use of memory-intensive methods such as Newton's method. The simple gradient method has much lower memory requirements, but exhibits prohibitively slow convergence. In the present work we address the question of how to reduce the number of gradient method iterations needed to achieve a high-accuracy TV reconstruction. We consider the use of two accelerated gradient-based methods, GPBB and UPN, to solve the 3D-TV minimization problem in CT image reconstruction. The former incorporates several heuristics from the optimization literature such as Barzilai-Borwein (BB) step size selection and nonmonotone line search. The latter uses a cleverly chosen sequence of auxiliary points to achieve a better convergence rate. The methods are memory efficient and equipped with a stopping criterion to ensure that the TV reconstruction has indeed been found. An implementation of the methods (in C with interface to Matlab) is available for download from http://www2.imm.dtu.dk/~pch/TVReg/. We compare the proposed methods with the standard gradient method, applied to a 3D test problem with synthetic few-view data. We find experimentally that for realistic parameters the proposed methods significantly outperform the standard gradient method. (orig.)

  8. Amplitude-based data selection for optimal retrospective reconstruction in micro-SPECT

    Science.gov (United States)

    Breuilly, M.; Malandain, G.; Guglielmi, J.; Marsault, R.; Pourcher, T.; Franken, P. R.; Darcourt, J.

    2013-04-01

    Respiratory motion can blur the tomographic reconstruction of positron emission tomography or single-photon emission computed tomography (SPECT) images, which subsequently impair quantitative measurements, e.g. in the upper abdomen area. Respiratory signal phase-based gated reconstruction addresses this problem, but deteriorates the signal-to-noise ratio (SNR) and other intensity-based quality measures. This paper proposes a 3D reconstruction method dedicated to micro-SPECT imaging of mice. From a 4D acquisition, the phase images exhibiting motion are identified and the associated list-mode data are discarded, which enables the reconstruction of a 3D image without respiratory artefacts. The proposed method allows a motion-free reconstruction exhibiting both satisfactory count statistics and accuracy of measures. With respect to standard 3D reconstruction (non-gated 3D reconstruction) without breathing motion correction, an increase of 14.6% of the mean standardized uptake value has been observed, while, with respect to a gated 4D reconstruction, up to 60% less noise and an increase of up to 124% of the SNR have been demonstrated.

  9. High-performance simulation-based algorithms for an alpine ski racer’s trajectory optimization in heterogeneous computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2014-09-01

    Full Text Available Effective, simulation-based trajectory optimization algorithms adapted to heterogeneous computers are studied with reference to the problem taken from alpine ski racing (the presented solution is probably the most general one published so far. The key idea behind these algorithms is to use a grid-based discretization scheme to transform the continuous optimization problem into a search problem over a specially constructed finite graph, and then to apply dynamic programming to find an approximation of the global solution. In the analyzed example it is the minimum-time ski line, represented as a piecewise-linear function (a method of elimination of unfeasible solutions is proposed. Serial and parallel versions of the basic optimization algorithm are presented in detail (pseudo-code, time and memory complexity. Possible extensions of the basic algorithm are also described. The implementation of these algorithms is based on OpenCL. The included experimental results show that contemporary heterogeneous computers can be treated as μ-HPC platforms-they offer high performance (the best speedup was equal to 128 while remaining energy and cost efficient (which is crucial in embedded systems, e.g., trajectory planners of autonomous robots. The presented algorithms can be applied to many trajectory optimization problems, including those having a black-box represented performance measure

  10. Adaptive robust motion trajectory tracking control of pneumatic cylinders with LuGre model-based friction compensation

    Science.gov (United States)

    Meng, Deyuan; Tao, Guoliang; Liu, Hao; Zhu, Xiaocong

    2014-07-01

    Friction compensation is particularly important for motion trajectory tracking control of pneumatic cylinders at low speed movement. However, most of the existing model-based friction compensation schemes use simple classical models, which are not enough to address applications with high-accuracy position requirements. Furthermore, the friction force in the cylinder is time-varying, and there exist rather severe unmodelled dynamics and unknown disturbances in the pneumatic system. To deal with these problems effectively, an adaptive robust controller with LuGre model-based dynamic friction compensation is constructed. The proposed controller employs on-line recursive least squares estimation (RLSE) to reduce the extent of parametric uncertainties, and utilizes the sliding mode control method to attenuate the effects of parameter estimation errors, unmodelled dynamics and disturbances. In addition, in order to realize LuGre model-based friction compensation, the modified dual-observer structure for estimating immeasurable friction internal state is developed. Therefore, a prescribed motion tracking transient performance and final tracking accuracy can be guaranteed. Since the system model uncertainties are unmatched, the recursive backstepping design technology is applied. In order to solve the conflicts between the sliding mode control design and the adaptive control design, the projection mapping is used to condition the RLSE algorithm so that the parameter estimates are kept within a known bounded convex set. Finally, the proposed controller is tested for tracking sinusoidal trajectories and smooth square trajectory under different loads and sudden disturbance. The testing results demonstrate that the achievable performance of the proposed controller is excellent and is much better than most other studies in literature. Especially when a 0.5 Hz sinusoidal trajectory is tracked, the maximum tracking error is 0.96 mm and the average tracking error is 0.45 mm. This

  11. Trajectory Browser Website

    Science.gov (United States)

    Foster, Cyrus; Jaroux, Belgacem A.

    2012-01-01

    The Trajectory Browser is a web-based tool developed at the NASA Ames Research Center to be used for the preliminary assessment of trajectories to small-bodies and planets and for providing relevant launch date, time-of-flight and V requirements. The site hosts a database of transfer trajectories from Earth to asteroids and planets for various types of missions such as rendezvous, sample return or flybys. A search engine allows the user to find trajectories meeting desired constraints on the launch window, mission duration and delta V capability, while a trajectory viewer tool allows the visualization of the heliocentric trajectory and the detailed mission itinerary. The anticipated user base of this tool consists primarily of scientists and engineers designing interplanetary missions in the context of pre-phase A studies, particularly for performing accessibility surveys to large populations of small-bodies. The educational potential of the website is also recognized for academia and the public with regards to trajectory design, a field that has generally been poorly understood by the public. The website is currently hosted on NASA-internal URL http://trajbrowser.arc.nasa.gov/ with plans for a public release as soon as development is complete.

  12. Low-dose CT image reconstruction using gain intervention-based dictionary learning

    Science.gov (United States)

    Pathak, Yadunath; Arya, K. V.; Tiwari, Shailendra

    2018-05-01

    Computed tomography (CT) approach is extensively utilized in clinical diagnoses. However, X-ray residue in human body may introduce somatic damage such as cancer. Owing to radiation risk, research has focused on the radiation exposure distributed to patients through CT investigations. Therefore, low-dose CT has become a significant research area. Many researchers have proposed different low-dose CT reconstruction techniques. But, these techniques suffer from various issues such as over smoothing, artifacts, noise, etc. Therefore, in this paper, we have proposed a novel integrated low-dose CT reconstruction technique. The proposed technique utilizes global dictionary-based statistical iterative reconstruction (GDSIR) and adaptive dictionary-based statistical iterative reconstruction (ADSIR)-based reconstruction techniques. In case the dictionary (D) is predetermined, then GDSIR can be used and if D is adaptively defined then ADSIR is appropriate choice. The gain intervention-based filter is also used as a post-processing technique for removing the artifacts from low-dose CT reconstructed images. Experiments have been done by considering the proposed and other low-dose CT reconstruction techniques on well-known benchmark CT images. Extensive experiments have shown that the proposed technique outperforms the available approaches.

  13. A compressed sensing based approach on Discrete Algebraic Reconstruction Technique.

    Science.gov (United States)

    Demircan-Tureyen, Ezgi; Kamasak, Mustafa E

    2015-01-01

    Discrete tomography (DT) techniques are capable of computing better results, even using less number of projections than the continuous tomography techniques. Discrete Algebraic Reconstruction Technique (DART) is an iterative reconstruction method proposed to achieve this goal by exploiting a prior knowledge on the gray levels and assuming that the scanned object is composed from a few different densities. In this paper, DART method is combined with an initial total variation minimization (TvMin) phase to ensure a better initial guess and extended with a segmentation procedure in which the threshold values are estimated from a finite set of candidates to minimize both the projection error and the total variation (TV) simultaneously. The accuracy and the robustness of the algorithm is compared with the original DART by the simulation experiments which are done under (1) limited number of projections, (2) limited view problem and (3) noisy projections conditions.

  14. Anisotropic Diffusion based Brain MRI Segmentation and 3D Reconstruction

    OpenAIRE

    M. Arfan Jaffar; Sultan Zia; Ghaznafar Latif; AnwarM. Mirza; Irfan Mehmood; Naveed Ejaz; Sung Wook Baik

    2012-01-01

    In medical field visualization of the organs is very imperative for accurate diagnosis and treatment of any disease. Brain tumor diagnosis and surgery also required impressive 3D visualization of the brain to the radiologist. Detection and 3D reconstruction of brain tumors from MRI is a computationally time consuming and error-prone task. Proposed system detects and presents a 3D visualization model of the brain and tumor inside which greatly helps the radiologist to effectively diagnose and ...

  15. Sound field reconstruction based on the acousto-optic effect

    DEFF Research Database (Denmark)

    Torras Rosell, Antoni; Barrera Figueroa, Salvador; Jacobsen, Finn

    2011-01-01

    be measured with a laser Doppler vibrometer; furthermore, it can be exploited to characterize an arbitrary sound field using tomographic techniques. This paper briefly reviews the fundamental principles governing the acousto-optic effect in air, and presents an investigation of the tomographic reconstruction...... within the audible frequency range by means of simulations and experimental results. The good agreement observed between simulations and measurements is further confirmed with representations of the sound field obtained with traditional microphone array measurements....

  16. Performance comparison between total variation (TV)-based compressed sensing and statistical iterative reconstruction algorithms

    International Nuclear Information System (INIS)

    Tang Jie; Nett, Brian E; Chen Guanghong

    2009-01-01

    Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.

  17. MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction

    International Nuclear Information System (INIS)

    Chen, G; Pan, X; Stayman, J; Samei, E

    2014-01-01

    Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within the reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical

  18. Time Reversal Reconstruction Algorithm Based on PSO Optimized SVM Interpolation for Photoacoustic Imaging

    Directory of Open Access Journals (Sweden)

    Mingjian Sun

    2015-01-01

    Full Text Available Photoacoustic imaging is an innovative imaging technique to image biomedical tissues. The time reversal reconstruction algorithm in which a numerical model of the acoustic forward problem is run backwards in time is widely used. In the paper, a time reversal reconstruction algorithm based on particle swarm optimization (PSO optimized support vector machine (SVM interpolation method is proposed for photoacoustics imaging. Numerical results show that the reconstructed images of the proposed algorithm are more accurate than those of the nearest neighbor interpolation, linear interpolation, and cubic convolution interpolation based time reversal algorithm, which can provide higher imaging quality by using significantly fewer measurement positions or scanning times.

  19. A Spectral Reconstruction Algorithm of Miniature Spectrometer Based on Sparse Optimization and Dictionary Learning.

    Science.gov (United States)

    Zhang, Shang; Dong, Yuhan; Fu, Hongyan; Huang, Shao-Lun; Zhang, Lin

    2018-02-22

    The miniaturization of spectrometer can broaden the application area of spectrometry, which has huge academic and industrial value. Among various miniaturization approaches, filter-based miniaturization is a promising implementation by utilizing broadband filters with distinct transmission functions. Mathematically, filter-based spectral reconstruction can be modeled as solving a system of linear equations. In this paper, we propose an algorithm of spectral reconstruction based on sparse optimization and dictionary learning. To verify the feasibility of the reconstruction algorithm, we design and implement a simple prototype of a filter-based miniature spectrometer. The experimental results demonstrate that sparse optimization is well applicable to spectral reconstruction whether the spectra are directly sparse or not. As for the non-directly sparse spectra, their sparsity can be enhanced by dictionary learning. In conclusion, the proposed approach has a bright application prospect in fabricating a practical miniature spectrometer.

  20. A Spectral Reconstruction Algorithm of Miniature Spectrometer Based on Sparse Optimization and Dictionary Learning

    Science.gov (United States)

    Zhang, Shang; Fu, Hongyan; Huang, Shao-Lun; Zhang, Lin

    2018-01-01

    The miniaturization of spectrometer can broaden the application area of spectrometry, which has huge academic and industrial value. Among various miniaturization approaches, filter-based miniaturization is a promising implementation by utilizing broadband filters with distinct transmission functions. Mathematically, filter-based spectral reconstruction can be modeled as solving a system of linear equations. In this paper, we propose an algorithm of spectral reconstruction based on sparse optimization and dictionary learning. To verify the feasibility of the reconstruction algorithm, we design and implement a simple prototype of a filter-based miniature spectrometer. The experimental results demonstrate that sparse optimization is well applicable to spectral reconstruction whether the spectra are directly sparse or not. As for the non-directly sparse spectra, their sparsity can be enhanced by dictionary learning. In conclusion, the proposed approach has a bright application prospect in fabricating a practical miniature spectrometer. PMID:29470406

  1. Detecting phase singularities and rotor center trajectories based on the Hilbert transform of intraatrial electrograms in an atrial voxel model

    Directory of Open Access Journals (Sweden)

    Unger Laura Anna

    2015-09-01

    Full Text Available This work aimed at the detection of rotor centers within the atrial cavity during atrial fibrillation on the basis of phase singularities. A voxel based method was established which employs the Hilbert transform and the phase of unipolar electrograms. The method provides a 3D overview of phase singularities at the endocardial surface and within the blood volume. Mapping those phase singularities from the inside of the atria at the endocardium yielded rotor center trajectories. We discuss the results for an unstable and a more stable rotor. The side length of the areas covered by the trajectories varied from 1.5 mm to 10 mm. These results are important for cardiologists who target rotors with RF ablation in order to cure atrial fibrillation.

  2. Spatial and Temporal Extrapolation of Disdrometer Size Distributions Based on a Lagrangian Trajectory Model of Falling Rain

    Science.gov (United States)

    Lane, John E.; Kasparis, Takis; Jones, W. Linwood; Metzger, Philip T.

    2009-01-01

    Methodologies to improve disdrometer processing, loosely based on mathematical techniques common to the field of particle flow and fluid mechanics, are examined and tested. The inclusion of advection and vertical wind field estimates appear to produce significantly improved results in a Lagrangian hydrometeor trajectory model, in spite of very strict assumptions of noninteracting hydrometeors, constant vertical air velocity, and time independent advection during the scan time interval. Wind field data can be extracted from each radar elevation scan by plotting and analyzing reflectivity contours over the disdrometer site and by collecting the radar radial velocity data to obtain estimates of advection. Specific regions of disdrometer spectra (drop size versus time) often exhibit strong gravitational sorting signatures, from which estimates of vertical velocity can be extracted. These independent wind field estimates become inputs and initial conditions to the Lagrangian trajectory simulation of falling hydrometeors.

  3. Rapid space trajectory generation using a Fourier series shape-based approach

    Science.gov (United States)

    Taheri, Ehsan

    With the insatiable curiosity of human beings to explore the universe and our solar system, it is essential to benefit from larger propulsion capabilities to execute efficient transfers and carry more scientific equipments. In the field of space trajectory optimization the fundamental advances in using low-thrust propulsion and exploiting the multi-body dynamics has played pivotal role in designing efficient space mission trajectories. The former provides larger cumulative momentum change in comparison with the conventional chemical propulsion whereas the latter results in almost ballistic trajectories with negligible amount of propellant. However, the problem of space trajectory design translates into an optimal control problem which is, in general, time-consuming and very difficult to solve. Therefore, the goal of the thesis is to address the above problem by developing a methodology to simplify and facilitate the process of finding initial low-thrust trajectories in both two-body and multi-body environments. This initial solution will not only provide mission designers with a better understanding of the problem and solution but also serves as a good initial guess for high-fidelity optimal control solvers and increases their convergence rate. Almost all of the high-fidelity solvers enjoy the existence of an initial guess that already satisfies the equations of motion and some of the most important constraints. Despite the nonlinear nature of the problem, it is sought to find a robust technique for a wide range of typical low-thrust transfers with reduced computational intensity. Another important aspect of our developed methodology is the representation of low-thrust trajectories by Fourier series with which the number of design variables reduces significantly. Emphasis is given on simplifying the equations of motion to the possible extent and avoid approximating the controls. These facts contribute to speeding up the solution finding procedure. Several example

  4. CT of the chest with model-based, fully iterative reconstruction: comparison with adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Ichikawa, Yasutaka; Kitagawa, Kakuya; Nagasawa, Naoki; Murashima, Shuichi; Sakuma, Hajime

    2013-08-09

    The recently developed model-based iterative reconstruction (MBIR) enables significant reduction of image noise and artifacts, compared with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP). The purpose of this study was to evaluate lesion detectability of low-dose chest computed tomography (CT) with MBIR in comparison with ASIR and FBP. Chest CT was acquired with 64-slice CT (Discovery CT750HD) with standard-dose (5.7 ± 2.3 mSv) and low-dose (1.6 ± 0.8 mSv) conditions in 55 patients (aged 72 ± 7 years) who were suspected of lung disease on chest radiograms. Low-dose CT images were reconstructed with MBIR, ASIR 50% and FBP, and standard-dose CT images were reconstructed with FBP, using a reconstructed slice thickness of 0.625 mm. Two observers evaluated the image quality of abnormal lung and mediastinal structures on a 5-point scale (Score 5 = excellent and score 1 = non-diagnostic). The objective image noise was also measured as the standard deviation of CT intensity in the descending aorta. The image quality score of enlarged mediastinal lymph nodes on low-dose MBIR CT (4.7 ± 0.5) was significantly improved in comparison with low-dose FBP and ASIR CT (3.0 ± 0.5, p = 0.004; 4.0 ± 0.5, p = 0.02, respectively), and was nearly identical to the score of standard-dose FBP image (4.8 ± 0.4, p = 0.66). Concerning decreased lung attenuation (bulla, emphysema, or cyst), the image quality score on low-dose MBIR CT (4.9 ± 0.2) was slightly better compared to low-dose FBP and ASIR CT (4.5 ± 0.6, p = 0.01; 4.6 ± 0.5, p = 0.01, respectively). There were no significant differences in image quality scores of visualization of consolidation or mass, ground-glass attenuation, or reticular opacity among low- and standard-dose CT series. Image noise with low-dose MBIR CT (11.6 ± 1.0 Hounsfield units (HU)) were significantly lower than with low-dose ASIR (21.1 ± 2.6 HU, p standard-dose FBP CT (16.6 ± 2.3 HU, p 70%, MBIR can provide

  5. A reconstruction algorithm for coherent scatter computed tomography based on filtered back-projection

    International Nuclear Information System (INIS)

    Stevendaal, U. van; Schlomka, J.-P.; Harding, A.; Grass, M.

    2003-01-01

    Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter form factor of the investigated object. Reconstruction from coherently scattered x-rays is commonly done using algebraic reconstruction techniques (ART). In this paper, we propose an alternative approach based on filtered back-projection. For the first time, a three-dimensional (3D) filtered back-projection technique using curved 3D back-projection lines is applied to two-dimensional coherent scatter projection data. The proposed algorithm is tested with simulated projection data as well as with projection data acquired with a demonstrator setup similar to a multi-line CT scanner geometry. While yielding comparable image quality as ART reconstruction, the modified 3D filtered back-projection algorithm is about two orders of magnitude faster. In contrast to iterative reconstruction schemes, it has the advantage that subfield-of-view reconstruction becomes feasible. This allows a selective reconstruction of the coherent-scatter form factor for a region of interest. The proposed modified 3D filtered back-projection algorithm is a powerful reconstruction technique to be implemented in a CSCT scanning system. This method gives coherent scatter CT the potential of becoming a competitive modality for medical imaging or nondestructive testing

  6. Optimization-based reconstruction for reduction of CBCT artifact in IGRT

    Science.gov (United States)

    Xia, Dan; Zhang, Zheng; Paysan, Pascal; Seghers, Dieter; Brehm, Marcus; Munro, Peter; Sidky, Emil Y.; Pelizzari, Charles; Pan, Xiaochuan

    2016-04-01

    Kilo-voltage cone-beam computed tomography (CBCT) plays an important role in image guided radiation therapy (IGRT) by providing 3D spatial information of tumor potentially useful for optimizing treatment planning. In current IGRT CBCT system, reconstructed images obtained with analytic algorithms, such as FDK algorithm and its variants, may contain artifacts. In an attempt to compensate for the artifacts, we investigate optimization-based reconstruction algorithms such as the ASD-POCS algorithm for potentially reducing arti- facts in IGRT CBCT images. In this study, using data acquired with a physical phantom and a patient subject, we demonstrate that the ASD-POCS reconstruction can significantly reduce artifacts observed in clinical re- constructions. Moreover, patient images reconstructed by use of the ASD-POCS algorithm indicate a contrast level of soft-tissue improved over that of the clinical reconstruction. We have also performed reconstructions from sparse-view data, and observe that, for current clinical imaging conditions, ASD-POCS reconstructions from data collected at one half of the current clinical projection views appear to show image quality, in terms of spatial and soft-tissue-contrast resolution, higher than that of the corresponding clinical reconstructions.

  7. Temporalis Myofascial Flap for Primary Cranial Base Reconstruction after Tumor Resection

    OpenAIRE

    Eldaly, Ahmed; Magdy, Emad A.; Nour, Yasser A.; Gaafar, Alaa H.

    2008-01-01

    Objective: To evaluate the use of the temporalis myofascial flap in primary cranial base reconstruction following surgical tumor ablation and to explain technical issues, potential complications, and donor site consequences along with their management. Design: Retrospective case series. Setting: Tertiary referral center. Participants: Forty-one consecutive patients receiving primary temporalis myofascial flap reconstructions following cranial base tumor resections in a 4-year period. Main Out...

  8. GNSS troposphere tomography based on two-step reconstructions using GPS observations and COSMIC profiles

    Directory of Open Access Journals (Sweden)

    P. Xia

    2013-10-01

    Full Text Available Traditionally, balloon-based radiosonde soundings are used to study the spatial distribution of atmospheric water vapour. However, this approach cannot be frequently employed due to its high cost. In contrast, GPS tomography technique can obtain water vapour in a high temporal resolution. In the tomography technique, an iterative or non-iterative reconstruction algorithm is usually utilised to overcome rank deficiency of observation equations for water vapour inversion. However, the single iterative or non-iterative reconstruction algorithm has their limitations. For instance, the iterative reconstruction algorithm requires accurate initial values of water vapour while the non-iterative reconstruction algorithm needs proper constraint conditions. To overcome these drawbacks, we present a combined iterative and non-iterative reconstruction approach for the three-dimensional (3-D water vapour inversion using GPS observations and COSMIC profiles. In this approach, the non-iterative reconstruction algorithm is first used to estimate water vapour density based on a priori water vapour information derived from COSMIC radio occultation data. The estimates are then employed as initial values in the iterative reconstruction algorithm. The largest advantage of this approach is that precise initial values of water vapour density that are essential in the iterative reconstruction algorithm can be obtained. This combined reconstruction algorithm (CRA is evaluated using 10-day GPS observations in Hong Kong and COSMIC profiles. The test results indicate that the water vapor accuracy from CRA is 16 and 14% higher than that of iterative and non-iterative reconstruction approaches, respectively. In addition, the tomography results obtained from the CRA are further validated using radiosonde data. Results indicate that water vapour densities derived from the CRA agree with radiosonde results very well at altitudes above 2.5 km. The average RMS value of their

  9. Reconstruction of a digital core containing clay minerals based on a clustering algorithm

    Science.gov (United States)

    He, Yanlong; Pu, Chunsheng; Jing, Cheng; Gu, Xiaoyu; Chen, Qingdong; Liu, Hongzhi; Khan, Nasir; Dong, Qiaoling

    2017-10-01

    It is difficult to obtain a core sample and information for digital core reconstruction of mature sandstone reservoirs around the world, especially for an unconsolidated sandstone reservoir. Meanwhile, reconstruction and division of clay minerals play a vital role in the reconstruction of the digital cores, although the two-dimensional data-based reconstruction methods are specifically applicable as the microstructure reservoir simulation methods for the sandstone reservoir. However, reconstruction of clay minerals is still challenging from a research viewpoint for the better reconstruction of various clay minerals in the digital cores. In the present work, the content of clay minerals was considered on the basis of two-dimensional information about the reservoir. After application of the hybrid method, and compared with the model reconstructed by the process-based method, the digital core containing clay clusters without the labels of the clusters' number, size, and texture were the output. The statistics and geometry of the reconstruction model were similar to the reference model. In addition, the Hoshen-Kopelman algorithm was used to label various connected unclassified clay clusters in the initial model and then the number and size of clay clusters were recorded. At the same time, the K -means clustering algorithm was applied to divide the labeled, large connecting clusters into smaller clusters on the basis of difference in the clusters' characteristics. According to the clay minerals' characteristics, such as types, textures, and distributions, the digital core containing clay minerals was reconstructed by means of the clustering algorithm and the clay clusters' structure judgment. The distributions and textures of the clay minerals of the digital core were reasonable. The clustering algorithm improved the digital core reconstruction and provided an alternative method for the simulation of different clay minerals in the digital cores.

  10. Reconstruction of a digital core containing clay minerals based on a clustering algorithm.

    Science.gov (United States)

    He, Yanlong; Pu, Chunsheng; Jing, Cheng; Gu, Xiaoyu; Chen, Qingdong; Liu, Hongzhi; Khan, Nasir; Dong, Qiaoling

    2017-10-01

    It is difficult to obtain a core sample and information for digital core reconstruction of mature sandstone reservoirs around the world, especially for an unconsolidated sandstone reservoir. Meanwhile, reconstruction and division of clay minerals play a vital role in the reconstruction of the digital cores, although the two-dimensional data-based reconstruction methods are specifically applicable as the microstructure reservoir simulation methods for the sandstone reservoir. However, reconstruction of clay minerals is still challenging from a research viewpoint for the better reconstruction of various clay minerals in the digital cores. In the present work, the content of clay minerals was considered on the basis of two-dimensional information about the reservoir. After application of the hybrid method, and compared with the model reconstructed by the process-based method, the digital core containing clay clusters without the labels of the clusters' number, size, and texture were the output. The statistics and geometry of the reconstruction model were similar to the reference model. In addition, the Hoshen-Kopelman algorithm was used to label various connected unclassified clay clusters in the initial model and then the number and size of clay clusters were recorded. At the same time, the K-means clustering algorithm was applied to divide the labeled, large connecting clusters into smaller clusters on the basis of difference in the clusters' characteristics. According to the clay minerals' characteristics, such as types, textures, and distributions, the digital core containing clay minerals was reconstructed by means of the clustering algorithm and the clay clusters' structure judgment. The distributions and textures of the clay minerals of the digital core were reasonable. The clustering algorithm improved the digital core reconstruction and provided an alternative method for the simulation of different clay minerals in the digital cores.

  11. Multi-criteria ACO-based Algorithm for Ship’s Trajectory Planning

    OpenAIRE

    Agnieszka Lazarowska

    2017-01-01

    The paper presents a new approach for solving a path planning problem for ships in the environment with static and dynamic obstacles. The algorithm utilizes a heuristic method, classified to the group of Swarm Intelligence approaches, called the Ant Colony Optimization. The method is inspired by a collective behaviour of ant colonies. A group of agents - artificial ants searches through the solution space in order to find a safe, optimal trajectory for a ship. The problem is considered as a ...

  12. TrajAnalytics: A Web-Based Visual Analytics Software of Urban Trajectory

    OpenAIRE

    Zhao, Ye; AL-Dohuki, Shamal; Eynon, Thomas; Kamw, Farah; Sheets, David; Ma, Chao; Ye, Xinyue; Hu, Yueqi; Feng, Tinghao; Yang, Jing

    2017-01-01

    Advanced technologies in sensing and computing have created urban trajectory datasets of humans and vehicles travelling over urban road networks. Understanding and analyzing the large-scale, complex data reflecting city dynamics is of great importance to enhance both human lives and urban environments. Domain practitioners, researchers, and decision-makers need to store, manage, query and visualize such big datasets. We develop a software system named TrajAnalytics, which explicitly supports ...

  13. Commercial Aircraft Trajectory Planning based on Multiphase Mixed-Integer Optimal Control

    OpenAIRE

    Soler Arnedo, Manuel Fernando

    2017-01-01

    The main goal of this dissertation is to develop optimal control techniques for aircraft trajectory planning looking at reduction of fuel consumption, emissions and overfly charges in flight plans. The calculation of a flight plan involves the consideration of multiple factors. They can be classified as either continuous or discrete, and include nonlinear aircraft performance, atmospheric conditions, wind conditions, airspace structure, amount of departure fuel, and operational...

  14. Robust Trajectory Option Set planning in CTOP based on Bayesian game model

    KAUST Repository

    Li, Lichun; Clarke, John-Paul; Feron, Eric; Shamma, Jeff S.

    2017-01-01

    The Federal Aviation Administration (FAA) rations capacity to reduce en route delay, especially those caused by bad weather. This is accomplished via Collaborative Trajectory Options Program (CTOP) which has been recently developed to provide a mechanism for flight operators to communicate their route preferences for each flight via a Trajectory Option Set (TOS), as well as a mechanism for the FAA to assign the best possible route within the set of trajectories in the TOS for a given flight, i.e. the route with the lowest adjusted cost after consideration of system constraints and the requirements of all flights. The routes assigned to an airline depend not only on the TOS's for its own flights but also on the TOS's of all other flights in the CTOP, which are unknown. This paper aims to provide a detailed algorithm for the airline to design its TOS plan which is robust to the uncertainties of its competitors' TOS's. To this purpose, we model the CTOP problem as a Bayesian game, and use Linear Program (LP) to compute the security strategy in the Bayesian game model. This security strategy guarantees the airline an upper bound on the sum of the assigned times. The numerical results demonstrate the robustness of the strategy, which is not achieved by any other tested strategy.

  15. Unconventional Constraints on Nitrogen Chemistry using DC3 Observations and Trajectory-based Chemical Modeling

    Science.gov (United States)

    Shu, Q.; Henderson, B. H.

    2017-12-01

    Chemical transport models underestimate nitrogen dioxide observations in the upper troposphere (UT). Previous research in the UT succeeded in combining model predictions with field campaign measurements to demonstrate that the nitric acid formation rate (HO + NO2 → HNO3 (R1)) is overestimated by 22% (Henderson et al., 2012). A subsequent publication (Seltzer et al., 2015) demonstrated that single chemical constraint alters ozone and aerosol formation/composition. This work attempts to replicate previous chemical constraints with newer observations and a different modeling framework. We apply the previously successful constraint framework to Deep Convection Clouds and Chemistry (DC3). DC3 is a more recent field campaign where simulated nitrogen imbalances still exist. Freshly convected air parcels, identified in the DC3 dataset, as initial coordinates to initiate Lagrangian trajectories. Along each trajectory, we simulate the air parcel chemical state. Samples along the trajectories will form ensembles that represent possible realizations of UT air parcels. We then apply Bayesian inference to constrain nitrogen chemistry and compare results to the existing literature. Our anticipated results will confirm overestimation of HNO3 formation rate in previous work and provide further constraints on other nitrogen reaction rate coefficients that affect terminal products from NOx. We will particularly focus on organic nitrate chemistry that laboratory literature has yet to fully address. The results will provide useful insights into nitrogen chemistry that affects climate and human health.

  16. Robust Trajectory Option Set planning in CTOP based on Bayesian game model

    KAUST Repository

    Li, Lichun

    2017-07-10

    The Federal Aviation Administration (FAA) rations capacity to reduce en route delay, especially those caused by bad weather. This is accomplished via Collaborative Trajectory Options Program (CTOP) which has been recently developed to provide a mechanism for flight operators to communicate their route preferences for each flight via a Trajectory Option Set (TOS), as well as a mechanism for the FAA to assign the best possible route within the set of trajectories in the TOS for a given flight, i.e. the route with the lowest adjusted cost after consideration of system constraints and the requirements of all flights. The routes assigned to an airline depend not only on the TOS\\'s for its own flights but also on the TOS\\'s of all other flights in the CTOP, which are unknown. This paper aims to provide a detailed algorithm for the airline to design its TOS plan which is robust to the uncertainties of its competitors\\' TOS\\'s. To this purpose, we model the CTOP problem as a Bayesian game, and use Linear Program (LP) to compute the security strategy in the Bayesian game model. This security strategy guarantees the airline an upper bound on the sum of the assigned times. The numerical results demonstrate the robustness of the strategy, which is not achieved by any other tested strategy.

  17. First experiences with model based iterative reconstructions influence on quantitative plaque volume and intensity measurements in coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Precht, Helle; Kitslaar, Pieter H.; Broersen, Alexander

    2017-01-01

    Purpose: Investigate the influence of adaptive statistical iterative reconstruction (ASIR) and the model- based IR (Veo) reconstruction algorithm in coronary computed tomography angiography (CCTA) im- ages on quantitative measurements in coronary arteries for plaque volumes and intensities. Methods...

  18. Lane Detection in Video-Based Intelligent Transportation Monitoring via Fast Extracting and Clustering of Vehicle Motion Trajectories

    Directory of Open Access Journals (Sweden)

    Jianqiang Ren

    2014-01-01

    Full Text Available Lane detection is a crucial process in video-based transportation monitoring system. This paper proposes a novel method to detect the lane center via rapid extraction and high accuracy clustering of vehicle motion trajectories. First, we use the activity map to realize automatically the extraction of road region, the calibration of dynamic camera, and the setting of three virtual detecting lines. Secondly, the three virtual detecting lines and a local background model with traffic flow feedback are used to extract and group vehicle feature points in unit of vehicle. Then, the feature point groups are described accurately by edge weighted dynamic graph and modified by a motion-similarity Kalman filter during the sparse feature point tracking. After obtaining the vehicle trajectories, a rough k-means incremental clustering with Hausdorff distance is designed to realize the rapid online extraction of lane center with high accuracy. The use of rough set reduces effectively the accuracy decrease, which results from the trajectories that run irregularly. Experimental results prove that the proposed method can detect lane center position efficiently, the affected time of subsequent tasks can be reduced obviously, and the safety of traffic surveillance systems can be enhanced significantly.

  19. l1- and l2-Norm Joint Regularization Based Sparse Signal Reconstruction Scheme

    Directory of Open Access Journals (Sweden)

    Chanzi Liu

    2016-01-01

    Full Text Available Many problems in signal processing and statistical inference involve finding sparse solution to some underdetermined linear system of equations. This is also the application condition of compressive sensing (CS which can find the sparse solution from the measurements far less than the original signal. In this paper, we propose l1- and l2-norm joint regularization based reconstruction framework to approach the original l0-norm based sparseness-inducing constrained sparse signal reconstruction problem. Firstly, it is shown that, by employing the simple conjugate gradient algorithm, the new formulation provides an effective framework to deduce the solution as the original sparse signal reconstruction problem with l0-norm regularization item. Secondly, the upper reconstruction error limit is presented for the proposed sparse signal reconstruction framework, and it is unveiled that a smaller reconstruction error than l1-norm relaxation approaches can be realized by using the proposed scheme in most cases. Finally, simulation results are presented to validate the proposed sparse signal reconstruction approach.

  20. A shape-based quality evaluation and reconstruction method for electrical impedance tomography.

    Science.gov (United States)

    Antink, Christoph Hoog; Pikkemaat, Robert; Malmivuo, Jaakko; Leonhardt, Steffen

    2015-06-01

    Linear methods of reconstruction play an important role in medical electrical impedance tomography (EIT) and there is a wide variety of algorithms based on several assumptions. With the Graz consensus reconstruction algorithm for EIT (GREIT), a novel linear reconstruction algorithm as well as a standardized framework for evaluating and comparing methods of reconstruction were introduced that found widespread acceptance in the community. In this paper, we propose a two-sided extension of this concept by first introducing a novel method of evaluation. Instead of being based on point-shaped resistivity distributions, we use 2759 pairs of real lung shapes for evaluation that were automatically segmented from human CT data. Necessarily, the figures of merit defined in GREIT were adjusted. Second, a linear method of reconstruction that uses orthonormal eigenimages as training data and a tunable desired point spread function are proposed. Using our novel method of evaluation, this approach is compared to the classical point-shaped approach. Results show that most figures of merit improve with the use of eigenimages as training data. Moreover, the possibility of tuning the reconstruction by modifying the desired point spread function is shown. Finally, the reconstruction of real EIT data shows that higher contrasts and fewer artifacts can be achieved in ventilation- and perfusion-related images.

  1. A shape-based quality evaluation and reconstruction method for electrical impedance tomography

    International Nuclear Information System (INIS)

    Antink, Christoph Hoog; Pikkemaat, Robert; Leonhardt, Steffen; Malmivuo, Jaakko

    2015-01-01

    Linear methods of reconstruction play an important role in medical electrical impedance tomography (EIT) and there is a wide variety of algorithms based on several assumptions. With the Graz consensus reconstruction algorithm for EIT (GREIT), a novel linear reconstruction algorithm as well as a standardized framework for evaluating and comparing methods of reconstruction were introduced that found widespread acceptance in the community.In this paper, we propose a two-sided extension of this concept by first introducing a novel method of evaluation. Instead of being based on point-shaped resistivity distributions, we use 2759 pairs of real lung shapes for evaluation that were automatically segmented from human CT data. Necessarily, the figures of merit defined in GREIT were adjusted. Second, a linear method of reconstruction that uses orthonormal eigenimages as training data and a tunable desired point spread function are proposed.Using our novel method of evaluation, this approach is compared to the classical point-shaped approach. Results show that most figures of merit improve with the use of eigenimages as training data. Moreover, the possibility of tuning the reconstruction by modifying the desired point spread function is shown. Finally, the reconstruction of real EIT data shows that higher contrasts and fewer artifacts can be achieved in ventilation- and perfusion-related images. (paper)

  2. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    International Nuclear Information System (INIS)

    Choo, Ji Yung; Goo, Jin Mo; Park, Chang Min; Park, Sang Joon; Lee, Chang Hyun; Shim, Mi-Suk

    2014-01-01

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  3. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Ji Yung [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Korea University Ansan Hospital, Ansan-si, Department of Radiology, Gyeonggi-do (Korea, Republic of); Goo, Jin Mo; Park, Chang Min; Park, Sang Joon [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University, Cancer Research Institute, Seoul (Korea, Republic of); Lee, Chang Hyun; Shim, Mi-Suk [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  4. Fast neural-net based fake track rejection in the LHCb reconstruction

    CERN Document Server

    De Cian, Michel; Seyfert, Paul; Stahl, Sascha

    2017-01-01

    A neural-network based algorithm to identify fake tracks in the LHCb pattern recognition is presented. This algorithm, called ghost probability, retains more than 99 % of well reconstructed tracks while reducing the number of fake tracks by 60 %. It is fast enough to fit into the CPU time budget of the software trigger farm and thus reduces the combinatorics of the decay reconstructions, as well as the number of tracks that need to be processed by the particle identification algorithms. As a result, it strongly contributes to the achievement of having the same reconstruction online and offline in the LHCb experiment in Run II of the LHC.

  5. Reconstruction algorithm in compressed sensing based on maximum a posteriori estimation

    International Nuclear Information System (INIS)

    Takeda, Koujin; Kabashima, Yoshiyuki

    2013-01-01

    We propose a systematic method for constructing a sparse data reconstruction algorithm in compressed sensing at a relatively low computational cost for general observation matrix. It is known that the cost of ℓ 1 -norm minimization using a standard linear programming algorithm is O(N 3 ). We show that this cost can be reduced to O(N 2 ) by applying the approach of posterior maximization. Furthermore, in principle, the algorithm from our approach is expected to achieve the widest successful reconstruction region, which is evaluated from theoretical argument. We also discuss the relation between the belief propagation-based reconstruction algorithm introduced in preceding works and our approach

  6. Reconstruction of chaotic signals with applications to chaos-based communications

    CERN Document Server

    Feng, Jiu Chao

    2008-01-01

    This book provides a systematic review of the fundamental theory of signal reconstruction and the practical techniques used in reconstructing chaotic signals. Specific applications of signal reconstruction methods in chaos-based communications are expounded in full detail, along with examples illustrating the various problems associated with such applications.The book serves as an advanced textbook for undergraduate and graduate courses in electronic and information engineering, automatic control, physics and applied mathematics. It is also highly suited for general nonlinear scientists who wi

  7. Reconstruction for limited-projection fluorescence molecular tomography based on projected restarted conjugate gradient normal residual.

    Science.gov (United States)

    Cao, Xu; Zhang, Bin; Liu, Fei; Wang, Xin; Bai, Jing

    2011-12-01

    Limited-projection fluorescence molecular tomography (FMT) can greatly reduce the acquisition time, which is suitable for resolving fast biology processes in vivo but suffers from severe ill-posedness because of the reconstruction using only limited projections. To overcome the severe ill-posedness, we report a reconstruction method based on the projected restarted conjugate gradient normal residual. The reconstruction results of two phantom experiments demonstrate that the proposed method is feasible for limited-projection FMT. © 2011 Optical Society of America

  8. An Optimized Method for Terrain Reconstruction Based on Descent Images

    Directory of Open Access Journals (Sweden)

    Xu Xinchao

    2016-02-01

    Full Text Available An optimization method is proposed to perform high-accuracy terrain reconstruction of the landing area of Chang’e III. First, feature matching is conducted using geometric model constraints. Then, the initial terrain is obtained and the initial normal vector of each point is solved on the basis of the initial terrain. By changing the vector around the initial normal vector in small steps a set of new vectors is obtained. By combining these vectors with the direction of light and camera, the functions are set up on the basis of a surface reflection model. Then, a series of gray values is derived by solving the equations. The new optimized vector is recorded when the obtained gray value is closest to the corresponding pixel. Finally, the optimized terrain is obtained after iteration of the vector field. Experiments were conducted using the laboratory images and descent images of Chang’e III. The results showed that the performance of the proposed method was better than that of the classical feature matching method. It can provide a reference for terrain reconstruction of the landing area in subsequent moon exploration missions.

  9. Reconstruction of Banknote Fragments Based on Keypoint Matching Method.

    Science.gov (United States)

    Gwo, Chih-Ying; Wei, Chia-Hung; Li, Yue; Chiu, Nan-Hsing

    2015-07-01

    Banknotes may be shredded by a scrap machine, ripped up by hand, or damaged in accidents. This study proposes an image registration method for reconstruction of multiple sheets of banknotes. The proposed method first constructs different scale spaces to identify keypoints in the underlying banknote fragments. Next, the features of those keypoints are extracted to represent their local patterns around keypoints. Then, similarity is computed to find the keypoint pairs between the fragment and the reference banknote. The banknote fragments can determine the coordinate and amend the orientation. Finally, an assembly strategy is proposed to piece multiple sheets of banknote fragments together. Experimental results show that the proposed method causes, on average, a deviation of 0.12457 ± 0.12810° for each fragment while the SIFT method deviates 1.16893 ± 2.35254° on average. The proposed method not only reconstructs the banknotes but also decreases the computing cost. Furthermore, the proposed method can estimate relatively precisely the orientation of the banknote fragments to assemble. © 2015 American Academy of Forensic Sciences.

  10. History matters: childhood weight trajectories as a basis for planning community-based obesity prevention to adolescents.

    Science.gov (United States)

    Ekberg, J; Angbratt, M; Valter, L; Nordvall, M; Timpka, T

    2012-04-01

    To use epidemiological data and a standardized economic model to compare projected costs for obesity prevention in late adolescence accrued using a cross-sectional weight classification for selecting adolescents at age 15 years compared with a longitudinal classification. All children born in a Swedish county (population 440 000) in 1991 who participated in all regular measurements of height and weight at ages 5, 10 and 15 years (n=4312) were included in the study. The selection strategies were compared by calculating the projected financial load resulting from supply of obesity prevention services from providers at all levels in the health care system. The difference in marginal cost per 1000 children was used as the primary end point for the analyses. Using the cross-sectional selection strategy, 3.8% of adolescents at age 15 years were selected for evaluation by a pediatric specialist, and 96.2% were chosen for population-based interventions. In the trajectory-based strategy, 2.4% of the adolescents were selected for intensive pediatric care, 1.4% for individual clinical interventions in primary health care, 14.0% for individual primary obesity prevention using the Internet and 82.1% for population-based interventions. Costs for the cross-sectional selection strategy were projected to USD463 581 per 1000 adolescents and for the trajectory-based strategy were USD 302 016 per 1000 adolescents. Using projections from epidemiological data, we found that by basing the selection of adolescents for obesity prevention on weight trajectories, the load on highly specialized pediatric care can be reduced by one-third and total health service costs for obesity management among adolescents reduced by one-third. Before use in policies and prevention program planning, our findings warrant confirmation in prospective cost-benefit studies.

  11. PET image reconstruction with rotationally symmetric polygonal pixel grid based highly compressible system matrix

    International Nuclear Information System (INIS)

    Yu Yunhan; Xia Yan; Liu Yaqiang; Wang Shi; Ma Tianyu; Chen Jing; Hong Baoyu

    2013-01-01

    To achieve a maximum compression of system matrix in positron emission tomography (PET) image reconstruction, we proposed a polygonal image pixel division strategy in accordance with rotationally symmetric PET geometry. Geometrical definition and indexing rule for polygonal pixels were established. Image conversion from polygonal pixel structure to conventional rectangular pixel structure was implemented using a conversion matrix. A set of test images were analytically defined in polygonal pixel structure, converted to conventional rectangular pixel based images, and correctly displayed which verified the correctness of the image definition, conversion description and conversion of polygonal pixel structure. A compressed system matrix for PET image recon was generated by tap model and tested by forward-projecting three different distributions of radioactive sources to the sinogram domain and comparing them with theoretical predictions. On a practical small animal PET scanner, a compress ratio of 12.6:1 of the system matrix size was achieved with the polygonal pixel structure, comparing with the conventional rectangular pixel based tap-mode one. OS-EM iterative image reconstruction algorithms with the polygonal and conventional Cartesian pixel grid were developed. A hot rod phantom was detected and reconstructed based on these two grids with reasonable time cost. Image resolution of reconstructed images was both 1.35 mm. We conclude that it is feasible to reconstruct and display images in a polygonal image pixel structure based on a compressed system matrix in PET image reconstruction. (authors)

  12. Group-based developmental BMI trajectories, polycystic ovary syndrome, and gestational diabetes: a community-based longitudinal study.

    Science.gov (United States)

    Kakoly, Nadira Sultana; Earnest, Arul; Moran, Lisa J; Teede, Helena J; Joham, Anju E

    2017-11-06

    Obesity is common in young women, increasing insulin resistance (IR) and worsening pregnancy complications, including gestational diabetes (GDM). Women with polycystic ovary syndrome (PCOS) are commonly obese, which aggravates the severity of PCOS clinical expression. Relationships between these common insulin-resistant conditions, however, remain unclear. We conducted a secondary analysis of the Australian Longitudinal Study on Women's Health (ALSWH) database, including data from 8009 women aged 18-36 years across six surveys. We used latent-curve growth modelling to identify distinct body mass index (BMI) trajectories and multinomial logistic regression to explore sociodemographic and health variables characterizing BMI group membership. Logistic regression was used to assess independent risk of GDM. A total of 662 women (8.29%, 95% CI 7.68-8.89) reported PCOS. Three distinct BMI trajectories emerged, namely low stable (LSG) (63.8%), defined as an average trajectory remaining at ~25 kg/m 2 ; moderately rising (MRG) (28.8%), a curvilinear trajectory commencing in a healthy BMI and terminating in the overweight range; and high-rising (HRG) (7.4%), a curvilinear trajectory starting and terminating in the obese range. A high BMI in early reproductive life predicted membership in higher trajectories. The HRG BMI trajectory was independently associated with GDM (OR 2.50, 95% CI 1.80-3.48) and was a stronger correlate than PCOS (OR 1.89, 95% CI 1.41-2.54), maternal age, socioeconomic status, or parity. Our results suggest heterogeneity in BMI change among Australian women of reproductive age, with and without PCOS. Reducing early adult life weight represents an ideal opportunity to intervene at an early stage of reproductive life and decreases the risk of long-term metabolic complications such as GDM.

  13. A Superresolution Image Reconstruction Algorithm Based on Landweber in Electrical Capacitance Tomography

    Directory of Open Access Journals (Sweden)

    Chen Deyun

    2013-01-01

    Full Text Available According to the image reconstruction accuracy influenced by the “soft field” nature and ill-conditioned problems in electrical capacitance tomography, a superresolution image reconstruction algorithm based on Landweber is proposed in the paper, which is based on the working principle of the electrical capacitance tomography system. The method uses the algorithm which is derived by regularization of solutions derived and derives closed solution by fast Fourier transform of the convolution kernel. So, it ensures the certainty of the solution and improves the stability and quality of image reconstruction results. Simulation results show that the imaging precision and real-time imaging of the algorithm are better than Landweber algorithm, and this algorithm proposes a new method for the electrical capacitance tomography image reconstruction algorithm.

  14. Silhouette-based approach of 3D image reconstruction for automated image acquisition using robotic arm

    Science.gov (United States)

    Azhar, N.; Saad, W. H. M.; Manap, N. A.; Saad, N. M.; Syafeeza, A. R.

    2017-06-01

    This study presents the approach of 3D image reconstruction using an autonomous robotic arm for the image acquisition process. A low cost of the automated imaging platform is created using a pair of G15 servo motor connected in series to an Arduino UNO as a main microcontroller. Two sets of sequential images were obtained using different projection angle of the camera. The silhouette-based approach is used in this study for 3D reconstruction from the sequential images captured from several different angles of the object. Other than that, an analysis based on the effect of different number of sequential images on the accuracy of 3D model reconstruction was also carried out with a fixed projection angle of the camera. The effecting elements in the 3D reconstruction are discussed and the overall result of the analysis is concluded according to the prototype of imaging platform.

  15. Signal reconstruction in wireless sensor networks based on a cubature Kalman particle filter

    International Nuclear Information System (INIS)

    Huang Jin-Wang; Feng Jiu-Chao

    2014-01-01

    For solving the issues of the signal reconstruction of nonlinear non-Gaussian signals in wireless sensor networks (WSNs), a new signal reconstruction algorithm based on a cubature Kalman particle filter (CKPF) is proposed in this paper. We model the reconstruction signal first and then use the CKPF to estimate the signal. The CKPF uses a cubature Kalman filter (CKF) to generate the importance proposal distribution of the particle filter and integrates the latest observation, which can approximate the true posterior distribution better. It can improve the estimation accuracy. CKPF uses fewer cubature points than the unscented Kalman particle filter (UKPF) and has less computational overheads. Meanwhile, CKPF uses the square root of the error covariance for iterating and is more stable and accurate than the UKPF counterpart. Simulation results show that the algorithm can reconstruct the observed signals quickly and effectively, at the same time consuming less computational time and with more accuracy than the method based on UKPF. (general)

  16. Coronary artery plaques: Cardiac CT with model-based and adaptive-statistical iterative reconstruction technique

    International Nuclear Information System (INIS)

    Scheffel, Hans; Stolzmann, Paul; Schlett, Christopher L.; Engel, Leif-Christopher; Major, Gyöngi Petra; Károlyi, Mihály; Do, Synho; Maurovich-Horvat, Pál; Hoffmann, Udo

    2012-01-01

    Objectives: To compare image quality of coronary artery plaque visualization at CT angiography with images reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR), and model based iterative reconstruction (MBIR) techniques. Methods: The coronary arteries of three ex vivo human hearts were imaged by CT and reconstructed with FBP, ASIR and MBIR. Coronary cross-sectional images were co-registered between the different reconstruction techniques and assessed for qualitative and quantitative image quality parameters. Readers were blinded to the reconstruction algorithm. Results: A total of 375 triplets of coronary cross-sectional images were co-registered. Using MBIR, 26% of the images were rated as having excellent overall image quality, which was significantly better as compared to ASIR and FBP (4% and 13%, respectively, all p < 0.001). Qualitative assessment of image noise demonstrated a noise reduction by using ASIR as compared to FBP (p < 0.01) and further noise reduction by using MBIR (p < 0.001). The contrast-to-noise-ratio (CNR) using MBIR was better as compared to ASIR and FBP (44 ± 19, 29 ± 15, 26 ± 9, respectively; all p < 0.001). Conclusions: Using MBIR improved image quality, reduced image noise and increased CNR as compared to the other available reconstruction techniques. This may further improve the visualization of coronary artery plaque and allow radiation reduction.

  17. Linearized image reconstruction method for ultrasound modulated electrical impedance tomography based on power density distribution

    International Nuclear Information System (INIS)

    Song, Xizi; Xu, Yanbin; Dong, Feng

    2017-01-01

    Electrical resistance tomography (ERT) is a promising measurement technique with important industrial and clinical applications. However, with limited effective measurements, it suffers from poor spatial resolution due to the ill-posedness of the inverse problem. Recently, there has been an increasing research interest in hybrid imaging techniques, utilizing couplings of physical modalities, because these techniques obtain much more effective measurement information and promise high resolution. Ultrasound modulated electrical impedance tomography (UMEIT) is one of the newly developed hybrid imaging techniques, which combines electric and acoustic modalities. A linearized image reconstruction method based on power density is proposed for UMEIT. The interior data, power density distribution, is adopted to reconstruct the conductivity distribution with the proposed image reconstruction method. At the same time, relating the power density change to the change in conductivity, the Jacobian matrix is employed to make the nonlinear problem into a linear one. The analytic formulation of this Jacobian matrix is derived and its effectiveness is also verified. In addition, different excitation patterns are tested and analyzed, and opposite excitation provides the best performance with the proposed method. Also, multiple power density distributions are combined to implement image reconstruction. Finally, image reconstruction is implemented with the linear back-projection (LBP) algorithm. Compared with ERT, with the proposed image reconstruction method, UMEIT can produce reconstructed images with higher quality and better quantitative evaluation results. (paper)

  18. A wavelet-based regularized reconstruction algorithm for SENSE parallel MRI with applications to neuroimaging

    International Nuclear Information System (INIS)

    Chaari, L.; Pesquet, J.Ch.; Chaari, L.; Ciuciu, Ph.; Benazza-Benyahia, A.

    2011-01-01

    To reduce scanning time and/or improve spatial/temporal resolution in some Magnetic Resonance Imaging (MRI) applications, parallel MRI acquisition techniques with multiple coils acquisition have emerged since the early 1990's as powerful imaging methods that allow a faster acquisition process. In these techniques, the full FOV image has to be reconstructed from the resulting acquired under sampled k-space data. To this end, several reconstruction techniques have been proposed such as the widely-used Sensitivity Encoding (SENSE) method. However, the reconstructed image generally presents artifacts when perturbations occur in both the measured data and the estimated coil sensitivity profiles. In this paper, we aim at achieving accurate image reconstruction under degraded experimental conditions (low magnetic field and high reduction factor), in which neither the SENSE method nor the Tikhonov regularization in the image domain give convincing results. To this end, we present a novel method for SENSE-based reconstruction which proceeds with regularization in the complex wavelet domain by promoting sparsity. The proposed approach relies on a fast algorithm that enables the minimization of regularized non-differentiable criteria including more general penalties than a classical l 1 term. To further enhance the reconstructed image quality, local convex constraints are added to the regularization process. In vivo human brain experiments carried out on Gradient-Echo (GRE) anatomical and Echo Planar Imaging (EPI) functional MRI data at 1.5 T indicate that our algorithm provides reconstructed images with reduced artifacts for high reduction factors. (authors)

  19. Cone-beam local reconstruction based on a Radon inversion transformation

    International Nuclear Information System (INIS)

    Wang Xian-Chao; Yan Bin; Li Lei; Hu Guo-En

    2012-01-01

    The local reconstruction from truncated projection data is one area of interest in image reconstruction for computed tomography (CT), which creates the possibility for dose reduction. In this paper, a filtered-backprojection (FBP) algorithm based on the Radon inversion transform is presented to deal with the three-dimensional (3D) local reconstruction in the circular geometry. The algorithm achieves the data filtering in two steps. The first step is the derivative of projections, which acts locally on the data and can thus be carried out accurately even in the presence of data truncation. The second step is the nonlocal Hilbert filtering. The numerical simulations and the real data reconstructions have been conducted to validate the new reconstruction algorithm. Compared with the approximate truncation resistant algorithm for computed tomography (ATRACT), not only it has a comparable ability to restrain truncation artifacts, but also its reconstruction efficiency is improved. It is about twice as fast as that of the ATRACT. Therefore, this work provides a simple and efficient approach for the approximate reconstruction from truncated projections in the circular cone-beam CT

  20. Capturing the Diversity of Successful Aging: An Operational Definition Based on 16-Year Trajectories of Functioning.

    Science.gov (United States)

    Kok, Almar A L; Aartsen, Marja J; Deeg, Dorly J H; Huisman, Martijn

    2017-04-01

    To determine the prevalence and extent of successful aging (SA) when various suggestions proposed in the previous literature for improving models of SA are incorporated into one holistic operational definition. These suggestions include defining and measuring SA as a developmental process, including subjective indicators alongside more objective ones, and expressing SA on a continuum. Data were used from 2,241 respondents in the Longitudinal Aging Study Amsterdam, a multidisciplinary study in a nationally representative sample of older adults in the Netherlands. Latent class growth analysis was used to identify successful 16-year trajectories within nine indicators of physical, cognitive, emotional, and social functioning. SA was quantified as the number of indicators in which individual respondents showed successful trajectories (range 0-9). Successful trajectories were characterized by stability, limited decline, or even improvement of functioning over time. Of the respondents, 39.6% of men and 29.3% of women were successful in at least seven indicators; 7% of men and 11% of women were successful in less than three indicators. Proportions of successful respondents were largest in life satisfaction (>85%) and smallest in social activity (<25%). Correlations of success between separate indicators were low to moderate (range r = .02-.37). Many older adults age relatively successfully, but the character of successful functioning over time varies between indicators, and the combinations of successful indicators vary between individuals. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. What Is the End of Life Period? Trajectories and Characterization Based on Primary Caregiver Reports.

    Science.gov (United States)

    Cohen-Mansfield, Jiska; Cohen, Rinat; Skornick-Bouchbinder, Michal; Brill, Shai

    2018-04-17

    As the population lives longer, end of life (EOL) is emerging as a distinct life phase, about which there is still limited understanding. Characterizing this important period is vital for clarifying issues regarding trajectory and decline at EOL and for health service planning on an institutional, communal, and societal level. In this article, we aim to characterize the EOL period, examining the duration and number of EOL stages, as well as functional, attitudinal, and emotional trajectories. In this cross-sectional study, 70 primary caregivers of deceased persons were interviewed. Standardized rates of functional, attitudinal, and emotional change across the EOL period were calculated. Frequencies were compared using the McNemar statistical test. EOL period was found to have a median length of 3.25 years, and an average of approximately three progressive stages. The duration of EOL stages tended to decrease as death approached. Unexpected events (eg new medical diagnosis/accident) served as the precipitating event for the EOL period for approximately half of the deceased persons, and changes in existing conditions (eg health status/cognitive state) were also reported to precipitate EOL for a similar proportion. Reports of functionality across stages found the steepest decline in the "physical" domain and the most moderate decline in the "social" domain. With each stage, positive indicators, such as "will to live," showed a progressive decline, whereas negative indicators, including "suffering" and "dependence level," progressively increased. Results help characterize EOL trajectories and should inform care planning and decision making at various levels. In addition, they suggest a methodology for better understanding EOL.

  2. Super-resolution reconstruction of 4D-CT lung data via patch-based low-rank matrix reconstruction

    Science.gov (United States)

    Fang, Shiting; Wang, Huafeng; Liu, Yueliang; Zhang, Minghui; Yang, Wei; Feng, Qianjin; Chen, Wufan; Zhang, Yu

    2017-10-01

    Lung 4D computed tomography (4D-CT), which is a time-resolved CT data acquisition, performs an important role in explicitly including respiratory motion in treatment planning and delivery. However, the radiation dose is usually reduced at the expense of inter-slice spatial resolution to minimize radiation-related health risk. Therefore, resolution enhancement along the superior-inferior direction is necessary. In this paper, a super-resolution (SR) reconstruction method based on a patch low-rank matrix reconstruction is proposed to improve the resolution of lung 4D-CT images. Specifically, a low-rank matrix related to every patch is constructed by using a patch searching strategy. Thereafter, the singular value shrinkage is employed to recover the high-resolution patch under the constraints of the image degradation model. The output high-resolution patches are finally assembled to output the entire image. This method is extensively evaluated using two public data sets. Quantitative analysis shows that the proposed algorithm decreases the root mean square error by 9.7%-33.4% and the edge width by 11.4%-24.3%, relative to linear interpolation, back projection (BP) and Zhang et al’s algorithm. A new algorithm has been developed to improve the resolution of 4D-CT. In all experiments, the proposed method outperforms various interpolation methods, as well as BP and Zhang et al’s method, thus indicating the effectivity and competitiveness of the proposed algorithm.

  3. Duality-Based Nonlinear Quadratic Control: Application to Mobile Robot Trajectory-Following

    Czech Academy of Sciences Publication Activity Database

    Arnesto, L.; Girbés, V.; Sala, A.; Zima, M.; Šmídl, Václav

    2015-01-01

    Roč. 23, č. 4 (2015), s. 1494-1504 ISSN 1063-6536 R&D Projects: GA ČR(CZ) GAP102/11/0437 Grant - others:GA MŠk(CZ) CZ.1.05/2.1.00/03.0094 Institutional support: RVO:67985556 Keywords : trajectory planning * duality of estimation and control Subject RIV: BC - Control Systems Theory Impact factor: 2.818, year: 2015 http://library.utia.cas.cz/separaty/2015/AS/smidl-0445192.pdf

  4. Dynamic Weather Routes: A Weather Avoidance Concept for Trajectory-Based Operations

    Science.gov (United States)

    McNally, B. David; Love, John

    2011-01-01

    The integration of convective weather modeling with trajectory automation for conflict detection, trial planning, direct routing, and auto resolution has uncovered a concept that could help controllers, dispatchers, and pilots identify improved weather routes that result in significant savings in flying time and fuel burn. Trajectory automation continuously and automatically monitors aircraft in flight to find those that could potentially benefit from improved weather reroutes. Controllers, dispatchers, and pilots then evaluate reroute options to assess their suitability given current weather and traffic. In today's operations aircraft fly convective weather avoidance routes that were implemented often hours before aircraft approach the weather and automation does not exist to automatically monitor traffic to find improved weather routes that open up due to changing weather conditions. The automation concept runs in real-time and employs two keysteps. First, a direct routing algorithm automatically identifies flights with large dog legs in their routes and therefore potentially large savings in flying time. These are common - and usually necessary - during convective weather operations and analysis of Fort Worth Center traffic shows many aircraft with short cuts that indicate savings on the order of 10 flying minutes. The second and most critical step is to apply trajectory automation with weather modeling to determine what savings could be achieved by modifying the direct route such that it avoids weather and traffic and is acceptable to controllers and flight crews. Initial analysis of Fort Worth Center traffic suggests a savings of roughly 50% of the direct route savings could be achievable.The core concept is to apply trajectory automation with convective weather modeling in real time to identify a reroute that is free of weather and traffic conflicts and indicates enough time and fuel savings to be considered. The concept is interoperable with today

  5. Magneto-acousto-electrical Measurement Based Electrical Conductivity Reconstruction for Tissues.

    Science.gov (United States)

    Zhou, Yan; Ma, Qingyu; Guo, Gepu; Tu, Juan; Zhang, Dong

    2018-05-01

    Based on the interaction of ultrasonic excitation and magnetoelectrical induction, magneto-acousto-electrical (MAE) technology was demonstrated to have the capability of differentiating conductivity variations along the acoustic transmission. By applying the characteristics of the MAE voltage, a simplified algorithm of MAE measurement based conductivity reconstruction was developed. With the analyses of acoustic vibration, ultrasound propagation, Hall effect, and magnetoelectrical induction, theoretical and experimental studies of MAE measurement and conductivity reconstruction were performed. The formula of MAE voltage was derived and simplified for the transducer with strong directivity. MAE voltage was simulated for a three-layer gel phantom and the conductivity distribution was reconstructed using the modified Wiener inverse filter and Hilbert transform, which was also verified by experimental measurements. The experimental results are basically consistent with the simulations, and demonstrate that the wave packets of MAE voltage are generated at tissue interfaces with the amplitudes and vibration polarities representing the values and directions of conductivity variations. With the proposed algorithm, the amplitude and polarity of conductivity gradient can be restored and the conductivity distribution can also be reconstructed accurately. The favorable results demonstrate the feasibility of accurate conductivity reconstruction with improved spatial resolution using MAE measurement for tissues with conductivity variations, especially suitable for nondispersive tissues with abrupt conductivity changes. This study demonstrates that the MAE measurement based conductivity reconstruction algorithm can be applied as a new strategy for nondestructive real-time monitoring of conductivity variations in biomedical engineering.

  6. CT angiography after carotid artery stenting: assessment of the utility of adaptive statistical iterative reconstruction and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Kuya, Keita; Shinohara, Yuki; Fujii, Shinya; Ogawa, Toshihide [Tottori University, Division of Radiology, Department of Pathophysiological Therapeutic Science, Faculty of Medicine, Yonago (Japan); Sakamoto, Makoto; Watanabe, Takashi [Tottori University, Division of Neurosurgery, Department of Brain and Neurosciences, Faculty of Medicine, Yonago (Japan); Iwata, Naoki; Kishimoto, Junichi [Tottori University, Division of Clinical Radiology Faculty of Medicine, Yonago (Japan); Kaminou, Toshio [Osaka Minami Medical Center, Department of Radiology, Osaka (Japan)

    2014-11-15

    Follow-up CT angiography (CTA) is routinely performed for post-procedure management after carotid artery stenting (CAS). However, the stent lumen tends to be underestimated because of stent artifacts on CTA reconstructed with the filtered back projection (FBP) technique. We assessed the utility of new iterative reconstruction techniques, such as adaptive statistical iterative reconstruction (ASIR) and model-based iterative reconstruction (MBIR), for CTA after CAS in comparison with FBP. In a phantom study, we evaluated the differences among the three reconstruction techniques with regard to the relationship between the stent luminal diameter and the degree of underestimation of stent luminal diameter. In a clinical study, 34 patients who underwent follow-up CTA after CAS were included. We compared the stent luminal diameters among FBP, ASIR, and MBIR, and performed visual assessment of low attenuation area (LAA) in the stent lumen using a three-point scale. In the phantom study, stent luminal diameter was increasingly underestimated as luminal diameter became smaller in all CTA images. Stent luminal diameter was larger with MBIR than with the other reconstruction techniques. Similarly, in the clinical study, stent luminal diameter was larger with MBIR than with the other reconstruction techniques. LAA detectability scores of MBIR were greater than or equal to those of FBP and ASIR in all cases. MBIR improved the accuracy of assessment of stent luminal diameter and LAA detectability in the stent lumen when compared with FBP and ASIR. We conclude that MBIR is a useful reconstruction technique for CTA after CAS. (orig.)

  7. CT angiography after carotid artery stenting: assessment of the utility of adaptive statistical iterative reconstruction and model-based iterative reconstruction

    International Nuclear Information System (INIS)

    Kuya, Keita; Shinohara, Yuki; Fujii, Shinya; Ogawa, Toshihide; Sakamoto, Makoto; Watanabe, Takashi; Iwata, Naoki; Kishimoto, Junichi; Kaminou, Toshio

    2014-01-01

    Follow-up CT angiography (CTA) is routinely performed for post-procedure management after carotid artery stenting (CAS). However, the stent lumen tends to be underestimated because of stent artifacts on CTA reconstructed with the filtered back projection (FBP) technique. We assessed the utility of new iterative reconstruction techniques, such as adaptive statistical iterative reconstruction (ASIR) and model-based iterative reconstruction (MBIR), for CTA after CAS in comparison with FBP. In a phantom study, we evaluated the differences among the three reconstruction techniques with regard to the relationship between the stent luminal diameter and the degree of underestimation of stent luminal diameter. In a clinical study, 34 patients who underwent follow-up CTA after CAS were included. We compared the stent luminal diameters among FBP, ASIR, and MBIR, and performed visual assessment of low attenuation area (LAA) in the stent lumen using a three-point scale. In the phantom study, stent luminal diameter was increasingly underestimated as luminal diameter became smaller in all CTA images. Stent luminal diameter was larger with MBIR than with the other reconstruction techniques. Similarly, in the clinical study, stent luminal diameter was larger with MBIR than with the other reconstruction techniques. LAA detectability scores of MBIR were greater than or equal to those of FBP and ASIR in all cases. MBIR improved the accuracy of assessment of stent luminal diameter and LAA detectability in the stent lumen when compared with FBP and ASIR. We conclude that MBIR is a useful reconstruction technique for CTA after CAS. (orig.)

  8. The developmental trajectory of children's auditory and visual statistical learning abilities: modality-based differences in the effect of age.

    Science.gov (United States)

    Raviv, Limor; Arnon, Inbal

    2017-09-12

    Infants, children and adults are capable of extracting recurring patterns from their environment through statistical learning (SL), an implicit learning mechanism that is considered to have an important role in language acquisition. Research over the past 20 years has shown that SL is present from very early infancy and found in a variety of tasks and across modalities (e.g., auditory, visual), raising questions on the domain generality of SL. However, while SL is well established for infants and adults, only little is known about its developmental trajectory during childhood, leaving two important questions unanswered: (1) Is SL an early-maturing capacity that is fully developed in infancy, or does it improve with age like other cognitive capacities (e.g., memory)? and (2) Will SL have similar developmental trajectories across modalities? Only few studies have looked at SL across development, with conflicting results: some find age-related improvements while others do not. Importantly, no study to date has examined auditory SL across childhood, nor compared it to visual SL to see if there are modality-based differences in the developmental trajectory of SL abilities. We addressed these issues by conducting a large-scale study of children's performance on matching auditory and visual SL tasks across a wide age range (5-12y). Results show modality-based differences in the development of SL abilities: while children's learning in the visual domain improved with age, learning in the auditory domain did not change in the tested age range. We examine these findings in light of previous studies and discuss their implications for modality-based differences in SL and for the role of auditory SL in language acquisition. A video abstract of this article can be viewed at: https://www.youtube.com/watch?v=3kg35hoF0pw. © 2017 John Wiley & Sons Ltd.

  9. Reduced aliasing artifacts using shaking projection k-space sampling trajectory

    Science.gov (United States)

    Zhu, Yan-Chun; Du, Jiang; Yang, Wen-Chao; Duan, Chai-Jie; Wang, Hao-Yu; Gao, Song; Bao, Shang-Lian

    2014-03-01

    Radial imaging techniques, such as projection-reconstruction (PR), are used in magnetic resonance imaging (MRI) for dynamic imaging, angiography, and short-T2 imaging. They are less sensitive to flow and motion artifacts, and support fast imaging with short echo times. However, aliasing and streaking artifacts are two main sources which degrade radial imaging quality. For a given fixed number of k-space projections, data distributions along radial and angular directions will influence the level of aliasing and streaking artifacts. Conventional radial k-space sampling trajectory introduces an aliasing artifact at the first principal ring of point spread function (PSF). In this paper, a shaking projection (SP) k-space sampling trajectory was proposed to reduce aliasing artifacts in MR images. SP sampling trajectory shifts the projection alternately along the k-space center, which separates k-space data in the azimuthal direction. Simulations based on conventional and SP sampling trajectories were compared with the same number projections. A significant reduction of aliasing artifacts was observed using the SP sampling trajectory. These two trajectories were also compared with different sampling frequencies. A SP trajectory has the same aliasing character when using half sampling frequency (or half data) for reconstruction. SNR comparisons with different white noise levels show that these two trajectories have the same SNR character. In conclusion, the SP trajectory can reduce the aliasing artifact without decreasing SNR and also provide a way for undersampling reconstruction. Furthermore, this method can be applied to three-dimensional (3D) hybrid or spherical radial k-space sampling for a more efficient reduction of aliasing artifacts.

  10. Reduced aliasing artifacts using shaking projection k-space sampling trajectory

    International Nuclear Information System (INIS)

    Zhu Yan-Chun; Yang Wen-Chao; Wang Hao-Yu; Gao Song; Bao Shang-Lian; Du Jiang; Duan Chai-Jie

    2014-01-01

    Radial imaging techniques, such as projection-reconstruction (PR), are used in magnetic resonance imaging (MRI) for dynamic imaging, angiography, and short-T2 imaging. They are less sensitive to flow and motion artifacts, and support fast imaging with short echo times. However, aliasing and streaking artifacts are two main sources which degrade radial imaging quality. For a given fixed number of k-space projections, data distributions along radial and angular directions will influence the level of aliasing and streaking artifacts. Conventional radial k-space sampling trajectory introduces an aliasing artifact at the first principal ring of point spread function (PSF). In this paper, a shaking projection (SP) k-space sampling trajectory was proposed to reduce aliasing artifacts in MR images. SP sampling trajectory shifts the projection alternately along the k-space center, which separates k-space data in the azimuthal direction. Simulations based on conventional and SP sampling trajectories were compared with the same number projections. A significant reduction of aliasing artifacts was observed using the SP sampling trajectory. These two trajectories were also compared with different sampling frequencies. A SP trajectory has the same aliasing character when using half sampling frequency (or half data) for reconstruction. SNR comparisons with different white noise levels show that these two trajectories have the same SNR character. In conclusion, the SP trajectory can reduce the aliasing artifact without decreasing SNR and also provide a way for undersampling reconstruction. Furthermore, this method can be applied to three-dimensional (3D) hybrid or spherical radial k-space sampling for a more efficient reduction of aliasing artifacts

  11. A three-step reconstruction method for fluorescence molecular tomography based on compressive sensing

    DEFF Research Database (Denmark)

    Zhu, Yansong; Jha, Abhinav K.; Dreyer, Jakob K.

    2017-01-01

    Fluorescence molecular tomography (FMT) is a promising tool for real time in vivo quantification of neurotransmission (NT) as we pursue in our BRAIN initiative effort. However, the acquired image data are noisy and the reconstruction problem is ill-posed. Further, while spatial sparsity of the NT...... matrix coherence. The resultant image data are input to a homotopy-based reconstruction strategy that exploits sparsity via ℓ1 regularization. The reconstructed image is then input to a maximum-likelihood expectation maximization (MLEM) algorithm that retains the sparseness of the input estimate...... and improves upon the quantitation by accurate Poisson noise modeling. The proposed reconstruction method was evaluated in a three-dimensional simulated setup with fluorescent sources in a cuboidal scattering medium with optical properties simulating human brain cortex (reduced scattering coefficient: 9.2 cm-1...

  12. A deep learning-based reconstruction of cosmic ray-induced air showers

    Science.gov (United States)

    Erdmann, M.; Glombitza, J.; Walz, D.

    2018-01-01

    We describe a method of reconstructing air showers induced by cosmic rays using deep learning techniques. We simulate an observatory consisting of ground-based particle detectors with fixed locations on a regular grid. The detector's responses to traversing shower particles are signal amplitudes as a function of time, which provide information on transverse and longitudinal shower properties. In order to take advantage of convolutional network techniques specialized in local pattern recognition, we convert all information to the image-like grid of the detectors. In this way, multiple features, such as arrival times of the first particles and optimized characterizations of time traces, are processed by the network. The reconstruction quality of the cosmic ray arrival direction turns out to be competitive with an analytic reconstruction algorithm. The reconstructed shower direction, energy and shower depth show the expected improvement in resolution for higher cosmic ray energy.

  13. Image Reconstruction Based on Homotopy Perturbation Inversion Method for Electrical Impedance Tomography

    Directory of Open Access Journals (Sweden)

    Jing Wang

    2013-01-01

    Full Text Available The image reconstruction for electrical impedance tomography (EIT mathematically is a typed nonlinear ill-posed inverse problem. In this paper, a novel iteration regularization scheme based on the homotopy perturbation technique, namely, homotopy perturbation inversion method, is applied to investigate the EIT image reconstruction problem. To verify the feasibility and effectiveness, simulations of image reconstruction have been performed in terms of considering different locations, sizes, and numbers of the inclusions, as well as robustness to data noise. Numerical results indicate that this method can overcome the numerical instability and is robust to data noise in the EIT image reconstruction. Moreover, compared with the classical Landweber iteration method, our approach improves the convergence rate. The results are promising.

  14. Error Evaluation in a Stereovision-Based 3D Reconstruction System

    Directory of Open Access Journals (Sweden)

    Kohler Sophie

    2010-01-01

    Full Text Available The work presented in this paper deals with the performance analysis of the whole 3D reconstruction process of imaged objects, specifically of the set of geometric primitives describing their outline and extracted from a pair of images knowing their associated camera models. The proposed analysis focuses on error estimation for the edge detection process, the starting step for the whole reconstruction procedure. The fitting parameters describing the geometric features composing the workpiece to be evaluated are used as quality measures to determine error bounds and finally to estimate the edge detection errors. These error estimates are then propagated up to the final 3D reconstruction step. The suggested error analysis procedure for stereovision-based reconstruction tasks further allows evaluating the quality of the 3D reconstruction. The resulting final error estimates enable lastly to state if the reconstruction results fulfill a priori defined criteria, for example, fulfill dimensional constraints including tolerance information, for vision-based quality control applications for example.

  15. Intensity-based bayesian framework for image reconstruction from sparse projection data

    International Nuclear Information System (INIS)

    Rashed, E.A.; Kudo, Hiroyuki

    2009-01-01

    This paper presents a Bayesian framework for iterative image reconstruction from projection data measured over a limited number of views. The classical Nyquist sampling rule yields the minimum number of projection views required for accurate reconstruction. However, challenges exist in many medical and industrial imaging applications in which the projection data is undersampled. Classical analytical reconstruction methods such as filtered backprojection (FBP) are not a good choice for use in such cases because the data undersampling in the angular range introduces aliasing and streak artifacts that degrade lesion detectability. In this paper, we propose a Bayesian framework for maximum likelihood-expectation maximization (ML-EM)-based iterative reconstruction methods that incorporates a priori knowledge obtained from expected intensity information. The proposed framework is based on the fact that, in tomographic imaging, it is often possible to expect a set of intensity values of the reconstructed object with relatively high accuracy. The image reconstruction cost function is modified to include the l 1 norm distance to the a priori known information. The proposed method has the potential to regularize the solution to reduce artifacts without missing lesions that cannot be expected from the a priori information. Numerical studies showed a significant improvement in image quality and lesion detectability under the condition of highly undersampled projection data. (author)

  16. Late summer temperature reconstruction based on tree-ring density for Sygera Mountain, southeastern Tibetan Plateau

    Science.gov (United States)

    Li, Mingyong; Duan, Jianping; Wang, Lily; Zhu, Haifeng

    2018-04-01

    Although several tree-ring density-based summer/late summer temperature reconstructions have been developed on the Tibetan Plateau (TP), the understanding of the local/regional characteristics of summer temperature fluctuations on a long-term scale in some regions is still limited. To improve our understanding in these aspects, more local or regional summer temperature reconstructions extending back over several centuries are required. In this study, a new mean latewood density (LWD) chronology from Abies georgei var. smithii from the upper tree line of Sygera Mountain on the southeastern TP was developed to reconstruct the late summer temperature variability since 1820 CE. The bootstrapped correlation analysis showed that the LWD chronology index was significantly and positively correlated with the late summer (August-September) mean temperatures (r1950-2008 = 0.63, p < 0.001) recorded at the nearest meteorological station and that this reconstruction has considerable potential to represent the late summer temperature variability at the regional scale. Our late summer temperature reconstruction revealed three obvious cold periods (i.e., 1872-1908, 1913-1937 and 1941-1966) and two relatively warm phases (i.e., 1821-1871 and 1970-2008) over the past two centuries. Comparisons of our reconstruction with other independent tree-ring-based temperature reconstructions, glacier fluctuations and historical documental records from neighboring regions showed good agreement in these relatively cold and warm intervals. Our reconstruction exhibits an overall increasing temperature trend since the 1960s, providing new evidence supporting the recent warming of the TP. Moreover, our results also indicate that the late summer temperature variability of Sygera Mountain on the southeastern TP has potential links with the Pacific Decadal Oscillation (PDO).

  17. Trajectory Tracking of a Tri-Rotor Aerial Vehicle Using an MRAC-Based Robust Hybrid Control Algorithm

    Directory of Open Access Journals (Sweden)

    Zain Anwar Ali

    2017-01-01

    Full Text Available In this paper, a novel Model Reference Adaptive Control (MRAC-based hybrid control algorithm is presented for the trajectory tracking of a tri-rotor Unmanned Aerial Vehicle (UAV. The mathematical model of the tri-rotor is based on the Newton–Euler formula, whereas the MRAC-based hybrid controller consists of Fuzzy Proportional Integral Derivative (F-PID and Fuzzy Proportional Derivative (F-PD controllers. MRAC is used as the main controller for the dynamics, while the parameters of the adaptive controller are fine-tuned by the F-PD controller for the altitude control subsystem and the F-PID controller for the attitude control subsystem of the UAV. The stability of the system is ensured and proven by Lyapunov stability analysis. The proposed control algorithm is tested and verified using computer simulations for the trajectory tracking of the desired path as an input. The effectiveness of our proposed algorithm is compared with F-PID and the Fuzzy Logic Controller (FLC. Our proposed controller exhibits much less steady state error, quick error convergence in the presence of disturbance or noise, and model uncertainties.

  18. A Minimum Fuel Based Estimator for Maneuver and Natrual Dynamics Reconstruction

    Science.gov (United States)

    Lubey, D.; Scheeres, D.

    2013-09-01

    The vast and growing population of objects in Earth orbit (active and defunct spacecraft, orbital debris, etc.) offers many unique challenges when it comes to tracking these objects and associating the resulting observations. Complicating these challenges are the inaccurate natural dynamical models of these objects, the active maneuvers of spacecraft that deviate them from their ballistic trajectories, and the fact that spacecraft are tracked and operated by separate agencies. Maneuver detection and reconstruction algorithms can help with each of these issues by estimating mismodeled and unmodeled dynamics through indirect observation of spacecraft. It also helps to verify the associations made by an object correlation algorithm or aid in making those associations, which is essential when tracking objects in orbit. The algorithm developed in this study applies an Optimal Control Problem (OCP) Distance Metric approach to the problems of Maneuver Reconstruction and Dynamics Estimation. This was first developed by Holzinger, Scheeres, and Alfriend (2011), with a subsequent study by Singh, Horwood, and Poore (2012). This method estimates the minimum fuel control policy rather than the state as a typical Kalman Filter would. This difference ensures that the states are connected through a given dynamical model and allows for automatic covariance manipulation, which can help to prevent filter saturation. Using a string of measurements (either verified or hypothesized to correlate with one another), the algorithm outputs a corresponding string of adjoint and state estimates with associated noise. Post-processing techniques are implemented, which when applied to the adjoint estimates can remove noise and expose unmodeled maneuvers and mismodeled natural dynamics. Specifically, the estimated controls are used to determine spacecraft dependent accelerations (atmospheric drag and solar radiation pressure) using an adapted form of the Optimal Control based natural dynamics

  19. Practical considerations for image-based PSF and blobs reconstruction in PET

    International Nuclear Information System (INIS)

    Stute, Simon; Comtat, Claude

    2013-01-01

    Iterative reconstructions in positron emission tomography (PET) need a model relating the recorded data to the object/patient being imaged, called the system matrix (SM). The more realistic this model, the better the spatial resolution in the reconstructed images. However, a serious concern when using a SM that accurately models the resolution properties of the PET system is the undesirable edge artefact, visible through oscillations near sharp discontinuities in the reconstructed images. This artefact is a natural consequence of solving an ill-conditioned inverse problem, where the recorded data are band-limited. In this paper, we focus on practical aspects when considering image-based point-spread function (PSF) reconstructions. To remove the edge artefact, we propose to use a particular case of the method of sieves (Grenander 1981 Abstract Inference New York: Wiley), which simply consists in performing a standard PSF reconstruction, followed by a post-smoothing using the PSF as the convolution kernel. Using analytical simulations, we investigate the impact of different reconstruction and PSF modelling parameters on the edge artefact and its suppression, in the case of noise-free data and an exactly known PSF. Using Monte-Carlo simulations, we assess the proposed method of sieves with respect to the choice of the geometric projector and the PSF model used in the reconstruction. When the PSF model is accurately known, we show that the proposed method of sieves succeeds in completely suppressing the edge artefact, though after a number of iterations higher than typically used in practice. When applying the method to realistic data (i.e. unknown true SM and noisy data), we show that the choice of the geometric projector and the PSF model does not impact the results in terms of noise and contrast recovery, as long as the PSF has a width close to the true PSF one. Equivalent results were obtained using either blobs or voxels in the same conditions (i.e. the blob

  20. Household Food Insecurity and Children's Behaviour Problems: New Evidence from a Trajectories-Based Study

    Science.gov (United States)

    Huang, Jin; Vaughn, Michael G.

    2016-01-01

    This study examined the association between household food insecurity (insufficient access to adequate and nutritious food) and trajectories of externalising and internalising behaviour problems in children from kindergarten to fifth grade using longitudinal data from the Early Childhood Longitudinal Study—Kindergarten Cohort (ECLS-K), a nationally representative study in the USA. Household food insecurity was assessed using the eighteen-item standard food security scale, and children's behaviour problems were reported by teachers. Latent growth curve analysis was conducted on 7,348 children in the ECLS-K, separately for boys and girls. Following adjustment for an extensive array of confounding variables, results suggest that food insecurity generally was not associated with developmental change in children's behaviour problems. The impact of food insecurity on behaviour problems may be episodic or interact with certain developmental stages. PMID:27559210

  1. Novel Fourier-based iterative reconstruction for sparse fan projection using alternating direction total variation minimization

    International Nuclear Information System (INIS)

    Jin Zhao; Zhang Han-Ming; Yan Bin; Li Lei; Wang Lin-Yuan; Cai Ai-Long

    2016-01-01

    Sparse-view x-ray computed tomography (CT) imaging is an interesting topic in CT field and can efficiently decrease radiation dose. Compared with spatial reconstruction, a Fourier-based algorithm has advantages in reconstruction speed and memory usage. A novel Fourier-based iterative reconstruction technique that utilizes non-uniform fast Fourier transform (NUFFT) is presented in this work along with advanced total variation (TV) regularization for a fan sparse-view CT. The proposition of a selective matrix contributes to improve reconstruction quality. The new method employs the NUFFT and its adjoin to iterate back and forth between the Fourier and image space. The performance of the proposed algorithm is demonstrated through a series of digital simulations and experimental phantom studies. Results of the proposed algorithm are compared with those of existing TV-regularized techniques based on compressed sensing method, as well as basic algebraic reconstruction technique. Compared with the existing TV-regularized techniques, the proposed Fourier-based technique significantly improves convergence rate and reduces memory allocation, respectively. (paper)

  2. Environment-based pin-power reconstruction method for homogeneous core calculations

    International Nuclear Information System (INIS)

    Leroyer, H.; Brosselard, C.; Girardi, E.

    2012-01-01

    Core calculation schemes are usually based on a classical two-step approach associated with assembly and core calculations. During the first step, infinite lattice assemblies calculations relying on a fundamental mode approach are used to generate cross-sections libraries for PWRs core calculations. This fundamental mode hypothesis may be questioned when dealing with loading patterns involving several types of assemblies (UOX, MOX), burnable poisons, control rods and burn-up gradients. This paper proposes a calculation method able to take into account the heterogeneous environment of the assemblies when using homogeneous core calculations and an appropriate pin-power reconstruction. This methodology is applied to MOX assemblies, computed within an environment of UOX assemblies. The new environment-based pin-power reconstruction is then used on various clusters of 3x3 assemblies showing burn-up gradients and UOX/MOX interfaces, and compared to reference calculations performed with APOLLO-2. The results show that UOX/MOX interfaces are much better calculated with the environment-based calculation scheme when compared to the usual pin-power reconstruction method. The power peak is always better located and calculated with the environment-based pin-power reconstruction method on every cluster configuration studied. This study shows that taking into account the environment in transport calculations can significantly improve the pin-power reconstruction so far as it is consistent with the core loading pattern. (authors)

  3. Analysis of Spatio-Temporal Traffic Patterns Based on Pedestrian Trajectories

    Science.gov (United States)

    Busch, S.; Schindler, T.; Klinger, T.; Brenner, C.

    2016-06-01

    For driver assistance and autonomous driving systems, it is essential to predict the behaviour of other traffic participants. Usually, standard filter approaches are used to this end, however, in many cases, these are not sufficient. For example, pedestrians are able to change their speed or direction instantly. Also, there may be not enough observation data to determine the state of an object reliably, e.g. in case of occlusions. In those cases, it is very useful if a prior model exists, which suggests certain outcomes. For example, it is useful to know that pedestrians are usually crossing the road at a certain location and at certain times. This information can then be stored in a map which then can be used as a prior in scene analysis, or in practical terms to reduce the speed of a vehicle in advance in order to minimize critical situations. In this paper, we present an approach to derive such a spatio-temporal map automatically from the observed behaviour of traffic participants in everyday traffic situations. In our experiments, we use one stationary camera to observe a complex junction, where cars, public transportation and pedestrians interact. We concentrate on the pedestrians trajectories to map traffic patterns. In the first step, we extract trajectory segments from the video data. These segments are then clustered in order to derive a spatial model of the scene, in terms of a spatially embedded graph. In the second step, we analyse the temporal patterns of pedestrian movement on this graph. We are able to derive traffic light sequences as well as the timetables of nearby public transportation. To evaluate our approach, we used a 4 hour video sequence. We show that we are able to derive traffic light sequences as well as time tables of nearby public transportation.

  4. ANALYSIS OF SPATIO-TEMPORAL TRAFFIC PATTERNS BASED ON PEDESTRIAN TRAJECTORIES

    Directory of Open Access Journals (Sweden)

    S. Busch

    2016-06-01

    Full Text Available For driver assistance and autonomous driving systems, it is essential to predict the behaviour of other traffic participants. Usually, standard filter approaches are used to this end, however, in many cases, these are not sufficient. For example, pedestrians are able to change their speed or direction instantly. Also, there may be not enough observation data to determine the state of an object reliably, e.g. in case of occlusions. In those cases, it is very useful if a prior model exists, which suggests certain outcomes. For example, it is useful to know that pedestrians are usually crossing the road at a certain location and at certain times. This information can then be stored in a map which then can be used as a prior in scene analysis, or in practical terms to reduce the speed of a vehicle in advance in order to minimize critical situations. In this paper, we present an approach to derive such a spatio-temporal map automatically from the observed behaviour of traffic participants in everyday traffic situations. In our experiments, we use one stationary camera to observe a complex junction, where cars, public transportation and pedestrians interact. We concentrate on the pedestrians trajectories to map traffic patterns. In the first step, we extract trajectory segments from the video data. These segments are then clustered in order to derive a spatial model of the scene, in terms of a spatially embedded graph. In the second step, we analyse the temporal patterns of pedestrian movement on this graph. We are able to derive traffic light sequences as well as the timetables of nearby public transportation. To evaluate our approach, we used a 4 hour video sequence. We show that we are able to derive traffic light sequences as well as time tables of nearby public transportation.

  5. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2015-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well s health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  6. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2016-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  7. MAP-MRF-Based Super-Resolution Reconstruction Approach for Coded Aperture Compressive Temporal Imaging

    Directory of Open Access Journals (Sweden)

    Tinghua Zhang

    2018-02-01

    Full Text Available Coded Aperture Compressive Temporal Imaging (CACTI can afford low-cost temporal super-resolution (SR, but limits are imposed by noise and compression ratio on reconstruction quality. To utilize inter-frame redundant information from multiple observations and sparsity in multi-transform domains, a robust reconstruction approach based on maximum a posteriori probability and Markov random field (MAP-MRF model for CACTI is proposed. The proposed approach adopts a weighted 3D neighbor system (WNS and the coordinate descent method to perform joint estimation of model parameters, to achieve the robust super-resolution reconstruction. The proposed multi-reconstruction algorithm considers both total variation (TV and ℓ 2 , 1 norm in wavelet domain to address the minimization problem for compressive sensing, and solves it using an accelerated generalized alternating projection algorithm. The weighting coefficient for different regularizations and frames is resolved by the motion characteristics of pixels. The proposed approach can provide high visual quality in the foreground and background of a scene simultaneously and enhance the fidelity of the reconstruction results. Simulation results have verified the efficacy of our new optimization framework and the proposed reconstruction approach.

  8. Development of acoustic model-based iterative reconstruction technique for thick-concrete imaging

    Science.gov (United States)

    Almansouri, Hani; Clayton, Dwight; Kisner, Roger; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2016-02-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.1

  9. Reference Information Based Remote Sensing Image Reconstruction with Generalized Nonconvex Low-Rank Approximation

    Directory of Open Access Journals (Sweden)

    Hongyang Lu

    2016-06-01

    Full Text Available Because of the contradiction between the spatial and temporal resolution of remote sensing images (RSI and quality loss in the process of acquisition, it is of great significance to reconstruct RSI in remote sensing applications. Recent studies have demonstrated that reference image-based reconstruction methods have great potential for higher reconstruction performance, while lacking accuracy and quality of reconstruction. For this application, a new compressed sensing objective function incorporating a reference image as prior information is developed. We resort to the reference prior information inherent in interior and exterior data simultaneously to build a new generalized nonconvex low-rank approximation framework for RSI reconstruction. Specifically, the innovation of this paper consists of the following three respects: (1 we propose a nonconvex low-rank approximation for reconstructing RSI; (2 we inject reference prior information to overcome over smoothed edges and texture detail losses; (3 on this basis, we combine conjugate gradient algorithms and a single-value threshold (SVT simultaneously to solve the proposed algorithm. The performance of the algorithm is evaluated both qualitatively and quantitatively. Experimental results demonstrate that the proposed algorithm improves several dBs in terms of peak signal to noise ratio (PSNR and preserves image details significantly compared to most of the current approaches without reference images as priors. In addition, the generalized nonconvex low-rank approximation of our approach is naturally robust to noise, and therefore, the proposed algorithm can handle low resolution with noisy inputs in a more unified framework.

  10. 3.5D dynamic PET image reconstruction incorporating kinetics-based clusters

    International Nuclear Information System (INIS)

    Lu Lijun; Chen Wufan; Karakatsanis, Nicolas A; Rahmim, Arman; Tang Jing

    2012-01-01

    Standard 3D dynamic positron emission tomographic (PET) imaging consists of independent image reconstructions of individual frames followed by application of appropriate kinetic model to the time activity curves at the voxel or region-of-interest (ROI). The emerging field of 4D PET reconstruction, by contrast, seeks to move beyond this scheme and incorporate information from multiple frames within the image reconstruction task. Here we propose a novel reconstruction framework aiming to enhance quantitative accuracy of parametric images via introduction of priors based on voxel kinetics, as generated via clustering of preliminary reconstructed dynamic images to define clustered neighborhoods of voxels with similar kinetics. This is then followed by straightforward maximum a posteriori (MAP) 3D PET reconstruction as applied to individual frames; and as such the method is labeled ‘3.5D’ image reconstruction. The use of cluster-based priors has the advantage of further enhancing quantitative performance in dynamic PET imaging, because: (a) there are typically more voxels in clusters than in conventional local neighborhoods, and (b) neighboring voxels with distinct kinetics are less likely to be clustered together. Using realistic simulated 11 C-raclopride dynamic PET data, the quantitative performance of the proposed method was investigated. Parametric distribution-volume (DV) and DV ratio (DVR) images were estimated from dynamic image reconstructions using (a) maximum-likelihood expectation maximization (MLEM), and MAP reconstructions using (b) the quadratic prior (QP-MAP), (c) the Green prior (GP-MAP) and (d, e) two proposed cluster-based priors (CP-U-MAP and CP-W-MAP), followed by graphical modeling, and were qualitatively and quantitatively compared for 11 ROIs. Overall, the proposed dynamic PET reconstruction methodology resulted in substantial visual as well as quantitative accuracy improvements (in terms of noise versus bias performance) for parametric DV

  11. Efficient L1 regularization-based reconstruction for fluorescent molecular tomography using restarted nonlinear conjugate gradient.

    Science.gov (United States)

    Shi, Junwei; Zhang, Bin; Liu, Fei; Luo, Jianwen; Bai, Jing

    2013-09-15

    For the ill-posed fluorescent molecular tomography (FMT) inverse problem, the L1 regularization can protect the high-frequency information like edges while effectively reduce the image noise. However, the state-of-the-art L1 regularization-based algorithms for FMT reconstruction are expensive in memory, especially for large-scale problems. An efficient L1 regularization-based reconstruction algorithm based on nonlinear conjugate gradient with restarted strategy is proposed to increase the computational speed with low memory consumption. The reconstruction results from phantom experiments demonstrate that the proposed algorithm can obtain high spatial resolution and high signal-to-noise ratio, as well as high localization accuracy for fluorescence targets.

  12. A Hybrid Model Based on Wavelet Decomposition-Reconstruction in Track Irregularity State Forecasting

    Directory of Open Access Journals (Sweden)

    Chaolong Jia

    2015-01-01

    Full Text Available Wavelet is able to adapt to the requirements of time-frequency signal analysis automatically and can focus on any details of the signal and then decompose the function into the representation of a series of simple basis functions. It is of theoretical and practical significance. Therefore, this paper does subdivision on track irregularity time series based on the idea of wavelet decomposition-reconstruction and tries to find the best fitting forecast model of detail signal and approximate signal obtained through track irregularity time series wavelet decomposition, respectively. On this ideology, piecewise gray-ARMA recursive based on wavelet decomposition and reconstruction (PG-ARMARWDR and piecewise ANN-ARMA recursive based on wavelet decomposition and reconstruction (PANN-ARMARWDR models are proposed. Comparison and analysis of two models have shown that both these models can achieve higher accuracy.

  13. Anterior Cranial Base Reconstruction with a Reverse Temporalis Muscle Flap and Calvarial Bone Graft

    Directory of Open Access Journals (Sweden)

    Seung Gee Kwon

    2012-07-01

    Full Text Available BackgroundCranial base defects are challenging to reconstruct without serious complications. Although free tissue transfer has been used widely and efficiently, it still has the limitation of requiring a long operation time along with the burden of microanastomosis and donor site morbidity. We propose using a reverse temporalis muscle flap and calvarial bone graft as an alternative option to a free flap for anterior cranial base reconstruction.MethodsBetween April 2009 and February 2012, cranial base reconstructions using an autologous calvarial split bone graft combined with a reverse temporalis muscle flap were performed in five patients. Medical records were retrospectively analyzed and postoperative computed tomography scans, magnetic resonance imaging, and angiography findings were examined to evaluate graft survival and flap viability.ResultsThe mean follow-up period was 11.8 months and the mean operation time for reconstruction was 8.4±3.36 hours. The defects involved the anterior cranial base, including the orbital roof and the frontal and ethmoidal sinus. All reconstructions were successful. Viable flap vascularity and bone survival were observed. There were no serious complications except for acceptable donor site depressions, which were easily corrected with minor procedures.ConclusionsThe reverse temporalis muscle flap could provide sufficient bulkiness to fill dead space and sufficient vascularity to endure infection. The calvarial bone graft provides a rigid framework, which is critical for maintaining the cranial base structure. Combined anterior cranial base reconstruction with a reverse temporalis muscle flap and calvarial bone graft could be a viable alternative to free tissue transfer.

  14. Sample selection based on kernel-subclustering for the signal reconstruction of multifunctional sensors

    International Nuclear Information System (INIS)

    Wang, Xin; Wei, Guo; Sun, Jinwei

    2013-01-01

    The signal reconstruction methods based on inverse modeling for the signal reconstruction of multifunctional sensors have been widely studied in recent years. To improve the accuracy, the reconstruction methods have become more and more complicated because of the increase in the model parameters and sample points. However, there is another factor that affects the reconstruction accuracy, the position of the sample points, which has not been studied. A reasonable selection of the sample points could improve the signal reconstruction quality in at least two ways: improved accuracy with the same number of sample points or the same accuracy obtained with a smaller number of sample points. Both ways are valuable for improving the accuracy and decreasing the workload, especially for large batches of multifunctional sensors. In this paper, we propose a sample selection method based on kernel-subclustering distill groupings of the sample data and produce the representation of the data set for inverse modeling. The method calculates the distance between two data points based on the kernel-induced distance instead of the conventional distance. The kernel function is a generalization of the distance metric by mapping the data that are non-separable in the original space into homogeneous groups in the high-dimensional space. The method obtained the best results compared with the other three methods in the simulation. (paper)

  15. A volume of fluid method based on multidimensional advection and spline interface reconstruction

    International Nuclear Information System (INIS)

    Lopez, J.; Hernandez, J.; Gomez, P.; Faura, F.

    2004-01-01

    A new volume of fluid method for tracking two-dimensional interfaces is presented. The method involves a multidimensional advection algorithm based on the use of edge-matched flux polygons to integrate the volume fraction evolution equation, and a spline-based reconstruction algorithm. The accuracy and efficiency of the proposed method are analyzed using different tests, and the results are compared with those obtained recently by other authors. Despite its simplicity, the proposed method represents a significant improvement, and compares favorably with other volume of fluid methods as regards the accuracy and efficiency of both the advection and reconstruction steps

  16. Color Doppler Ultrasonography-Targeted Perforator Mapping and Angiosome-Based Flap Reconstruction

    DEFF Research Database (Denmark)

    Gunnarsson, Gudjon Leifur; Tei, Troels; Thomsen, Jørn Bo

    2016-01-01

    Knowledge about perforators and angiosomes has inspired new and innovative flap designs for reconstruction of defects throughout the body. The purpose of this article is to share our experience using color Doppler ultrasonography (CDU)-targeted perforator mapping and angiosome-based flap reconstr......Knowledge about perforators and angiosomes has inspired new and innovative flap designs for reconstruction of defects throughout the body. The purpose of this article is to share our experience using color Doppler ultrasonography (CDU)-targeted perforator mapping and angiosome-based flap...

  17. Surface reconstruction and deformation monitoring of stratospheric airship based on laser scanning technology

    Science.gov (United States)

    Guo, Kai; Xie, Yongjie; Ye, Hu; Zhang, Song; Li, Yunfei

    2018-04-01

    Due to the uncertainty of stratospheric airship's shape and the security problem caused by the uncertainty, surface reconstruction and surface deformation monitoring of airship was conducted based on laser scanning technology and a √3-subdivision scheme based on Shepard interpolation was developed. Then, comparison was conducted between our subdivision scheme and the original √3-subdivision scheme. The result shows our subdivision scheme could reduce the shrinkage of surface and the number of narrow triangles. In addition, our subdivision scheme could keep the sharp features. So, surface reconstruction and surface deformation monitoring of airship could be conducted precisely by our subdivision scheme.

  18. Computed tomography depiction of small pediatric vessels with model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Koc, Gonca; Courtier, Jesse L.; Phelps, Andrew; Marcovici, Peter A.; MacKenzie, John D. [UCSF Benioff Children' s Hospital, Department of Radiology and Biomedical Imaging, San Francisco, CA (United States)

    2014-07-15

    Computed tomography (CT) is extremely important in characterizing blood vessel anatomy and vascular lesions in children. Recent advances in CT reconstruction technology hold promise for improved image quality and also reductions in radiation dose. This report evaluates potential improvements in image quality for the depiction of small pediatric vessels with model-based iterative reconstruction (Veo trademark), a technique developed to improve image quality and reduce noise. To evaluate Veo trademark as an improved method when compared to adaptive statistical iterative reconstruction (ASIR trademark) for the depiction of small vessels on pediatric CT. Seventeen patients (mean age: 3.4 years, range: 2 days to 10.0 years; 6 girls, 11 boys) underwent contrast-enhanced CT examinations of the chest and abdomen in this HIPAA compliant and institutional review board approved study. Raw data were reconstructed into separate image datasets using Veo trademark and ASIR trademark algorithms (GE Medical Systems, Milwaukee, WI). Four blinded radiologists subjectively evaluated image quality. The pulmonary, hepatic, splenic and renal arteries were evaluated for the length and number of branches depicted. Datasets were compared with parametric and non-parametric statistical tests. Readers stated a preference for Veo trademark over ASIR trademark images when subjectively evaluating image quality criteria for vessel definition, image noise and resolution of small anatomical structures. The mean image noise in the aorta and fat was significantly less for Veo trademark vs. ASIR trademark reconstructed images. Quantitative measurements of mean vessel lengths and number of branches vessels delineated were significantly different for Veo trademark and ASIR trademark images. Veo trademark consistently showed more of the vessel anatomy: longer vessel length and more branching vessels. When compared to the more established adaptive statistical iterative reconstruction algorithm, model-based

  19. ℓ0 Gradient Minimization Based Image Reconstruction for Limited-Angle Computed Tomography.

    Directory of Open Access Journals (Sweden)

    Wei Yu

    Full Text Available In medical and industrial applications of computed tomography (CT imaging, limited by the scanning environment and the risk of excessive X-ray radiation exposure imposed to the patients, reconstructing high quality CT images from limited projection data has become a hot topic. X-ray imaging in limited scanning angular range is an effective imaging modality to reduce the radiation dose to the patients. As the projection data available in this modality are incomplete, limited-angle CT image reconstruction is actually an ill-posed inverse problem. To solve the problem, image reconstructed by conventional filtered back projection (FBP algorithm frequently results in conspicuous streak artifacts and gradual changed artifacts nearby edges. Image reconstruction based on total variation minimization (TVM can significantly reduce streak artifacts in few-view CT, but it suffers from the gradual changed artifacts nearby edges in limited-angle CT. To suppress this kind of artifacts, we develop an image reconstruction algorithm based on ℓ0 gradient minimization for limited-angle CT in this paper. The ℓ0-norm of the image gradient is taken as the regularization function in the framework of developed reconstruction model. We transformed the optimization problem into a few optimization sub-problems and then, solved these sub-problems in the manner of alternating iteration. Numerical experiments are performed to validate the efficiency and the feasibility of the developed algorithm. From the statistical analysis results of the performance evaluations peak signal-to-noise ratio (PSNR and normalized root mean square distance (NRMSD, it shows that there are significant statistical differences between different algorithms from different scanning angular ranges (p<0.0001. From the experimental results, it also indicates that the developed algorithm outperforms classical reconstruction algorithms in suppressing the streak artifacts and the gradual changed

  20. Computed tomography depiction of small pediatric vessels with model-based iterative reconstruction

    International Nuclear Information System (INIS)

    Koc, Gonca; Courtier, Jesse L.; Phelps, Andrew; Marcovici, Peter A.; MacKenzie, John D.

    2014-01-01

    Computed tomography (CT) is extremely important in characterizing blood vessel anatomy and vascular lesions in children. Recent advances in CT reconstruction technology hold promise for improved image quality and also reductions in radiation dose. This report evaluates potential improvements in image quality for the depiction of small pediatric vessels with model-based iterative reconstruction (Veo trademark), a technique developed to improve image quality and reduce noise. To evaluate Veo trademark as an improved method when compared to adaptive statistical iterative reconstruction (ASIR trademark) for the depiction of small vessels on pediatric CT. Seventeen patients (mean age: 3.4 years, range: 2 days to 10.0 years; 6 girls, 11 boys) underwent contrast-enhanced CT examinations of the chest and abdomen in this HIPAA compliant and institutional review board approved study. Raw data were reconstructed into separate image datasets using Veo trademark and ASIR trademark algorithms (GE Medical Systems, Milwaukee, WI). Four blinded radiologists subjectively evaluated image quality. The pulmonary, hepatic, splenic and renal arteries were evaluated for the length and number of branches depicted. Datasets were compared with parametric and non-parametric statistical tests. Readers stated a preference for Veo trademark over ASIR trademark images when subjectively evaluating image quality criteria for vessel definition, image noise and resolution of small anatomical structures. The mean image noise in the aorta and fat was significantly less for Veo trademark vs. ASIR trademark reconstructed images. Quantitative measurements of mean vessel lengths and number of branches vessels delineated were significantly different for Veo trademark and ASIR trademark images. Veo trademark consistently showed more of the vessel anatomy: longer vessel length and more branching vessels. When compared to the more established adaptive statistical iterative reconstruction algorithm, model-based

  1. Evaluation of knowledge-based reconstruction for magnetic resonance volumetry of the right ventricle in tetralogy of Fallot

    International Nuclear Information System (INIS)

    Nyns, Emile Christian Arie; Dragulescu, Andreea; Yoo, Shi-Joon; Grosse-Wortmann, Lars

    2014-01-01

    Cardiac magnetic resonance using the Simpson method is the gold standard for right ventricular volumetry. However, this method is time-consuming and not without sources of error. Knowledge-based reconstruction is a novel post-processing approach that reconstructs the right ventricular endocardial shape based on anatomical landmarks and a database of various right ventricular configurations. To assess the feasibility, accuracy and labor intensity of knowledge-based reconstruction in repaired tetralogy of Fallot (TOF). The short-axis cine cardiac MR datasets of 35 children and young adults (mean age 14.4 ± 2.5 years) after TOF repair were studied using both knowledge-based reconstruction and the Simpson method. Intraobserver, interobserver and inter-method variability were assessed using Bland-Altman analyses. Knowledge-based reconstruction was feasible and highly accurate as compared to the Simpson method. Intra- and inter-method variability for knowledge-based reconstruction measurements showed good agreement. Volumetric assessment using knowledge-based reconstruction was faster when compared with the Simpson method (10.9 ± 2.0 vs. 7.1 ± 2.4 min, P < 0.001). In patients with repaired tetralogy of Fallot, knowledge-based reconstruction is a feasible, accurate and reproducible method for measuring right ventricular volumes and ejection fraction. The post-processing time of right ventricular volumetry using knowledge-based reconstruction was significantly shorter when compared with the routine Simpson method. (orig.)

  2. Evaluation of knowledge-based reconstruction for magnetic resonance volumetry of the right ventricle in tetralogy of Fallot

    Energy Technology Data Exchange (ETDEWEB)

    Nyns, Emile Christian Arie; Dragulescu, Andreea [University of Toronto, The Labatt Family Heart Centre, The Hospital for Sick Children, Toronto (Canada); Yoo, Shi-Joon; Grosse-Wortmann, Lars [University of Toronto, The Labatt Family Heart Centre, The Hospital for Sick Children, Toronto (Canada); University of Toronto, Department of Diagnostic Imaging, The Hospital for Sick Children, Toronto (Canada)

    2014-12-15

    Cardiac magnetic resonance using the Simpson method is the gold standard for right ventricular volumetry. However, this method is time-consuming and not without sources of error. Knowledge-based reconstruction is a novel post-processing approach that reconstructs the right ventricular endocardial shape based on anatomical landmarks and a database of various right ventricular configurations. To assess the feasibility, accuracy and labor intensity of knowledge-based reconstruction in repaired tetralogy of Fallot (TOF). The short-axis cine cardiac MR datasets of 35 children and young adults (mean age 14.4 ± 2.5 years) after TOF repair were studied using both knowledge-based reconstruction and the Simpson method. Intraobserver, interobserver and inter-method variability were assessed using Bland-Altman analyses. Knowledge-based reconstruction was feasible and highly accurate as compared to the Simpson method. Intra- and inter-method variability for knowledge-based reconstruction measurements showed good agreement. Volumetric assessment using knowledge-based reconstruction was faster when compared with the Simpson method (10.9 ± 2.0 vs. 7.1 ± 2.4 min, P < 0.001). In patients with repaired tetralogy of Fallot, knowledge-based reconstruction is a feasible, accurate and reproducible method for measuring right ventricular volumes and ejection fraction. The post-processing time of right ventricular volumetry using knowledge-based reconstruction was significantly shorter when compared with the routine Simpson method. (orig.)

  3. Linking community, parenting, and depressive symptom trajectories: testing resilience models of adolescent agency based on race/ethnicity and gender.

    Science.gov (United States)

    Williams, Amanda L; Merten, Michael J

    2014-09-01

    Family stress models illustrate how communities affect youth outcomes through effects on parents and studies consistently show the enduring effects of early community context. The present study takes a different approach identifying human agency during adolescence as a potentially significant promotive factor mediating the relationship between community, parenting, and mental health. While agency is an important part of resilience, its longitudinal effects are unknown, particularly based on gender and race/ethnicity. The purpose of this research was to model the long-term effects of community structural adversity and social resources as predictors of adolescent depressive symptom trajectories via indirect effects of parental happiness, parent-child relationships, and human agency. Latent growth analyses were conducted with 1,796 participants (53% female; 56% White) across four waves of the National Longitudinal Study of Adolescent Health spanning adolescence (Wave 1) through adulthood (Wave 4). The results identified agency as an important promotive factor during adolescence with long-term mental health benefits, but only for White and male participants. For these individuals, community social resources and the quality of the parent-child relationship were related to higher levels of agency and more positive mental health trajectories. Although community social resources similarly benefitted parenting and agency among females and non-White participants, there were no significant links between agency and depressive symptoms for these youth. The results suggest that agency remains an important, but poorly understood concept and additional work is necessary to continue unpacking its meaning for diverse groups of youth.

  4. Bio-inspired varying subspace based computational framework for a class of nonlinear constrained optimal trajectory planning problems.

    Science.gov (United States)

    Xu, Y; Li, N

    2014-09-01

    Biological species have produced many simple but efficient rules in their complex and critical survival activities such as hunting and mating. A common feature observed in several biological motion strategies is that the predator only moves along paths in a carefully selected or iteratively refined subspace (or manifold), which might be able to explain why these motion strategies are effective. In this paper, a unified linear algebraic formulation representing such a predator-prey relationship is developed to simplify the construction and refinement process of the subspace (or manifold). Specifically, the following three motion strategies are studied and modified: motion camouflage, constant absolute target direction and local pursuit. The framework constructed based on this varying subspace concept could significantly reduce the computational cost in solving a class of nonlinear constrained optimal trajectory planning problems, particularly for the case with severe constraints. Two non-trivial examples, a ground robot and a hypersonic aircraft trajectory optimization problem, are used to show the capabilities of the algorithms in this new computational framework.

  5. Bio-inspired varying subspace based computational framework for a class of nonlinear constrained optimal trajectory planning problems

    International Nuclear Information System (INIS)

    Xu, Y; Li, N

    2014-01-01

    Biological species have produced many simple but efficient rules in their complex and critical survival activities such as hunting and mating. A common feature observed in several biological motion strategies is that the predator only moves along paths in a carefully selected or iteratively refined subspace (or manifold), which might be able to explain why these motion strategies are effective. In this paper, a unified linear algebraic formulation representing such a predator–prey relationship is developed to simplify the construction and refinement process of the subspace (or manifold). Specifically, the following three motion strategies are studied and modified: motion camouflage, constant absolute target direction and local pursuit. The framework constructed based on this varying subspace concept could significantly reduce the computational cost in solving a class of nonlinear constrained optimal trajectory planning problems, particularly for the case with severe constraints. Two non-trivial examples, a ground robot and a hypersonic aircraft trajectory optimization problem, are used to show the capabilities of the algorithms in this new computational framework. (paper)

  6. High-SNR spectrum measurement based on Hadamard encoding and sparse reconstruction

    Science.gov (United States)

    Wang, Zhaoxin; Yue, Jiang; Han, Jing; Li, Long; Jin, Yong; Gao, Yuan; Li, Baoming

    2017-12-01

    The denoising capabilities of the H-matrix and cyclic S-matrix based on the sparse reconstruction, employed in the Pixel of Focal Plane Coded Visible Spectrometer for spectrum measurement are investigated, where the spectrum is sparse in a known basis. In the measurement process, the digital micromirror device plays an important role, which implements the Hadamard coding. In contrast with Hadamard transform spectrometry, based on the shift invariability, this spectrometer may have the advantage of a high efficiency. Simulations and experiments show that the nonlinear solution with a sparse reconstruction has a better signal-to-noise ratio than the linear solution and the H-matrix outperforms the cyclic S-matrix whether the reconstruction method is nonlinear or linear.

  7. Feature reconstruction of LFP signals based on PLSR in the neural information decoding study.

    Science.gov (United States)

    Yonghui Dong; Zhigang Shang; Mengmeng Li; Xinyu Liu; Hong Wan

    2017-07-01

    To solve the problems of Signal-to-Noise Ratio (SNR) and multicollinearity when the Local Field Potential (LFP) signals is used for the decoding of animal motion intention, a feature reconstruction of LFP signals based on partial least squares regression (PLSR) in the neural information decoding study is proposed in this paper. Firstly, the feature information of LFP coding band is extracted based on wavelet transform. Then the PLSR model is constructed by the extracted LFP coding features. According to the multicollinearity characteristics among the coding features, several latent variables which contribute greatly to the steering behavior are obtained, and the new LFP coding features are reconstructed. Finally, the K-Nearest Neighbor (KNN) method is used to classify the reconstructed coding features to verify the decoding performance. The results show that the proposed method can achieve the highest accuracy compared to the other three methods and the decoding effect of the proposed method is robust.

  8. SENSOR-TOPOLOGY BASED SIMPLICIAL COMPLEX RECONSTRUCTION FROM MOBILE LASER SCANNING

    Directory of Open Access Journals (Sweden)

    S. Guinard

    2018-05-01

    Full Text Available We propose a new method for the reconstruction of simplicial complexes (combining points, edges and triangles from 3D point clouds from Mobile Laser Scanning (MLS. Our main goal is to produce a reconstruction of a scene that is adapted to the local geometry of objects. Our method uses the inherent topology of the MLS sensor to define a spatial adjacency relationship between points. We then investigate each possible connexion between adjacent points and filter them by searching collinear structures in the scene, or structures perpendicular to the laser beams. Next, we create triangles for each triplet of self-connected edges. Last, we improve this method with a regularization based on the co-planarity of triangles and collinearity of remaining edges. We compare our results to a naive simplicial complexes reconstruction based on edge length.

  9. Sensor-Topology Based Simplicial Complex Reconstruction from Mobile Laser Scanning

    Science.gov (United States)

    Guinard, S.; Vallet, B.

    2018-05-01

    We propose a new method for the reconstruction of simplicial complexes (combining points, edges and triangles) from 3D point clouds from Mobile Laser Scanning (MLS). Our main goal is to produce a reconstruction of a scene that is adapted to the local geometry of objects. Our method uses the inherent topology of the MLS sensor to define a spatial adjacency relationship between points. We then investigate each possible connexion between adjacent points and filter them by searching collinear structures in the scene, or structures perpendicular to the laser beams. Next, we create triangles for each triplet of self-connected edges. Last, we improve this method with a regularization based on the co-planarity of triangles and collinearity of remaining edges. We compare our results to a naive simplicial complexes reconstruction based on edge length.

  10. A Total Variation Regularization Based Super-Resolution Reconstruction Algorithm for Digital Video

    Directory of Open Access Journals (Sweden)

    Zhang Liangpei

    2007-01-01

    Full Text Available Super-resolution (SR reconstruction technique is capable of producing a high-resolution image from a sequence of low-resolution images. In this paper, we study an efficient SR algorithm for digital video. To effectively deal with the intractable problems in SR video reconstruction, such as inevitable motion estimation errors, noise, blurring, missing regions, and compression artifacts, the total variation (TV regularization is employed in the reconstruction model. We use the fixed-point iteration method and preconditioning techniques to efficiently solve the associated nonlinear Euler-Lagrange equations of the corresponding variational problem in SR. The proposed algorithm has been tested in several cases of motion and degradation. It is also compared with the Laplacian regularization-based SR algorithm and other TV-based SR algorithms. Experimental results are presented to illustrate the effectiveness of the proposed algorithm.

  11. Restoring the lattice of Si-based atom probe reconstructions for enhanced information on dopant positioning.

    Science.gov (United States)

    Breen, Andrew J; Moody, Michael P; Ceguerra, Anna V; Gault, Baptiste; Araullo-Peters, Vicente J; Ringer, Simon P

    2015-12-01

    The following manuscript presents a novel approach for creating lattice based models of Sb-doped Si directly from atom probe reconstructions for the purposes of improving information on dopant positioning and directly informing quantum mechanics based materials modeling approaches. Sophisticated crystallographic analysis techniques are used to detect latent crystal structure within the atom probe reconstructions with unprecedented accuracy. A distortion correction algorithm is then developed to precisely calibrate the detected crystal structure to the theoretically known diamond cubic lattice. The reconstructed atoms are then positioned on their most likely lattice positions. Simulations are then used to determine the accuracy of such an approach and show that improvements to short-range order measurements are possible for noise levels and detector efficiencies comparable with experimentally collected atom probe data. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Does Acellular Dermal Matrix Thickness Affect Complication Rate in Tissue Expander Based Breast Reconstruction?

    Directory of Open Access Journals (Sweden)

    Jessica F. Rose

    2016-01-01

    Full Text Available Background. While the benefits of using acellular dermal matrices (ADMs in breast reconstruction are well described, their use has been associated with additional complications. The purpose of this study was to determine if ADM thickness affects complications in breast reconstruction. Methods. A retrospective chart review was performed including all tissue expander based breast reconstructions with AlloDerm (LifeCell, Branchburg, NJ over 4 years. We evaluated preoperative characteristics and assessed postoperative complications including seroma, hematoma, infection, skin necrosis, and need for reintervention. We reviewed ADM thickness and time to Jackson-Pratt (JP drain removal. Results. Fifty-five patients underwent 77 ADM-associated tissue expander based breast reconstructions, with average age of 48.1 years and average BMI of 25.9. Average ADM thickness was 1.21 mm. We found higher complication rates in the thick ADM group. Significant associations were found between smokers and skin necrosis (p<0.0001 and seroma and prolonged JP drainage (p=0.0004; radiated reconstructed breasts were more likely to suffer infections (p=0.0085, and elevated BMI is a significant predictor for increased infection rate (p=0.0037. Conclusion. We found a trend toward increased complication rates with thicker ADMs. In the future, larger prospective studies evaluating thickness may provide more information.

  13. Comparing and improving reconstruction methods for proxies based on compositional data

    Science.gov (United States)

    Nolan, C.; Tipton, J.; Booth, R.; Jackson, S. T.; Hooten, M.

    2017-12-01

    Many types of studies in paleoclimatology and paleoecology involve compositional data. Often, these studies aim to use compositional data to reconstruct an environmental variable of interest; the reconstruction is usually done via the development of a transfer function. Transfer functions have been developed using many different methods. Existing methods tend to relate the compositional data and the reconstruction target in very simple ways. Additionally, the results from different methods are rarely compared. Here we seek to address these two issues. First, we introduce a new hierarchical Bayesian multivariate gaussian process model; this model allows for the relationship between each species in the compositional dataset and the environmental variable to be modeled in a way that captures the underlying complexities. Then, we compare this new method to machine learning techniques and commonly used existing methods. The comparisons are based on reconstructing the water table depth history of Caribou Bog (an ombrotrophic Sphagnum peat bog in Old Town, Maine, USA) from a new 7500 year long record of testate amoebae assemblages. The resulting reconstructions from different methods diverge in both their resulting means and uncertainties. In particular, uncertainty tends to be drastically underestimated by some common methods. These results will help to improve inference of water table depth from testate amoebae. Furthermore, this approach can be applied to test and improve inferences of past environmental conditions from a broad array of paleo-proxies based on compositional data

  14. Comparison of Outcomes in Immediate Implant-Based Breast Reconstruction Versus Mastectomy Alone.

    Science.gov (United States)

    Sousa, Janelle; Sood, Ravi; Liu, Daniel; Calhoun, Kristine; Louie, Otway; Neligan, Peter; Said, Hakim; Mathes, David

    2018-02-01

    Immediate implant-based techniques are common practice in post-mastectomy breast reconstruction. Previous studies have shown an increased complication rate in the setting of immediate versus delayed, MD reconstruction. We aimed to quantify any additional risk in complications when implant-based immediate breast reconstruction (IBR) is performed versus mastectomy alone. We retrospectively reviewed all IBR cases and all mastectomies without reconstruction from 2007 to 2011. Patient characteristics, operative details, and complication rates were reviewed and analyzed. IBR was performed in 315 consecutive women; mastectomy alone was performed in 401 women. Patients undergoing mastectomy alone were more often older, diabetic, and more frequently underwent neoadjuvant chemotherapy or radiation. Overall complications were higher in the IBR group, most commonly reoperation and delayed wound healing. In a multivariate analysis, IBR, increasing age, body mass index, history of radiation therapy, smoking, and nipple-sparing mastectomy were independently associated with increased risk of complications. However, IBR was only independently associated with increased risk of major complications such as reoperation or readmission for intravenous antibiotics, not minor complications. Patients selected for IBR are inherently different than those undergoing mastectomy alone. After adjusting for these differences, the increased risk of complications seen in IBR is moderately increased over the risk of complications in mastectomy alone. The observed increased risk of major complications after IBR is largely due to the aggressive management of complications in the setting of a prosthetic implant. IBR is a safe reconstructive strategy with only a slightly increased risk over mastectomy alone.

  15. Interleaved EPI diffusion imaging using SPIRiT-based reconstruction with virtual coil compression.

    Science.gov (United States)

    Dong, Zijing; Wang, Fuyixue; Ma, Xiaodong; Zhang, Zhe; Dai, Erpeng; Yuan, Chun; Guo, Hua

    2018-03-01

    To develop a novel diffusion imaging reconstruction framework based on iterative self-consistent parallel imaging reconstruction (SPIRiT) for multishot interleaved echo planar imaging (iEPI), with computation acceleration by virtual coil compression. As a general approach for autocalibrating parallel imaging, SPIRiT improves the performance of traditional generalized autocalibrating partially parallel acquisitions (GRAPPA) methods in that the formulation with self-consistency is better conditioned, suggesting SPIRiT to be a better candidate in k-space-based reconstruction. In this study, a general SPIRiT framework is adopted to incorporate both coil sensitivity and phase variation information as virtual coils and then is applied to 2D navigated iEPI diffusion imaging. To reduce the reconstruction time when using a large number of coils and shots, a novel shot-coil compression method is proposed for computation acceleration in Cartesian sampling. Simulations and in vivo experiments were conducted to evaluate the performance of the proposed method. Compared with the conventional coil compression, the shot-coil compression achieved higher compression rates with reduced errors. The simulation and in vivo experiments demonstrate that the SPIRiT-based reconstruction outperformed the existing method, realigned GRAPPA, and provided superior images with reduced artifacts. The SPIRiT-based reconstruction with virtual coil compression is a reliable method for high-resolution iEPI diffusion imaging. Magn Reson Med 79:1525-1531, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  16. Bohmian mechanics with complex action: A new trajectory-based formulation of quantum mechanics

    International Nuclear Information System (INIS)

    Goldfarb, Yair; Degani, Ilan; Tannor, David J.

    2006-01-01

    In recent years there has been a resurgence of interest in Bohmian mechanics as a numerical tool because of its local dynamics, which suggest the possibility of significant computational advantages for the simulation of large quantum systems. However, closer inspection of the Bohmian formulation reveals that the nonlocality of quantum mechanics has not disappeared--it has simply been swept under the rug into the quantum force. In this paper we present a new formulation of Bohmian mechanics in which the quantum action, S, is taken to be complex. This leads to a single equation for complex S, and ultimately complex x and p but there is a reward for this complexification - a significantly higher degree of localization. The quantum force in the new approach vanishes for Gaussian wave packet dynamics, and its effect on barrier tunneling processes is orders of magnitude lower than that of the classical force. In fact, the current method is shown to be a rigorous extension of generalized Gaussian wave packet dynamics to give exact quantum mechanics. We demonstrate tunneling probabilities that are in virtually perfect agreement with the exact quantum mechanics down to 10 -7 calculated from strictly localized quantum trajectories that do not communicate with their neighbors. The new formulation may have significant implications for fundamental quantum mechanics, ranging from the interpretation of non-locality to measures of quantum complexity

  17. Establishing Base Elements of Perspective in Order to Reconstruct Architectural Buildings from Photographs

    Science.gov (United States)

    Dzwierzynska, Jolanta

    2017-12-01

    The use of perspective images, especially historical photographs for retrieving information about presented architectural environment is a fast developing field recently. The photography image is a perspective image with secure geometrical connection with reality, therefore it is possible to reverse this process. The aim of the herby study is establishing requirements which a photographic perspective representation should meet for a reconstruction purpose, as well as determination of base elements of perspective such as a horizon line and a circle of depth, which is a key issue in any reconstruction. The starting point in the reconstruction process is geometrical analysis of the photograph, especially determination of the kind of perspective projection applied, which is defined by the building location towards a projection plane. Next, proper constructions can be used. The paper addresses the problem of establishing base elements of perspective on the basis of the photograph image in the case when camera calibration is impossible to establish. It presents different geometric construction methods selected dependently on the starting assumptions. Therefore, the methods described in the paper seem to be universal. Moreover, they can be used even in the case of poor quality photographs with poor perspective geometry. Such constructions can be realized with computer aid when the photographs are in digital form as it is presented in the paper. The accuracy of the applied methods depends on the photography image accuracy, as well as drawing accuracy, however, it is sufficient for further reconstruction. Establishing base elements of perspective presented in the paper is especially useful in difficult cases of reconstruction, when one lacks information about reconstructed architectural form and it is necessary to lean on solid geometry.

  18. Optimising Aesthetic Reconstruction of Scalp Soft Tissue by an Algorithm Based on Defect Size and Location.

    Science.gov (United States)

    Ooi, Adrian Sh; Kanapathy, Muholan; Ong, Yee Siang; Tan, Kok Chai; Tan, Bien Keem

    2015-11-01

    Scalp soft tissue defects are common and result from a variety of causes. Reconstructive methods should maximise cosmetic outcomes by maintaining hair-bearing tissue and aesthetic hairlines. This article outlines an algorithm based on a diverse clinical case series to optimise scalp soft tissue coverage. A retrospective analysis of scalp soft tissue reconstruction cases performed at the Singapore General Hospital between January 2004 and December 2013 was conducted. Forty-one patients were included in this study. The majority of defects aesthetic outcome while minimising complications and repeat procedures.

  19. Adaptive tight frame based medical image reconstruction: a proof-of-concept study for computed tomography

    International Nuclear Information System (INIS)

    Zhou, Weifeng; Cai, Jian-Feng; Gao, Hao

    2013-01-01

    A popular approach for medical image reconstruction has been through the sparsity regularization, assuming the targeted image can be well approximated by sparse coefficients under some properly designed system. The wavelet tight frame is such a widely used system due to its capability for sparsely approximating piecewise-smooth functions, such as medical images. However, using a fixed system may not always be optimal for reconstructing a variety of diversified images. Recently, the method based on the adaptive over-complete dictionary that is specific to structures of the targeted images has demonstrated its superiority for image processing. This work is to develop the adaptive wavelet tight frame method image reconstruction. The proposed scheme first constructs the adaptive wavelet tight frame that is task specific, and then reconstructs the image of interest by solving an l 1 -regularized minimization problem using the constructed adaptive tight frame system. The proof-of-concept study is performed for computed tomography (CT), and the simulation results suggest that the adaptive tight frame method improves the reconstructed CT image quality from the traditional tight frame method. (paper)

  20. Metal artifact reduction using a patch-based reconstruction for digital breast tomosynthesis

    Science.gov (United States)

    Borges, Lucas R.; Bakic, Predrag R.; Maidment, Andrew D. A.; Vieira, Marcelo A. C.

    2017-03-01

    Digital breast tomosynthesis (DBT) is rapidly emerging as the main clinical tool for breast cancer screening. Although several reconstruction methods for DBT are described by the literature, one common issue is the interplane artifacts caused by out-of-focus features. For breasts containing highly attenuating features, such as surgical clips and large calcifications, the artifacts are even more apparent and can limit the detection and characterization of lesions by the radiologist. In this work, we propose a novel method of combining backprojected data into tomographic slices using a patch-based approach, commonly used in denoising. Preliminary tests were performed on a geometry phantom and on an anthropomorphic phantom containing metal inserts. The reconstructed images were compared to a commercial reconstruction solution. Qualitative assessment of the reconstructed images provides evidence that the proposed method reduces artifacts while maintaining low noise levels. Objective assessment supports the visual findings. The artifact spread function shows that the proposed method is capable of suppressing artifacts generated by highly attenuating features. The signal difference to noise ratio shows that the noise levels of the proposed and commercial methods are comparable, even though the commercial method applies post-processing filtering steps, which were not implemented on the proposed method. Thus, the proposed method can produce tomosynthesis reconstructions with reduced artifacts and low noise levels.

  1. Development of a North American paleoclimate pollen-based reconstruction database application

    Science.gov (United States)

    Ladd, Matthew; Mosher, Steven; Viau, Andre

    2013-04-01

    Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.

  2. Pollen-based continental climate reconstructions at 6 and 21 ka: a global synthesis

    Energy Technology Data Exchange (ETDEWEB)

    Bartlein, P.J. [University of Oregon, Department of Geography, Eugene, Oregon (United States); Harrison, S.P. [University of Bristol, School of Geographical Sciences, Bristol (United Kingdom); Macquarie University, School of Biological Sciences, North Ryde, NSW (Australia); Brewer, S. [University of Wyoming, Botany Department, Wyoming (United States); Connor, S. [University of the Algarve, Centre for Marine and Environmental Research, Faro (Portugal); Davis, B.A.S. [Ecole Polytechnique Federale de Lausanne, School of Architecture, Civil and Environmental Engineering, Lausanne (Switzerland); Gajewski, K.; Viau, A.E. [University of Ottawa, Department of Geography, Ottawa, ON (Canada); Guiot, J. [CEREGE, Aix-en-Provence cedex 4 (France); Harrison-Prentice, T.I. [GTZ, PAKLIM, Jakarta (Indonesia); Henderson, A. [University of Minnesota, Department of Geology and Geophysics, Minneapolis, MN (United States); Peyron, O. [Laboratoire Chrono-Environnement UMR 6249 CNRS-UFC UFR Sciences et Techniques, Besancon Cedex (France); Prentice, I.C. [Macquarie University, School of Biological Sciences, North Ryde, NSW (Australia); University of Bristol, QUEST, Department of Earth Sciences, Bristol (United Kingdom); Scholze, M. [University of Bristol, QUEST, Department of Earth Sciences, Bristol (United Kingdom); Seppae, H. [University of Helsinki, Department of Geology, P.O. Box 65, Helsinki (Finland); Shuman, B. [University of Wyoming, Department of Geology and Geophysics, Laramie, WY (United States); Sugita, S. [Tallinn University, Institute of Ecology, Tallinn (Estonia); Thompson, R.S. [US Geological Survey, PO Box 25046, Denver, CO (United States); Williams, J. [University of Wisconsin, Department of Geography, Madison, WI (United States); Wu, H. [Chinese Academy of Sciences, Key Laboratory of Cenozoic Geology and Environment, Institute of Geology and Geophysics, Beijing (China)

    2011-08-15

    Subfossil pollen and plant macrofossil data derived from {sup 14}C-dated sediment profiles can provide quantitative information on glacial and interglacial climates. The data allow climate variables related to growing-season warmth, winter cold, and plant-available moisture to be reconstructed. Continental-scale reconstructions have been made for the mid-Holocene (MH, around 6 ka) and Last Glacial Maximum (LGM, around 21 ka), allowing comparison with palaeoclimate simulations currently being carried out as part of the fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change. The synthesis of the available MH and LGM climate reconstructions and their uncertainties, obtained using modern-analogue, regression and model-inversion techniques, is presented for four temperature variables and two moisture variables. Reconstructions of the same variables based on surface-pollen assemblages are shown to be accurate and unbiased. Reconstructed LGM and MH climate anomaly patterns are coherent, consistent between variables, and robust with respect to the choice of technique. They support a conceptual model of the controls of Late Quaternary climate change whereby the first-order effects of orbital variations and greenhouse forcing on the seasonal cycle of temperature are predictably modified by responses of the atmospheric circulation and surface energy balance. (orig.)

  3. L{sub 1/2} regularization based numerical method for effective reconstruction of bioluminescence tomography

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xueli, E-mail: xlchen@xidian.edu.cn, E-mail: jimleung@mail.xidian.edu.cn; Yang, Defu; Zhang, Qitan; Liang, Jimin, E-mail: xlchen@xidian.edu.cn, E-mail: jimleung@mail.xidian.edu.cn [School of Life Science and Technology, Xidian University, Xi' an 710071 (China); Engineering Research Center of Molecular and Neuro Imaging, Ministry of Education (China)

    2014-05-14

    Even though bioluminescence tomography (BLT) exhibits significant potential and wide applications in macroscopic imaging of small animals in vivo, the inverse reconstruction is still a tough problem that has plagued researchers in a related area. The ill-posedness of inverse reconstruction arises from insufficient measurements and modeling errors, so that the inverse reconstruction cannot be solved directly. In this study, an l{sub 1/2} regularization based numerical method was developed for effective reconstruction of BLT. In the method, the inverse reconstruction of BLT was constrained into an l{sub 1/2} regularization problem, and then the weighted interior-point algorithm (WIPA) was applied to solve the problem through transforming it into obtaining the solution of a series of l{sub 1} regularizers. The feasibility and effectiveness of the proposed method were demonstrated with numerical simulations on a digital mouse. Stability verification experiments further illustrated the robustness of the proposed method for different levels of Gaussian noise.

  4. Crime event 3D reconstruction based on incomplete or fragmentary evidence material--case report.

    Science.gov (United States)

    Maksymowicz, Krzysztof; Tunikowski, Wojciech; Kościuk, Jacek

    2014-09-01

    Using our own experience in 3D analysis, the authors will demonstrate the possibilities of 3D crime scene and event reconstruction in cases where originally collected material evidence is largely insufficient. The necessity to repeat forensic evaluation is often down to the emergence of new facts in the course of case proceedings. Even in cases when a crime scene and its surroundings have undergone partial or complete transformation, with regard to elements significant to the course of the case, or when the scene was not satisfactorily secured, it is still possible to reconstruct it in a 3D environment based on the originally-collected, even incomplete, material evidence. In particular cases when no image of the crime scene is available, its partial or even full reconstruction is still potentially feasible. Credibility of evidence for such reconstruction can still satisfy the evidence requirements in court. Reconstruction of the missing elements of the crime scene is still possible with the use of information obtained from current publicly available databases. In the study, we demonstrate that these can include Google Maps(®*), Google Street View(®*) and available construction and architecture archives. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data

    Science.gov (United States)

    Yu, Q.; Helmholz, P.; Belton, D.; West, G.

    2014-04-01

    The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.

  6. Spectral CT metal artifact reduction with an optimization-based reconstruction algorithm

    Science.gov (United States)

    Gilat Schmidt, Taly; Barber, Rina F.; Sidky, Emil Y.

    2017-03-01

    Metal objects cause artifacts in computed tomography (CT) images. This work investigated the feasibility of a spectral CT method to reduce metal artifacts. Spectral CT acquisition combined with optimization-based reconstruction is proposed to reduce artifacts by modeling the physical effects that cause metal artifacts and by providing the flexibility to selectively remove corrupted spectral measurements in the spectral-sinogram space. The proposed Constrained `One-Step' Spectral CT Image Reconstruction (cOSSCIR) algorithm directly estimates the basis material maps while enforcing convex constraints. The incorporation of constraints on the reconstructed basis material maps is expected to mitigate undersampling effects that occur when corrupted data is excluded from reconstruction. The feasibility of the cOSSCIR algorithm to reduce metal artifacts was investigated through simulations of a pelvis phantom. The cOSSCIR algorithm was investigated with and without the use of a third basis material representing metal. The effects of excluding data corrupted by metal were also investigated. The results demonstrated that the proposed cOSSCIR algorithm reduced metal artifacts and improved CT number accuracy. For example, CT number error in a bright shading artifact region was reduced from 403 HU in the reference filtered backprojection reconstruction to 33 HU using the proposed algorithm in simulation. In the dark shading regions, the error was reduced from 1141 HU to 25 HU. Of the investigated approaches, decomposing the data into three basis material maps and excluding the corrupted data demonstrated the greatest reduction in metal artifacts.

  7. A low-count reconstruction algorithm for Compton-based prompt gamma imaging

    Science.gov (United States)

    Huang, Hsuan-Ming; Liu, Chih-Chieh; Jan, Meei-Ling; Lee, Ming-Wei

    2018-04-01

    The Compton camera is an imaging device which has been proposed to detect prompt gammas (PGs) produced by proton–nuclear interactions within tissue during proton beam irradiation. Compton-based PG imaging has been developed to verify proton ranges because PG rays, particularly characteristic ones, have strong correlations with the distribution of the proton dose. However, accurate image reconstruction from characteristic PGs is challenging because the detector efficiency and resolution are generally low. Our previous study showed that point spread functions can be incorporated into the reconstruction process to improve image resolution. In this study, we proposed a low-count reconstruction algorithm to improve the image quality of a characteristic PG emission by pooling information from other characteristic PG emissions. PGs were simulated from a proton beam irradiated on a water phantom, and a two-stage Compton camera was used for PG detection. The results show that the image quality of the reconstructed characteristic PG emission is improved with our proposed method in contrast to the standard reconstruction method using events from only one characteristic PG emission. For the 4.44 MeV PG rays, both methods can be used to predict the positions of the peak and the distal falloff with a mean accuracy of 2 mm. Moreover, only the proposed method can improve the estimated positions of the peak and the distal falloff of 5.25 MeV PG rays, and a mean accuracy of 2 mm can be reached.

  8. Model-based respiratory motion compensation for emission tomography image reconstruction

    International Nuclear Information System (INIS)

    Reyes, M; Malandain, G; Koulibaly, P M; Gonzalez-Ballester, M A; Darcourt, J

    2007-01-01

    In emission tomography imaging, respiratory motion causes artifacts in lungs and cardiac reconstructed images, which lead to misinterpretations, imprecise diagnosis, impairing of fusion with other modalities, etc. Solutions like respiratory gating, correlated dynamic PET techniques, list-mode data based techniques and others have been tested, which lead to improvements over the spatial activity distribution in lungs lesions, but which have the disadvantages of requiring additional instrumentation or the need of discarding part of the projection data used for reconstruction. The objective of this study is to incorporate respiratory motion compensation directly into the image reconstruction process, without any additional acquisition protocol consideration. To this end, we propose an extension to the maximum likelihood expectation maximization (MLEM) algorithm that includes a respiratory motion model, which takes into account the displacements and volume deformations produced by the respiratory motion during the data acquisition process. We present results from synthetic simulations incorporating real respiratory motion as well as from phantom and patient data

  9. Internet2-based 3D PET image reconstruction using a PC cluster

    International Nuclear Information System (INIS)

    Shattuck, D.W.; Rapela, J.; Asma, E.; Leahy, R.M.; Chatzioannou, A.; Qi, J.

    2002-01-01

    We describe an approach to fast iterative reconstruction from fully three-dimensional (3D) PET data using a network of PentiumIII PCs configured as a Beowulf cluster. To facilitate the use of this system, we have developed a browser-based interface using Java. The system compresses PET data on the user's machine, sends these data over a network, and instructs the PC cluster to reconstruct the image. The cluster implements a parallelized version of our preconditioned conjugate gradient method for fully 3D MAP image reconstruction. We report on the speed-up factors using the Beowulf approach and the impacts of communication latencies in the local cluster network and the network connection between the user's machine and our PC cluster. (author)

  10. Fast data reconstructed method of Fourier transform imaging spectrometer based on multi-core CPU

    Science.gov (United States)

    Yu, Chunchao; Du, Debiao; Xia, Zongze; Song, Li; Zheng, Weijian; Yan, Min; Lei, Zhenggang

    2017-10-01

    Imaging spectrometer can gain two-dimensional space image and one-dimensional spectrum at the same time, which shows high utility in color and spectral measurements, the true color image synthesis, military reconnaissance and so on. In order to realize the fast reconstructed processing of the Fourier transform imaging spectrometer data, the paper designed the optimization reconstructed algorithm with OpenMP parallel calculating technology, which was further used for the optimization process for the HyperSpectral Imager of `HJ-1' Chinese satellite. The results show that the method based on multi-core parallel computing technology can control the multi-core CPU hardware resources competently and significantly enhance the calculation of the spectrum reconstruction processing efficiency. If the technology is applied to more cores workstation in parallel computing, it will be possible to complete Fourier transform imaging spectrometer real-time data processing with a single computer.

  11. Segmenting Trajectories by Movement States

    NARCIS (Netherlands)

    Buchin, M.; Kruckenberg, H.; Kölzsch, A.; Timpf, S.; Laube, P.

    2013-01-01

    Dividing movement trajectories according to different movement states of animals has become a challenge in movement ecology, as well as in algorithm development. In this study, we revisit and extend a framework for trajectory segmentation based on spatio-temporal criteria for this purpose. We adapt

  12. Reconstruction of biological networks based on life science data integration

    Directory of Open Access Journals (Sweden)

    Kormeier Benjamin

    2010-06-01

    Full Text Available For the implementation of the virtual cell, the fundamental question is how to model and simulate complex biological networks. Therefore, based on relevant molecular database and information systems, biological data integration is an essential step in constructing biological networks. In this paper, we will motivate the applications BioDWH - an integration toolkit for building life science data warehouses, CardioVINEdb - a information system for biological data in cardiovascular-disease and VANESA- a network editor for modeling and simulation of biological networks. Based on this integration process, the system supports the generation of biological network models. A case study of a cardiovascular-disease related gene-regulated biological network is also presented.

  13. Reconstruction of biological networks based on life science data integration.

    Science.gov (United States)

    Kormeier, Benjamin; Hippe, Klaus; Arrigo, Patrizio; Töpel, Thoralf; Janowski, Sebastian; Hofestädt, Ralf

    2010-10-27

    For the implementation of the virtual cell, the fundamental question is how to model and simulate complex biological networks. Therefore, based on relevant molecular database and information systems, biological data integration is an essential step in constructing biological networks. In this paper, we will motivate the applications BioDWH--an integration toolkit for building life science data warehouses, CardioVINEdb--a information system for biological data in cardiovascular-disease and VANESA--a network editor for modeling and simulation of biological networks. Based on this integration process, the system supports the generation of biological network models. A case study of a cardiovascular-disease related gene-regulated biological network is also presented.

  14. Deviation from Trajectory Detection in Vision based Robotic Navigation using SURF and Subsequent Restoration by Dynamic Auto Correction Algorithm

    Directory of Open Access Journals (Sweden)

    Ray Debraj

    2015-01-01

    Full Text Available Speeded Up Robust Feature (SURF is used to position a robot with respect to an environment and aid in vision-based robotic navigation. During the course of navigation irregularities in the terrain, especially in an outdoor environment may deviate a robot from the track. Another reason for deviation can be unequal speed of the left and right robot wheels. Hence it is essential to detect such deviations and perform corrective operations to bring the robot back to the track. In this paper we propose a novel algorithm that uses image matching using SURF to detect deviation of a robot from the trajectory and subsequent restoration by corrective operations. This algorithm is executed in parallel to positioning and navigation algorithms by distributing tasks among different CPU cores using Open Multi-Processing (OpenMP API.

  15. Low dose CBCT reconstruction via prior contour based total variation (PCTV) regularization: a feasibility study

    Science.gov (United States)

    Chen, Yingxuan; Yin, Fang-Fang; Zhang, Yawei; Zhang, You; Ren, Lei

    2018-04-01

    Purpose: compressed sensing reconstruction using total variation (TV) tends to over-smooth the edge information by uniformly penalizing the image gradient. The goal of this study is to develop a novel prior contour based TV (PCTV) method to enhance the edge information in compressed sensing reconstruction for CBCT. Methods: the edge information is extracted from prior planning-CT via edge detection. Prior CT is first registered with on-board CBCT reconstructed with TV method through rigid or deformable registration. The edge contours in prior-CT is then mapped to CBCT and used as the weight map for TV regularization to enhance edge information in CBCT reconstruction. The PCTV method was evaluated using extended-cardiac-torso (XCAT) phantom, physical CatPhan phantom and brain patient data. Results were compared with both TV and edge preserving TV (EPTV) methods which are commonly used for limited projection CBCT reconstruction. Relative error was used to calculate pixel value difference and edge cross correlation was defined as the similarity of edge information between reconstructed images and ground truth in the quantitative evaluation. Results: compared to TV and EPTV, PCTV enhanced the edge information of bone, lung vessels and tumor in XCAT reconstruction and complex bony structures in brain patient CBCT. In XCAT study using 45 half-fan CBCT projections, compared with ground truth, relative errors were 1.5%, 0.7% and 0.3% and edge cross correlations were 0.66, 0.72 and 0.78 for TV, EPTV and PCTV, respectively. PCTV is more robust to the projection number reduction. Edge enhancement was reduced slightly with noisy projections but PCTV was still superior to other methods. PCTV can maintain resolution while reducing the noise in the low mAs CatPhan reconstruction. Low contrast edges were preserved better with PCTV compared with TV and EPTV. Conclusion: PCTV preserved edge information as well as reduced streak artifacts and noise in low dose CBCT reconstruction

  16. A compressed sensing based reconstruction algorithm for synchrotron source propagation-based X-ray phase contrast computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Melli, Seyed Ali, E-mail: sem649@mail.usask.ca [Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, SK (Canada); Wahid, Khan A. [Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, SK (Canada); Babyn, Paul [Department of Medical Imaging, University of Saskatchewan, Saskatoon, SK (Canada); Montgomery, James [College of Medicine, University of Saskatchewan, Saskatoon, SK (Canada); Snead, Elisabeth [Western College of Veterinary Medicine, University of Saskatchewan, Saskatoon, SK (Canada); El-Gayed, Ali [College of Medicine, University of Saskatchewan, Saskatoon, SK (Canada); Pettitt, Murray; Wolkowski, Bailey [College of Agriculture and Bioresources, University of Saskatchewan, Saskatoon, SK (Canada); Wesolowski, Michal [Department of Medical Imaging, University of Saskatchewan, Saskatoon, SK (Canada)

    2016-01-11

    Synchrotron source propagation-based X-ray phase contrast computed tomography is increasingly used in pre-clinical imaging. However, it typically requires a large number of projections, and subsequently a large radiation dose, to produce high quality images. To improve the applicability of this imaging technique, reconstruction algorithms that can reduce the radiation dose and acquisition time without degrading image quality are needed. The proposed research focused on using a novel combination of Douglas–Rachford splitting and randomized Kaczmarz algorithms to solve large-scale total variation based optimization in a compressed sensing framework to reconstruct 2D images from a reduced number of projections. Visual assessment and quantitative performance evaluations of a synthetic abdomen phantom and real reconstructed image of an ex-vivo slice of canine prostate tissue demonstrate that the proposed algorithm is competitive in reconstruction process compared with other well-known algorithms. An additional potential benefit of reducing the number of projections would be reduction of time for motion artifact to occur if the sample moves during image acquisition. Use of this reconstruction algorithm to reduce the required number of projections in synchrotron source propagation-based X-ray phase contrast computed tomography is an effective form of dose reduction that may pave the way for imaging of in-vivo samples.

  17. Reconstruction of the cranial base in surgery for jugular foramen tumors.

    Science.gov (United States)

    Ramina, Ricardo; Maniglia, Joao J; Paschoal, Jorge R; Fernandes, Yvens B; Neto, Mauricio Coelho; Honorato, Donizeti C

    2005-04-01

    The surgical removal of a jugular foramen (JF) tumor presents the neurosurgeon with a complex management problem that requires an understanding of the natural history, diagnosis, surgical approaches, and postoperative complications. Cerebrospinal fluid (CSF) leakage is one of the most common complications of this surgery. Different surgical approaches and management concepts to avoid this complication have been described, mainly in the ear, nose, and throat literature. The purpose of this study was to review the results of CSF leakage prevention in a series of 66 patients with JF tumors operated on by a multidisciplinary cranial base team using a new technique for cranial base reconstruction. We retrospectively studied 66 patients who had JF tumors with intracranial extension and who underwent surgical treatment in our institutions from January 1987 to December 2001. Paragangliomas were the most frequent lesions, followed by schwannomas and meningiomas. All patients were operated on using the same multidisciplinary surgical approach (neurosurgeons and ear, nose, and throat surgeons). A surgical strategy for reconstruction of the cranial base using vascularized flaps was carried out. The closure of the surgical wound was performed in three layers. A specially developed myofascial flap (temporalis fascia, cervical fascia, and sternocleidomastoid muscle) associated to the inferior rotation of the posterior portion of the temporalis muscle was used to reconstruct the cranial base with vascularized flaps. In this series of 66 patients, postoperative CSF leakage developed in three cases. These patients presented with very large or recurrent tumors, and the postoperative CSF fistulae were surgically closed. The cosmetic result obtained with this reconstruction was classified as excellent or good in all patients. Our results compare favorably with those reported in the literature. The surgical strategy used for cranial base reconstruction presented in this article has

  18. Does acellular dermal matrix really improve aesthetic outcome in tissue expander/implant-based breast reconstruction?

    Science.gov (United States)

    Ibrahim, Ahmed M S; Koolen, Pieter G L; Ganor, Oren; Markarian, Mark K; Tobias, Adam M; Lee, Bernard T; Lin, Samuel J; Mureau, Marc A M

    2015-06-01

    The expectation for improved results by women undergoing postmastectomy reconstruction has steadily risen. A majority of these operations are tissue expander/implant-based breast reconstructions. Acellular dermal matrix (ADM) offers numerous advantages in these procedures. Thus far, the evidence to justify improved aesthetic outcome has solely been based on surgeon opinion. The purpose of this study was to assess aesthetic outcome following ADM use in tissue expander/implant-based breast reconstruction by a panel of blinded plastic surgeons. Mean aesthetic results of patients who underwent tissue expander/implant-based breast reconstruction with (n = 18) or without ADM (n = 20) were assessed with objective grading of preoperative and postoperative photographs by five independent blinded plastic surgeons. Absolute observed agreement as well as weighted Fleiss Kappa (κ) test statistics were calculated to assess inter-rater variability. When ADM was incorporated, the overall aesthetic score was improved by an average of 12.1 %. In addition, subscale analyses revealed improvements in breast contour (35.2 %), implant placement (20.7 %), lower pole projection (16.7 %), and inframammary fold definition (13.8 %). Contour (p = 0.039), implant placement (p = 0.021), and overall aesthetic score (p = 0.022) reached statistical significance. Inter-rater reliability showed mostly moderate agreement. Mean aesthetic scores were higher in the ADM-assisted breast reconstruction cohort including the total aesthetic score which was statistically significant. Aesthetic outcome alone may justify the added expense of incorporating biologic mesh. Moreover, ADM has other benefits which may render it cost-effective. Larger prospective studies are needed to provide plastic surgeons with more definitive guidelines for ADM use. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the

  19. Reconstruction for Skull Base Defect Using Fat-Containing Perifascial Areolar Tissue.

    Science.gov (United States)

    Choi, Woo Young; Sung, Ki Wook; Kim, Young Seok; Hong, Jong Won; Roh, Tai Suk; Lew, Dae Hyun; Chang, Jong Hee; Lee, Kyu Sung

    2017-06-01

    Skull base reconstruction is a challenging task. The method depends on the anatomical complexity and size of the defect. We obtained tissue by harvesting fat-containing perifascial areolar tissue (PAT) for reconstruction of limited skull base defects and volume augmentation. We demonstrated the effective option for reconstruction of limited skull base defects and volume augmentation. From October 2013 to November 2015, 5 patients underwent operations using fat-containing PAT to fill the defect in skull base and/or perform volume replacement in the forehead. Perifascial areolar tissue with 5- to 10-mm fat thickness was harvested from the inguinal region. The fat-containing PAT was grafted to the defect contacting the vascularized wound bed. Patients were followed up in terms of their clinical symptoms and postoperative magnetic resonance imaging findings. Four patients were treated using fat-containing PAT after tumor resection. One patient was treated for a posttraumatic forehead depression deformity. The fat-containing PAT included 5- to 9-mm fat thickness in all cases. The mean size of grafted PAT was 65.6 cm (28-140 cm). The mean follow-up period was 18.6 months (12-31 months). There was no notable complication. There was no donor site morbidity. We can harvest PAT with fat easily and obtain the sufficient volume to treat the defect. It also could be used with other reconstructive method, such as a free flap or a regional flap to fill the left dead space. Therefore, fat-containing PAT could be additional options to reconstruction of skull base defect.

  20. Trajectories of delinquency and parenting styles

    NARCIS (Netherlands)

    Hoeve, M.; van Blokland, A.; Dubas, J.S.; Loeber, R; Gerris, J.R.M.; van der Laan, P.H.

    2008-01-01

    We investigated trajectories of adolescent delinquent development using data from the Pittsburgh Youth Study and examined the extent to which these different trajectories are differentially predicted by childhood parenting styles. Based on self-reported and official delinquency seriousness, covering

  1. Trajectories of Delinquency and Parenting Styles

    NARCIS (Netherlands)

    Hoeve, M.; Blokland, A.A.J.; Dubas, J.S.; Loeber, R.; Gerris, J.R.M.; Laan, P.H. van der

    2008-01-01

    We investigated trajectories of adolescent delinquent development using data from the Pittsburgh Youth Study and examined the extent to which these different trajectories are differentially predicted by childhood parenting styles. Based on self-reported and official delinquency seriousness, covering

  2. Image-Based 3d Reconstruction and Analysis for Orthodontia

    Science.gov (United States)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  3. Regional MLEM reconstruction strategy for PET-based treatment verification in ion beam radiotherapy

    International Nuclear Information System (INIS)

    Gianoli, Chiara; Riboldi, Marco; Fattori, Giovanni; Baselli, Giuseppe; Baroni, Guido; Bauer, Julia; Debus, Jürgen; Parodi, Katia; De Bernardi, Elisabetta

    2014-01-01

    In ion beam radiotherapy, PET-based treatment verification provides a consistency check of the delivered treatment with respect to a simulation based on the treatment planning. In this work the region-based MLEM reconstruction algorithm is proposed as a new evaluation strategy in PET-based treatment verification. The comparative evaluation is based on reconstructed PET images in selected regions, which are automatically identified on the expected PET images according to homogeneity in activity values. The strategy was tested on numerical and physical phantoms, simulating mismatches between the planned and measured β + activity distributions. The region-based MLEM reconstruction was demonstrated to be robust against noise and the sensitivity of the strategy results were comparable to three voxel units, corresponding to 6 mm in numerical phantoms. The robustness of the region-based MLEM evaluation outperformed the voxel-based strategies. The potential of the proposed strategy was also retrospectively assessed on patient data and further clinical validation is envisioned. (paper)

  4. Design and Development of a Rapid Research, Design, and Development Platform for In-Situ Testing of Tools and Concepts for Trajectory-Based Operations

    Science.gov (United States)

    Underwood, Matthew C.

    2017-01-01

    To provide justification for equipping a fleet of aircraft with avionics capable of supporting trajectory-based operations, significant flight testing must be accomplished. However, equipping aircraft with these avionics and enabling technologies to communicate the clearances required for trajectory-based operations is cost-challenging using conventional avionics approaches. This paper describes an approach to minimize the costs and risks of flight testing these technologies in-situ, discusses the test-bed platform developed, and highlights results from a proof-of-concept flight test campaign that demonstrates the feasibility and efficiency of this approach.

  5. Reconstruction of Sky Illumination Domes from Ground-Based Panoramas

    Science.gov (United States)

    Coubard, F.; Lelégard, L.; Brédif, M.; Paparoditis, N.; Briottet, X.

    2012-07-01

    The knowledge of the sky illumination is important for radiometric corrections and for computer graphics applications such as relighting or augmented reality. We propose an approach to compute environment maps, representing the sky radiance, from a set of ground-based images acquired by a panoramic acquisition system, for instance a mobile-mapping system. These images can be affected by important radiometric artifacts, such as bloom or overexposure. A Perez radiance model is estimated with the blue sky pixels of the images, and used to compute additive corrections in order to reduce these radiometric artifacts. The sky pixels are then aggregated in an environment map, which still suffers from discontinuities on stitching edges. The influence of the quality of estimated sky radiance on the simulated light signal is measured quantitatively on a simple synthetic urban scene; in our case, the maximal error for the total sensor radiance is about 10%.

  6. RECONSTRUCTION OF SKY ILLUMINATION DOMES FROM GROUND-BASED PANORAMAS

    Directory of Open Access Journals (Sweden)

    F. Coubard

    2012-07-01

    Full Text Available The knowledge of the sky illumination is important for radiometric corrections and for computer graphics applications such as relighting or augmented reality. We propose an approach to compute environment maps, representing the sky radiance, from a set of ground-based images acquired by a panoramic acquisition system, for instance a mobile-mapping system. These images can be affected by important radiometric artifacts, such as bloom or overexposure. A Perez radiance model is estimated with the blue sky pixels of the images, and used to compute additive corrections in order to reduce these radiometric artifacts. The sky pixels are then aggregated in an environment map, which still suffers from discontinuities on stitching edges. The influence of the quality of estimated sky radiance on the simulated light signal is measured quantitatively on a simple synthetic urban scene; in our case, the maximal error for the total sensor radiance is about 10%.

  7. Surgical Reconstruction of Charcot Foot Neuroarthropathy, a Case Based Review

    Directory of Open Access Journals (Sweden)

    Tomáš Kučera

    2014-01-01

    Full Text Available Our case-based review focuses on limb salvage through operative management of Charcot neuroarthropathy of the diabetic foot. We describe a case, when a below-knee amputation was considered in a patient with chronic Charcot foot with a rocker-bottom deformity and chronic plantar ulceration. Conservative treatment failed. Targeted antibiotic therapy and operative management (Tendo-Achilles lengthening, resectional arthrodesis of Lisfranc and midtarsal joints, fixation with large-diameter axial screws, and plaster cast were performed. On the basis of this case, we discuss options and drawbacks of surgical management. Our approach led to healing of the ulcer and correction of the deformity. Two years after surgery, we observed a significant improvement in patient’s quality of life. Advanced diagnostic and imaging techniques, a better understanding of the biomechanics and biology of Charcot neuroarthropathy, and suitable osteosynthetic material enables diabetic limb salvage.

  8. Reconstruction of complicated skull base defects utilizing free tissue transfer.

    Science.gov (United States)

    Djalilian, Hamid R; Gapany, Markus; Levine, Samuel C

    2002-11-01

    We managed five patients with large skull base defects complicated by complex infections with microvascular free tissue transfer. The first patient developed an infection, cerebrospinal fluid (CSF) leak, and meningitis after undergoing a translabyrinthine resection of an acoustic neuroma. The second patient had a history of a gunshot wound to the temporal bone, with a large defect and an infected cholesteatoma that caused several episodes of meningitis. The third through the fifth patients had persistent CSF leakage and infection refractory to conventional therapy. In all cases prior attempts of closure with fat grafts or regional flaps had failed. Rectus abdominis myofascial free flap, radial forearm free flap or a gracilis muscle free flap was used after debridement of the infected cavities. The CSF leaks, local infections, and meningitis were controlled within a week. In our experience, microvascular free tissue provides the necessary bulk of viable, well-vascularized tissue, which not only assures a mechanical seal but also helps clear the local infection.

  9. Model simulations and proxy-based reconstructions for the European region in the past millennium (Invited)

    Science.gov (United States)

    Zorita, E.

    2009-12-01

    One of the objectives when comparing simulations of past climates to proxy-based climate reconstructions is to asses the skill of climate models to simulate climate change. This comparison may accomplished at large spatial scales, for instance the evolution of simulated and reconstructed Northern Hemisphere annual temperature, or at regional or point scales. In both approaches a 'fair' comparison has to take into account different aspects that affect the inevitable uncertainties and biases in the simulations and in the reconstructions. These efforts face a trade-off: climate models are believed to be more skillful at large hemispheric scales, but climate reconstructions are these scales are burdened by the spatial distribution of available proxies and by methodological issues surrounding the statistical method used to translate the proxy information into large-spatial averages. Furthermore, the internal climatic noise at large hemispheric scales is low, so that the sampling uncertainty tends to be also low. On the other hand, the skill of climate models at regional scales is limited by the coarse spatial resolution, which hinders a faithful representation of aspects important for the regional climate. At small spatial scales, the reconstruction of past climate probably faces less methodological problems if information from different proxies is available. The internal climatic variability at regional scales is, however, high. In this contribution some examples of the different issues faced when comparing simulation and reconstructions at small spatial scales in the past millennium are discussed. These examples comprise reconstructions from dendrochronological data and from historical documentary data in Europe and climate simulations with global and regional models. These examples indicate that the centennial climate variations can offer a reasonable target to assess the skill of global climate models and of proxy-based reconstructions, even at small spatial scales

  10. Hybrid light transport model based bioluminescence tomography reconstruction for early gastric cancer detection

    Science.gov (United States)

    Chen, Xueli; Liang, Jimin; Hu, Hao; Qu, Xiaochao; Yang, Defu; Chen, Duofang; Zhu, Shouping; Tian, Jie

    2012-03-01

    Gastric cancer is the second cause of cancer-related death in the world, and it remains difficult to cure because it has been in late-stage once that is found. Early gastric cancer detection becomes an effective approach to decrease the gastric cancer mortality. Bioluminescence tomography (BLT) has been applied to detect early liver cancer and prostate cancer metastasis. However, the gastric cancer commonly originates from the gastric mucosa and grows outwards. The bioluminescent light will pass through a non-scattering region constructed by gastric pouch when it transports in tissues. Thus, the current BLT reconstruction algorithms based on the approximation model of radiative transfer equation are not optimal to handle this problem. To address the gastric cancer specific problem, this paper presents a novel reconstruction algorithm that uses a hybrid light transport model to describe the bioluminescent light propagation in tissues. The radiosity theory integrated with the diffusion equation to form the hybrid light transport model is utilized to describe light propagation in the non-scattering region. After the finite element discretization, the hybrid light transport model is converted into a minimization problem which fuses an l1 norm based regularization term to reveal the sparsity of bioluminescent source distribution. The performance of the reconstruction algorithm is first demonstrated with a digital mouse based simulation with the reconstruction error less than 1mm. An in situ gastric cancer-bearing nude mouse based experiment is then conducted. The primary result reveals the ability of the novel BLT reconstruction algorithm in early gastric cancer detection.

  11. Trajectories of health-related quality of life among family caregivers of individuals with dementia: A home-based caregiver-training program matters.

    Science.gov (United States)

    Kuo, Li-Min; Huang, Huei-Ling; Liang, Jersey; Kwok, Yam-Ting; Hsu, Wen-Chuin; Liu, Chin-Yi; Shyu, Yea-Ing L

    To determine distinct courses of change in health-related quality of life (HRQoL) among family caregivers of individuals with dementia and how participating in a home-based caregiver-training program affects the probability of belonging to each course. Sixty three caregivers were in the intervention group, and 66 caregivers were in the control group of a single-blinded randomized clinical trial. Two distinct trajectories of HRQoL were identified: a well-functioning trajectory and a poor-functioning trajectory. Caregivers who received the training program were more likely than those who did not have a well-functioning trajectory of HRQoL over 18 months. This trajectory included bodily pain (b = 1.02, odds ratio [OR] = 2.76), general health perception (b = 1.28, OR = 3.60), social functioning (b = 1.12, OR = 3.05), vitality (b = 1.51, OR = 4.49), general mental health (b = 1.08, OR = 2.94), and mental component summary (b = 1.27, OR = 3.55). Home-based caregiver training can be considered as part of the protocol for managing patients with dementia and their caregivers. NCT02667951. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Identification of digitized particle trajectories

    CERN Document Server

    Grote, H; Lassalle, J C; Zanella, P

    1973-01-01

    High-energy Physics Laboratories make increasing use of particle detectors which directly produce digital measurements of trajectories at very high rates. Data collected in vast amounts during experiments are then analysed by computer programs whose first task is the recognition of tracks and reconstruction of the interesting events. This paper discusses the applicability of various Pattern Recognition approaches. Examples are given of the problems and the practical achievements in this field.

  13. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  14. 3D reconstruction based on compressed-sensing (CS)-based framework by using a dental panoramic detector.

    Science.gov (United States)

    Je, U K; Cho, H M; Hong, D K; Cho, H S; Park, Y O; Park, C K; Kim, K S; Lim, H W; Kim, G A; Park, S Y; Woo, T H; Cho, S I

    2016-01-01

    In this work, we propose a practical method that can combine the two functionalities of dental panoramic and cone-beam CT (CBCT) features in one by using a single panoramic detector. We implemented a CS-based reconstruction algorithm for the proposed method and performed a systematic simulation to demonstrate its viability for 3D dental X-ray imaging. We successfully reconstructed volumetric images of considerably high accuracy by using a panoramic detector having an active area of 198.4 mm × 6.4 mm and evaluated the reconstruction quality as a function of the pitch (p) and the angle step (Δθ). Our simulation results indicate that the CS-based reconstruction almost completely recovered the phantom structures, as in CBCT, for p≤2.0 and θ≤6°, indicating that it seems very promising for accurate image reconstruction even for large-pitch and few-view data. We expect the proposed method to be applicable to developing a cost-effective, volumetric dental X-ray imaging system. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Effective electric fields along realistic DTI-based neural trajectories for modelling the stimulation mechanisms of TMS

    International Nuclear Information System (INIS)

    De Geeter, N; Crevecoeur, G; Dupré, L; Leemans, A

    2015-01-01

    In transcranial magnetic stimulation (TMS), an applied alternating magnetic field induces an electric field in the brain that can interact with the neural system. It is generally assumed that this induced electric field is the crucial effect exciting a certain region of the brain. More specifically, it is the component of this field parallel to the neuron’s local orientation, the so-called effective electric field, that can initiate neuronal stimulation. Deeper insights on the stimulation mechanisms can be acquired through extensive TMS modelling. Most models study simple representations of neurons with assumed geometries, whereas we embed realistic neural trajectories computed using tractography based on diffusion tensor images. This way of modelling ensures a more accurate spatial distribution of the effective electric field that is in addition patient and case specific. The case study of this paper focuses on the single pulse stimulation of the left primary motor cortex with a standard figure-of-eight coil. Including realistic neural geometry in the model demonstrates the strong and localized variations of the effective electric field between the tracts themselves and along them due to the interplay of factors such as the tract’s position and orientation in relation to the TMS coil, the neural trajectory and its course along the white and grey matter interface. Furthermore, the influence of changes in the coil orientation is studied. Investigating the impact of tissue anisotropy confirms that its contribution is not negligible. Moreover, assuming isotropic tissues lead to errors of the same size as rotating or tilting the coil with 10 degrees. In contrast, the model proves to be less sensitive towards the not well-known tissue conductivity values. (paper)

  16. [Vegetation spatial and temporal dynamic characteristics based on NDVI time series trajectories in grassland opencast coal mining].

    Science.gov (United States)

    Jia, Duo; Wang, Cang Jiao; Mu, Shou Guo; Zhao, Hua

    2017-06-18

    The spatiotemporal dynamic patterns of vegetation in mining area are still unclear. This study utilized time series trajectory segmentation algorithm to fit Landsat NDVI time series which generated from fusion images at the most prosperous period of growth based on ESTARFM algorithm. Combining with the shape features of the fitted trajectory, this paper extracted five vegetation dynamic patterns including pre-disturbance type, continuous disturbance type, stabilization after disturbance type, stabilization between disturbance and recovery type, and recovery after disturbance type. The result indicated that recovery after disturbance type was the dominant vegetation change pattern among the five types of vegetation dynamic pattern, which accounted for 55.2% of the total number of pixels. The follows were stabilization after disturbance type and continuous disturbance type, accounting for 25.6% and 11.0%, respectively. The pre-disturbance type and stabilization between disturbance and recovery type accounted for 3.5% and 4.7%, respectively. Vegetation disturbance mainly occurred from 2004 to 2009 in Shengli mining area. The onset time of stable state was 2008 and the spatial locations mainlydistributed in open-pit stope and waste dump. The reco-very state mainly started since the year of 2008 and 2010, while the areas were small and mainly distributed at the periphery of open-pit stope and waste dump. Duration of disturbance was mainly 1 year. The duration of stable period usually sustained 7 years. The duration of recovery state of the type of stabilization between disturbances continued 2 to 5 years, while the type of recovery after disturbance often sustained 8 years.

  17. Iterative reconstruction using a Monte Carlo based system transfer matrix for dedicated breast positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Saha, Krishnendu [Ohio Medical Physics Consulting, Dublin, Ohio 43017 (United States); Straus, Kenneth J.; Glick, Stephen J. [Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts 01655 (United States); Chen, Yu. [Department of Radiation Oncology, Columbia University, New York, New York 10032 (United States)

    2014-08-28

    To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.

  18. Target 3-D reconstruction of streak tube imaging lidar based on Gaussian fitting

    Science.gov (United States)

    Yuan, Qingyu; Niu, Lihong; Hu, Cuichun; Wu, Lei; Yang, Hongru; Yu, Bing

    2018-02-01

    Streak images obtained by the streak tube imaging lidar (STIL) contain the distance-azimuth-intensity information of a scanned target, and a 3-D reconstruction of the target can be carried out through extracting the characteristic data of multiple streak images. Significant errors will be caused in the reconstruction result by the peak detection method due to noise and other factors. So as to get a more precise 3-D reconstruction, a peak detection method based on Gaussian fitting of trust region is proposed in this work. Gaussian modeling is performed on the returned wave of single time channel of each frame, then the modeling result which can effectively reduce the noise interference and possesses a unique peak could be taken as the new returned waveform, lastly extracting its feature data through peak detection. The experimental data of aerial target is for verifying this method. This work shows that the peak detection method based on Gaussian fitting reduces the extraction error of the feature data to less than 10%; utilizing this method to extract the feature data and reconstruct the target make it possible to realize the spatial resolution with a minimum 30 cm in the depth direction, and improve the 3-D imaging accuracy of the STIL concurrently.

  19. Probability- and curve-based fractal reconstruction on 2D DEM terrain profile

    International Nuclear Information System (INIS)

    Lai, F.-J.; Huang, Y.M.

    2009-01-01

    Data compression and reconstruction has been playing important roles in information science and engineering. As part of them, image compression and reconstruction that mainly deal with image data set reduction for storage or transmission and data set restoration with least loss is still a topic deserved a great deal of works to focus on. In this paper we propose a new scheme in comparison with the well-known Improved Douglas-Peucker (IDP) method to extract characteristic or feature points of two-dimensional digital elevation model (2D DEM) terrain profile to compress data set. As for reconstruction in use of fractal interpolation, we propose a probability-based method to speed up the fractal interpolation execution to a rate as high as triple or even ninefold of the regular. In addition, a curve-based method is proposed in the study to determine the vertical scaling factor that much affects the generation of the interpolated data points to significantly improve the reconstruction performance. Finally, an evaluation is made to show the advantage of employing the proposed new method to extract characteristic points associated with our novel fractal interpolation scheme.

  20. Limiting CT radiation dose in children with craniosynostosis: phantom study using model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Kaasalainen, Touko; Lampinen, Anniina [University of Helsinki and Helsinki University Hospital, HUS Medical Imaging Center, Radiology, POB 340, Helsinki (Finland); University of Helsinki, Department of Physics, Helsinki (Finland); Palmu, Kirsi [University of Helsinki and Helsinki University Hospital, HUS Medical Imaging Center, Radiology, POB 340, Helsinki (Finland); School of Science, Aalto University, Department of Biomedical Engineering and Computational Science, Helsinki (Finland); Reijonen, Vappu; Kortesniemi, Mika [University of Helsinki and Helsinki University Hospital, HUS Medical Imaging Center, Radiology, POB 340, Helsinki (Finland); Leikola, Junnu [University of Helsinki and Helsinki University Hospital, Department of Plastic Surgery, Helsinki (Finland); Kivisaari, Riku [University of Helsinki and Helsinki University Hospital, Department of Neurosurgery, Helsinki (Finland)

    2015-09-15

    Medical professionals need to exercise particular caution when developing CT scanning protocols for children who require multiple CT studies, such as those with craniosynostosis. To evaluate the utility of ultra-low-dose CT protocols with model-based iterative reconstruction techniques for craniosynostosis imaging. We scanned two pediatric anthropomorphic phantoms with a 64-slice CT scanner using different low-dose protocols for craniosynostosis. We measured organ doses in the head region with metal-oxide-semiconductor field-effect transistor (MOSFET) dosimeters. Numerical simulations served to estimate organ and effective doses. We objectively and subjectively evaluated the quality of images produced by adaptive statistical iterative reconstruction (ASiR) 30%, ASiR 50% and Veo (all by GE Healthcare, Waukesha, WI). Image noise and contrast were determined for different tissues. Mean organ dose with the newborn phantom was decreased up to 83% compared to the routine protocol when using ultra-low-dose scanning settings. Similarly, for the 5-year phantom the greatest radiation dose reduction was 88%. The numerical simulations supported the findings with MOSFET measurements. The image quality remained adequate with Veo reconstruction, even at the lowest dose level. Craniosynostosis CT with model-based iterative reconstruction could be performed with a 20-μSv effective dose, corresponding to the radiation exposure of plain skull radiography, without compromising required image quality. (orig.)

  1. Single-resolution and multiresolution extended-Kalman-filter-based reconstruction approaches to optical refraction tomography.

    Science.gov (United States)

    Naik, Naren; Vasu, R M; Ananthasayanam, M R

    2010-02-20

    The problem of reconstruction of a refractive-index distribution (RID) in optical refraction tomography (ORT) with optical path-length difference (OPD) data is solved using two adaptive-estimation-based extended-Kalman-filter (EKF) approaches. First, a basic single-resolution EKF (SR-EKF) is applied to a state variable model describing the tomographic process, to estimate the RID of an optically transparent refracting object from noisy OPD data. The initialization of the biases and covariances corresponding to the state and measurement noise is discussed. The state and measurement noise biases and covariances are adaptively estimated. An EKF is then applied to the wavelet-transformed state variable model to yield a wavelet-based multiresolution EKF (MR-EKF) solution approach. To numerically validate the adaptive EKF approaches, we evaluate them with benchmark studies of standard stationary cases, where comparative results with commonly used efficient deterministic approaches can be obtained. Detailed reconstruction studies for the SR-EKF and two versions of the MR-EKF (with Haar and Daubechies-4 wavelets) compare well with those obtained from a typically used variant of the (deterministic) algebraic reconstruction technique, the average correction per projection method, thus establishing the capability of the EKF for ORT. To the best of our knowledge, the present work contains unique reconstruction studies encompassing the use of EKF for ORT in single-resolution and multiresolution formulations, and also in the use of adaptive estimation of the EKF's noise covariances.

  2. Study on the effects of sample selection on spectral reflectance reconstruction based on the algorithm of compressive sensing

    International Nuclear Information System (INIS)

    Zhang, Leihong; Liang, Dong

    2016-01-01

    In order to solve the problem that reconstruction efficiency and precision is not high, in this paper different samples are selected to reconstruct spectral reflectance, and a new kind of spectral reflectance reconstruction method based on the algorithm of compressive sensing is provided. Four different color numbers of matte color cards such as the ColorChecker Color Rendition Chart and Color Checker SG, the copperplate paper spot color card of Panton, and the Munsell colors card are chosen as training samples, the spectral image is reconstructed respectively by the algorithm of compressive sensing and pseudo-inverse and Wiener, and the results are compared. These methods of spectral reconstruction are evaluated by root mean square error and color difference accuracy. The experiments show that the cumulative contribution rate and color difference of the Munsell colors card are better than those of the other three numbers of color cards in the same conditions of reconstruction, and the accuracy of the spectral reconstruction will be affected by the training sample of different numbers of color cards. The key technology of reconstruction means that the uniformity and representation of the training sample selection has important significance upon reconstruction. In this paper, the influence of the sample selection on the spectral image reconstruction is studied. The precision of the spectral reconstruction based on the algorithm of compressive sensing is higher than that of the traditional algorithm of spectral reconstruction. By the MATLAB simulation results, it can be seen that the spectral reconstruction precision and efficiency are affected by the different color numbers of the training sample. (paper)

  3. Cup Implant Planning Based on 2-D/3-D Radiographic Pelvis Reconstruction-First Clinical Results.

    Science.gov (United States)

    Schumann, Steffen; Sato, Yoshinobu; Nakanishi, Yuki; Yokota, Futoshi; Takao, Masaki; Sugano, Nobuhiko; Zheng, Guoyan

    2015-11-01

    In the following, we will present a newly developed X-ray calibration phantom and its integration for 2-D/3-D pelvis reconstruction and subsequent automatic cup planning. Two different planning strategies were applied and evaluated with clinical data. Two different cup planning methods were investigated: The first planning strategy is based on a combined pelvis and cup statistical atlas. Thereby, the pelvis part of the combined atlas is matched to the reconstructed pelvis model, resulting in an optimized cup planning. The second planning strategy analyzes the morphology of the reconstructed pelvis model to determine the best fitting cup implant. The first planning strategy was compared to 3-D CT-based planning. Digitally reconstructed radiographs of THA patients with differently severe pathologies were used to evaluate the accuracy of predicting the cup size and position. Within a discrepancy of one cup size, the size was correctly identified in 100% of the cases for Crowe type I datasets and in 77.8% of the cases for Crowe type II, III, and IV datasets. The second planning strategy was analyzed with respect to the eventually implanted cup size. In seven patients, the estimated cup diameter was correct within one cup size, while the estimation for the remaining five patients differed by two cup sizes. While both planning strategies showed the same prediction rate with a discrepancy of one cup size (87.5%), the prediction of the exact cup size was increased for the statistical atlas-based strategy (56%) in contrast to the anatomically driven approach (37.5%). The proposed approach demonstrated the clinical validity of using 2-D/3-D reconstruction technique for cup planning.

  4. Track reconstruction for the Mu3e experiment based on a novel Multiple Scattering fit

    Directory of Open Access Journals (Sweden)

    Kozlinskiy Alexandr

    2017-01-01

    Full Text Available The Mu3e experiment is designed to search for the lepton flavor violating decay μ+ → e+e+e−. The aim of the experiment is to reach a branching ratio sensitivity of 10−16. In a first phase the experiment will be performed at an existing beam line at the Paul-Scherrer Institute (Switzerland providing 108 muons per second, which will allow to reach a sensitivity of 2 · 10−15. The muons with a momentum of about 28 MeV/c are stopped and decay at rest on a target. The decay products (positrons and electrons with energies below 53MeV are measured by a tracking detector consisting of two double layers of 50 μm thin silicon pixel sensors. The high granularity of the pixel detector with a pixel size of 80 μm × 80 μm allows for a precise track reconstruction in the high multiplicity environment of the Mu3e experiment, reaching 100 tracks per reconstruction frame of 50 ns in the final phase of the experiment. To deal with such high rates and combinatorics, the Mu3e track reconstruction uses a novel fit algorithm that in the simplest case takes into account only the multiple scattering, which allows for a fast online tracking on a GPU based filter farm. An implementation of the 3-dimensional multiple scattering fit based on hit triplets is described. The extension of the fit that takes into account energy losses and pixel size is used for offline track reconstruction. The algorithm and performance of the offline track reconstruction based on a full Geant4 simulation of the Mu3e detector are presented.

  5. Abdominal- versus thigh-based reconstruction of perineal defects in patients with cancer.

    Science.gov (United States)

    Pang, John; Broyles, Justin M; Berli, Jens; Buretta, Kate; Shridharani, Sachin M; Rochlin, Danielle H; Efron, Jonathan E; Sacks, Justin M

    2014-06-01

    An abdominoperineal resection is an invasive procedure that leaves the patient with vast pelvic dead space. Traditionally, the vertical rectus abdominus myocutaneous flap is used to reconstruct these defects. Oftentimes, this flap cannot be used because of multiple ostomy placements or previous abdominal surgery. The anterolateral thigh flap can be used; however, the efficacy of this flap has been questioned. We report a single surgeon's experience with perineal reconstruction in patients with cancer with the use of either the vertical rectus abdominus myocutaneous flap or the anterolateral thigh flap to demonstrate acceptable outcomes with either repair modality. From 2010 to 2012, 19 consecutive patients with perineal defects secondary to cancer underwent flap reconstruction. A retrospective chart review of prospectively entered data was conducted to determine the frequency of short-term and long-term complications. This study was conducted at an academic, tertiary-care cancer center. Patients in the study were patients with cancer who were receiving perineal reconstruction. Interventions were surgical and included either abdomen- or thigh-based reconstruction. The main outcome measures included infection, flap failure, length of stay, and time to radiotherapy. Of the 19 patients included in our study, 10 underwent anterolateral thigh flaps and 9 underwent vertical rectus abdominus myocutaneous flaps for reconstruction. There were no significant differences in demographics between groups (p > 0.05). Surgical outcomes and complications demonstrated no significant differences in the rate of infection, hematoma, bleeding, or necrosis. The mean length of stay after reconstruction was 9.7 ± 3.4 days (± SD) in the anterolateral thigh flap group and 13.4 ± 7.7 days in the vertical rectus abdominus myocutaneous flap group (p > 0.05). The limitations of this study include a relatively small sample size and retrospective evaluation. This study suggests that the

  6. Efficient conservative ADER schemes based on WENO reconstruction and space-time predictor in primitive variables

    Science.gov (United States)

    Zanotti, Olindo; Dumbser, Michael

    2016-01-01

    schemes provide less oscillatory solutions when compared to ADER finite volume schemes based on the reconstruction in conserved variables, especially for the RMHD and the Baer-Nunziato equations. For the RHD and RMHD equations, the overall accuracy is improved and the CPU time is reduced by about 25 %. Because of its increased accuracy and due to the reduced computational cost, we recommend to use this version of ADER as the standard one in the relativistic framework. At the end of the paper, the new approach has also been extended to ADER-DG schemes on space-time adaptive grids (AMR).

  7. Endoscopic endonasal double flap technique for reconstruction of large anterior skull base defects: technical note.

    Science.gov (United States)

    Dolci, Ricardo Landini Lutaif; Todeschini, Alexandre Bossi; Santos, Américo Rubens Leite Dos; Lazarini, Paulo Roberto

    2018-04-19

    One of the main concerns in endoscopic endonasal approaches to the skull base has been the high incidence and morbidity associated with cerebrospinal fluid leaks. The introduction and routine use of vascularized flaps allowed a marked decrease in this complication followed by a great expansion in the indications and techniques used in endoscopic endonasal approaches, extending to defects from huge tumours and previously inaccessible areas of the skull base. Describe the technique of performing endoscopic double flap multi-layered reconstruction of the anterior skull base without craniotomy. Step by step description of the endoscopic double flap technique (nasoseptal and pericranial vascularized flaps and fascia lata free graft) as used and illustrated in two patients with an olfactory groove meningioma who underwent an endoscopic approach. Both patients achieved a gross total resection: subsequent reconstruction of the anterior skull base was performed with the nasoseptal and pericranial flaps onlay and a fascia lata free graft inlay. Both patients showed an excellent recovery, no signs of cerebrospinal fluid leak, meningitis, flap necrosis, chronic meningeal or sinonasal inflammation or cerebral herniation having developed. This endoscopic double flap technique we have described is a viable, versatile and safe option for anterior skull base reconstructions, decreasing the incidence of complications in endoscopic endonasal approaches. Copyright © 2018 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  8. Reproducibility of UAV-based earth topography reconstructions based on Structure-from-Motion algorithms

    Science.gov (United States)

    Clapuyt, Francois; Vanacker, Veerle; Van Oost, Kristof

    2016-05-01

    Combination of UAV-based aerial pictures and Structure-from-Motion (SfM) algorithm provides an efficient, low-cost and rapid framework for remote sensing and monitoring of dynamic natural environments. This methodology is particularly suitable for repeated topographic surveys in remote or poorly accessible areas. However, temporal analysis of landform topography requires high accuracy of measurements and reproducibility of the methodology as differencing of digital surface models leads to error propagation. In order to assess the repeatability of the SfM technique, we surveyed a study area characterized by gentle topography with an UAV platform equipped with a standard reflex camera, and varied the focal length of the camera and location of georeferencing targets between flights. Comparison of different SfM-derived topography datasets shows that precision of measurements is in the order of centimetres for identical replications which highlights the excellent performance of the SfM workflow, all parameters being equal. The precision is one order of magnitude higher for 3D topographic reconstructions involving independent sets of ground control points, which results from the fact that the accuracy of the localisation of ground control points strongly propagates into final results.

  9. Images from the Mind: BCI image reconstruction based on Rapid Serial Visual Presentations of polygon primitives

    Directory of Open Access Journals (Sweden)

    Luís F Seoane

    2015-04-01

    Full Text Available We provide a proof of concept for an EEG-based reconstruction of a visual image which is on a user's mind. Our approach is based on the Rapid Serial Visual Presentation (RSVP of polygon primitives and Brain-Computer Interface (BCI technology. In an experimental setup, subjects were presented bursts of polygons: some of them contributed to building a target image (because they matched the shape and/or color of the target while some of them did not. The presentation of the contributing polygons triggered attention-related EEG patterns. These Event Related Potentials (ERPs could be determined using BCI classification and could be matched to the stimuli that elicited them. These stimuli (i.e. the ERP-correlated polygons were accumulated in the display until a satisfactory reconstruction of the target image was reached. As more polygons were accumulated, finer visual details were attained resulting in more challenging classification tasks. In our experiments, we observe an average classification accuracy of around 75%. An in-depth investigation suggests that many of the misclassifications were not misinterpretations of the BCI concerning the users' intent, but rather caused by ambiguous polygons that could contribute to reconstruct several different images. When we put our BCI-image reconstruction in perspective with other RSVP BCI paradigms, there is large room for improvement both in speed and accuracy. These results invite us to be optimistic. They open a plethora of possibilities to explore non-invasive BCIs for image reconstruction both in healthy and impaired subjects and, accordingly, suggest interesting recreational and clinical applications.

  10. Electron and photon reconstruction and performance in ATLAS using a dynamical, topological cell clustering-based approach

    CERN Document Server

    The ATLAS collaboration

    2017-01-01

    The electron and photon reconstruction in ATLAS has moved towards the use of a dynamical, topo- logical cell-based approach for cluster building, owing to advancements in the calibration procedure which allow for such a method to be applied. The move to this new technique allows for improved measurements of electron and photon energies, particularly in situations where an electron radiates a bremsstrahlung photon, or a photon converts to an electron-poistron pair. This note details the changes to the ATLAS electron and photon reconstruction software, and assesses its performance under current LHC luminosity conditions using simulated data. Changes to the converted photon reconstruction are also detailed, which improve the reconstruction efficiency of double-track converted photons, as well as reducing the reconstruction of spurious one-track converted photons. The performance of the new reconstruction algorithm is also presented in a number of important topologies relevant to precision Standard Model physics,...

  11. TH-EF-BRB-10: Dosimetric Validation of a Trajectory Based Cranial SRS Treatment Technique On a Varian TrueBeam Linac

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, B [University of British Columbia, Vancouver, BC (Canada); Vancouver Cancer Centre, Vancouver, BC (Canada); Gete, E [Vancouver Cancer Centre, Vancouver, BC (Canada)

    2016-06-15

    Purpose: This work investigates the dosimetric accuracy of a trajectory based delivery technique in which an optimized radiation beam is delivered along a Couch-Gantry trajectory that is formed by simultaneous rotation of the linac gantry and the treatment couch. Methods: Nine trajectory based cranial SRS treatment plans were created using in-house optimization software. The plans were calculated for delivery on the TrueBeam STx linac with 6MV photon beam. Dose optimization was performed along a user-defined trajectory using MLC modulation, dose rate modulation and jaw tracking. The pre-defined trajectory chosen for this study is formed by a couch rotation through its full range of 180 degrees while the gantry makes four partial arc sweeps which are 170 degrees each. For final dose calculation, the trajectory based plans were exported to the Varian Eclipse Treatment Planning System. The plans were calculated on a homogeneous cube phantom measuring 18.2×18.2×18.2 cm3 with the analytical anisotropic algorithm (AAA) using a 1mm3 calculation voxel. The plans were delivered on the TrueBeam linac via the developer’s mode. Point dose measurements were performed on 9 patients with the IBA CC01 mini-chamber with a sensitive volume of 0.01 cc. Gafchromic film measurements along the sagittal and coronal planes were performed on three of the 9 treatment plans. Point dose values were compared with ion chamber measurements. Gamma analysis comparing film measurement and AAA calculations was performed using FilmQA Pro. Results: The AAA calculations and measurements were in good agreement. The point dose difference between AAA and ion chamber measurements were within 2.2%. Gamma analysis test pass rates (2%, 2mm passing criteria) for the Gafchromic film measurements were >95%. Conclusion: We have successfully tested TrueBeam’s ability to deliver accurate trajectory based treatments involving simultaneous gantry and couch rotation with MLC and dose rate modulation along the

  12. Consideration of safety of implant-based breast reconstruction with postreconstruction radiotherapy for breast cancer

    International Nuclear Information System (INIS)

    Aomatsu, Naoki; Tei, Seika; Haraoka, Goichi

    2016-01-01

    There is controversy as to whether immediate autologous breast reconstruction followed by postoperative radiotherapy has acceptable complications and aesthetic outcomes. To evaluate the interval between surgery and adjuvant chemotherapy and radiation in patients treated with mastectomy and immediate expander-implant reconstruction, and to evaluate locoregional and distant control and cosmesis in these patients. Between 2011 and 2015, 9 patients with breast cancer were treated at our institution with definitive mastectomy and axillary lymph node dissection followed by immediate tissue expander placement and postreconstruction radiotherapy. We reviewed the complications of implant-based breast reconstruction followed by postreconstruction radiotherapy. The timing of irradiation was after implant insertion for 8 patients and after tissue expander insertion for 1 patient. The mean follow-up was 601 days. There were no unacceptable complications or local recurrences. For the majority of patients, overall symmetry, aesthetic results, and patient satisfaction were high. Breast reconstruction using tissue expansion and implants is an acceptable option for the subset of patients who may undergo postreconstruction radiotherapy. (author)

  13. THE STUDY OF SPECTRUM RECONSTRUCTION BASED ON FUZZY SET FULL CONSTRAINT AND MULTIENDMEMBER DECOMPOSITION

    Directory of Open Access Journals (Sweden)

    Y. Sun

    2017-09-01

    Full Text Available Hyperspectral imaging system can obtain spectral and spatial information simultaneously with bandwidth to the level of 10 nm or even less. Therefore, hyperspectral remote sensing has the ability to detect some kinds of objects which can not be detected in wide-band remote sensing, making it becoming one of the hottest spots in remote sensing. In this study, under conditions with a fuzzy set of full constraints, Normalized Multi-Endmember Decomposition Method (NMEDM for vegetation, water, and soil was proposed to reconstruct hyperspectral data using a large number of high-quality multispectral data and auxiliary spectral library data. This study considered spatial and temporal variation and decreased the calculation time required to reconstruct the hyper-spectral data. The results of spectral reconstruction based on NMEDM showed that the reconstructed data has good qualities and certain applications, which makes it possible to carry out spectral features identification. This method also extends the application of depth and breadth of remote sensing data, helping to explore the law between multispectral and hyperspectral data.

  14. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  15. Reconstruction of various perinasal defects using facial artery perforator-based nasolabial island flaps.

    Science.gov (United States)

    Yoon, Tae Ho; Yun, In Sik; Rha, Dong Kyun; Lee, Won Jai

    2013-11-01

    Classical flaps for perinasal defect reconstruction, such as forehead or nasolabial flaps, have some disadvantages involving limitations of the arc of rotation and two stages of surgery. However, a perforator-based flap is more versatile and allows freedom in flap design. We introduced our experience with reconstruction using a facial artery perforator-based propeller flap on the perinasal area. We describe the surgical differences between different defect subtypes. Between December 2005 and August 2013, 10 patients underwent perinasal reconstruction in which a facial artery perforator-based flap was used. We divided the perinasal defects into types A and B, according to location. The operative results, including flap size, arc of rotation, complications, and characteristics of the perforator were evaluated by retrospective chart review and photographic evaluation. Eight patients were male and 2 patients were female. Their mean age was 61 years (range, 35-75 years). The size of the flap ranged from 1 cm×1.5 cm to 3 cm×6 cm. Eight patients healed uneventfully, but 2 patients presented with mild flap congestion. However, these 2 patients healed by conservative management without any additional surgery. All of the flaps survived completely with aesthetically pleasing results. The facial artery perforator-based flap allowed for versatile customized flaps, and the donor site scar was concealed using the natural nasolabial fold.

  16. Towards Efficient Search for Activity Trajectories

    DEFF Research Database (Denmark)

    Zheng, Kai; Shang, Shuo; Yuan, Jing

    2013-01-01

    , recent proliferation in location-based web applications (e.g., Foursquare, Facebook) has given rise to large amounts of trajectories associated with activity information, called activity trajectory. In this paper, we study the problem of efficient similarity search on activity trajectory database. Given...

  17. Automated retinofugal visual pathway reconstruction with multi-shell HARDI and FOD-based analysis.

    Science.gov (United States)

    Kammen, Alexandra; Law, Meng; Tjan, Bosco S; Toga, Arthur W; Shi, Yonggang

    2016-01-15

    Diffusion MRI tractography provides a non-invasive modality to examine the human retinofugal projection, which consists of the optic nerves, optic chiasm, optic tracts, the lateral geniculate nuclei (LGN) and the optic radiations. However, the pathway has several anatomic features that make it particularly challenging to study with tractography, including its location near blood vessels and bone-air interface at the base of the cerebrum, crossing fibers at the chiasm, somewhat-tortuous course around the temporal horn via Meyer's Loop, and multiple closely neighboring fiber bundles. To date, these unique complexities of the visual pathway have impeded the development of a robust and automated reconstruction method using tractography. To overcome these challenges, we develop a novel, fully automated system to reconstruct the retinofugal visual pathway from high-resolution diffusion imaging data. Using multi-shell, high angular resolution diffusion imaging (HARDI) data, we reconstruct precise fiber orientation distributions (FODs) with high order spherical harmonics (SPHARM) to resolve fiber crossings, which allows the tractography algorithm to successfully navigate the complicated anatomy surrounding the retinofugal pathway. We also develop automated algorithms for the identification of ROIs used for fiber bundle reconstruction. In particular, we develop a novel approach to extract the LGN region of interest (ROI) based on intrinsic shape analysis of a fiber bundle computed from a seed region at the optic chiasm to a target at the primary visual cortex. By combining automatically identified ROIs and FOD-based tractography, we obtain a fully automated system to compute the main components of the retinofugal pathway, including the optic tract and the optic radiation. We apply our method to the multi-shell HARDI data of 215 subjects from the Human Connectome Project (HCP). Through comparisons with post-mortem dissection measurements, we demonstrate the retinotopic

  18. Missing texture reconstruction method based on error reduction algorithm using Fourier transform magnitude estimation scheme.

    Science.gov (United States)

    Ogawa, Takahiro; Haseyama, Miki

    2013-03-01

    A missing texture reconstruction method based on an error reduction (ER) algorithm, including a novel estimation scheme of Fourier transform magnitudes is presented in this brief. In our method, Fourier transform magnitude is estimated for a target patch including missing areas, and the missing intensities are estimated by retrieving its phase based on the ER algorithm. Specifically, by monitoring errors converged in the ER algorithm, known patches whose Fourier transform magnitudes are similar to that of the target patch are selected from the target image. In the second approach, the Fourier transform magnitude of the target patch is estimated from those of the selected known patches and their corresponding errors. Consequently, by using the ER algorithm, we can estimate both the Fourier transform magnitudes and phases to reconstruct the missing areas.

  19. Fast reconstruction of off-axis digital holograms based on digital spatial multiplexing.

    Science.gov (United States)

    Sha, Bei; Liu, Xuan; Ge, Xiao-Lu; Guo, Cheng-Shan

    2014-09-22

    A method for fast reconstruction of off-axis digital holograms based on digital multiplexing algorithm is proposed. Instead of the existed angular multiplexing (AM), the new method utilizes a spatial multiplexing (SM) algorithm, in which four off-axis holograms recorded in sequence are synthesized into one SM function through multiplying each hologram with a tilted plane wave and then adding them up. In comparison with the conventional methods, the SM algorithm simplifies two-dimensional (2-D) Fourier transforms (FTs) of four N*N arrays into a 1.25-D FTs of one N*N arrays. Experimental results demonstrate that, using the SM algorithm, the computational efficiency can be improved and the reconstructed wavefronts keep the same quality as those retrieved based on the existed AM method. This algorithm may be useful in design of a fast preview system of dynamic wavefront imaging in digital holography.

  20. Reconstructing the early 19th-century Waal River by means of a 2D physics-based numerical model

    NARCIS (Netherlands)

    Montes Arboleda, A.; Crosato, A.; Middelkoop, H.

    2010-01-01

    Suspended-sediment concentration data are a missing link in reconstructions of the River Waal in the early 1800s. These reconstructions serve as a basis for assessing the long-term effects of major interventions carried out between 1850 AD and the early 20th century. We used a 2D physics-based