We propose an effective 4D image registration algorithm for dynamic volumetric lung images. The registration will construct a deforming 3D model with continuous trajectory and smooth spatial deformation, and the model interpolates the interested region in the 4D (3D+T) CT images. The resultant non-rigid transformation is represented using two 4D B-spline functions, indicating a forward and an inverse 4D parameterization respectively. The registration process solves these two functions by minimizing an objective function that penalizes intensity matching error, feature alignment error, spatial and temporal non-smoothness, and inverse inconsistency. We test our algorithm for respiratory motion estimation on public benchmarks and on clinic lung CT data. The experimental results demonstrate the efficacy of our algorithm.
This paper presents a consistent feature-aligned 4D image registration algorithm and its medical application. The matching across a temporal sequence of volumetric images is based on a 4D (3D spatial + 1D temporal) free-form B-spline deformation model, which ensures interpolated motions with both spatial and temporal smoothness. We first develop the forward and inverse matching models with feature alignment constraints, then iteratively refine the registration results by incorporating extra inverse consistency. Experimental results show that our method achieves better registration accuracy than previous 3D registration and 4D registration methods. This algorithm can be used to parameterize temporal CT lung volume images for motion analysis and tracking.
A computational framework for modeling the respiratory motion of lung tumors provides a 4D parametric representation that tracks, analyzes, and models movement to provide more accurate guidance in the planning and delivery of lung tumor radiothearpy.
We propose a biharmonic model for cross-object volumetric mapping. This new computational model aims to facilitate the mapping of solid models with complicated geometry or heterogeneous inner structure. We demonstrate its effectiveness on m
We develop a feature-aware 4D spatiotemporal image registration method. Our model is based on a 4D (3D+time) free-form B-spline deformation model which has both spatial and temporal smoothness. We first introduce an automatic 3D feature extraction and matching method based on an improved 3D SIFT descriptor, which is scale- and rotation- invariant. Then we use the results of feature correspondence to guide an intensity-based deformable image registration. Experimental results show that our method can lead to smooth temporal registration with good matching accuracy; therefore this registration model is potentially suitable for dynamic tumor tracking.
This video visualizes our 4D parameterization, i.e., a continuously deforming 3D geometry, constructed from a sequence of 3D CT volume images.
We first extract the lung contour through image segmentation on the first frame, then reconstruct a 3D surface model. Next, we compute the 4D registration through all the frames. To visualize the result, we deform the reconstructed 3D model following the parameterization, which gives us a tracking of the lung contour. We verify the accuracy of our registration and motion tracking on the original volume scans, by showing the intersection of our 3D model and the image cross-sections.
Our experiments are performed on the publicly available DIR-Lab and POPI benchmark datasets. Our landmark prediction errors are within 1.2mm. This roughly doubles the accuracy computed by the existing popular 3D registration algorithms/software.