Adaptive Curve Region based Motion Estimation and Motion Visualization of Cardiac Ultrasound Imaging Tian Cao, Chaowei Tan and Dong C. Liu School of Computer Science, Sichuan University Chengdu, China [email protected], [email protected] Abstract—This paper presents a novel method for motion estimation and visualization of cardiac ultrasound images. The motion vectors are derived from an adaptive curve region based matching algorithm. The size of curve regions can be adjusted adaptively based on ultrasound system parameters and position information of the region’s center which is located on the scan lines as sampling points. Then the Curve Minimum Mean of Absolute Difference (CMMAD) method is proposed to the curve region to estimate the motion vector field of two successive images. After obtained a vector field of the sampling points, we employ the Thin-Plate Spline (TPS) transformation function to recover the non-rigid motion and interpolation motion vectors for each pixel in the region of interest (ROI) of the ultrasound image. Finally we apply the Unsteady Flow Line Integral Convolution (UFLIC) to the vector field for motion visualization. Keywords- Cardiac Ulrasound; Curve Motion Estimation; Thin-Plate Spline; Unsteady Flow; Motion Visualization

I.

INTRODUCTION

Ultrasound imaging is one of the most widely used techniques in clinical diagnosis and treatment of illness and injury. The cardiac ultrasound imaging is an important application in medical ultrasound. In this application, as proposed in [1], cardiac structure and tissue motion estimated from the ultrasonic image sequence constitutes an important aid for quantification of the tissue properties such as elasticity and structure contractility, and the visualization technology could extract the information of motion dynamics and offer more direct information for the clinician to analyze the tissue motion of heart. Speckle tracking is one of methods for motion estimation of ultrasound images which is commonly implemented by blockmatching method. A series of block-matching algorithms for motion estimation have been proposed in recent years, and most of them are based on rectangular blocks. These methods become less effective when they are applied to cardiac ultrasound video images acquired by a phase array probe. In this paper, a Curve Minimum Mean of Absolute Difference (CMMAD) method is presented for motion estimation using adaptive curve regions as criteria. After the displacement of each sampling points is extracted from two successive images, we introduced Thin-Plate Spline (TPS) interpolation algorithm [2] to recover the motion of heart structure and reconstruct the local motion field. TPS is a basis functions based interpolation

method and is widely used in non-rigid image registration and shape matching [3]. Visualization of vector fields is a challenging issue in scientific visualization and computer graphics. The best known is perhaps Line Integral Convolution (LIC) technique developed by Cabral and Leedom [4]. The basic idea of the LIC is displaying a texture where each output pixel value is computed by convolving a white noise texture along the streamline of the vector field in both directions from the pixel location. The output of LIC algorithm provides a global dense representation of the flow. However, LIC is computed following the traces of streamline in a steady flow field, which cannot be readily used for visualizing unsteady flow data. Forssell and Cohen [5] proposed an extension by modifying the convolution path from streamlines to pathlines for visualizing unsteady vector fields. However, they reported that flow lines in the output images become ambiguous when the convolution window is set too wide. Shen and Kao [6] proposed a new algorithm for visualizing unsteady flow which is called Unsteady Flow Line Integral Convolution (UFLIC). UFLIC uses a time-accurate value scattering scheme and a successive texture feed-forward approach to achieve high temporal and spatial coherence. In this paper, we use the UFLIC method to visualize the time-varying vector fields. This paper is organized as follows: the adaptive curve region based matching algorithm is described in Section II and The TPS interpolation is discussed in Section III. The unsteady flow line integral convolution is presented in Section IV, the results are demonstrated in Section V. Conclusions are given in Section VI. II.

ADAPTIVE CURVE REGION BASED MOTION ESTIMATION

In this section, the algorithms of adaptive curve region generation and CMMAD for the motion estimation of a sequence of successive images are described in detail. As we know, most of the applications of cardiac ultrasound imaging use a phase array probe to acquire the diagnostic information because ribs will block the ultrasound transmission when using big curved or linear probes. The sector-shaped B-mode image of heart structure and tissue, such as valve and chamber, is displayed after scan conversion. Because of the special physical and scanning characteristics, we can see the size of speckle is different along the axial direction. Therefore the cardiac motion estimation using speckle tracking needs an adaptive and robust method.

978-1-4244-2902-8/09/$25.00 ©2009 IEEE

1

Instead of applying fixed blocks as criteria, we develop an approach to decide dynamically the size of comparison regions in different positions and use the CMMAD method for the ultrasound sector images. The adaptive comparison region-size decision method will be described in Part A, and the CMMAD method will be proposed in Part B and C. A. Adaptive Comparison Region Size Decision In order to decide the comparison region size, we use the following equation,

s = f c ed

(1)

In (1), fc is the constant coefficient from experiments, ed is the dynamic coefficient of speckle information from current ultrasound system parameters and sampling point position and s is the computed size. To obtain reliable speckle statistics, the ed should be large enough to cover a region with a high density of scatterers. From [7], it is shown that the scatterer density should be at least 5 per pulse width in order to approach a fully developed speckle region with Rayleigh distribution. The pulse width ΔT is proportional to the standard deviation of a Gaussian-enveloped pulse in the time domain. For a scanning depth of Z mm and N pixels in the axial direction, the ed is:

ed =

N ΔTC0 2Z

(2)

where C0 is the sound speed. B. Curve Region as Matching Criteria According to the region size calculated in Part A, we can construct the curve region for matching. For example, for a sampling point at t(x0, y0) with size s0, first we transform it into a polar coordinate t’(r0, A0), we treat position (w/2, 0) of the original image coordinate as the origin of the polar coordinate and the y-axis as the polar axis, where w is the width of image and r0 is the scanning radius, A0 is the scanning radian. So we can compute four points to define the curve region R0 with the center point t’,

t1' ( r0 −

s0 s s s , A0 − 0 ) , t 2' ( r0 + 0 , A0 − 0 ) 2 2 r0 2 2 r0

s s s s t ( r0 + 0 , A0 + 0 ) , t 4' ( r0 − 0 , A0 + 0 ) 2 2 r0 2 2 r0 ' 3

Then we need to transform the curve region R0 into a regular s0×s0 block B0 that is convenient for comparison. First, several sampling points with an equal interval s0 are generated along each sampling line, and the intersection angle between two adjacent sampling lines equals to 1/ r0. Second, all the sampling points located in R0 are calculated by bilinear interpolation from the original image to get the block B0. If two candidates curve region R1 and R2 are chosen, we need to estimate the similarity of these two regions. Because these two regions may have different size, for example s1 and s2, we should normalize the two images into the same size, and we must use Mean of Absolute Difference (MAD) instead of

Sum of Absolute Differences (SAD) to estimate these two regions. C. CMMAD Searching Strategy in Polar Coordinate Using the methods from Part A and B, we implement the curve region based searching in polar coordinates. The searching strategy of CMMAD refers to the Improved Minimum Sum of Absolute Differences (IMSAD) [8] but implemented in polar coordinates defined in Part B. The curve region matching first searches along the scanning line direction, or radial direction, and then along the arc direction, or angle direction, to find the matching region. This searching method conforms to the scanning process of the phase array probe, in which all the scanning lines are arranged in a fan with the same interval angle to interpolate the final B-mode image, so based on this method the information for matching is more accurate. III.

THIN-PLATE SPLINE FOR NON-RIGID INTERPOLATION

From the above section, the motion vector field of the sampling points is obtained, and then the whole motion field must be reconstructed for motion visualization. However, the motion pattern of the heart is very irregular, and local tissue motion, speckle decorrelation, out-of-plane motion always happen during the motion estimation and will pose a significant challenge in whole motion field reconstruction. Therefore, the traditional rigid and linear transformation model is not precise enough for this complex motion pattern of heart. To interpolate the motion vectors of pixels among these sampling points, we introduce the TPS method [2]. The TPS method has an elegant algebraic expression of the dependence of the physical bending energy of a thin metal plate on point constraints. TPS has been widely used as the non-rigid transformation model in image registration and shape matching. TPS also has a number of advantages [9]: •

The interpolation is smooth with derivatives of any order.



The model has no free parameters that need manual tuning.



The method has closed-form solutions for both warping and parameter estimation.



There is a physical explanation for the bending energy of a thin sheet of metal.

Hence this method can better simulate the motion of the heart. In TPS method, n one-to-one corresponding points, which are previously decided, denoted by pi and qi, i = 1,…, n, fulfill (3),

qi = u ( pi ), i = 1,..., n

(3)

where u is the transformation function that minimizes the bending energy J md (u ) for each component uk, k = 1,..,d of the transformation u, where d is the number of dimensions and m is the order of derivatives [10]. So we have the following equation,

2

d

J md (u) = ∑ J md (uk )

(4)

k =1

and when d = 2 and m = 2, we get,

J 22 = ∫∫ 2 [( ℜ

∂ 2u x 2 ∂ 2u x 2 ∂ 2 u x 2 + ) 2( ) + ( 2 ) ]dxdy ∂x2 ∂x∂y ∂y

+ ∫∫ 2 [( ℜ

∂ 2u y ∂x 2

) 2 + 2(

∂ 2u y ∂x ∂y

)2 + (

∂ 2u y ∂y 2

) 2 ]dxdy

(5)

The solution of minimizing the J 22 in (5) can be obtained as following, n

ux ( x, y) = ac 0 + ax 0 x + a y 0 y + ∑ wxiU (|| ( xi , yi ) − ( x, y) ||) i =1

components, the time-accurate value scattering scheme and the successive texture feed-forward strategy. A. Time-Accurate Value Scattering The time-accurate value scattering scheme is a new convolution method which is modified from traditional value gathering convolution method. This scheme computes a convolution image for each step of unsteady flow data by advecting the input texture over time. The input texture can be either a white noise image or convolution result from the preceding step. Given an input texture, every pixel serves as a seed particle. From its pixel position at the starting time t, the seed particle advects forward in the field both in space and time following a pathline which can be defined as:

p (t + Δ t ) = p (t ) +

(6)

u y ( x, y) = ac1 + ax1 x + a y1 y + ∑ wyiU (|| ( xi , yi ) − ( x, y) ||) (7)

where the parameters ac0, ac1, ax0, ax1, ay0 and ay1 are the linear affine transformation, and the parameters wxi and wyi are the weights of the non-linear elastic interpolation function U:

U (r ) = r 2 ln(r 2 )

(8)

Using (6) to (7), we can get the interpolated point p’ = (ux, uy) from the original point p = (x, y), and the corresponding motion vector v = p’- p. IV.

UNSTEADY FLOW LINE INTEGRAL CONVOLUTION FOR ULTRASOUND IMAGING

The cardiac motion pattern acquired via motion estimation is quite complicated. Visualizing this motion pattern is a challenging issue because the coherences between spatial and temporal domains are hard to present when the motion is inherently unsteady. The original LIC method is a texture synthesis technique which can be used for steady vector field data visualization. Taking as input a vector field and a white noise image, the algorithm perform one-dimensional convolution on the texels using a low-pass filter kernel along both directions of the streamlines. Given a streamline σ, the intensity I for a pixel located at x0=σ (s0) is [4]: s0 + L / 2

I ( x0 ) =



k ( s − s0 )T (σ ( s )) ds



v ( p (t ), t ) dt

(10)

t

n

i =1

t +Δt

(9)

s0 − L / 2

where T stands for an input noise texture, k denotes the filter kernel, s is an arc length used to parameterize the streamline curve, and L represents the filter kernel length [4]. UFLIC [6] is a new LIC method for visualizing timevarying vector fields. Starting from a white noise texture, the UFLIC algorithm successively advects the texture to create a sequence of flow images. This algorithm has two main

where p(t) is the position of the particle at time t and v(p(t))is the vector at p(t) at time t. The particle traces are generated by integrating the equation. At each time step, a scattering process occurs for which a seed is released from each pixel as a contributor to scatter the texture value to pixels along the newly advected pathline. The scattering length of the particle is defined as its life span that usually ranges through several time steps. On the other hand, to receive the scattering values, each pixel keeps a buffer with several buckets corresponding to different times. Each bucket has two values, one for accumulated pixel value, Iaccum, and the other for accumulated weights, Waccum. The scattering at the nth integration step is done by adding the image value I and its corresponding weight W to the bucket of the pixel:

Iaccum = Iaccum + I Waccum = Waccum + W The final convolution value C is computed:

C=

I accum Waccum

(11)

An output image is obtained by convolving the values that each pixel has received and stored in the bucket. B. Successive Feed-Forward In order to enhance temporal coherence, a successive feedforward process is proposed in UFLIC. Instead of using the noise texture again, the output from the previous convolution is used as the input texture for next convolution. In the successive feed-forward process, the contrasts among flow lines will gradually be reduced over time as the previous convolution step is actually a low-pass filter. The UFLIC applies a 2D high-pass filter to the input texture which helps to enhance the flow lines and maintain the contrast in the input texture. Fig.1 gives an overview of the UFLIC algorithm .

3

ACKNOWLEDGMENT The authors thank Alex Huang and Xiaohui Zuo of Saset Healthcare Inc. for obtaining the cardiac ultrasound images.

Figure 2. Vector Field 1

Figure 5. Motion visualization 1

Figure 3. Vector Field 2

Figure 6. Motion visualization 2

Figure 4. Vector Field 3

Figure 7. Motion visualization 3

Figure 1. The UFLIC algorithm flowchart

V.

RESULTS AND DISCUSSION

To verify our proposed adaptive curve region based motion estimation and motion visualization algorithms, we have tested a series of successive cardiac ultrasound images using in vivo data. We used a modern digital ultrasound scanner for data acquisition. The standard test set contains 30 images. The vector fields generated from our algorithms is displayed in Fig.2 to 4 successively. The detected motion vectors are superimposed on the original images. We can see that the motion pattern described by the vector fields is quite matching with the motion between the test data frames. The visualization of the corresponding vector fields using UFLIC is shown in Fig. 5 to 7 correspondingly. The lifespan of the UFLIC is 4 time steps, which means that the pixel value from current image frame will scatter forward about 4 image frames. The UFLIC image with motion fields can display motion patterns with more information in direction and singularity. It also displays the interface of different velocity fields, which may not be obtained from the conventional B-mode and color-flow ultrasound imaging [1]. VI.

CONCLUSIONS AND FUTURE WORK

In this paper, we present an adaptive curve region matching algorithm for motion estimation. Each curve region is regularized to a rectangular block for matching. This CMMAD motion estimation method is quite suitable for cardiac ultrasound images acquired by a phase array probe. We use TPS nonlinear interpolation to obtain a precise vector field based on the sampling points. Using the UFLIC to visualize the time-varying vector fields, we can create highly coherent flow animation. In future work, more robust motion estimation methods and more accurate interpolation algorithms need to be considered for the situation of speckle decorrelation and tissue deformation. Moreover, some new visualizing methods could be involved to display the motion vector fields of ultrasound imaging, and some high performance computational resources, such as Graphics Processing Unit (GPU), should be implemented for real-time processing.

REFERENCES [1]

D. C. Liu, L.L. Hou and Paul S. Liu, “Motion Visualization of Ultrasound Imaging,” LNCS. Advances in Visual Computing, vol. 3804/2005, pp. 653-658, Nov. 2005. [2] FL Bookstein, “Principal Warps: Thin-plate Splines and The decomposition of Deformations,” IEEE Trans. Pattern Analysis and Machine Learning, Vol. 11, no. 6, pp. 567-585, June 1989. [3] B. Zitová and J. Flusser, “Image registration methods: a survey,” Image and Vision Computing, Vol. 21, no. 11, pp. 977-1000, Oct. 2003. [4] B. Cabral, and C. Leedom, “Imaging Vector Fields Using Line Integral Convolution,” Proc. ACM SIGGRAPH ’97, pp. 263-270, 1993. [5] L.K. Forssell and S.D. Cohen, “Using Line Integral Convolution for Flow Visualization: Curvilinear Grids, Variable-Speed Animation, and Unsteady Flows,” IEEE Trans. Visualization and Computer Graphics, vol. 1, no. 2, pp. 133-141, June 1995. [6] H.-W. Shen and D.L. Kao, “A New Line Integral Convolution Algorithm for Visualzing Time-Varying Flow Fields,” IEEE Trans. Visualization and Computer Graphics, vol. 4, no. 2, pp. 98-108, Apr.June 1998. [7] T.A. Tuthill, R.H. Sperry and K.J. Parker, “Deviation from Rayleigh Statistics in Ultrasonic Speckle,” Ultrasonic Imaging, Vol. 10, no. 2, 8189, Apr. 1988. [8] C. Tan and D. C. Liu, “Image Registration Based Wide-Field-of-View Method in Ultrasound Imaging,” Second International Conference on Bioinformatics and Biomedical Engineering, Shanghai, China, 2008. [9] Wikipedia(Dec. 2006), Thin-Plate Spline(n. d.), Retrieved Nov 2. 2008, from http://en.wikipedia.org/wiki/Thin_plate_spline [10] K. Rohr, H..S. Stiehl and et al, “Landmark-Based Elastic Registration Using Approximating Thin-Plate Splines,” IEEE Trans. Medical Imaging, Vol. 20, no. 6, pp. 526-534, June 2001.

4

Adaptive Curve Region based Motion Estimation and ...

spatial coherence. In this paper, we use the UFLIC method to visualize the time-varying vector fields. This paper is organized as follows: the adaptive curve ..... estimation and motion visualization algorithms, we have tested a series of successive cardiac ultrasound images using in vivo data. We used a modern digital ...

538KB Sizes 1 Downloads 292 Views

Recommend Documents

Status Update: Adaptive Motion Estimation Search Range
Aug 3, 2009 - 33% of CPU time. • 90% of memory access ... CPU usage*. Motion .... H.264 / AVC Baseline Profile. Ref. Software. JM 15.1. Frame Size. QCIF.

gaze direction estimation tool based on head motion ... - eurasip
The data acquisition system is a single digital camera. The main advantage of this solution is that iris position and head motion can be extracted from the same video data. Oth- er systems for gaze direction estimation usually use more sophisticated

gaze direction estimation tool based on head motion ... - eurasip
98, pp. 236-274, 2005. [2] B. Noureddin, P. Lawrence, C. Man – A Non Contact System for Tracking Gaze in a Human Computer Interface – Com- puter Vison ...

Robust Tracking with Motion Estimation and Local ...
Jul 19, 2006 - The proposed tracker outperforms the traditional MS tracker as ... (4) Learning-based approaches use pattern recognition algorithms to learn the ...... tracking, IEEE Trans. on Pattern Analysis Machine Intelligence 27 (8) (2005).

true motion estimation — theory, application, and ... - Semantic Scholar
5 Application in Motion Analysis and Understanding: Object-Motion Estima- ...... data, we extend the basic TMT to an integration of the matching-based technique ...

decentralized set-membership adaptive estimation ... - Semantic Scholar
Jan 21, 2009 - new parameter estimate. Taking advantage of the sparse updates of ..... cursive least-squares using wireless ad hoc sensor networks,”. Proc.

true motion estimation — theory, application, and ... - Semantic Scholar
From an application perspective, the TMT successfully captured true motion vectors .... 6 Effective System Design and Implementation of True Motion Tracker.

Geometric Motion Estimation and Control for ... - Berkeley Robotics
the motion of surgical tools to the motion of the surface of the heart, with the surgeon ...... Conference on Robotics and Automation, May 2006, pp. 237–244.

Adaptive Fusion and Sparse Estimation of Multi-sensor ...
two GV sites of the TRMM , Houston, TX. (HSTN) and ... Data. Motivations. Mathematics of Rainfall Images. (GSM) Estimation and Fusion Sparse .... L2 Recovery.

Robust Tracking with Motion Estimation and Local ... - Semantic Scholar
Jul 19, 2006 - Visual tracking has been a challenging problem in computer vision over the decades. The applications ... This work was supported by an ERCIM post-doctoral fellowship at. IRISA/INRIA ...... 6 (4) (1995) 348–365. [31] G. Hager ...

Geometric Motion Estimation and Control for Robotic ...
motion of the surface, and several different technologies have been proposed in ..... the origin of Ψh, and the axes of the tool frame are properly aligned.

improved rate control and motion estimation for h.264 ... - CiteSeerX
speed in designing the encoder. The encoder developed by the Joint ..... [3] “x264,” http://developers.videolan.org/x264.html. [4] “MPEG-4 AVC/H.264 video ...

visual motion estimation and terrain modeling for ...
Sep 8, 2005 - stereo cameras to localize a rover offers a more robust solution ... vision-based localization system to allow a planetary ..... The final bit file was.

Fast Sub-Pixel Motion Estimation and Mode Decision ...
partition selection and only performs the 'precise' sub-pel search for the best ... Motion Estimation process contains two stages: integer pixel search .... full. COST avg. COST. COST avg. COST ii blocktype if c where COSTFull is the best COST after

dChipSNP: significance curve and clustering of SNP-array-based loss ...
of-heterozygosity (LOH) analysis of paired normal and tumor ... intensity patterns, Affymetrix software makes an A, B or AB call, and the SNP calls of a pair of ...

Estimation of cartilaginous region in noncontrast CT of ...
Washington DC, USA; b Computer Science Department, Princeton University, NJ; ... scans of two pectus excavatum patients and three healthy subjects.

HMM-BASED MOTION RECOGNITION SYSTEM ...
hardware and software technologies that allow spatio- temporal ... the object trajectory include tracking results from video trackers ..... different systems for comparison. ... Conference on Computer Vision and Pattern Recognition, CVPR. 2004.

Kernel-Based Skyline Cardinality Estimation
which is more meaningful than the conventional skyline for high- dimensional ... [2], use DD&C as a sub-routine and, thus, preserve its worst-case bound, while ...

Comparison of Camera Motion Estimation Methods for ...
Items 1 - 8 - 2 Post-Doctoral Researcher, Construction Information Technology Group, Georgia Institute of. Technology ... field of civil engineering over the years.

Real-Time Motion Trajectory-Based Indexing and ...
of the object trajectory in this setting include tracking results from video trackers .... An important application area of trajectory-based indexing is human activity ...

Vision-based Altitude and Pitch Estimation for Ultra ...
High precision inertial measurement units (IMU) are too heavy to be em- ..... techniques,” International Journal of Computer Vision, vol. 12, no. 1, pp. 43–77 ...

VLSI Oriented Fast Multiple Reference Frame Motion Estimation ...
mance. For the VLSI real-time encoder, the heavy computation of fractional motion ... is the most computation intensive part, can not be saved. Another promising ...

A Computation Control Motion Estimation Method for ... - IEEE Xplore
Nov 5, 2010 - tion estimation (ME) adaptively under different computation or ... proposed method performs ME in a one-pass flow. Experimental.