ADAPTIVE MULTISCALE OPTICAL FLOW ESTIMATION
Jian Li
1
, Christopher P Benton
1
, Stavri G Nikolov
2
, Nicholas E Scott-Samuel
1
1
Department of Experimental Psychology, 12A, Priory Road, BS8 1TU, U.K.
2
Department of Electrical and Electronic Engineering, MVB, BS8 1UB, U.K.
University of Bristol, U.K.
ABSTRACT
The current paper presents a novel adaptive multiscale scheme
to estimate optical flow from image sequences. The scheme
models estimation uncertainties which are used to reduce the
influence of unreliable intermediate estimates on accuracy.
The experimental results show that the proposed method pro-
vides more accurate estimates for both small and large mo-
tions than a standard multiscale scheme in which an incre-
ment is added to an intermediate estimate regardless of esti-
mation certainty.
Index Terms— Optical flow, multiscale, pyramid, least
squares. uncertainty
1. INTRODUCTION
The current paper presents a novel adaptive multiscale scheme
to recover optical flows from image sequences. In a stan-
dard multiscale scheme, for example [1], a warped image at a
finer pyramid level is produced using estimates from a coarser
pyramid level. By using the warped image and video image at
the same level, a velocity increment is estimated which is used
as a correction to the velocity estimate from a coarser level.
In the standard scheme, an increment could be affected by
noise at the finer scale and once the increment is erroneously
estimated, the scheme is not able to recover from the error
[2]. Regarding this problem, Simoncelli modeled cross-scale
refinement as a stochastic process and applied the Kalman fil-
tering technique to ensure the optimality of intermediate esti-
mates.
A second problem which is less frequently addressed is
the influence of the number of pyramid levels on estimation
accuracy, especially for small displacements. In real applica-
tion, the largest number of pyramid levels should generally
be used (within the limit of image size) to cover all possi-
ble displacements. However, because of the down-sampling
procedure in constructing a Gaussian pyramid [3], a small ve-
locity from the original images could be down-scaled to a tiny
velocity at the coarsest scale. In this case, image noise may
The work has been funded by the UK MOD Data and Information Fusion
Defence Technology Centre.
introduce large error to the coarsest estimate and the error re-
mains in the refinements at finer scales. As we show below,
the standard scheme is not able to produce accurate estimate
in this case.
The proposed adaptive scheme solves the above problems
through improving the accuracy of intermediate estimates at
all levels. The scheme assumes a stochastic process for the
cross-scale velocity refinement, in which estimation uncer-
tainties are modeled as variances of intermediate estimates
obtained from a least squares estimation scheme. By adap-
tively reducing the variances, superior accuracy can be guar-
anteed. Our experiments show that the proposed technique
produces more accurate estimates than the standard scheme
for both small and large displacements. Moreover, the pro-
posed scheme ensures that the use of a large number of pyra-
mid levels does not introduce serious errors to small displace-
ments and the scheme is suitable for a procedure in which
both cross-scale and same-scale refinements are adopted.
2. ADAPTIVE MULTISCALE ESTIMATION
2.1. Optical Flow Estimation
If a pixel moves from (x, y, t) to (x + u, y + v,t + 1), we
assume:
I (x + u, y + v,t + 1) + c = I (x, y, t), (1)
∇I (x + u, y + v,t + 1) = ∇I (x, y, t), (2)
where I denotes image intensity, u and v are velocities in x
and y directions, respectively. ∇ is a partial differentiation
operator, c is a parameter compensating temporal variation
of intensities. Eq.(1) models the constraint on image intensi-
ties, in which intensity variation c is allowed [4] while Eq.(2)
models the constraint on spatial derivatives which are also as-
sumed to be conserved over time. Applying Taylor expansion
to the above models, we can get a linear expression of the
unknown parameters w =[u, v, c]
T
:
Aw =
⎡
⎣
I
x
I
y
1
I
xx
I
xy
0
I
xy
I
yy
0
⎤
⎦
w ≈-
⎡
⎣
I
t
I
xt
I
yt
⎤
⎦
= -b, (3)
II - 509 1-4244-1437-7/07/$20.00 ©2007 IEEE ICIP 2007