COMPUTATIONAL FILTER-APERTURE APPROACH FOR SINGLE-VIEW MULTI-
FOCUSING
Vivek Maik, Dohee Cho, Sangjin Kim, Donghwan Har and Joonki Paik
Graduate School of Advanced Imaging Science Multimedia and Film, Chung-Ang University, Seoul 156 756,
South Korea. (vivek5681@wm.cau.ac.kr )
ABSTRACT
Most of the focusing techniques need to estimate depth information
for ensuring that the object of interest is at an appropriate distance
for full frontal focus. Computational cameras which can variably
focus different regions of the scene with large depth of field have
been proposed. In this paper we propose a full auto-focusing
algorithm using computational camera without involving any
digital image restoration methods and just one input. The proposed
computational camera uses multiple filter apertures corresponding
to each color channel which can acquire three shifted views of a
scene in the RGB color planes. We can make any region focused
by appropriately shifting each color channel to be aligned. Depth
map estimation is carried out to extract different regions from these
channel shifted images which is later fused to produce the final
image without any focal blur. Experimental results show
performance and feasibility of the proposed algorithm for auto-
focusing images with one or more differently out-of-focused
objects.
Index Terms— Image restoration, image classification.
1. INTRODUCTION
Demand for digital auto-focusing techniques is rapidly increasing
in many visual applications, such as camcorders, digital cameras,
and video surveillance systems. Conventional cameras have come
a long way in dealing with problems associated with focal settings
and blur. Even though several steps have been taken, focal blur
caused by varying distance of the object from the lens has been
something that the conventional cameras still have to deal with.
With focus set at either near, mid or far regions of the scene, the
captured image tend to have only that particular region in focus
where as the remaining regions tend to be in out-of-focus. Post-
processing steps in the form of blur restoration and multiple image
fusion have been proposed to deal with the focusing problem.
Recently computational cameras have been developed that
were capable of capturing additional information from the scene
which when combined with post-processing can overcome several
drawbacks of the imaging applications including: refocusing,
increased dynamic range, depth-guided editing, variable lighting
and reflectance, etc. The scope of this paper is to deal with the
first factor that is the focal blur due to the mismatch in the distance
of the object and focal length of the lens. In this paper we propose
a combined hardware-software approach by which we tend to
model the focal blur as channel dependent depth maps which are
then used to remove the focal blur in images. The former
(hardware) refers to the computational camera which employs a
novel multiple filter aperture (FA) models for separating and
distributing the blur into different color channels as shown in Fig.
1.
Fig. 1. (a) The modified camera configuration with the proposed
multiple color filter-aperture. A normal aperture in a traditional
camera, shown in (b), is replaced with the modified aperture
consisting of red (R), green (G), and blue (B) color filters as shown
in (c).
The latter (software) refers to the computation or algorithm part
which involves variable focusing, depth map estimation and fusion
to generate a fully focused image from just single FA input. The
block diagram of the proposed single view auto focusing algorithm
is shown in Fig. 2.
Fig. 2. (a) Block diagram of the proposed algorithm using
computational camera, (b) five objects including a bunny(orange),
a flower (red), a star (yellow), a tree (green), and a cloud(sapphire
green) located at different distances from the camera, (c)-(f)
captured image and respective color channels, (f-k) four differently
focused images with optimal focusing at the bunny, the flower, the
tree, and the cloud, respectively and channel depth.
1541 978-1-4244-5654-3/09/$26.00 ©2009 IEEE ICIP 2009