Enabling Warping on Stereoscopic Images
Yuzhen Niu, Wu-Chi Feng, and Feng Liu
Portland State University
Abstract
Warping is one of the basic image processing techniques. Directly applying existing monocular image warping techniques to stereoscopic images is problematic as it often introduces vertical disparities and damages the original disparity distribution. In this paper, we show that these problems can be solved by appropriately warping both the disparity map and the two images of a stereoscopic image. We accordingly develop a technique for extending existing image warping algorithms to stereoscopic images. This technique divides stereoscopic image warping into three steps. Our method first applies the user-specified warping to one of the two images. Our method then computes the target disparity map according to the user specified warping. The target disparity map is optimized to preserve the perceived 3D shape of image content after image warping. Our method finally warps the other image using a spatially-varying warping method guided by the target disparity map. Our experiments show that our technique enables existing warping methods to be effectively applied to stereoscopic images, ranging from parametric global warping to non-parametric spatially-varying warping.
Paper
Yuzhen Niu, Wu-Chi Feng, and Feng Liu. Enabling Warping on Stereoscopic Images
ACM Transactions on Graphics (Proceedings of ACM SIGGRAPH Asia 2012). PDF  
Video
Search engine friendly content
Supplementary material: zip html
Acknowledgements
We would like to thank Rob Crockett and Flickr users, including -ytf-, turbguy, Dan(aka firrs), jaysdesk, tanj3d and fossilmike, for letting us use their photos under a Creative Commons license or with their permissions. The “Elephants Dream” frames  are from Blender Foundation / Netherlands Media Art Institute / www.elephantsdream.org, used under a Creative Commons license and the stereo version is used from Youtube user geekboydischead under a Creative Commons license. This work was supported by NSF CNS-1205746 and the Portland State University Faculty Enhancement Grant. This video demo is voiced by Jeremy Silver.