Skip to yearly menu bar Skip to main content


Poster

Novel View Synthesis with View-Dependent Effects from a Single Image

Juan Luis Gonzalez Bello · Munchurl Kim

Arch 4A-E Poster #72
[ ] [ Project Page ]
[ Poster
Thu 20 Jun 10:30 a.m. PDT — noon PDT

Abstract:

In this paper, we firstly consider view-dependent effects into single image-based novel view synthesis (NVS) problems. For this, we propose to exploit the camera motion priors in NVS to model view-dependent appearance or effects (VDE) as the negative disparity in the scene. By recognizing specularities \enquote{follow} the camera motion, we infuse VDEs into the input images by aggregating input pixel colors along the negative depth region of the epipolar lines. Also, we propose a `relaxed volumetric rendering' approximation that allows computing the densities in a single pass, improving efficiency for NVS from single images. Our method can learn single-image NVS from image sequences only, which is a completely self-supervised learning method, for the first time requiring neither depth nor camera pose annotations.We present extensive experiment results and show that our proposed method can learn NVS with VDEs, outperforming the SOTA single-view NVS methods on the RealEstate10k and MannequinChallenge datasets.

Chat is not available.