Skip to yearly menu bar Skip to main content


Poster

ESR-NeRF: Emissive Source Reconstruction Using LDR Multi-view Images

Jinseo Jeong · Junseo Koo · Qimeng Zhang · Gunhee Kim

Arch 4A-E Poster #432
[ ]
Wed 19 Jun 10:30 a.m. PDT — noon PDT

Abstract:

Existing NeRF-based inverse rendering methods suppose that scenes are exclusively illuminated by distant light sources, neglecting the potential influence of emissive sources within a scene. In this work, we confront this limitation using LDR multi-view images captured with emissive sources turned on and off. Two key issues must be addressed: 1) ambiguity arising from the limited dynamic range along with unknown lighting details, and 2) the expensive computational cost in volume rendering to backtrace the paths leading to final object colors. We present a novel approach, ESR-NeRF, leveraging neural networks as learnable functions to represent ray-traced fields. By training networks to satisfy light transport segments, we regulate outgoing radiances, progressively identifying emissive sources while being aware of reflection areas. The results on scenes encompassing emissive sources with various properties demonstrate the superiority of ESR-NeRF in qualitative and quantitative ways. Our approach also extends its applicability to the scenes devoid of emissive sources, achieving lower CD metrics on the DTU dataset.

Live content is unavailable. Log in and register to view live content