Poster
TMO: Textured Mesh Acquisition of Objects With a Mobile Device by Using Differentiable Rendering
Jaehoon Choi · Dongki Jung · Taejae Lee · Sangwook Kim · Youngdong Jung · Dinesh Manocha · Donghwan Lee
West Building Exhibit Halls ABC 019
We present a new pipeline for acquiring a textured mesh in the wild with a single smartphone which offers access to images, depth maps, and valid poses. Our method first introduces an RGBD-aided structure from motion, which can yield filtered depth maps and refines camera poses guided by corresponding depth. Then, we adopt the neural implicit surface reconstruction method, which allows for high quality mesh and develops a new training process for applying a regularization provided by classical multi-view stereo methods. Moreover, we apply a differentiable rendering to fine-tune incomplete texture maps and generate textures which are perceptually closer to the original scene. Our pipeline can be applied to any common objects in the real world without the need for either in-the-lab environments or accurate mask images. We demonstrate results of captured objects with complex shapes and validate our method numerically against existing 3D reconstruction and texture mapping methods.