Skip to yearly menu bar Skip to main content


Poster

SPU-PMD: Self-Supervised Point Cloud Upsampling via Progressive Mesh Deformation

Yanzhe Liu · Rong Chen · Yushi Li · Yixi Li · Xuehou Tan

Arch 4A-E Poster #34
[ ] [ Paper PDF ]
[ Poster
Wed 19 Jun 5 p.m. PDT — 6:30 p.m. PDT

Abstract:

Despite the success of recent upsampling approaches, generating high-resolution point sets with uniform distribution and meticulous structures is still challenging. Unlike existing methods that only take spatial information of the raw data into account, we regard point cloud upsampling as generating dense point clouds from deformable topology. Motivated by this, we present SPU-PMD, a self-supervised topological mesh deformation network, for 3D densification. As a cascaded framework, our architecture is formulated by a series of coarse mesh interpolator and mesh deformers. At each stage, the mesh interpolator first produces the initial dense point clouds via mesh interpolation, which allows the model to perceive the primitive topology better. Meanwhile, the deformer infers the morphing by estimating the movements of mesh nodes and reconstructs the descriptive topology structure. By associating mesh deformation with feature expansion, this module progressively refines point clouds' surface uniformity and structural details. To demonstrate the effectiveness of the proposed method, extensive quantitative and qualitative experiments are conducted on synthetic and real-scanned 3D data. Also, we compare it with state-of-the-art techniques to further illustrate the superiority of our network. The project page is: https://github.com/lyz21/SPU-PMD

Chat is not available.