Skip to yearly menu bar Skip to main content


Poster

Neural 3D Strokes: Creating Stylized 3D Scenes with Vectorized 3D Strokes

Haobin Duan · Miao Wang · Yanxun Li · Yong-Liang Yang

Arch 4A-E Poster #39
[ ] [ Project Page ] [ Paper PDF ]
[ Slides [ Poster
Wed 19 Jun 5 p.m. PDT — 6:30 p.m. PDT

Abstract:

We present Neural 3D Strokes, a novel technique to generate stylized images of a 3D scene at arbitrary novel views from multi-view 2D images. Different from existing methods which apply stylization to trained neural radiance fields at the voxel level, our approach draws inspiration from image-to-painting methods, simulating the progressive painting process of human artwork with vector strokes. We develop a palette of stylized 3D strokes from basic primitives and splines, and consider the 3D scene stylization task as a multi-view reconstruction process based on these 3D stroke primitives. Instead of directly searching for the parameters of these 3D strokes, which would be too costly, we introduce a differentiable renderer that allows optimizing stroke parameters using gradient descent, and propose a training scheme to alleviate the vanishing gradient issue. The extensive evaluation demonstrates that our approach effectively synthesizes 3D scenes with significant geometric and aesthetic stylization while maintaining a consistent appearance across different views. Our method can be further integrated with style loss and image-text contrastive models to extend its applications, including color transfer and text-driven 3D scene drawing. Results and code are available at http://buaavrcg.github.io/Neural3DStrokes.

Chat is not available.