DreamSpace

Dreaming Your Room Space with Text-Driven
Panoramic Texture Propagation

IEEE VR 2024

1ByteDance, 2State Key Lab of CAD & CG, Zhejiang University

DreamSpace allows users to personalize the appearance of real-world scene reconstructions with text prompts, and delivers immersive VR experiences on HMD devices.

Abstract

Diffusion-based methods have achieved prominent success in generating 2D media. However, accomplishing similar proficiencies for scene-level mesh texturing in 3D spatial applications, e.g., XR/VR, remains constrained, primarily due to the intricate nature of 3D geometry and the necessity for immersive free-viewpoint rendering. In this paper, we propose a novel indoor scene texturing framework, which delivers text-driven texture generation with enchanting details and authentic spatial coherence. The key insight is to first imagine a stylized 360° panoramic texture from the central viewpoint of the scene, and then propagate it to the rest areas with inpainting and imitating techniques. To ensure meaningful and aligned textures to the scene, we develop a novel coarse-to-fine panoramic texture generation approach with dual texture alignment, which both considers the geometry and texture cues of the captured scenes. To survive from cluttered geometries during texture propagation, we design a separated strategy, which conducts texture inpainting in confidential regions and then learns an implicit imitating network to synthesize textures in occluded and tiny structural areas. Extensive experiments and the immersive VR application on real-world indoor scenes demonstrate the high quality of the generated textures and the engaging experience on VR headsets.

Video


Key Insight

The key insight of our work is to generate and propagate textures in the panoramic space. First, we transfer the panorama of the scene reconstructions into high-resolution stylized images that align to the real-world geometry. Then, we propagate the stylized panorama into the rest of the scenes with inpainting and imitating techniques, while taking into account occlusions in complex real-world geometries.


Scene Mesh Texturing on DreamSpot Dataset

We show scene mesh texturing on our captured DreamSpot Dataset. Here, we first reconstruct the real-world scenes with images captured by an iPhone. Then, we transform the scene into several styles, such as cyberpunk, anime landscape, nebula or harrypotter.

Scene Mesh Texturing on DreamSpot Dataset

We show scene mesh texturing on our captured DreamSpot Dataset. Here, we first reconstruct the real-world scenes with images captured by an iPhone. Then, we transform the scene into several styles, such as cyberpunk, anime landscape, nebula or harrypotter.


Scene Mesh Texturing on Replica Dataset

We show scene mesh texturing on the Replica Dataset, including office 0, room 0, room 1.



BibTeX


    @article{yang2023dreamspace,
      title={DreamSpace: Dreaming Your Room Space with Text-Driven Panoramic Texture Propagation},
      author={Yang, Bangbang and Dong, Wenqi and Ma, Lin and Hu, Wenbo and Liu, Xiao and Cui, Zhaopeng and Ma, Yuewen},
      booktitle={arXiv preprint arXiv:2310.13119},
      year={2023}
    }