Skip to yearly menu bar Skip to main content


Poster

External Knowledge Enhanced 3D Scene Generation from Sketch

Zijie Wu · Mingtao Feng · Yaonan Wang · He Xie · Weisheng Dong · Bo Miao · Ajmal Mian

[ ]
Thu 3 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

Generating realistic 3D scenes is challenging due to the complexity of room layouts and object geometries. We propose a sketch based knowledge enhanced diffusion architecture (SEK) for generating customized, diverse, and plausible 3D scenes. SEK conditions the denoising process with a hand-drawn sketch of the target scene and cues from an object relationship knowledge base. We first construct an external knowledge base containing object relationships and then leverage knowledge enhanced graph reasoning to assist our model in understanding hand-drawn sketches. A scene is represented as a combination of 3D objects and their relationships, and then incrementally diffused to reach a Gaussian distribution. We propose a 3D denoising scene transformer that learns to reverse the diffusion process, conditioned by a hand-drawn sketch along with knowledge cues, to regressively generate the scene including the 3D object instances as well as their layout. Experiments on the 3D-FRONT dataset show that our model improves FID, CKL by 17.41\%, 37.18\% in 3D scene generation and FID, KID by 19.12\%, 20.06\% in 3D scene completion compared to the nearest competitor DiffuScene.

Live content is unavailable. Log in and register to view live content