Skip to yearly menu bar Skip to main content


Poster

Wear-Any-Way: Manipulable Virtual Try-on via Sparse Correspondence Alignment

Mengting Chen · Xi Chen · Zhonghua Zhai · Chen Ju · Xuewen Hong · Jinsong Lan · Shuai Xiao

[ ]
Thu 3 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

This paper introduces a novel framework for virtual try-on, termed Wear-Any-Way. Different from previous methods, Wear-Any-Way is “customizable”. Besides generating high-fidelity results, our method supports users to precisely control the wearing style. To achieve this goal, we first construct a strong pipeline, supporting single/multiple garment try-on and model-to model try-on in complicated scenarios. To make it manipulable, we propose sparse correspondence alignment and involve point-based control to guide the generation. Wear-Any-Way gets state- of-the-art performance for the standard setting and provides a novel interaction form for customizing the wearing style. For instance, it supports users to drag the sleeve to make it rolled up, drag the coat to make it open, and utilize clicks to control the style of tuck, etc. Wear-Any-Way enables more liberated and flexible expressions of the attires, which holds profound implications in the fashion industry.

Live content is unavailable. Log in and register to view live content