Skip to yearly menu bar Skip to main content


Poster

Hierarchical Unsupervised Relation Distillation for Source Free Domain Adaptation

Bowei Xing · Xianghua Ying · Ruibin Wang · Ruohao Guo · Ji Shi · Wenzhen Yue

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Wed 2 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

Source free domain adaptation (SFDA) aims to transfer the model trained on labeled source domain to unlabeled target domain without accessing source data. Recent SFDA methods predominantly rely on self-training, which supervise the model with pseudo labels generated from individual data samples. However, they often ignore the crucial data structure information and sample relationships that are beneficial for adaptive training. In this paper, we propose a novel hierarchical relation distillation framework, establishing multi-level relations across samples in an unsupervised manner, which fully exploits inherent data structure to guide the sample training instead of using the isolated pseudo labels. We first distinguish the source-like samples based on prediction reliability in the training process, followed by an effort on distilling knowledge to those target-specific ones by transferring both local clustering relation and global semantic relation. Specifically, we leverage the affinity with nearest neighborhood samples for local relation and consider the similarity to category-wise Gaussian Mixtures for global relation, offering complementary supervision to facilitate student learning. To validate the effectiveness of our approach, we conduct extensive experiments on four diverse benchmarks, achieving better performance compared to previous methods.

Live content is unavailable. Log in and register to view live content