Skip to yearly menu bar Skip to main content


Poster

Relaxed Contrastive Learning for Federated Learning

Seonguk Seo · Jinkyu Kim · Geeho Kim · Bohyung Han

Arch 4A-E Poster #257
[ ] [ Paper PDF ]
[ Poster
Thu 20 Jun 10:30 a.m. PDT — noon PDT

Abstract:

We propose a novel contrastive learning framework to effectively address the challenges of data heterogeneity in federated learning. We first analyze the inconsistency of gradient updates across clients during local training and establish its dependence on the distribution of feature representations, leading to the derivation of the supervised contrastive learning (SCL) objective to mitigate local deviations.In addition, we show that a na\"ive adoption of SCL leads to representation collapse, resulting in slow convergence and limited performance gains. To address this issue, we introduce a relaxed contrastive learning loss that imposes a divergence penalty on excessively similar sample pairs within each class. This strategy prevents collapsed representations and enhances feature transferability, facilitating collaborative training and leading to significant performance improvements. Our framework outperforms all existing federated learning approaches by huge margins on the standard benchmarks through extensive experimental results. We plan to release the source code of our work for better reproducibility.

Chat is not available.