Federated Learning (FL) is a machine learning paradigm that allows
decentralized clients to learn collaboratively without sharing their private
data. However, excessive computation and communication demands pose challenges
to current FL frameworks, especially when training large-scale models. To
prevent these issues from hindering the deployment of FL systems, we propose a
lightweight framework where clients jointly learn to fuse the representations
generated by multiple fixed pre-trained models rather than training a
large-scale model from scratch. This leads us to a more practical FL problem by
considering how to capture more client-specific and class-relevant information
from the pre-trained models and jointly improve each client's ability to
exploit those off-the-shelf models. In this work, we design a Federated
Prototype-wise Contrastive Learning (FedPCL) approach which shares knowledge
across clients through their class prototypes and builds client-specific
representations in a prototype-wise contrastive manner. Sharing prototypes
rather than learnable model parameters allows each client to fuse the
representations in a personalized way while keeping the shared knowledge in a
compact form for efficient communication. We perform a thorough evaluation of
the proposed FedPCL in the lightweight framework, measuring and visualizing its
ability to fuse various pre-trained models on popular FL datasets.