Does Flower support Federated Transfer Learning (FTL)?

FTL enables transferring knowledge between domains with little to no overlapping features and users, does the Flower framework currently support this approach?


Hi @ahmedkubba , this seems like a form of training that can be done through Flower. But probably it would be best if you given additional details about what do you mean by FTL exactly (as the “Transfer” part can be understood in various ways). Maybe giving an example would be super useful.

As an example, let’s think of federating the training of a multimodal model that uses both images and audio. In a regular (non-FL) setup you’d use some particular techniques to train such model (e.g. ensuring batches have training examples from both data domains). In the FL setup it might be that only some clients have images and some clients only have audio training examples. Still, there is nothing technical at the Flower API level stopping you from aggregating models that have seen one modality in their previous steps of training.

Maybe some ideas can be extracted from this recent ICLR’23 paper: [2302.08888] Multimodal Federated Learning via Contrastive Representation Ensemble

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.