Hierarchical Federated Learning

Hi everyone,

We are exploring the idea of introducing edge servers into a federated learning system to sit between clients and the central server. This aligns with hierarchical federated learning (HFL), where edge servers aggregate client models locally before sending updates to the global server. Potential benefits include:

• Improved scalability and reduced communication overhead.

• Better handling of client heterogeneity through localized aggregation.

Opportunities to Contribute

We’d love to hear your input! Here’s how you can contribute:

  1. Ideas and Design: Share thoughts on implementing edge servers or suggest design improvements.

  2. Code and Prototypes: Help experiment with hierarchical aggregation using Flower.

  3. Research and Resources: Share papers, tools, or examples relevant to hierarchical federated learning.

Key Questions

• How can hierarchical federated learning be implemented using Flower?

• What modifications or extensions would be needed?

• Are there best practices for multi-level aggregation strategies (client-edge-server-global)?

Let’s collaborate to explore this concept!

6 Likes

Hi, is this in the works? I’d appreciate an update. This would be very useful!

Hi @marykor, we recently had a presentation about it during our Flower AI Summit 2025.

I’ll try to get the paper if it is public and link it here, from what I know that author might implement the baseline for Hierarchical FL. Will keep you updated.

2 Likes