How to Train on a Fixed Number of Mini-Batches per Round

Hi everyone,

I’m working on a federated learning setup using Flower and would like to train each client on a fixed number of mini-batches per round instead of a full epoch. I noticed that the pyproject.toml file allows setting local_epochs, and the Flower documentation mentions that training can be limited to “as little as a few steps (mini-batches)” in each round

However, I’m unsure how to implement this efficiently:

  1. Is there a built-in way to specify a fixed number of mini-batches per round using pyproject.toml or client configuration?
  2. Or would I need to manually define an iterator within the client class to handle this?

Any guidance or best practices would be greatly appreciated!

Thanks in advance!

Hi there, welcome to the Flower community!

Instead of specifying local_epochs in pyproject.toml, you could specify num_batches and then use that value in your local training pipeline to control the number of mini batches (e.g., in your PyTorch training function). Flower doesn’t control the local training function, it merely provides a way to communicate hyper-parameters from server to client.

Does this answer your question?

Thanks for the reply.
I solved it how you propsed, and with an interator class.