Hyperparameter Optimization with Optuna and Flower

Hi @ all,
does someone has an minimal example of how using optuna with flower for hyper-parameter optimization? The official paper is Federated Hyperparameter Optimisation with Flower and Optuna — University of Bristol, but there are some unclear instructions, maybe someone has an idea, of what they are implemented. In the paper, they provide a listing for the fit_config method, like this:

study = optuna.create_study(direction=‘maximize’)
HPO_rate = 1
def fit_config(self, server_round: int):
if (server_round-1)%(HPO_rate)==0:
trial = self.study.ask()
config = { … }
return config

My question is: In Flower documentation fit_config is a method, which is called by configure_fit method by the strategy object. Usually we would implement the fit_config method outside any class, but here it is implemented, like fit_config method is a method of a class, indicating by the “self”. So what’s the matter with this, what class do we have to implement for this?

Thanks in advance.

1 Like

Hi, because we don’t have access to the complete code of the paper you mentioned, we can’t determine how their code interacts with other parts. It appears that in their code, fit_config is integrated into a class.
In Flower, this integration is generally unnecessary (though there might be a specific reason for its use in that paper, so it would be better to ask them for the full code), and you don’t need to implement fit_config as part of a class.

2 Likes

Hi @vikwal did you successfully manage to implement the paper in flower framework for hyper parameter tuning?