Hi @ all,
does someone has an minimal example of how using optuna with flower for hyper-parameter optimization? The official paper is Federated Hyperparameter Optimisation with Flower and Optuna — University of Bristol, but there are some unclear instructions, maybe someone has an idea, of what they are implemented. In the paper, they provide a listing for the fit_config method, like this:
study = optuna.create_study(direction=‘maximize’)
HPO_rate = 1
def fit_config(self, server_round: int):
if (server_round-1)%(HPO_rate)==0:
trial = self.study.ask()
config = { … }
return config
My question is: In Flower documentation fit_config is a method, which is called by configure_fit method by the strategy object. Usually we would implement the fit_config method outside any class, but here it is implemented, like fit_config method is a method of a class, indicating by the “self”. So what’s the matter with this, what class do we have to implement for this?
Thanks in advance.