Yes! We have a quickstart example with XGBoost bagging aggregation strategy, and a comprehensive example supporting both bagging and cyclic training.
As for the bagging strategy we used in these examples, it’s not based on a specific paper. The bagging strategy is a common and effective way to do tree ensemble, which could be used for federated XGBoost. Here are some related materials to help understand this method:
-
This paper describes how to perform FL with gradient boosting decision tree (GBDT). XGBoost belongs to GBDT.
-
Wikipedia of bootstrap bagging: Bootstrap aggregating - Wikipedia.