2000 character limit reached
FedTune: Automatic Tuning of Federated Learning Hyper-Parameters from System Perspective (2110.03061v6)
Published 6 Oct 2021 in cs.LG
Abstract: Federated learning (FL) hyper-parameters significantly affect the training overheads in terms of computation time, transmission time, computation load, and transmission load. However, the current practice of manually selecting FL hyper-parameters puts a high burden on FL practitioners since various applications prefer different training preferences. In this paper, we propose FedTune, an automatic FL hyper-parameter tuning algorithm tailored to applications' diverse system requirements of FL training. FedTune is lightweight and flexible, achieving 8.48%-26.75% improvement for different datasets compared to fixed FL hyper-parameters.
- Huanle Zhang (12 papers)
- Mi Zhang (85 papers)
- Xin Liu (821 papers)
- Prasant Mohapatra (44 papers)
- Michael DeLucia (1 paper)