Cost-Effective Federated Learning: A Unified Approach to Device and Training Scheduling

Document Type

Conference Proceeding

Publication Date

1-1-2024

Abstract

Federated learning enables decentralized model training across numerous devices without data centralization, leveraging model updates to enhance privacy and reduce communication overhead. Despite its advantages, federated learning systems must be optimized for cost efficiency, considering the limited computational capabilities and battery life of edge devices. Current research often focuses on minimizing either time or energy costs but rarely both, and does not jointly optimize the parameters of device and training scheduling in the presence of system and data heterogeneity. In our paper, we formulate a novel joint optimization problem for device and training scheduling that minimizes the total cost of federated learning while ensuring model convergence. We propose a new device scheduling scheme, Group Scheduling on Orthogonal Frequency-Division Multiple Access (GS-OFDMA), to improve time efficiency and develop an iterative algorithm to tackle the resulting mixed integer nonlinear programming problem. Our experimental results show that our approach significantly reduces the total cost by at least 35 % across different real-world datasets and data distributions in comparison with random participant selection.

Publication Source (Journal or Book title)

IEEE International Conference on Communications

First Page

3488

Last Page

3493

This document is currently not available here.

Share

COinS