TY - JOUR
T1 - Optimized Federated Multitask Learning in Mobile Edge Networks
T2 - A Hybrid Client Selection and Model Aggregation Approach
AU - Hamood, Moqbel
AU - Albaseer, Abdullatif
AU - Abdallah, Mohamed
AU - Al-Fuqaha, Ala
AU - Mohamed, Amr
N1 - Publisher Copyright:
© 1967-2012 IEEE.
PY - 2024/11
Y1 - 2024/11
N2 - Clustered federated multitask learning is introduced as an effective strategy, specifically designed to tackle statistical challenges, including non-independent and identically distributed data among clients. This is achieved by clustering clients based on a similarity in their data distributions and assigning a specialized model for each cluster. However, this approach encounters complexities when applied in hierarchical wireless networks due to hierarchical two-level model aggregation and resource-based client selection. This results in a slower convergence rate and deprives clients of the best-fit specialized model for those having similar data distribution from different mobile edge networks. To this end, we propose a framework comprising two-phase client selection and two-level model aggregation schemes designed for IoT devices and intelligent vehicles. For client selection, the cloud ensures fairness among all clients by providing them with equal priority to participate in training, contributing to more accurate clustering. Once a particular cluster reaches a stopping point, the related edge server performs two client selection methods separately, greedy and round-robin. The greedy algorithm prioritizes clients with less latency and superior resources, while the round-robin algorithm allows for cyclic participation in training. The cloud then executes model aggregation in two distinct ways: one based on reaching pre-determined aggregation rounds (round-based model aggregation) and the other by performing at least one split at the edge servers (split-based model aggregation). We conduct extensive experiments to evaluate our proposed approach. Our results show that the proposed algorithms significantly improve the convergence rate, minimize training time, and reduce energy consumption by up to 60% while providing every client with a specialized model specifically attuned to its data distribution.
AB - Clustered federated multitask learning is introduced as an effective strategy, specifically designed to tackle statistical challenges, including non-independent and identically distributed data among clients. This is achieved by clustering clients based on a similarity in their data distributions and assigning a specialized model for each cluster. However, this approach encounters complexities when applied in hierarchical wireless networks due to hierarchical two-level model aggregation and resource-based client selection. This results in a slower convergence rate and deprives clients of the best-fit specialized model for those having similar data distribution from different mobile edge networks. To this end, we propose a framework comprising two-phase client selection and two-level model aggregation schemes designed for IoT devices and intelligent vehicles. For client selection, the cloud ensures fairness among all clients by providing them with equal priority to participate in training, contributing to more accurate clustering. Once a particular cluster reaches a stopping point, the related edge server performs two client selection methods separately, greedy and round-robin. The greedy algorithm prioritizes clients with less latency and superior resources, while the round-robin algorithm allows for cyclic participation in training. The cloud then executes model aggregation in two distinct ways: one based on reaching pre-determined aggregation rounds (round-based model aggregation) and the other by performing at least one split at the edge servers (split-based model aggregation). We conduct extensive experiments to evaluate our proposed approach. Our results show that the proposed algorithms significantly improve the convergence rate, minimize training time, and reduce energy consumption by up to 60% while providing every client with a specialized model specifically attuned to its data distribution.
KW - Client selection
KW - Collaboration
KW - Computational modeling
KW - Convergence
KW - Data models
KW - Hierarchical mobile wireless networks
KW - Internet of Things
KW - Model aggregation
KW - Resource allocation
KW - Servers
KW - Training
KW - clustered federated learning (CFL)
UR - https://www.scopus.com/pages/publications/105002087938
U2 - 10.1109/TVT.2024.3427349
DO - 10.1109/TVT.2024.3427349
M3 - Article
AN - SCOPUS:105002087938
SN - 0018-9545
VL - 73
SP - 17613
EP - 17629
JO - IEEE Transactions on Vehicular Technology
JF - IEEE Transactions on Vehicular Technology
IS - 11
ER -