File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: CFARMS: A clustered federated learning framework with recursive model selection

TitleCFARMS: A clustered federated learning framework with recursive model selection
Authors
KeywordsClassical federated learning
Clustered federated learning
Deep learning
Efficient communication
Model convergence
Model selection
Issue Date12-Jul-2025
PublisherElsevier
Citation
Expert Systems with Applications, 2025, v. 296, n. Part B How to Cite?
AbstractFederated learning, originally devised to train a single global model over diverse client populations while upholding data privacy, faces challenges due to variations in data distributions among clients. The learning and generalization performance of the global model tends to be suboptimal. To address this, we propose CFARMS (Clustered Federated Learning Algorithm with Recursive Model Selection), a novel framework that effectively enhances model learning within clusters. CFARMS employs an iterative process to cluster clients, primarily leveraging gradients from the loss function of each client's local training. Moreover, in instances of a tie, CFARMS also considers clients’ local model losses to inform clustering decisions. In contrast to existing frameworks, CFARMS produces highly personalized cluster-specific models, mitigating challenges in global model generalization. Additionally, it significantly reduces communication costs within the federated learning network by recursively refining the number of local models per client and updating the cluster models periodically. During training, CFARMS uniquely facilitates decentralized search for the most optimal model for each client. Extensive experimental evaluations on benchmark image and tabular datasets with non-convex models demonstrate the proposed framework's superior performance. It achieved higher prediction accuracy and a remarkable reduction in communication bottleneck, exceeding 80 % and 49 % respectively, compared to state-of-the-art federated learning frameworks, including IFCA, FedProx and FedAvg. It is worth mentioning that the proposed framework exhibits faster convergence and effective scalability to a higher number of models and clients.
Persistent Identifierhttp://hdl.handle.net/10722/358563
ISSN
2023 Impact Factor: 7.5
2023 SCImago Journal Rankings: 1.875

 

DC FieldValueLanguage
dc.contributor.authorNanor, Ebenezer-
dc.contributor.authorCobbinah, M. Bernard-
dc.contributor.authorYang, Qinli-
dc.contributor.authorShao, Junming-
dc.contributor.authorMeng, Nan-
dc.contributor.authorCheung, Jason-
dc.contributor.authorAdjei, K. Philip-
dc.contributor.authorWang, Leo-
dc.date.accessioned2025-08-07T00:33:03Z-
dc.date.available2025-08-07T00:33:03Z-
dc.date.issued2025-07-12-
dc.identifier.citationExpert Systems with Applications, 2025, v. 296, n. Part B-
dc.identifier.issn0957-4174-
dc.identifier.urihttp://hdl.handle.net/10722/358563-
dc.description.abstractFederated learning, originally devised to train a single global model over diverse client populations while upholding data privacy, faces challenges due to variations in data distributions among clients. The learning and generalization performance of the global model tends to be suboptimal. To address this, we propose CFARMS (Clustered Federated Learning Algorithm with Recursive Model Selection), a novel framework that effectively enhances model learning within clusters. CFARMS employs an iterative process to cluster clients, primarily leveraging gradients from the loss function of each client's local training. Moreover, in instances of a tie, CFARMS also considers clients’ local model losses to inform clustering decisions. In contrast to existing frameworks, CFARMS produces highly personalized cluster-specific models, mitigating challenges in global model generalization. Additionally, it significantly reduces communication costs within the federated learning network by recursively refining the number of local models per client and updating the cluster models periodically. During training, CFARMS uniquely facilitates decentralized search for the most optimal model for each client. Extensive experimental evaluations on benchmark image and tabular datasets with non-convex models demonstrate the proposed framework's superior performance. It achieved higher prediction accuracy and a remarkable reduction in communication bottleneck, exceeding 80 % and 49 % respectively, compared to state-of-the-art federated learning frameworks, including IFCA, FedProx and FedAvg. It is worth mentioning that the proposed framework exhibits faster convergence and effective scalability to a higher number of models and clients.-
dc.languageeng-
dc.publisherElsevier-
dc.relation.ispartofExpert Systems with Applications-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectClassical federated learning-
dc.subjectClustered federated learning-
dc.subjectDeep learning-
dc.subjectEfficient communication-
dc.subjectModel convergence-
dc.subjectModel selection-
dc.titleCFARMS: A clustered federated learning framework with recursive model selection-
dc.typeArticle-
dc.identifier.doi10.1016/j.eswa.2025.128931-
dc.identifier.scopuseid_2-s2.0-105010882732-
dc.identifier.volume296-
dc.identifier.issuePart B-
dc.identifier.eissn1873-6793-
dc.identifier.issnl0957-4174-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats