File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.eswa.2025.128931
- Scopus: eid_2-s2.0-105010882732
- Find via

Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: CFARMS: A clustered federated learning framework with recursive model selection
| Title | CFARMS: A clustered federated learning framework with recursive model selection |
|---|---|
| Authors | |
| Keywords | Classical federated learning Clustered federated learning Deep learning Efficient communication Model convergence Model selection |
| Issue Date | 12-Jul-2025 |
| Publisher | Elsevier |
| Citation | Expert Systems with Applications, 2025, v. 296, n. Part B How to Cite? |
| Abstract | Federated learning, originally devised to train a single global model over diverse client populations while upholding data privacy, faces challenges due to variations in data distributions among clients. The learning and generalization performance of the global model tends to be suboptimal. To address this, we propose CFARMS (Clustered Federated Learning Algorithm with Recursive Model Selection), a novel framework that effectively enhances model learning within clusters. CFARMS employs an iterative process to cluster clients, primarily leveraging gradients from the loss function of each client's local training. Moreover, in instances of a tie, CFARMS also considers clients’ local model losses to inform clustering decisions. In contrast to existing frameworks, CFARMS produces highly personalized cluster-specific models, mitigating challenges in global model generalization. Additionally, it significantly reduces communication costs within the federated learning network by recursively refining the number of local models per client and updating the cluster models periodically. During training, CFARMS uniquely facilitates decentralized search for the most optimal model for each client. Extensive experimental evaluations on benchmark image and tabular datasets with non-convex models demonstrate the proposed framework's superior performance. It achieved higher prediction accuracy and a remarkable reduction in communication bottleneck, exceeding 80 % and 49 % respectively, compared to state-of-the-art federated learning frameworks, including IFCA, FedProx and FedAvg. It is worth mentioning that the proposed framework exhibits faster convergence and effective scalability to a higher number of models and clients. |
| Persistent Identifier | http://hdl.handle.net/10722/358563 |
| ISSN | 2023 Impact Factor: 7.5 2023 SCImago Journal Rankings: 1.875 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Nanor, Ebenezer | - |
| dc.contributor.author | Cobbinah, M. Bernard | - |
| dc.contributor.author | Yang, Qinli | - |
| dc.contributor.author | Shao, Junming | - |
| dc.contributor.author | Meng, Nan | - |
| dc.contributor.author | Cheung, Jason | - |
| dc.contributor.author | Adjei, K. Philip | - |
| dc.contributor.author | Wang, Leo | - |
| dc.date.accessioned | 2025-08-07T00:33:03Z | - |
| dc.date.available | 2025-08-07T00:33:03Z | - |
| dc.date.issued | 2025-07-12 | - |
| dc.identifier.citation | Expert Systems with Applications, 2025, v. 296, n. Part B | - |
| dc.identifier.issn | 0957-4174 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/358563 | - |
| dc.description.abstract | Federated learning, originally devised to train a single global model over diverse client populations while upholding data privacy, faces challenges due to variations in data distributions among clients. The learning and generalization performance of the global model tends to be suboptimal. To address this, we propose CFARMS (Clustered Federated Learning Algorithm with Recursive Model Selection), a novel framework that effectively enhances model learning within clusters. CFARMS employs an iterative process to cluster clients, primarily leveraging gradients from the loss function of each client's local training. Moreover, in instances of a tie, CFARMS also considers clients’ local model losses to inform clustering decisions. In contrast to existing frameworks, CFARMS produces highly personalized cluster-specific models, mitigating challenges in global model generalization. Additionally, it significantly reduces communication costs within the federated learning network by recursively refining the number of local models per client and updating the cluster models periodically. During training, CFARMS uniquely facilitates decentralized search for the most optimal model for each client. Extensive experimental evaluations on benchmark image and tabular datasets with non-convex models demonstrate the proposed framework's superior performance. It achieved higher prediction accuracy and a remarkable reduction in communication bottleneck, exceeding 80 % and 49 % respectively, compared to state-of-the-art federated learning frameworks, including IFCA, FedProx and FedAvg. It is worth mentioning that the proposed framework exhibits faster convergence and effective scalability to a higher number of models and clients. | - |
| dc.language | eng | - |
| dc.publisher | Elsevier | - |
| dc.relation.ispartof | Expert Systems with Applications | - |
| dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
| dc.subject | Classical federated learning | - |
| dc.subject | Clustered federated learning | - |
| dc.subject | Deep learning | - |
| dc.subject | Efficient communication | - |
| dc.subject | Model convergence | - |
| dc.subject | Model selection | - |
| dc.title | CFARMS: A clustered federated learning framework with recursive model selection | - |
| dc.type | Article | - |
| dc.identifier.doi | 10.1016/j.eswa.2025.128931 | - |
| dc.identifier.scopus | eid_2-s2.0-105010882732 | - |
| dc.identifier.volume | 296 | - |
| dc.identifier.issue | Part B | - |
| dc.identifier.eissn | 1873-6793 | - |
| dc.identifier.issnl | 0957-4174 | - |
