File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: ESFL: Efficient Split Federated Learning over Resource-Constrained Heterogeneous Wireless Devices

TitleESFL: Efficient Split Federated Learning over Resource-Constrained Heterogeneous Wireless Devices
Authors
KeywordsDistributed machine learning (ML)
federated learning (FL)
split learning
wireless networking
Issue Date7-May-2024
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Internet of Things Journal, 2024, v. 11, n. 16, p. 27153-27166 How to Cite?
AbstractFederated learning (FL) allows multiple parties (distributed devices) to train a machine learning model without sharing raw data. How to effectively and efficiently utilize the resources on devices and the central server is a highly interesting yet challenging problem. In this paper, we propose an efficient split federated learning algorithm (ESFL) to take full advantage of the powerful computing capabilities at a central server under a split federated learning framework with heterogeneous end devices (EDs). By splitting the model into different submodels between the server and EDs, our approach jointly optimizes user-side workload and server-side computing resource allocation by considering users’ heterogeneity. We formulate the whole optimization problem as a mixed-integer non-linear program, which is an NP-hard problem, and develop an iterative approach to obtain an approximate solution efficiently. Extensive simulations have been conducted to validate the significantly increased efficiency of our ESFL approach compared with standard federated learning, split learning, and splitfed learning.
Persistent Identifierhttp://hdl.handle.net/10722/348432

 

DC FieldValueLanguage
dc.contributor.authorZhu, Guangyu-
dc.contributor.authorDeng, Yiqin-
dc.contributor.authorChen, Xianhao-
dc.contributor.authorZhang, Haixia-
dc.contributor.authorFang, Yuguang-
dc.contributor.authorWong, Tan F-
dc.date.accessioned2024-10-09T00:31:28Z-
dc.date.available2024-10-09T00:31:28Z-
dc.date.issued2024-05-07-
dc.identifier.citationIEEE Internet of Things Journal, 2024, v. 11, n. 16, p. 27153-27166-
dc.identifier.urihttp://hdl.handle.net/10722/348432-
dc.description.abstractFederated learning (FL) allows multiple parties (distributed devices) to train a machine learning model without sharing raw data. How to effectively and efficiently utilize the resources on devices and the central server is a highly interesting yet challenging problem. In this paper, we propose an efficient split federated learning algorithm (ESFL) to take full advantage of the powerful computing capabilities at a central server under a split federated learning framework with heterogeneous end devices (EDs). By splitting the model into different submodels between the server and EDs, our approach jointly optimizes user-side workload and server-side computing resource allocation by considering users’ heterogeneity. We formulate the whole optimization problem as a mixed-integer non-linear program, which is an NP-hard problem, and develop an iterative approach to obtain an approximate solution efficiently. Extensive simulations have been conducted to validate the significantly increased efficiency of our ESFL approach compared with standard federated learning, split learning, and splitfed learning.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Internet of Things Journal-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectDistributed machine learning (ML)-
dc.subjectfederated learning (FL)-
dc.subjectsplit learning-
dc.subjectwireless networking-
dc.titleESFL: Efficient Split Federated Learning over Resource-Constrained Heterogeneous Wireless Devices-
dc.typeArticle-
dc.identifier.doi10.1109/JIOT.2024.3397677-
dc.identifier.scopuseid_2-s2.0-85192991965-
dc.identifier.volume11-
dc.identifier.issue16-
dc.identifier.spage27153-
dc.identifier.epage27166-
dc.identifier.eissn2327-4662-
dc.identifier.issnl2327-4662-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats