File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: FedHe: Heterogeneous Models and Communication-Efficient Federated Learning

TitleFedHe: Heterogeneous Models and Communication-Efficient Federated Learning
Authors
Keywordsfederated learning
communication efficiency
heterogeneous models
knowledge distillation
asynchronous algorithm
Issue Date2021
PublisherIEEE. The Journal's web site is located at https://ieeexplore.ieee.org/xpl/conhome/1002549/all-proceedings
Citation
Proceedings of the 17th International Conference on Mobility, Sensing and Networking (MSN 2021), Exeter, UK, 13-15 December 2021, p. 207-214 How to Cite?
AbstractFederated learning (FL) is able to manage edge devices to cooperatively train a model while maintaining the training data local and private. One common assumption in FL is that all edge devices have similar capabilities and share the same machine learning model in training, for example, identical neural network architecture. However, the computation and store capability of different devices may not be the same. Moreover, reducing communication overheads can improve the training efficiency but it is also a difficult problem in the FL environment. In this paper, we propose a novel FL method, called FedHe, inspired by a core idea from knowledge distillation, which can train with heterogeneous models, handle asynchronous training processes, and reduce communication overheads. Our analysis and experimental results demonstrate that the performance of our proposed method is better than the state-of-the-art algorithms in terms of communication overheads and model accuracy.
DescriptionSession S4: Federated Learning II
Persistent Identifierhttp://hdl.handle.net/10722/312771
ISBN
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorCHAN, YH-
dc.contributor.authorNgai, CHE-
dc.date.accessioned2022-05-12T10:55:21Z-
dc.date.available2022-05-12T10:55:21Z-
dc.date.issued2021-
dc.identifier.citationProceedings of the 17th International Conference on Mobility, Sensing and Networking (MSN 2021), Exeter, UK, 13-15 December 2021, p. 207-214-
dc.identifier.isbn9781665406697-
dc.identifier.urihttp://hdl.handle.net/10722/312771-
dc.descriptionSession S4: Federated Learning II-
dc.description.abstractFederated learning (FL) is able to manage edge devices to cooperatively train a model while maintaining the training data local and private. One common assumption in FL is that all edge devices have similar capabilities and share the same machine learning model in training, for example, identical neural network architecture. However, the computation and store capability of different devices may not be the same. Moreover, reducing communication overheads can improve the training efficiency but it is also a difficult problem in the FL environment. In this paper, we propose a novel FL method, called FedHe, inspired by a core idea from knowledge distillation, which can train with heterogeneous models, handle asynchronous training processes, and reduce communication overheads. Our analysis and experimental results demonstrate that the performance of our proposed method is better than the state-of-the-art algorithms in terms of communication overheads and model accuracy.-
dc.languageeng-
dc.publisherIEEE. The Journal's web site is located at https://ieeexplore.ieee.org/xpl/conhome/1002549/all-proceedings-
dc.relation.ispartofInternational Conference on Mobility, Sensing and Networking (MSN) Proceedings-
dc.rightsInternational Conference on Mobility, Sensing and Networking (MSN) Proceedings. Copyright © IEEE.-
dc.rights©2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.-
dc.subjectfederated learning-
dc.subjectcommunication efficiency-
dc.subjectheterogeneous models-
dc.subjectknowledge distillation-
dc.subjectasynchronous algorithm-
dc.titleFedHe: Heterogeneous Models and Communication-Efficient Federated Learning-
dc.typeConference_Paper-
dc.identifier.emailNgai, CHE: chngai@eee.hku.hk-
dc.identifier.authorityNgai, CHE=rp02656-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/MSN53354.2021.00043-
dc.identifier.hkuros333051-
dc.identifier.spage207-
dc.identifier.epage214-
dc.identifier.isiWOS:000817822300027-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats