File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Differentially private stochastic gradient descent with low-noise

TitleDifferentially private stochastic gradient descent with low-noise
Authors
KeywordsDifferential privacy
Generalization
Low-noise
Stochastic gradient descent
Issue Date25-Mar-2024
PublisherElsevier
Citation
Neurocomputing, 2024, v. 585 How to Cite?
AbstractModern machine learning algorithms aim to extract fine-grained information from data to provide accurate predictions, which often conflicts with the goal of privacy protection. This paper addresses the practical and theoretical importance of developing privacy-preserving machine learning algorithms that ensure good performance while preserving privacy. In this paper, we focus on the privacy and utility (measured by excess risk bounds) performances of differentially private stochastic gradient descent (SGD) algorithms in the setting of stochastic convex optimization. Specifically, we examine the pointwise problem in the low-noise setting for which we derive sharper excess risk bounds for the differentially private SGD algorithm. In the pairwise learning setting, we propose a simple differentially private SGD algorithm based on gradient perturbation. Furthermore, we develop novel utility bounds for the proposed algorithm, proving that it achieves optimal excess risk rates even for non-smooth losses. Notably, we establish fast learning rates for privacy-preserving pairwise learning under the low-noise condition, which is the first of its kind.
Persistent Identifierhttp://hdl.handle.net/10722/345925
ISSN
2023 Impact Factor: 5.5
2023 SCImago Journal Rankings: 1.815

 

DC FieldValueLanguage
dc.contributor.authorWang, Puyu-
dc.contributor.authorLei, Yunwen-
dc.contributor.authorYing, Yiming-
dc.contributor.authorZhou, Ding Xuan-
dc.date.accessioned2024-09-04T07:06:30Z-
dc.date.available2024-09-04T07:06:30Z-
dc.date.issued2024-03-25-
dc.identifier.citationNeurocomputing, 2024, v. 585-
dc.identifier.issn0925-2312-
dc.identifier.urihttp://hdl.handle.net/10722/345925-
dc.description.abstractModern machine learning algorithms aim to extract fine-grained information from data to provide accurate predictions, which often conflicts with the goal of privacy protection. This paper addresses the practical and theoretical importance of developing privacy-preserving machine learning algorithms that ensure good performance while preserving privacy. In this paper, we focus on the privacy and utility (measured by excess risk bounds) performances of differentially private stochastic gradient descent (SGD) algorithms in the setting of stochastic convex optimization. Specifically, we examine the pointwise problem in the low-noise setting for which we derive sharper excess risk bounds for the differentially private SGD algorithm. In the pairwise learning setting, we propose a simple differentially private SGD algorithm based on gradient perturbation. Furthermore, we develop novel utility bounds for the proposed algorithm, proving that it achieves optimal excess risk rates even for non-smooth losses. Notably, we establish fast learning rates for privacy-preserving pairwise learning under the low-noise condition, which is the first of its kind.-
dc.languageeng-
dc.publisherElsevier-
dc.relation.ispartofNeurocomputing-
dc.subjectDifferential privacy-
dc.subjectGeneralization-
dc.subjectLow-noise-
dc.subjectStochastic gradient descent-
dc.titleDifferentially private stochastic gradient descent with low-noise-
dc.typeArticle-
dc.identifier.doi10.1016/j.neucom.2024.127557-
dc.identifier.scopuseid_2-s2.0-85189760691-
dc.identifier.volume585-
dc.identifier.eissn1872-8286-
dc.identifier.issnl0925-2312-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats