File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Agnostic learning of halfspaces with gradient descent via soft margins
Title | Agnostic learning of halfspaces with gradient descent via soft margins |
---|---|
Authors | |
Issue Date | 2021 |
Publisher | ML Research Press. |
Citation | 38th International Conference on Machine Learning (ICML), 18-24 July 2021, Virtual Event. In Proceedings of the 38th International Conference on Machine Learning (ICML) 2021, 18-24 July 2021, v. 139, p. 3417-3426 How to Cite? |
Abstract | We analyze the properties of gradient descent on convex surrogates for the zero-one loss for the agnostic learning of halfspaces. We show that when a quantity we refer to as the extit{soft margin} is well-behaved—a condition satisfied by log-concave isotropic distributions among others—minimizers of convex surrogates for the zero-one loss are approximate minimizers for the zero-one loss itself. As standard convex optimization arguments lead to efficient guarantees for minimizing convex surrogates of the zero-one loss, our methods allow for the first positive guarantees for the classification error of halfspaces learned by gradient descent using the binary cross-entropy or hinge loss in the presence of agnostic label noise. |
Persistent Identifier | http://hdl.handle.net/10722/314620 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Frei, S | - |
dc.contributor.author | Cao, Y | - |
dc.contributor.author | Gu, Q | - |
dc.date.accessioned | 2022-07-22T05:28:03Z | - |
dc.date.available | 2022-07-22T05:28:03Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | 38th International Conference on Machine Learning (ICML), 18-24 July 2021, Virtual Event. In Proceedings of the 38th International Conference on Machine Learning (ICML) 2021, 18-24 July 2021, v. 139, p. 3417-3426 | - |
dc.identifier.uri | http://hdl.handle.net/10722/314620 | - |
dc.description.abstract | We analyze the properties of gradient descent on convex surrogates for the zero-one loss for the agnostic learning of halfspaces. We show that when a quantity we refer to as the extit{soft margin} is well-behaved—a condition satisfied by log-concave isotropic distributions among others—minimizers of convex surrogates for the zero-one loss are approximate minimizers for the zero-one loss itself. As standard convex optimization arguments lead to efficient guarantees for minimizing convex surrogates of the zero-one loss, our methods allow for the first positive guarantees for the classification error of halfspaces learned by gradient descent using the binary cross-entropy or hinge loss in the presence of agnostic label noise. | - |
dc.language | eng | - |
dc.publisher | ML Research Press. | - |
dc.relation.ispartof | Proceedings of the 38th International Conference on Machine Learning (ICML) 2021, 18-24 July 2021 | - |
dc.title | Agnostic learning of halfspaces with gradient descent via soft margins | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Cao, Y: yuancao@hku.hk | - |
dc.identifier.authority | Cao, Y=rp02862 | - |
dc.identifier.hkuros | 334659 | - |
dc.identifier.volume | 139 | - |
dc.identifier.spage | 3417 | - |
dc.identifier.epage | 3426 | - |
dc.publisher.place | Austria | - |