File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: Stochastic proximal AUC maximization
Title | Stochastic proximal AUC maximization |
---|---|
Authors | |
Keywords | AUC maximization Imbalanced classification Proximal operator Stochastic gradient descent |
Issue Date | 2021 |
Citation | Journal of Machine Learning Research, 2021, v. 22 How to Cite? |
Abstract | In this paper we consider the problem of maximizing the Area under the ROC curve (AUC) which is a widely used performance metric in imbalanced classification and anomaly detection. Due to the pairwise nonlinearity of the objective function, classical SGD algorithms do not apply to the task of AUC maximization. We propose a novel stochastic proximal algorithm for AUC maximization which is scalable to large scale streaming data. Our algorithm can accommodate general penalty terms and is easy to implement with favorable O(d) space and per-iteration time complexities. We establish a high-probability convergence rate O(1/√T) for the general convex setting, and improve it to a fast convergence rate O(1/T) for the cases of strongly convex regularizers and no regularization term (without strong convexity). Our proof does not need the uniform boundedness assumption on the loss function or the iterates which is more fidelity to the practice. Finally, we perform extensive experiments over various benchmark data sets from real-world application domains which show the superior performance of our algorithm over the existing AUC maximization algorithms. |
Persistent Identifier | http://hdl.handle.net/10722/329709 |
ISSN | 2023 Impact Factor: 4.3 2023 SCImago Journal Rankings: 2.796 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lei, Yunwen | - |
dc.contributor.author | Ying, Yiming | - |
dc.date.accessioned | 2023-08-09T03:34:45Z | - |
dc.date.available | 2023-08-09T03:34:45Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | Journal of Machine Learning Research, 2021, v. 22 | - |
dc.identifier.issn | 1532-4435 | - |
dc.identifier.uri | http://hdl.handle.net/10722/329709 | - |
dc.description.abstract | In this paper we consider the problem of maximizing the Area under the ROC curve (AUC) which is a widely used performance metric in imbalanced classification and anomaly detection. Due to the pairwise nonlinearity of the objective function, classical SGD algorithms do not apply to the task of AUC maximization. We propose a novel stochastic proximal algorithm for AUC maximization which is scalable to large scale streaming data. Our algorithm can accommodate general penalty terms and is easy to implement with favorable O(d) space and per-iteration time complexities. We establish a high-probability convergence rate O(1/√T) for the general convex setting, and improve it to a fast convergence rate O(1/T) for the cases of strongly convex regularizers and no regularization term (without strong convexity). Our proof does not need the uniform boundedness assumption on the loss function or the iterates which is more fidelity to the practice. Finally, we perform extensive experiments over various benchmark data sets from real-world application domains which show the superior performance of our algorithm over the existing AUC maximization algorithms. | - |
dc.language | eng | - |
dc.relation.ispartof | Journal of Machine Learning Research | - |
dc.subject | AUC maximization | - |
dc.subject | Imbalanced classification | - |
dc.subject | Proximal operator | - |
dc.subject | Stochastic gradient descent | - |
dc.title | Stochastic proximal AUC maximization | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85105893722 | - |
dc.identifier.volume | 22 | - |
dc.identifier.eissn | 1533-7928 | - |