File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.3390/e27080845
- Scopus: eid_2-s2.0-105014278874
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: PAC–Bayes Guarantees for Data-Adaptive Pairwise Learning
| Title | PAC–Bayes Guarantees for Data-Adaptive Pairwise Learning |
|---|---|
| Authors | |
| Keywords | algorithmic stability PAC–Bayes pairwise learning randomized algorithms |
| Issue Date | 8-Aug-2025 |
| Publisher | MDPI |
| Citation | Entropy, 2025, v. 27, n. 8 How to Cite? |
| Abstract | We study the generalization properties of stochastic optimization methods under adaptive data sampling schemes, focusing on the setting of pairwise learning, which is central to tasks like ranking, metric learning, and AUC maximization. Unlike pointwise learning, pairwise methods must address statistical dependencies between input pairs—a challenge that existing analyses do not adequately handle when sampling is adaptive. In this work, we extend a general framework that integrates two algorithm-dependent approaches—algorithmic stability and PAC–Bayes analysis for this purpose. Specifically, we examine (1) Pairwise Stochastic Gradient Descent (Pairwise SGD), widely used across machine learning applications, and (2) Pairwise Stochastic Gradient Descent Ascent (Pairwise SGDA), common in adversarial training. Our analysis avoids artificial randomization and leverages the inherent stochasticity of gradient updates instead. Our results yield generalization guarantees of order (Formula presented.) under non-uniform adaptive sampling strategies, covering both smooth and non-smooth convex settings. We believe these findings address a significant gap in the theory of pairwise learning with adaptive sampling. |
| Persistent Identifier | http://hdl.handle.net/10722/360508 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Zhou, Sijia | - |
| dc.contributor.author | Lei, Yunwen | - |
| dc.contributor.author | Kabán, Ata | - |
| dc.date.accessioned | 2025-09-11T00:30:51Z | - |
| dc.date.available | 2025-09-11T00:30:51Z | - |
| dc.date.issued | 2025-08-08 | - |
| dc.identifier.citation | Entropy, 2025, v. 27, n. 8 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/360508 | - |
| dc.description.abstract | <p>We study the generalization properties of stochastic optimization methods under adaptive data sampling schemes, focusing on the setting of pairwise learning, which is central to tasks like ranking, metric learning, and AUC maximization. Unlike pointwise learning, pairwise methods must address statistical dependencies between input pairs—a challenge that existing analyses do not adequately handle when sampling is adaptive. In this work, we extend a general framework that integrates two algorithm-dependent approaches—algorithmic stability and PAC–Bayes analysis for this purpose. Specifically, we examine (1) Pairwise Stochastic Gradient Descent (Pairwise SGD), widely used across machine learning applications, and (2) Pairwise Stochastic Gradient Descent Ascent (Pairwise SGDA), common in adversarial training. Our analysis avoids artificial randomization and leverages the inherent stochasticity of gradient updates instead. Our results yield generalization guarantees of order (Formula presented.) under non-uniform adaptive sampling strategies, covering both smooth and non-smooth convex settings. We believe these findings address a significant gap in the theory of pairwise learning with adaptive sampling.</p> | - |
| dc.language | eng | - |
| dc.publisher | MDPI | - |
| dc.relation.ispartof | Entropy | - |
| dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
| dc.subject | algorithmic stability | - |
| dc.subject | PAC–Bayes | - |
| dc.subject | pairwise learning | - |
| dc.subject | randomized algorithms | - |
| dc.title | PAC–Bayes Guarantees for Data-Adaptive Pairwise Learning | - |
| dc.type | Article | - |
| dc.description.nature | published_or_final_version | - |
| dc.identifier.doi | 10.3390/e27080845 | - |
| dc.identifier.scopus | eid_2-s2.0-105014278874 | - |
| dc.identifier.volume | 27 | - |
| dc.identifier.issue | 8 | - |
| dc.identifier.eissn | 1099-4300 | - |
| dc.identifier.issnl | 1099-4300 | - |
