File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Neyman-Pearson classification under a strict constraint
Title | Neyman-Pearson classification under a strict constraint |
---|---|
Authors | |
Keywords | Anomaly detection Binary classification Chance constrained optimization Empirical constraint Empirical risk minimization Neyman-Pearson paradigm |
Issue Date | 2011 |
Citation | Journal of Machine Learning Research, 2011, v. 19, p. 595-613 How to Cite? |
Abstract | Motivated by problems of anomaly detection, this paper implements the Neyman-Pearson paradigm to deal with asymmetric errors in binary classification with a convex loss. Given a finite collection of classifiers, we combine them and obtain a new classifier that satisfies simultaneously the two following properties with high probability: (i), its probability of type I error is below a pre-specified level and (ii), it has probability of type II error close to the minimum possible. The proposed classifier is obtained by minimizing an empirical objective subject to an empirical constraint. The novelty of the method is that the classifier output by this problem is shown to satisfy the original constraint on type I error. This strict enforcement of the constraint has interesting consequences on the control of the type II error and we develop new techniques to handle this situation. Finally, connections with chance constrained optimization are evident and are investigated. © 2011 P. Rigollet & X. Tong. |
Persistent Identifier | http://hdl.handle.net/10722/354113 |
ISSN | 2023 Impact Factor: 4.3 2023 SCImago Journal Rankings: 2.796 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Rigollet, Philippe | - |
dc.contributor.author | Tong, Xin | - |
dc.date.accessioned | 2025-02-07T08:46:33Z | - |
dc.date.available | 2025-02-07T08:46:33Z | - |
dc.date.issued | 2011 | - |
dc.identifier.citation | Journal of Machine Learning Research, 2011, v. 19, p. 595-613 | - |
dc.identifier.issn | 1532-4435 | - |
dc.identifier.uri | http://hdl.handle.net/10722/354113 | - |
dc.description.abstract | Motivated by problems of anomaly detection, this paper implements the Neyman-Pearson paradigm to deal with asymmetric errors in binary classification with a convex loss. Given a finite collection of classifiers, we combine them and obtain a new classifier that satisfies simultaneously the two following properties with high probability: (i), its probability of type I error is below a pre-specified level and (ii), it has probability of type II error close to the minimum possible. The proposed classifier is obtained by minimizing an empirical objective subject to an empirical constraint. The novelty of the method is that the classifier output by this problem is shown to satisfy the original constraint on type I error. This strict enforcement of the constraint has interesting consequences on the control of the type II error and we develop new techniques to handle this situation. Finally, connections with chance constrained optimization are evident and are investigated. © 2011 P. Rigollet & X. Tong. | - |
dc.language | eng | - |
dc.relation.ispartof | Journal of Machine Learning Research | - |
dc.subject | Anomaly detection | - |
dc.subject | Binary classification | - |
dc.subject | Chance constrained optimization | - |
dc.subject | Empirical constraint | - |
dc.subject | Empirical risk minimization | - |
dc.subject | Neyman-Pearson paradigm | - |
dc.title | Neyman-Pearson classification under a strict constraint | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-84898488452 | - |
dc.identifier.volume | 19 | - |
dc.identifier.spage | 595 | - |
dc.identifier.epage | 613 | - |
dc.identifier.eissn | 1533-7928 | - |