File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Neyman-Pearson classification, convexity and stochastic constraints

TitleNeyman-Pearson classification, convexity and stochastic constraints
Authors
KeywordsAnomaly detection
Binary classification
Chance constrained optimization
Empirical constraint
Empirical risk minimization
Neyman-Pearson paradigm
Issue Date2011
Citation
Journal of Machine Learning Research, 2011, v. 12, p. 2831-2855 How to Cite?
AbstractMotivated by problems of anomaly detection, this paper implements the Neyman-Pearson paradigm to deal with asymmetric errors in binary classification with a convex loss Φ(p. Given a finite collection of classifiers, we combine them and obtain a new classifier that satisfies simultaneously the two following properties with high probability: (i) its Φcp-type I error is below a pre-specified level and (ii), it has Φcp-type II error close to the minimum possible. The proposed classifier is obtained by minimizing an empirical convex objective with an empirical convex constraint. The novelty of the method is that the classifier output by this computationally feasible program is shown to satisfy the original constraint on type I error. New techniques to handle such problems are developed and they have consequences on chance constrained programming. We also evaluate the price to pay in terms of type II error for being conservative on type I error. © 2011 Philippe Rigollet and Xin Tong.
Persistent Identifierhttp://hdl.handle.net/10722/354104
ISSN
2023 Impact Factor: 4.3
2023 SCImago Journal Rankings: 2.796

 

DC FieldValueLanguage
dc.contributor.authorRigollet, Philippe-
dc.contributor.authorTong, Xin-
dc.date.accessioned2025-02-07T08:46:29Z-
dc.date.available2025-02-07T08:46:29Z-
dc.date.issued2011-
dc.identifier.citationJournal of Machine Learning Research, 2011, v. 12, p. 2831-2855-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10722/354104-
dc.description.abstractMotivated by problems of anomaly detection, this paper implements the Neyman-Pearson paradigm to deal with asymmetric errors in binary classification with a convex loss Φ(p. Given a finite collection of classifiers, we combine them and obtain a new classifier that satisfies simultaneously the two following properties with high probability: (i) its Φcp-type I error is below a pre-specified level and (ii), it has Φcp-type II error close to the minimum possible. The proposed classifier is obtained by minimizing an empirical convex objective with an empirical convex constraint. The novelty of the method is that the classifier output by this computationally feasible program is shown to satisfy the original constraint on type I error. New techniques to handle such problems are developed and they have consequences on chance constrained programming. We also evaluate the price to pay in terms of type II error for being conservative on type I error. © 2011 Philippe Rigollet and Xin Tong.-
dc.languageeng-
dc.relation.ispartofJournal of Machine Learning Research-
dc.subjectAnomaly detection-
dc.subjectBinary classification-
dc.subjectChance constrained optimization-
dc.subjectEmpirical constraint-
dc.subjectEmpirical risk minimization-
dc.subjectNeyman-Pearson paradigm-
dc.titleNeyman-Pearson classification, convexity and stochastic constraints-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-80555154412-
dc.identifier.volume12-
dc.identifier.spage2831-
dc.identifier.epage2855-
dc.identifier.eissn1533-7928-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats