File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Efficient Maximal Coding Rate Reduction by Variational Forms

TitleEfficient Maximal Coding Rate Reduction by Variational Forms
Authors
KeywordsDeep learning architectures and techniques
Machine learning
Optimization methods
Representation learning
Issue Date2022
Citation
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2022, v. 2022-June, p. 490-498 How to Cite?
AbstractThe principle of Maximal Coding Rate Reduction (MCR2) has recently been proposed as a training objective for learning discriminative low-dimensional structures intrinsic to high-dimensional data to allow for more robust training than standard approaches, such as cross-entropy minimization. However, despite the advantages that have been shown for MCR2 training, MCR2 suffers from a significant computational cost due to the need to evaluate and differentiate a significant number of log-determinant terms that grows linearly with the number of classes. By taking advantage of variational forms of spectral functions of a matrix, we reformulate the MCR2 objective to a form that can scale significantly without compromising training accuracy. Experiments in image classification demonstrate that our proposed formulation results in a significant speed up over optimizing the original MCR2 objective directly and often results in higher quality learned representations. Further, our approach may be of independent interest in other models that require computation of log-determinant forms, such as in system identification or normalizing flow models.
Persistent Identifierhttp://hdl.handle.net/10722/327787
ISSN
2020 SCImago Journal Rankings: 4.658
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorBaek, Christina-
dc.contributor.authorWu, Ziyang-
dc.contributor.authorChan, Kwan Ho Ryan-
dc.contributor.authorDing, Tianjiao-
dc.contributor.authorMa, Yi-
dc.contributor.authorHaeffele, Benjamin D.-
dc.date.accessioned2023-05-08T02:26:48Z-
dc.date.available2023-05-08T02:26:48Z-
dc.date.issued2022-
dc.identifier.citationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2022, v. 2022-June, p. 490-498-
dc.identifier.issn1063-6919-
dc.identifier.urihttp://hdl.handle.net/10722/327787-
dc.description.abstractThe principle of Maximal Coding Rate Reduction (MCR2) has recently been proposed as a training objective for learning discriminative low-dimensional structures intrinsic to high-dimensional data to allow for more robust training than standard approaches, such as cross-entropy minimization. However, despite the advantages that have been shown for MCR2 training, MCR2 suffers from a significant computational cost due to the need to evaluate and differentiate a significant number of log-determinant terms that grows linearly with the number of classes. By taking advantage of variational forms of spectral functions of a matrix, we reformulate the MCR2 objective to a form that can scale significantly without compromising training accuracy. Experiments in image classification demonstrate that our proposed formulation results in a significant speed up over optimizing the original MCR2 objective directly and often results in higher quality learned representations. Further, our approach may be of independent interest in other models that require computation of log-determinant forms, such as in system identification or normalizing flow models.-
dc.languageeng-
dc.relation.ispartofProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-
dc.subjectDeep learning architectures and techniques-
dc.subjectMachine learning-
dc.subjectOptimization methods-
dc.subjectRepresentation learning-
dc.titleEfficient Maximal Coding Rate Reduction by Variational Forms-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPR52688.2022.00058-
dc.identifier.scopuseid_2-s2.0-85138966725-
dc.identifier.volume2022-June-
dc.identifier.spage490-
dc.identifier.epage498-
dc.identifier.isiWOS:000867754200050-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats