File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Nonsmooth Low-Rank Matrix Recovery: Methodology, Theory and Algorithm
Title | Nonsmooth Low-Rank Matrix Recovery: Methodology, Theory and Algorithm |
---|---|
Authors | |
Issue Date | 2021 |
Citation | Proceedings of the Future Technologies Conference, v. 1, p. 848–862 How to Cite? |
Abstract | Many interesting problems in statistics and machine learning can be written as minx F(x)=f(x)+g(x), where x is the model parameter, f is the loss and g is the regularizer. Examples include regularized regression in high-dimensional feature selection and low-rank matrix/tensor factorization. Sometimes the loss function and/or the regularizer is nonsmooth due to the nature of the problem, for example, f(x) could be quantile loss to induce some robustness or to put more focus on different parts of the distribution other than the mean. In this paper we propose a general framework to deal with situations when you have nonsmooth loss or regularizer. Specifically we use low-rank matrix recovery as an example to demonstrate the main idea. The framework involves two main steps: the optimal smoothing of the loss function or regularizer and then a gradient based algorithm to solve the smoothed loss. The proposed smoothing pipeline is highly flexible, computationally efficient, easy to implement and well suited for problems with high-dimensional data. Strong theoretical convergence guarantee has also been established. In the numerical studies, we used L1 loss as an example to illustrate the practicability of the proposed pipeline. Various state-of-the-art algorithms such as Adam, NAG and YellowFin all show promising results for the smoothed loss. |
Persistent Identifier | http://hdl.handle.net/10722/320352 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Tu, W | - |
dc.contributor.author | Liu, P | - |
dc.contributor.author | Liu, Y | - |
dc.contributor.author | Li, G | - |
dc.contributor.author | Jiang, B | - |
dc.contributor.author | Kong, L | - |
dc.date.accessioned | 2022-10-21T07:51:42Z | - |
dc.date.available | 2022-10-21T07:51:42Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | Proceedings of the Future Technologies Conference, v. 1, p. 848–862 | - |
dc.identifier.uri | http://hdl.handle.net/10722/320352 | - |
dc.description.abstract | Many interesting problems in statistics and machine learning can be written as minx F(x)=f(x)+g(x), where x is the model parameter, f is the loss and g is the regularizer. Examples include regularized regression in high-dimensional feature selection and low-rank matrix/tensor factorization. Sometimes the loss function and/or the regularizer is nonsmooth due to the nature of the problem, for example, f(x) could be quantile loss to induce some robustness or to put more focus on different parts of the distribution other than the mean. In this paper we propose a general framework to deal with situations when you have nonsmooth loss or regularizer. Specifically we use low-rank matrix recovery as an example to demonstrate the main idea. The framework involves two main steps: the optimal smoothing of the loss function or regularizer and then a gradient based algorithm to solve the smoothed loss. The proposed smoothing pipeline is highly flexible, computationally efficient, easy to implement and well suited for problems with high-dimensional data. Strong theoretical convergence guarantee has also been established. In the numerical studies, we used L1 loss as an example to illustrate the practicability of the proposed pipeline. Various state-of-the-art algorithms such as Adam, NAG and YellowFin all show promising results for the smoothed loss. | - |
dc.language | eng | - |
dc.relation.ispartof | Proceedings of the Future Technologies Conference | - |
dc.title | Nonsmooth Low-Rank Matrix Recovery: Methodology, Theory and Algorithm | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Li, G: gdli@hku.hk | - |
dc.identifier.authority | Li, G=rp00738 | - |
dc.identifier.hkuros | 339990 | - |
dc.identifier.volume | 1 | - |
dc.identifier.spage | 848–862 | - |
dc.identifier.epage | 848–862 | - |