File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Bypassing the Exponential Dependency: Looped Transformers Efficiently Learn In-context by Multi-step Gradient Descent
| Title | Bypassing the Exponential Dependency: Looped Transformers Efficiently Learn In-context by Multi-step Gradient Descent |
|---|---|
| Authors | |
| Issue Date | 3-May-2025 |
| Persistent Identifier | http://hdl.handle.net/10722/359529 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Bo, Chen | - |
| dc.contributor.author | Li, Xiaoyu | - |
| dc.contributor.author | Liang, Yingyu | - |
| dc.contributor.author | Shi, Zhenmei | - |
| dc.contributor.author | Zhao, Song | - |
| dc.date.accessioned | 2025-09-07T00:30:55Z | - |
| dc.date.available | 2025-09-07T00:30:55Z | - |
| dc.date.issued | 2025-05-03 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/359529 | - |
| dc.language | eng | - |
| dc.relation.ispartof | The 28th International Conference on Artificial Intelligence and Statistics. (03/05/2025-05/05/2025, Mai Khao) | - |
| dc.title | Bypassing the Exponential Dependency: Looped Transformers Efficiently Learn In-context by Multi-step Gradient Descent | - |
| dc.type | Conference_Paper | - |
