File Download

There are no files associated with this item.

Supplementary

Conference Paper: Bypassing the Exponential Dependency: Looped Transformers Efficiently Learn In-context by Multi-step Gradient Descent

TitleBypassing the Exponential Dependency: Looped Transformers Efficiently Learn In-context by Multi-step Gradient Descent
Authors
Issue Date3-May-2025
Persistent Identifierhttp://hdl.handle.net/10722/359529

 

DC FieldValueLanguage
dc.contributor.authorBo, Chen-
dc.contributor.authorLi, Xiaoyu-
dc.contributor.authorLiang, Yingyu-
dc.contributor.authorShi, Zhenmei-
dc.contributor.authorZhao, Song-
dc.date.accessioned2025-09-07T00:30:55Z-
dc.date.available2025-09-07T00:30:55Z-
dc.date.issued2025-05-03-
dc.identifier.urihttp://hdl.handle.net/10722/359529-
dc.languageeng-
dc.relation.ispartofThe 28th International Conference on Artificial Intelligence and Statistics. (03/05/2025-05/05/2025, Mai Khao)-
dc.titleBypassing the Exponential Dependency: Looped Transformers Efficiently Learn In-context by Multi-step Gradient Descent-
dc.typeConference_Paper-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats