File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Curse of Attention: A Kernel-Based Perspective for Why Transformers Fail to Generalize on Time Series Forecasting and Beyond
| Title | Curse of Attention: A Kernel-Based Perspective for Why Transformers Fail to Generalize on Time Series Forecasting and Beyond |
|---|---|
| Authors | |
| Issue Date | 24-Mar-2025 |
| Persistent Identifier | http://hdl.handle.net/10722/359533 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Ke, Yekun | - |
| dc.contributor.author | Liang, Yingyu | - |
| dc.contributor.author | Shi, Zhenmei | - |
| dc.contributor.author | Zhao, Song | - |
| dc.contributor.author | Yang, Chiwun | - |
| dc.date.accessioned | 2025-09-07T00:30:57Z | - |
| dc.date.available | 2025-09-07T00:30:57Z | - |
| dc.date.issued | 2025-03-24 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/359533 | - |
| dc.language | eng | - |
| dc.relation.ispartof | Conference on Parsimony and Learning 2025 (24/03/2025-27/03/2025, Stanford University, California) | - |
| dc.title | Curse of Attention: A Kernel-Based Perspective for Why Transformers Fail to Generalize on Time Series Forecasting and Beyond | - |
| dc.type | Conference_Paper | - |
