File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: FlashST: A Simple and Universal Prompt-Tuning Framework for Traffic Prediction

TitleFlashST: A Simple and Universal Prompt-Tuning Framework for Traffic Prediction
Authors
Issue Date2024
Citation
Proceedings of Machine Learning Research, 2024, v. 235, p. 28978-28988 How to Cite?
AbstractThe objective of traffic prediction is to accurately forecast and analyze the dynamics of transportation patterns, considering both space and time. However, the presence of distribution shift poses a significant challenge in this field, as existing models struggle to generalize well when faced with test data that significantly differs from the training distribution. To tackle this issue, this paper introduces a simple and universal spatio-temporal prompt-tuning framework-FlashST, which adapts pre-trained models to the specific characteristics of diverse downstream datasets, improving generalization in diverse traffic prediction scenarios. Specifically, the FlashST framework employs a lightweight spatio-temporal prompt network for in-context learning, capturing spatio-temporal invariant knowledge and facilitating effective adaptation to diverse scenarios. Additionally, we incorporate a distribution mapping mechanism to align the data distributions of pre-training and downstream data, facilitating effective knowledge transfer in spatio-temporal forecasting. Empirical evaluations demonstrate the effectiveness of our FlashST across different spatio-temporal prediction tasks using diverse urban datasets. Code is available at https://github.com/HKUDS/FlashST.
Persistent Identifierhttp://hdl.handle.net/10722/355979

 

DC FieldValueLanguage
dc.contributor.authorLi, Zhonghang-
dc.contributor.authorXia, Lianghao-
dc.contributor.authorXu, Yong-
dc.contributor.authorHuang, Chao-
dc.date.accessioned2025-05-19T05:47:02Z-
dc.date.available2025-05-19T05:47:02Z-
dc.date.issued2024-
dc.identifier.citationProceedings of Machine Learning Research, 2024, v. 235, p. 28978-28988-
dc.identifier.urihttp://hdl.handle.net/10722/355979-
dc.description.abstractThe objective of traffic prediction is to accurately forecast and analyze the dynamics of transportation patterns, considering both space and time. However, the presence of distribution shift poses a significant challenge in this field, as existing models struggle to generalize well when faced with test data that significantly differs from the training distribution. To tackle this issue, this paper introduces a simple and universal spatio-temporal prompt-tuning framework-FlashST, which adapts pre-trained models to the specific characteristics of diverse downstream datasets, improving generalization in diverse traffic prediction scenarios. Specifically, the FlashST framework employs a lightweight spatio-temporal prompt network for in-context learning, capturing spatio-temporal invariant knowledge and facilitating effective adaptation to diverse scenarios. Additionally, we incorporate a distribution mapping mechanism to align the data distributions of pre-training and downstream data, facilitating effective knowledge transfer in spatio-temporal forecasting. Empirical evaluations demonstrate the effectiveness of our FlashST across different spatio-temporal prediction tasks using diverse urban datasets. Code is available at https://github.com/HKUDS/FlashST.-
dc.languageeng-
dc.relation.ispartofProceedings of Machine Learning Research-
dc.titleFlashST: A Simple and Universal Prompt-Tuning Framework for Traffic Prediction-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85203806628-
dc.identifier.volume235-
dc.identifier.spage28978-
dc.identifier.epage28988-
dc.identifier.eissn2640-3498-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats