File Download

There are no files associated with this item.

Supplementary

Conference Paper: cosFormer: Rethinking Softmax In Attention

TitlecosFormer: Rethinking Softmax In Attention
Authors
Issue Date2022
Citation
International Conference on Learning Representations How to Cite?
Persistent Identifierhttp://hdl.handle.net/10722/317212

 

DC FieldValueLanguage
dc.contributor.authorZhen, Q-
dc.contributor.authorSun, W-
dc.contributor.authorDeng, H-
dc.contributor.authorLi, D-
dc.contributor.authorWei, Y-
dc.contributor.authorLv, B-
dc.contributor.authorYan, J-
dc.contributor.authorKong, L-
dc.contributor.authorZhong, Y-
dc.date.accessioned2022-10-07T10:16:22Z-
dc.date.available2022-10-07T10:16:22Z-
dc.date.issued2022-
dc.identifier.citationInternational Conference on Learning Representations-
dc.identifier.urihttp://hdl.handle.net/10722/317212-
dc.languageeng-
dc.relation.ispartofInternational Conference on Learning Representations-
dc.titlecosFormer: Rethinking Softmax In Attention-
dc.typeConference_Paper-
dc.identifier.emailKong, L: lpk@cs.hku.hk-
dc.identifier.authorityKong, L=rp02775-
dc.identifier.hkuros337863-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats