File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Loss-Balanced task weighting to reduce negative transfer in multi-task learning
Title | Loss-Balanced task weighting to reduce negative transfer in multi-task learning |
---|---|
Authors | |
Issue Date | 2019 |
Citation | 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, 2019, p. 9977-9978 How to Cite? |
Abstract | In settings with related prediction tasks, integrated multi-task learning models can often improve performance relative to independent single-task models. However, even when the average task performance improves, individual tasks may experience negative transfer in which the multi-task model's predictions are worse than the single-task model's. We show the prevalence of negative transfer in a computational chemistry case study with 128 tasks and introduce a framework that provides a foundation for reducing negative transfer in multitask models. Our Loss-Balanced Task Weighting approach dynamically updates task weights during model training to control the influence of individual tasks. |
Persistent Identifier | http://hdl.handle.net/10722/341278 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, Shengchao | - |
dc.contributor.author | Liang, Yingyu | - |
dc.contributor.author | Gitter, Anthony | - |
dc.date.accessioned | 2024-03-13T08:41:33Z | - |
dc.date.available | 2024-03-13T08:41:33Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, 2019, p. 9977-9978 | - |
dc.identifier.uri | http://hdl.handle.net/10722/341278 | - |
dc.description.abstract | In settings with related prediction tasks, integrated multi-task learning models can often improve performance relative to independent single-task models. However, even when the average task performance improves, individual tasks may experience negative transfer in which the multi-task model's predictions are worse than the single-task model's. We show the prevalence of negative transfer in a computational chemistry case study with 128 tasks and introduce a framework that provides a foundation for reducing negative transfer in multitask models. Our Loss-Balanced Task Weighting approach dynamically updates task weights during model training to control the influence of individual tasks. | - |
dc.language | eng | - |
dc.relation.ispartof | 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019 | - |
dc.title | Loss-Balanced task weighting to reduce negative transfer in multi-task learning | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85087287690 | - |
dc.identifier.spage | 9977 | - |
dc.identifier.epage | 9978 | - |