File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Compositional Exemplars for In-context Learning
Title | Compositional Exemplars for In-context Learning |
---|---|
Authors | |
Issue Date | 11-Jul-2023 |
Abstract | Large pretrained language models (LMs) have shown impressive In-Context Learning (ICL) ability, where the model learns to do an unseen task via a prompt consisting of input-output examples as the demonstration, without any parameter updates. The performance of ICL is highly dominated by the quality of the selected in-context examples. However, previous selection methods are mostly based on simple heuristics, leading to sub-optimal performance. In this work, we formulate in-context example selection as a subset selection problem. We propose CEIL (Compositional Exemplars for In-context Learning), which is instantiated by Determinantal Point Processes (DPPs) to model the interaction between the given input and in-context examples, and optimized through a carefully-designed contrastive learning objective to obtain preference from LMs. We validate CEIL on 12 classification and generation datasets from 7 distinct NLP tasks, including sentiment analysis, paraphrase detection, natural language inference, commonsense reasoning, open-domain question answering, code generation, and semantic parsing. Extensive experiments demonstrate not only the state-of-the-art performance but also the transferability and compositionality of CEIL, shedding new light on effective and efficient in-context learning. |
Persistent Identifier | http://hdl.handle.net/10722/333815 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ye, Jiacheng | - |
dc.contributor.author | Wu, Zhiyong | - |
dc.contributor.author | Feng, Jiangtao | - |
dc.contributor.author | Yu, Tao | - |
dc.contributor.author | Kong, Lingpeng | - |
dc.date.accessioned | 2023-10-06T08:39:18Z | - |
dc.date.available | 2023-10-06T08:39:18Z | - |
dc.date.issued | 2023-07-11 | - |
dc.identifier.uri | http://hdl.handle.net/10722/333815 | - |
dc.description.abstract | <p>Large pretrained language models (LMs) have shown impressive In-Context Learning (ICL) ability, where the model learns to do an unseen task via a prompt consisting of input-output examples as the demonstration, without any parameter updates. The performance of ICL is highly dominated by the quality of the selected in-context examples. However, previous selection methods are mostly based on simple heuristics, leading to sub-optimal performance. In this work, we formulate in-context example selection as a subset selection problem. We propose CEIL (Compositional Exemplars for In-context Learning), which is instantiated by Determinantal Point Processes (DPPs) to model the interaction between the given input and in-context examples, and optimized through a carefully-designed contrastive learning objective to obtain preference from LMs. We validate CEIL on 12 classification and generation datasets from 7 distinct NLP tasks, including sentiment analysis, paraphrase detection, natural language inference, commonsense reasoning, open-domain question answering, code generation, and semantic parsing. Extensive experiments demonstrate not only the state-of-the-art performance but also the transferability and compositionality of CEIL, shedding new light on effective and efficient in-context learning.<br></p> | - |
dc.language | eng | - |
dc.relation.ispartof | International Conference on Machine Learning (23/07/2023-29/07/2023, Honolulu, Hawaii) | - |
dc.title | Compositional Exemplars for In-context Learning | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.48550/arXiv.2302.05698 | - |