File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.18653/v1/2023.findings-emnlp.961
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Gradually Excavating External Knowledge for Implicit Complex Question Answering
Title | Gradually Excavating External Knowledge for Implicit Complex Question Answering |
---|---|
Authors | |
Issue Date | 1-Dec-2023 |
Publisher | Association for Computational Linguistics |
Abstract | Recently, large language models (LLMs) have gained much attention for the emergence of human-comparable capabilities and huge potential. However, for open-domain implicit question-answering problems, LLMs may not be the ultimate solution due to the reasons of: 1) uncovered or out-of-date domain knowledge, 2) one-shot generation and hence restricted comprehensiveness. To this end, this work proposes a gradual knowledge excavation framework for open-domain complex question answering, where LLMs iteratively and actively acquire extrinsic information, then reason based on acquired historical knowledge. Specifically, during each step of the solving process, the model selects an action to execute, such as querying external knowledge or performing a single logical reasoning step, to gradually progress toward a final answer. Our method can effectively leverage plug-and-play external knowledge and dynamically adjust the strategy for solving complex questions. Evaluated on the StrategyQA dataset, our method achieves 78.17% accuracy with less than 6% parameters of its competitors, setting new SOTA in the ~10B LLM class. |
Persistent Identifier | http://hdl.handle.net/10722/339494 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, Chang | - |
dc.contributor.author | Li, Xiaoguang | - |
dc.contributor.author | Shang, Lifeng | - |
dc.contributor.author | Jiang, Xin | - |
dc.contributor.author | Liu, Qun | - |
dc.contributor.author | Lam, Edmund | - |
dc.contributor.author | Wong, Ngai | - |
dc.date.accessioned | 2024-03-11T10:37:05Z | - |
dc.date.available | 2024-03-11T10:37:05Z | - |
dc.date.issued | 2023-12-01 | - |
dc.identifier.uri | http://hdl.handle.net/10722/339494 | - |
dc.description.abstract | <p>Recently, large language models (LLMs) have gained much attention for the emergence of human-comparable capabilities and huge potential. However, for open-domain implicit question-answering problems, LLMs may not be the ultimate solution due to the reasons of: 1) uncovered or out-of-date domain knowledge, 2) one-shot generation and hence restricted comprehensiveness. To this end, this work proposes a gradual knowledge excavation framework for open-domain complex question answering, where LLMs iteratively and actively acquire extrinsic information, then reason based on acquired historical knowledge. Specifically, during each step of the solving process, the model selects an action to execute, such as querying external knowledge or performing a single logical reasoning step, to gradually progress toward a final answer. Our method can effectively leverage plug-and-play external knowledge and dynamically adjust the strategy for solving complex questions. Evaluated on the StrategyQA dataset, our method achieves 78.17% accuracy with less than 6% parameters of its competitors, setting new SOTA in the ~10B LLM class.<br></p> | - |
dc.language | eng | - |
dc.publisher | Association for Computational Linguistics | - |
dc.relation.ispartof | EMNLP (01/12/2023-08/12/2023, Singapore) | - |
dc.title | Gradually Excavating External Knowledge for Implicit Complex Question Answering | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.18653/v1/2023.findings-emnlp.961 | - |