File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Individual differences in explanation strategies for image classification and implications for explainable AI
Title | Individual differences in explanation strategies for image classification and implications for explainable AI |
---|---|
Authors | |
Issue Date | 26-Jul-2023 |
Abstract | While saliency-based explainable AI (XAI) methods have been well developed for image classification models, they fall short in comparison with human explanations. Here we examined human explanation strategies for image classification and their relationship with explanation quality to inform better XAI designs. We found that individuals differed in attention strategies during explanation: Participants adopting more explorative strategies used more visual information in their explanations, whereas those adopting more focused strategies included more conceptual information. In addition, visual explanations were rated higher for effectiveness in teaching learners without prior category knowledge, whereas conceptual explanations were more diagnostic for observers with prior knowledge to infer the class label. Thus, individuals differ in the use of visual and conceptual information to explain image classification, which facilitate different aspects of explanation quality and suit learners with different experiences. These findings have important implications for adaptive use of visual and conceptual information in XAI development. |
Persistent Identifier | http://hdl.handle.net/10722/337705 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Qi, R | - |
dc.contributor.author | Zheng, Y | - |
dc.contributor.author | Yang, Y | - |
dc.contributor.author | Zhang, J | - |
dc.contributor.author | Hsiao, J H | - |
dc.date.accessioned | 2024-03-11T10:23:14Z | - |
dc.date.available | 2024-03-11T10:23:14Z | - |
dc.date.issued | 2023-07-26 | - |
dc.identifier.uri | http://hdl.handle.net/10722/337705 | - |
dc.description.abstract | <p>While saliency-based explainable AI (XAI) methods have been well developed for image classification models, they fall short in comparison with human explanations. Here we examined human explanation strategies for image classification and their relationship with explanation quality to inform better XAI designs. We found that individuals differed in attention strategies during explanation: Participants adopting more explorative strategies used more visual information in their explanations, whereas those adopting more focused strategies included more conceptual information. In addition, visual explanations were rated higher for effectiveness in teaching learners without prior category knowledge, whereas conceptual explanations were more diagnostic for observers with prior knowledge to infer the class label. Thus, individuals differ in the use of visual and conceptual information to explain image classification, which facilitate different aspects of explanation quality and suit learners with different experiences. These findings have important implications for adaptive use of visual and conceptual information in XAI development.<br></p> | - |
dc.language | eng | - |
dc.relation.ispartof | 44th Annual Meeting of the Cognitive Science Society (26/07/2023-29/07/2023, Sydney) | - |
dc.title | Individual differences in explanation strategies for image classification and implications for explainable AI | - |
dc.type | Conference_Paper | - |
dc.identifier.issue | 45 | - |
dc.identifier.spage | 1644 | - |
dc.identifier.epage | 1651 | - |