File Download

There are no files associated with this item.

Supplementary

Conference Paper: Individual differences in explanation strategies for image classification and implications for explainable AI

TitleIndividual differences in explanation strategies for image classification and implications for explainable AI
Authors
Issue Date26-Jul-2023
Abstract

While saliency-based explainable AI (XAI) methods have been well developed for image classification models, they fall short in comparison with human explanations. Here we examined human explanation strategies for image classification and their relationship with explanation quality to inform better XAI designs. We found that individuals differed in attention strategies during explanation: Participants adopting more explorative strategies used more visual information in their explanations, whereas those adopting more focused strategies included more conceptual information. In addition, visual explanations were rated higher for effectiveness in teaching learners without prior category knowledge, whereas conceptual explanations were more diagnostic for observers with prior knowledge to infer the class label. Thus, individuals differ in the use of visual and conceptual information to explain image classification, which facilitate different aspects of explanation quality and suit learners with different experiences. These findings have important implications for adaptive use of visual and conceptual information in XAI development.


Persistent Identifierhttp://hdl.handle.net/10722/337705

 

DC FieldValueLanguage
dc.contributor.authorQi, R-
dc.contributor.authorZheng, Y-
dc.contributor.authorYang, Y-
dc.contributor.authorZhang, J-
dc.contributor.authorHsiao, J H-
dc.date.accessioned2024-03-11T10:23:14Z-
dc.date.available2024-03-11T10:23:14Z-
dc.date.issued2023-07-26-
dc.identifier.urihttp://hdl.handle.net/10722/337705-
dc.description.abstract<p>While saliency-based explainable AI (XAI) methods have been well developed for image classification models, they fall short in comparison with human explanations. Here we examined human explanation strategies for image classification and their relationship with explanation quality to inform better XAI designs. We found that individuals differed in attention strategies during explanation: Participants adopting more explorative strategies used more visual information in their explanations, whereas those adopting more focused strategies included more conceptual information. In addition, visual explanations were rated higher for effectiveness in teaching learners without prior category knowledge, whereas conceptual explanations were more diagnostic for observers with prior knowledge to infer the class label. Thus, individuals differ in the use of visual and conceptual information to explain image classification, which facilitate different aspects of explanation quality and suit learners with different experiences. These findings have important implications for adaptive use of visual and conceptual information in XAI development.<br></p>-
dc.languageeng-
dc.relation.ispartof44th Annual Meeting of the Cognitive Science Society (26/07/2023-29/07/2023, Sydney)-
dc.titleIndividual differences in explanation strategies for image classification and implications for explainable AI-
dc.typeConference_Paper-
dc.identifier.issue45-
dc.identifier.spage1644-
dc.identifier.epage1651-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats