File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Human Attention-Guided Explainable AI for Object Detection
Title | Human Attention-Guided Explainable AI for Object Detection |
---|---|
Authors | |
Issue Date | 26-Jul-2023 |
Abstract | Although object detection AI plays an important role in many critical systems, corresponding Explainable AI (XAI) methods remain very limited. Here we first developed FullGrad-CAM and FullGrad-CAM++ by extending traditional gradient-based methods to generate object-specific explanations with higher plausibility. Since human attention may reflect features more in-terpretable to humans, we explored the possibility to use it as guidance to learn how to combine the explanatory information in the detector model to best present as an XAI saliency map that is interpretable (plausible) to humans. Interestingly, we found that human attention maps had higher faithfulness for explaining the detector model than existing saliency-based XAI methods. By using trainable activation functions and smoothing kernels to maximize the XAI saliency map similarity to human attention maps, the generated map had higher faithfulness and plausibility than both existing XAI methods and human atten-tion maps. The learned functions were model-specific, well generalizable to other databases. |
Persistent Identifier | http://hdl.handle.net/10722/337707 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, G | - |
dc.contributor.author | Zhang, J | - |
dc.contributor.author | Chan, A | - |
dc.contributor.author | Hsiao, J H | - |
dc.date.accessioned | 2024-03-11T10:23:15Z | - |
dc.date.available | 2024-03-11T10:23:15Z | - |
dc.date.issued | 2023-07-26 | - |
dc.identifier.uri | http://hdl.handle.net/10722/337707 | - |
dc.description.abstract | <p>Although object detection AI plays an important role in many critical systems, corresponding Explainable AI (XAI) methods remain very limited. Here we first developed FullGrad-CAM and FullGrad-CAM++ by extending traditional gradient-based methods to generate object-specific explanations with higher plausibility. Since human attention may reflect features more in-terpretable to humans, we explored the possibility to use it as guidance to learn how to combine the explanatory information in the detector model to best present as an XAI saliency map that is interpretable (plausible) to humans. Interestingly, we found that human attention maps had higher faithfulness for explaining the detector model than existing saliency-based XAI methods. By using trainable activation functions and smoothing kernels to maximize the XAI saliency map similarity to human attention maps, the generated map had higher faithfulness and plausibility than both existing XAI methods and human atten-tion maps. The learned functions were model-specific, well generalizable to other databases.<br></p> | - |
dc.language | eng | - |
dc.relation.ispartof | 44th Annual Meeting of the Cognitive Science Society (26/07/2023-29/07/2023, Sydney) | - |
dc.title | Human Attention-Guided Explainable AI for Object Detection | - |
dc.type | Conference_Paper | - |
dc.identifier.issue | 45 | - |
dc.identifier.spage | 2573 | - |
dc.identifier.epage | 2580 | - |