File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: PROTOTYPE MEMORY AND ATTENTION MECHANISMS FOR FEW SHOT IMAGE GENERATION
Title | PROTOTYPE MEMORY AND ATTENTION MECHANISMS FOR FEW SHOT IMAGE GENERATION |
---|---|
Authors | |
Issue Date | 2022 |
Citation | ICLR 2022 - 10th International Conference on Learning Representations, 2022 How to Cite? |
Abstract | Recent discoveries indicate that the neural codes in the superficial layers of the primary visual cortex (V1) of macaque monkeys are complex, diverse and super-sparse. This leads us to ponder the computational advantages and functional role of these “grandmother cells." Here, we propose that such cells can serve as prototype memory priors that bias and shape the distributed feature processing during the image generation process in the brain. These memory prototypes are learned by momentum online clustering and are utilized through a memory-based attention operation. Integrating this mechanism, we propose Memory Concept Attention (MoCA) to improve few shot image generation quality. We show that having a prototype memory with attention mechanisms can improve image synthesis quality, learn interpretable visual concept clusters, and improve the robustness of the model. Our results demonstrate the feasibility of the idea that these super-sparse complex feature detectors can serve as prototype memory priors for modulating the image synthesis processes in the visual system. |
Persistent Identifier | http://hdl.handle.net/10722/352351 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Li, Tianqin | - |
dc.contributor.author | Li, Zijie | - |
dc.contributor.author | Luo, Andrew | - |
dc.contributor.author | Rockwell, Harold | - |
dc.contributor.author | Farimani, Amir Barati | - |
dc.contributor.author | Lee, Tai Sing | - |
dc.date.accessioned | 2024-12-16T03:58:25Z | - |
dc.date.available | 2024-12-16T03:58:25Z | - |
dc.date.issued | 2022 | - |
dc.identifier.citation | ICLR 2022 - 10th International Conference on Learning Representations, 2022 | - |
dc.identifier.uri | http://hdl.handle.net/10722/352351 | - |
dc.description.abstract | Recent discoveries indicate that the neural codes in the superficial layers of the primary visual cortex (V1) of macaque monkeys are complex, diverse and super-sparse. This leads us to ponder the computational advantages and functional role of these “grandmother cells." Here, we propose that such cells can serve as prototype memory priors that bias and shape the distributed feature processing during the image generation process in the brain. These memory prototypes are learned by momentum online clustering and are utilized through a memory-based attention operation. Integrating this mechanism, we propose Memory Concept Attention (MoCA) to improve few shot image generation quality. We show that having a prototype memory with attention mechanisms can improve image synthesis quality, learn interpretable visual concept clusters, and improve the robustness of the model. Our results demonstrate the feasibility of the idea that these super-sparse complex feature detectors can serve as prototype memory priors for modulating the image synthesis processes in the visual system. | - |
dc.language | eng | - |
dc.relation.ispartof | ICLR 2022 - 10th International Conference on Learning Representations | - |
dc.title | PROTOTYPE MEMORY AND ATTENTION MECHANISMS FOR FEW SHOT IMAGE GENERATION | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85150347325 | - |