[WACV 2024] Pixel-Grounded Prototypical Part Networks

Поділитися
Вставка
  • Опубліковано 18 січ 2024
  • MERL intern Zach Carmichael presents our paper, "Pixel-Grounded Prototypical Networks," for the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) held in Waikoloa, Hawaii January 4-8, 2024. The paper was co-authored by MERL researchers Michael J. Jones, Suhas Lohit, and Anoop Cherian, as well as Prof. Walter Scheirer of the University of Notre Dame.
    Abstract:
    Prototypical part neural networks, namely PROTOPNET and its derivatives, are an intrinsically interpretable approach to machine learning. Their prototype learning scheme enables intuitive explanations of the form, "this" looks like "that". But, does "this" actually look like "that"? In this work, we delve into why such object part localization and associated heat maps are misleading. The reasons underlying this phenomenon arise from the standard prototypical part visualization procedure which produces seductive visualizations. We devise new architectural constraints and a principled pixel space mapping for prototypical part neural networks to resolve these issues. To improve interpretability, we propose additional architectural improvements and new prototype visualizations. Furthermore, we make additional corrections to PROTOPNET and its derivatives, such as the use of a validation set, rather than a test set, to stop training. Our results indicate that improved interpretability can be achieved without sacrificing accuracy.
  • Наука та технологія

КОМЕНТАРІ •