AttnLRP: Attention-Aware Layer-wise Relevance Propagation for Transformers
The paper discusses AttnLRP, a method that extends Layer-wise Relevance Propagation to handle attention layers in transformer models. It aims to provide better understanding of the reasoning process of these…
Continue reading