The article presents a training approach for Neural Machine Translation (NMT) based on Layer-wise Relevance Propagation (LRP). The approach is applied to Unsupervised and Supervised model training, for translation of three languages (French, Gujarati, Kazakh) to and from English. The method is found to be particularly effective under low-resource conditions, outperforming basic training methods. The results suggest that LRP can be beneficial during model training for NMT, especially in situations with limited data and for specific well-defined model setups. The study opens up possibilities for further exploration of this approach and its applicability to other languages.
Publication date: 30 Nov 2023
Project Page: https://arxiv.org/abs/2312.00214v1
Paper: https://arxiv.org/pdf/2312.00214