The article ‘When Redundancy Matters: Machine Teaching of Representations’ discusses the idea of teaching representations in machine teaching, rather than concepts. It emphasizes the role of redundancy in representation, which strongly impacts the search space. The authors work with several teaching schemas, including Eager, Greedy, and Optimal, and analyze the gains in teaching effectiveness for some representational languages. Theoretical and experimental results reveal various types of redundancy, better handled by the Greedy schema. The study contributes to the fields of machine teaching, machine learning, and explainable AI.

 

Publication date: 23 Jan 2024
Project Page: https://arxiv.org/abs/2401.12711
Paper: https://arxiv.org/pdf/2401.12711