The paper ‘Manipulating Trajectory Prediction with Backdoors’ discusses the security implications of trajectory prediction in autonomous vehicles. The authors focus on ‘backdoors’, a security threat that has been overlooked in trajectory prediction. They describe four triggers that could affect trajectory prediction and show that these triggers, when correlated with a desired output during training, cause the desired output of a trajectory prediction model. Despite this, most models have good benign performance but are vulnerable to backdoors. The paper also evaluates defenses against backdoors, finding that while some are ineffective, clustering could be a promising method to detect backdoors.

 

Publication date: 22 Dec 2023
Project Page: Not provided
Paper: https://arxiv.org/pdf/2312.13863