Large Language Model Meets Graph Neural Network in Knowledge Distillation
The article discusses a new graph knowledge distillation framework, named LinguGKD, using Large Language Models (LLMs) as teacher models and Graph Neural Networks (GNNs) as student models for knowledge distillation….
Continue reading