This research introduces multiple physics pretraining (MPP), a new approach for physical surrogate modeling. The MPP method involves training large models to predict the dynamics of various physical systems simultaneously. This is achieved by learning features that are useful across different physical tasks. A shared embedding and normalization strategy is introduced to project the fields of multiple systems into a single shared space. The study demonstrates that a single MPP-pretrained transformer can match or outperform task-specific baselines on all pretraining sub-tasks without the need for fine-tuning. Furthermore, fine-tuning MPP-trained models results in more accurate predictions across multiple time-steps on new physics compared to training from scratch or fine-tuning pretrained video foundation models.

 

Publication date: 4 Oct 2023
Project Page: https://github.com/PolymathicAI/multiple_physics_pretraining
Paper: https://arxiv.org/pdf/2310.02994