The article introduces SymbolNet, a new approach to symbolic regression using neural networks. Unlike traditional methods, SymbolNet enables dynamic pruning of model weights, input features, and mathematical operators in a single training. It optimizes both training loss and expression complexity simultaneously. The authors demonstrate the effectiveness of SymbolNet on high-dimensional datasets, showing its superior scalability compared to existing methods. SymbolNet also incorporates a sparsity regularization term per pruning type, which can adaptively adjust its strength to reach a target sparsity level.
Publication date: 19 Jan 2024
Project Page: Not provided
Paper: https://arxiv.org/pdf/2401.09949