OmniPred: Language Models as Universal Regressors
The paper proposes OmniPred, a framework for training language models as universal regressors over evaluation data from various real-world experiments. This approach, which uses data from Google Vizier, demonstrates that…
Continue reading