# SymFormerEnd-to-end symbolic regression using transformer-based architecture

## Abstract

Many real-world problems can be naturally described by mathematical formulas. Recently, neural networks have been applied to the task of finding formulas from observed data. We propose a novel transformer-based method called SymFormer which we train on a large number of formulas (hundreds of millions). After training our method is considerably faster than state-of-the-art evolutionary methods. The main novelty of our approach is that SymFormer predicts the formula by outputting the individual symbols and the corresponding constants simultaneously. This leads to better performance in terms of fitting the available data than alternative transformer-based models. In addition, the constants provided by SymFormer serve as a good starting point for subsequent tuning via gradient descent to further improve the performance. We show on a set of benchmarks that SymFormer outperforms two state-of-the-art methods while having faster inference.

## Predictions on unseen data

In this section we present qualitative results generated on the testing dataset.

## Comparison to previous approaches

We have evaluated SymFormer on common benchmarks (see the paper) and compared it to current state-of-the-art approaches: NSRS [1] and DSO [2].

## References

[1] Biggio, L., Bendinelli, T., Neitz, A., Lucchi, A. and Parascandolo, G., 2021, July. Neural Symbolic Regression that Scales. In International Conference on Machine Learning, pages 936-945. PMLR.
[2] Mundhenk, T.N., Landajuela, M., Glatt, R., Santiago, C.P., Faissol, D.M. and Petersen, B.K., 2021. Symbolic Regression via Neural-Guided Genetic Programming Population Seeding. arXiv preprint arXiv:2111.00053.

## Citation

@article{vastl2022symformer,
title={SymFormer: End-to-end symbolic regression using transformer-based architecture},
author={Vastl, Martin and Kulh{\'a}nek, Jon{\'a}{\v{s}} and Kubal{\'i}k, Ji{\v{r}}{\'i} and Derner, Erik and Babu{\v{s}}ka, Robert},
journal={arXiv preprint arXiv:2205.15764},
year={2022},
}