Poster
Deep symbolic regression for recurrence prediction
Stéphane d'Ascoli · Pierre-Alexandre Kamienny · Guillaume Lample · Francois Charton
Hall E #433
Keywords: [ APP: Time Series ] [ DL: Sequential Models, Time series ] [ DL: Attention Mechanisms ] [ DL: Algorithms ] [ DL: Everything Else ]
Abstract:
Symbolic regression, i.e. predicting a function from the observation of its values, is well-known to be a challenging task. In this paper, we train Transformers to infer the function or recurrence relation underlying sequences of integers or floats, a typical task in human IQ tests which has hardly been tackled in the machine learning literature. We evaluate our integer model on a subset of OEIS sequences, and show that it outperforms built-in Mathematica functions for recurrence prediction. We also demonstrate that our float model is able to yield informative approximations of out-of-vocabulary functions and constants, e.g. bessel0(x)≈sin(x)+cos(x)√πxbessel0(x)≈sin(x)+cos(x)√πx and 1.644934≈π2/61.644934≈π2/6.
Chat is not available.