Poster
in
Workshop: AI for Science: Scaling in AI for Scientific Discovery
The Scaling Law in Astronomical Time Series Data
Jiashu Pan · Yuan-Sen Ting · Jie Yu · Yang Huang · Ji-Feng Liu
Keywords: [ Time Series ] [ GPT ] [ Scaling Law ] [ Astronomy ]
Abstract:
Characterizing time series of fluxes from stars, known as stellar light curves, can provide valuable insights into stellar properties. However, most existing methods rely on extracting summary statistics, and studies applying deep learning have been limited to straightforward supervised approaches. In this study, we explore the scaling law properties exhibited when learning astronomical time series data using unsupervised techniques. Employing the GPT-2 architecture, we demonstrate how the learned representation improves as the number of parameters increases from $10^4$ to $10^9$, with no indication of performance plateauing. We show that at the billion-parameter scale, a simple unsupervised model based on GPT-2 rivals state-of-the-art supervised learning models in inferring the surface gravity of stars as a downstream task from stellar light curves.
Chat is not available.