Poster
in
Affinity Workshop: LatinX in AI (LXAI) Research Workshop
Transformer-Based Astronomical Time Series Model with Uncertainty Estimation for Detecting Misclassified Instances
Martina Cádiz-Leyton · Guillermo Cabrera-Vives · Cristobal Donoso · Daniel Moreno-Cartagena · Pavlos Protopapas
Keywords: [ variable stars ] [ Uncertainty Estimation ] [ transformer-based models ] [ astrophysics machine learning ]
In this work, we present a framework for estimating and evaluating uncertainty in deep-attention-based classifiers for light curves for variable stars. We implemented three techniques, Deep Ensembles (DEs), Monte Carlo Dropout (MCD) and Hierarchical Stochastic Attention (HSA) and evaluated models trained on three astronomical surveys. Our results demonstrate that MCD and HSA offers a competitive and computationally less expensive alternative to DE, allowing the training of transformers with the ability to estimate uncertainties for large-scale light curve datasets. We conclude that the quality of the uncertainty estimation is evaluated using the ROC AUC metric.