pTNAS: Progressive Neural Architecture Search for Tabular Data
Abstract
Recent advances have shifted the paradigm of tabular learning toward tabular foundation models, yet their accuracy relies on a heavy inference cost that scales poorly with context size. Deep neural networks remain a highly competitive and more efficient modeling paradigm when equipped with well-designed architectures; however, identifying such architectures in a data-adaptive and budget-aware manner remains challenging. We propose pTNAS, the first progressive neural architecture search (NAS) approach tailored for tabular data, to enable fast identification of a viable architecture and continuously improve its search performance as more budget becomes available. pTNAS adopts a filter-and-refine optimization strategy that combines both efficient training-free and effective training-based architecture evaluation. At the filtering phase, we introduce pTProxy, a novel zero-cost proxy specifically designed for tabular networks that jointly captures architectural trainability and expressivity, to facilitate fast filtering of large architecture search spaces. At the refinement phase, pTNAS employs a fixed-budget scheduling algorithm to accurately identify the best-performing architecture from a small set of promising candidates. We further propose a budget-aware coordinator to optimize budget allocation holistically. Experiments show that pTNAS reduces the time to reach the globally best architecture by up to 82.75 X compared with other NAS approaches, and improves average predictive accuracy and end-to-end efficiency by up to 4.95 X compared with TabPFN.