Deep Neural Network Regression with Functional Covariates
Abstract
Regression with functional covariates poses fundamental challenges due to the infinite-dimensional nature of functional data, and its theoretical properties have been studied under specialized frameworks in classical nonparametric statistics. While deep neural networks (DNNs) have demonstrated remarkable empirical success in high-dimensional regression, their theoretical behavior in settings involving infinite-dimensional covariates remains largely unexplored. In this work, we study the theoretical performance of DNN-based estimators for regression problems with functional covariates. We extend existing theoretical techniques, which are developed for finite-dimensional covariates supported on compact sets, to the infinite-dimensional and non-compact functional data setting. Under mild conditions, we show that DNN estimators attain minimax-optimal polynomial rates of convergence for both functional linear models and functional generalized linear models. For fully nonparametric regression with functional covariates, we establish a lower bound on the prediction error, and further discuss the fundamental obstacles inherent to this problem and their connections to existing state-of-the-art methods in the literature.