Function-Valued Causal Influence in Nonlinear Time Series
Abstract
Causal discovery in time series is increasingly performed using nonlinear machine-learning models, yet the resulting causal relationships are almost always summarized by scalar edge scores. We argue that this practice obscures the true object learned by nonlinear autoregressive models: function-valued causal influence. In such models, each directed relationship corresponds not to a single weight or coefficient, but to a state-dependent function whose effect varies across regimes, magnitudes, and contexts of the system. In this paper, we formalize function-valued causal influence in nonlinear multivariate time series and show that common scalar summaries, such as aggregated contribution magnitudes, constitute severe information bottlenecks. Using Neural Additive Vector Autoregression as a representative architecture, we demonstrate that edges with indistinguishable scalar causal scores can exhibit qualitatively different functional behaviors, including monotonic, thresholded, saturating, and sign-changing effects. These differences explain persistent discrepancies between causal score magnitude, interpretability, and predictive relevance that cannot be resolved by significance testing alone. We present a general framework for extracting and visualizing causal response functions from neural autoregressive models using learned contribution tensors and local attribution methods. Through controlled synthetic systems and an applied case study of democratic development, we show how function-valued analysis reveals regime-specific and asymmetric causal structure that is systematically missed by coefficient-centric or score-centric approaches. Our results suggest that meaningful interpretation of nonlinear causal time-series models requires moving beyond scalar causal scores toward explicit analysis of causal response functions. This reframing clarifies the representational content of modern causal discovery methods and provides a foundation for more faithful interpretation of complex dynamical systems.