The Fisher Dimension: Instance-Dependent Complexity for Causal Discovery
Abstract
Classical sample complexity bounds for causal structure learning are minimax in nature, characterizing worst-case difficulty without distinguishing between easy and hard instances. We study instance-specific complexity for Markov equivalence class (MEC) recovery in linear Gaussian structural equation models. We introduce the Fisher dimension, defined as the inverse squared minimum partial correlation that must be detected to recover the MEC. We prove that the Fisher dimension governs sample complexity: it provides both a lower bound and an upper bound (tight up to logarithmic factors) for MEC recovery. A key theoretical finding is that under spectrally well-conditioned models, with bounded noise variances, bounded covariance eigenvalues, and constant-order edge coefficients, the Fisher dimension is uniformly bounded regardless of graph structure. Thus, significant instance-specific variation arises from parametric rather than structural features. Empirical validation shows strong correlation between our predictor and observed sample complexity for structured graph families.