Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators

Investigating Axis-Aligned Differentiable Trees through Neural Tangent Kernels

Ryuichi Kanoh · Mahito Sugiyama


Abstract:

Axis-aligned rules are known to induce an important inductive bias in machine learning models such as typical hard decision tree ensembles. However, theoretical understanding of the learning behavior is largely unrevealed due to the discrete nature of rules. To address this issue, we impose the axis-aligned constraint on differentiable decision trees, or soft trees, which relax the splitting process of decision trees and are trained using the gradient method. The differentiable property enables us to derive their Neural Tangent Kernel (NTK) that can analytically describe the training behavior. Two cases are realized: imposing the axis-aligned constraint throughout the entire training process, or only at the initial state. Moreover, we extend the NTK framework to handle various tree architectures simultaneously, and prove that any axis-aligned non-oblivious tree ensemble can be transformed into an axis-aligned oblivious tree ensemble with the same limiting NTK. By excluding non-oblivious trees from the search space, the cost of trial-and-error procedures required for model selection can be massively reduced.

Chat is not available.