Timezone: »

Efficient Statistical Tests: A Neural Tangent Kernel Approach
Sheng Jia · Ehsan Nezhadarya · Yuhuai Wu · Jimmy Ba

Thu Jul 22 09:00 PM -- 11:00 PM (PDT) @ Virtual

For machine learning models to make reliable predictions in deployment, one needs to ensure the previously unknown test samples need to be sufficiently similar to the training data. The commonly used shift-invariant kernels do not have the compositionality and fail to capture invariances in high-dimensional data in computer vision. We propose a shift-invariant convolutional neural tangent kernel (SCNTK) based outlier detector and two-sample tests with maximum mean discrepancy (MMD) that is O(n) in the number of samples due to using the random feature approximation. On MNIST and CIFAR10 with various types of dataset shifts, we empirically show that statistical tests with such compositional kernels, inherited from infinitely wide neural networks, achieve higher detection accuracy than existing non-parametric methods. Our method also provides a competitive alternative to adapted kernel methods that require a training phase.

Author Information

Sheng Jia (University of Toronto)
Ehsan Nezhadarya (LG Electronics)
Yuhuai Wu (Stanford University / Google)
Jimmy Ba

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors