Uncertainty quantification (UQ) of neural networks (NN) is an area of continuing developments. Key methods for UQ involve a Gaussian Process (GP), with the Neural Tangent Kernel (NTK) or Conjugate Kernel (CK) as the kernel function, i.e. an NTK-GP or CK-GP respectively. While the CK can be viewed as a last-layer approximation to the NTK (a common approximation in UQ), there has been to date no direct comparison of the NTK & CK in the context of UQ. In aid of this goal, we compute the limiting expected marginal likelihood for both an NTK-GP and a CK-GP, under the double-asymptotic limit of Random Matrix Theory. We find that both kernel functions display phenomena such as robustness & descent curves when over-parameterized; however, there is a clear preference for the NTK in under-parameterized regimes at initialization. Interestingly, we empirically show that this advantage is merely an artifact of the NN at initialization: once the network is trained, the CK offers UQ performance equivalent to the NTK. This is significant for practical UQ, as the CK is substantially cheaper to compute than the NTK.
Joseph Wilson completed a Bachelor (Honours) in Mathematics at The University of Queensland (UQ) in 2022. He is currently studying a PhD at (UQ) in Mathematics, under the supervision of Fred Roosta (UQ), Liam Hodgkinson (Uni. Melb) and Chris van der Heide (Uni. Melb). Joseph's work focuses on Bayesian uncertainty quantification for neural networks, as well as attempting to predict generalization performance for neural networks under probabilistic frameworks.