Timezone: »

Impact of Noise on Calibration and Generalisation of Neural Networks
Martin Ferianc · Ondrej Bohdal · Timothy Hospedales · Miguel Rodrigues
Event URL: https://openreview.net/forum?id=QzlN0rUJVi »

Noise injection and data augmentation strategies have been effective for enhancing the generalisation and robustness of neural networks (NNs). Certain types of noise such as label smoothing and MixUp have also been shown to improve calibration. Since noise can be added in various stages of the NN's training, it motivates the question of when and where the noise is the most effective. We study a variety of noise types to determine how much they improve calibration and generalisation, and under what conditions. More specifically we evaluate various noise-injection strategies in both in-distribution (ID) and out-of-distribution (OOD) scenarios.The findings highlight that activation noise was the most transferable and effective in improving generalisation, while input augmentation noise was prominent in improving calibration on OOD but not necessarily ID data.

Author Information

Martin Ferianc (University College London, University of London)
Ondrej Bohdal (University of Edinburgh)
Timothy Hospedales (Samsung AI Centre / University of Edinburgh)
Miguel Rodrigues (University College London)

More from the Same Authors