Poster
Topology-aware Generalization of Decentralized SGD
Tongtian Zhu · Fengxiang He · Lan Zhang · Zhengyang Niu · Mingli Song · Dacheng Tao
Hall E #1221
Keywords: [ DL: Theory ] [ T: Deep Learning ]
Abstract:
This paper studies the algorithmic stability and generalizability of decentralized stochastic gradient descent (D-SGD). We prove that the consensus model learned by D-SGD is O(m/N\unaryplus1/m\unaryplusλ2)-stable in expectation in the non-convex non-smooth setting, where N is the total sample size of the whole system, m is the worker number, and 1\unaryminusλ is the spectral gap that measures the connectivity of the communication topology. These results then deliver an O(1/N\unaryplus((m−1λ2)α2\unaryplusm\unaryminusα)/N1\unaryminusα2) in-average generalization bound, which is non-vacuous even when λ is closed to 1, in contrast to vacuous as suggested by existing literature on the projected version of D-SGD. Our theory indicates that the generalizability of D-SGD has a positive correlation with the spectral gap, and can explain why consensus control in initial training phase can ensure better generalization. Experiments of VGG-11 and ResNet-18 on CIFAR-10, CIFAR-100 and Tiny-ImageNet justify our theory. To our best knowledge, this is the first work on the topology-aware generalization of vanilla D-SGD. Code is available at \url{https://github.com/Raiden-Zhu/Generalization-of-DSGD}.
Chat is not available.