Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Neural Compression: From Information Theory to Applications

Practical Random Tree Generation using Spanning Trees: Entropy and Compression

Amirmohammad Farzaneh


Abstract:

Tree structures make an appearance in many learning-related problems, most importantly in Graph Neural Networks. Modeling and simulating the appearance of these data structures can be done using random tree generators. However, there has been very little study on random models that are able to capture the dynamics of networks. We introduce the random spanning tree model, which is a random tree generator that is based on generating a tree from an already existing network topology. The Shannon entropy of this model is then analysed, and upper bounds to it are found. As compression can be beneficial because of the complexity of large trees, we then introduce a universal approach to compressing trees generated using the spanning tree model. It will be shown that the proposed method of compression introduces a redundancy that tends to zero for larger trees.

Virtual talk: https://drive.google.com/file/d/1ZnivpgeQss0ftLPjb2TNtlbAZQH6EmbN/view?usp=drive_link

Chat is not available.