Skip to yearly menu bar Skip to main content


Poster

Meta-Learning Neural Bloom Filters

Jack Rae · Sergey Bartunov · Timothy Lillicrap

Pacific Ballroom #43

Keywords: [ Systems and Software ] [ Deep Sequence Models ] [ Algorithms ]


Abstract:

There has been a recent trend in training neural networks to replace data structures that have been crafted by hand, with an aim for faster execution, better accuracy, or greater compression. In this setting, a neural data structure is instantiated by training a network over many epochs of its inputs until convergence. In applications where inputs arrive at high throughput, or are ephemeral, training a network from scratch is not practical. This motivates the need for few-shot neural data structures. In this paper we explore the learning of approximate set membership over a set of data in one-shot via meta-learning. We propose a novel memory architecture, the Neural Bloom Filter, which is able to achieve significant compression gains over classical Bloom Filters and existing memory-augmented neural networks.

Live content is unavailable. Log in and register to view live content