Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators

Learning Set Functions with Implicit Differentiation

Gözde Özcan · Chengzhi Shi · Stratis Ioannidis

Keywords: [ Deep Learning ] [ Probabilistic Methods ] [ learning set functions ] [ implicit differentiation ] [ neural set functions ]


Abstract:

Ou et al. [1] introduce the problem of learning set functions from data generated by a so-called optimal subset oracle. Their approach approximates the underlying utility function with an energy-based model. This approximation yields iterations of fixed-point update steps during mean-field variational inference. However, as the number of iterations increases, automatic differentiation quickly becomes computationally prohibitive due to the size of the Jacobians that are stacked during backpropagation. We address this challenge with implicit differentiation and examine the convergence conditions for the fixed-point iterations. We empirically demonstrate the efficiency of our method on synthetic and real-world subset selection applications including product recommendation and set anomaly detection tasks.

Chat is not available.