Skip to yearly menu bar Skip to main content

Workshop: Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators

From Perception to Programs: Regularize, Overparameterize, and Amortize

Hao Tang · Kevin Ellis


We develop techniques for synthesizing neurosymbolic programs. Such programs mix discrete symbolic processing with continuous neural computation. We relax this mixed discrete/continuous problem and jointly learn all modules with gradient descent, and also incorporate amortized inference, overparameterization, and a differentiable strategy for penalizing lengthy programs. Collectedly this toolbox improves the stability of gradient-guided program search, and suggests ways of learning both how to parse continuous input into discrete abstractions, and how to process those abstractions via symbolic code

Chat is not available.