Skip to yearly menu bar Skip to main content


Poster

Conditional Temporal Neural Processes with Covariance Loss

Boseon Yoo · Jiwoo Lee · Janghoon Ju · Seijun Chung · Soyeon Kim · Jaesik Choi

Virtual

Keywords: [ Optimization for Deep Networks ]


Abstract:

We introduce a novel loss function, Covariance Loss, which is conceptually equivalent to conditional neural processes and has a form of regularization so that is applicable to many kinds of neural networks. With the proposed loss, mappings from input variables to target variables are highly affected by dependencies of target variables as well as mean activation and mean dependencies of input and target variables. This nature enables the resulting neural networks to become more robust to noisy observations and recapture missing dependencies from prior information. In order to show the validity of the proposed loss, we conduct extensive sets of experiments on real-world datasets with state-of-the-art models and discuss the benefits and drawbacks of the proposed Covariance Loss.

Chat is not available.