Real data tensors are usually high dimensional but their intrinsic information is preserved in low-dimensional space, which motivates to use tensor decompositions such as Tucker decomposition. Often, real data tensors are not only low dimensional, but also smooth, meaning that the adjacent elements are similar or continuously changing, which typically appear as spatial or temporal data. To incorporate the smoothness property, we propose the smoothed Tucker decomposition (STD). STD leverages the smoothness by the sum of a few basis functions, which reduces the number of parameters. The objective function is formulated as a convex problem and, to solve that, an algorithm based on the alternating direction method of multipliers is derived. We theoretically show that, under the smoothness assumption, STD achieves a better error bound. The theoretical result and performances of STD are numerically verified.
Masaaki Imaizumi (Institute of Statistical Mathematics)
Kohei Hayashi (AIST / RIKEN)
Received D.Eng. from Nara Institute of Science and Technology, Japan in 2012. Interested in approximate Bayesian inference, tensor and matrix decomposition, and web data mining.
Related Events (a corresponding poster, oral, or spotlight)
2017 Talk: Tensor Decomposition with Smoothness »
Mon Aug 7th 06:42 -- 07:00 AM Room C4.4