Timezone: »
In the context of supervised tensor learning, preserving the structural information and exploiting the discriminative nonlinear relationships of tensor data are crucial for improving the performance of learning tasks. Based on tensor factorization theory and kernel methods, we propose a novel Kernelized Support Tensor Machine (KSTM) which integrates kernelized tensor factorization with maximum-margin criterion. Specifically, the kernelized factorization technique is introduced to approximate the tensor data in kernel space such that the complex nonlinear relationships within tensor data can be explored. Further, dual structural preserving kernels are devised to learn the nonlinear boundary between tensor data. As a result of joint optimization, the kernels obtained in KSTM exhibit better generalization power to discriminative analysis. The experimental results on real-world neuroimaging datasets show the superiority of KSTM over the state-of-the-art techniques.
Author Information
Lifang He (University of Illinios at Chicago/Shenzhen University)
Chun-Ta Lu (University of Illinois at Chicago)
Guixiang Ma
Shen Wang (University of Illinios at Chicago)
Linlin Shen
Philip Yu (UIC)
Ann Ragin (Northwestern University)
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Talk: Kernelized Support Tensor Machines »
Mon. Aug 7th 01:42 -- 02:00 AM Room C4.6 & C4.7
More from the Same Authors
-
2022 : Deoscillated Adaptive Graph Collaborative Filtering »
Zhiwei Liu · Lin Meng · Fei Jiang · Jiawei Zhang · Philip Yu -
2018 Poster: PredRNN++: Towards A Resolution of the Deep-in-Time Dilemma in Spatiotemporal Predictive Learning »
Yunbo Wang · Zhifeng Gao · Mingsheng Long · Jianmin Wang · Philip Yu -
2018 Oral: PredRNN++: Towards A Resolution of the Deep-in-Time Dilemma in Spatiotemporal Predictive Learning »
Yunbo Wang · Zhifeng Gao · Mingsheng Long · Jianmin Wang · Philip Yu