Timezone: »

RaFM: Rank-Aware Factorization Machines
Xiaoshuang Chen · Yin Zheng · Jiaxing Wang · Wenye Ma · Junzhou Huang

Thu Jun 13 09:30 AM -- 09:35 AM (PDT) @ Room 201

Fatorization machines (FM) are a popular model class to learn pairwise interactions by a low-rank approximation. Different from existing FM-based approaches which use a fixed rank for all features, this paper proposes a Rank-Aware FM (RaFM) model which adopts pairwise interactions from FMs with different ranks. On one hand, the proposed model achieves a better performance on real-world datasets where different features usually have significantly varying frequencies of occurrences. On the other hand, we prove that the RaFM model can be stored, evaluated, and trained as efficiently as one single FM, and under some reasonable conditions it can be even significantly more efficient than FM. RaFM improves the performance of FMs in both regression tasks and classification tasks while incurring less computational burden, therefore also has attractive potential in industrial applications.

Author Information

Xiaoshuang Chen (Tsinghua Univerisity)
Yin Zheng (Tencent AI Lab)

Yin Zheng received the Ph.D degree from Tsinghua University under the supervision of Prof. Yu-Jin Zhang (Tsinghua University) and Prof. Hugo Larochelle (Google Brain). His research interest is Deep Learning, Machine Learning, Computer Vision, Artificial Intelligence and Recommender Systems. After graduation at 2015, he work as a researcher in the recommendation team of Hulu LLC. Then he joined Tencent AI Lab as a researcher in Machine Learning center.

Jiaxing Wang (Institute of Automation, Chinese Academy of Sciences)
Wenye Ma (Tencent)
Junzhou Huang (University of Texas at Arlington / Tencent AI Lab)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors