Current Visitors:

Curriculum Pre-Training Heterogeneous Subgraph Transformer for Top-N Recommendation

Hui Wang, Kun Zhou, Wayne Xin Zhao, Jingyuan Wang*, Ji-Rong Wen

ACM Transactions on Information Systems2022.

To characterize complex and heterogeneous side information in recommender systems, heterogeneous information network (HIN) has shown superior performance and attracted much research attention. In HIN, the rich entities, relations and paths can be utilized to model the correlations of users and items, such a task setting is often called HIN-based recommendation. Although HIN provides a general approach to modeling rich side information, it lacks special consideration on the goal of the recommendation task. The aggregated context from the heterogeneous graph is likely to incorporate irrelevant information, and the learned representations are not speciically optimized according to the recommendation task. Therefore, there is a need to rethink how to leverage the useful information from HIN to accomplish the recommendation task.

  To address the above issues, we propose a Curriculum pre-training based HEterogeneous Subgraph Transformer (called CHEST) with new data characterization, representation model and learning algorithm. Speciically, we consider extracting useful information from HIN to compose the interaction-speciic heterogeneous subgraph, containing highly relevant context information for recommendation. Then, we capture the rich semantics (e.g., graph structure and path semantics) within the subgraph via a heterogeneous subgraph Transformer, where we encode the subgraph into multi-slot sequence representations. Besides, we design a curriculum pre-training strategy to provide an elementary-to-advanced learning process. The elementary course focuses on capturing local context information within the subgraph, and the advanced course aims to learn global context information. In this way, we gradually capture useful semantic information from HIN for modeling user-item interactions. Extensive experiments conducted on four real-world datasets demonstrate the superiority of our proposed method over a number of competitive baselines, especially when only limited training data is available.

Curriculum Pre-Training Heterogeneous Subgraph Transformer for Top-N Recommendation
Curriculum Pre-Training Heterogeneous Su
Adobe Acrobat Document 2.0 MB



  author = {Wang, Hui and Zhou, Kun and Zhao, Wayne Xin and Wang, Jingyuan and Wen, Ji-Rong},

  title = {Curriculum Pre-Training Heterogeneous Subgraph Transformer for Top-N Recommendation},

  year = {2022},

  journal = {ACM Trans. Inf. Syst.}