Collaborative distillation for top-N recommendation

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

60 Scopus citations

Abstract

Knowledge distillation (KD) is a well-known method to reduce inference latency by compressing a cumbersome teacher model to a small student model. Despite the success of KD in the classification task, applying KD to recommender models is challenging due to the sparsity of positive feedback, the ambiguity of missing feedback, and the ranking problem associated with the top-N recommendation. To address the issues, we propose a new KD model for the collaborative filtering approach, namely collaborative distillation (CD). Specifically, (1) we reformulate a loss function to deal with the ambiguity of missing feedback. (2) We exploit probabilistic rank-aware sampling for the top-N recommendation. (3) To train the proposed model effectively, we develop two training strategies for the student model, called the teacher-and the student-guided training methods, selecting the most useful feedback from the teacher model. Via experimental results, we demonstrate that the proposed model outperforms the state-of-the-art method by 5.5-29.7% and 4.8-27.8% in hit rate (HR) and normalized discounted cumulative gain (NDCG), respectively. Moreover, the proposed model achieves the performance comparable to the teacher model.

Original languageEnglish
Title of host publicationProceedings - 19th IEEE International Conference on Data Mining, ICDM 2019
EditorsJianyong Wang, Kyuseok Shim, Xindong Wu
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages369-378
Number of pages10
ISBN (Electronic)9781728146034
DOIs
StatePublished - Nov 2019
Event19th IEEE International Conference on Data Mining, ICDM 2019 - Beijing, China
Duration: 8 Nov 201911 Nov 2019

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
Volume2019-November
ISSN (Print)1550-4786

Conference

Conference19th IEEE International Conference on Data Mining, ICDM 2019
Country/TerritoryChina
CityBeijing
Period8/11/1911/11/19

Keywords

  • Collaborative filtering
  • Knowledge distillation
  • Recommender systems

Fingerprint

Dive into the research topics of 'Collaborative distillation for top-N recommendation'. Together they form a unique fingerprint.

Cite this