Leveraging Title-Abstract Attentive Semantics for Paper Recommendation
Abstract
Paper recommendation is a research topic to provide users with personalized papers of interest. However, most existing approaches equally treat title and abstract as the input to learn the representation of a paper, ignoring their semantic relationship. In this paper, we regard the abstract as a sequence of sentences, and propose a two-level attentive neural network to capture:(1) the ability of each word within a sentence to reflect if it is semantically close to the words within the title.(2) the extent of each sentence in the abstract relative to the title, which is often a good summarization of the abstract document. Specifically, we propose a Long-Short Term Memory (LSTM) network with attention to learn the representation of sentences, and integrate a Gated Recurrent Unit (GRU) network with a memory network to learn the long-term sequential sentence patterns of interacted papers for both user and item (paper) modeling. We conduct extensive experiments on two real datasets, and show that our approach outperforms other state-of-the-art approaches in terms of accuracy.
BibTeX
@conference{Guo-2020-126832,author = {Guibing Guo and Bowei Chen and Xiaoyan Zhang and Zhirong Liu and Zhenhua Dong and Xiuqiang He},
title = {Leveraging Title-Abstract Attentive Semantics for Paper Recommendation},
booktitle = {Proceedings of 34th AAAI Conference on Artificial Intelligence (AAAI '20)},
year = {2020},
month = {April},
pages = {67 - 74},
}