Attention-Based and Positional-Aware Neural Networks for Next-Item Recommendation

Document Type : Original Article

Authors

1 Department of Artificial Intelligence Engineering, University of Isfahan, Isfahan, Iran

2 Department of Electrical and Computer Engineering, Tarbiat Moders University, Tehran, Iran

Abstract

Next-item recommendation intends to predict user interest over sequential items given user historical behaviors. There has been a lot of past works for the next-item recommendation, according to Markov Chains(MCs) and Recurrent Neural Networks(RNNs). MCs perform best in sparse datasets and RNNs perform better in denser datasets. To improve these methods, several recommendation systems have been built on MC and RNN architectures. However, these methods struggle to consider long-range dependencies and uncover complex relationships in next-item recommendation. Due to the limitations of these methods, we apply self-attention using neural networks. The proposed method can consider long-range dependencies using self-attention. Also, the proposed method can uncover complex relationships to capture an efficient features representation using neural networks such as long short-term memory(LSTM), Bi-Directional LSTM and convolutional neural network(CNN) presenting for sequence modeling. In this paper, to choose the best model, several neural network models are evaluated and the best model is selected regarding the performance of self-attention-based neural networks. At each time step, this method tries to identify which items are ‘relevant’ from a user’s action history and apply them to predict the next item. We are able to achieve a high accuracy rate and significantly outperform state-of-the-art next-item methods on sequential datasets.  

Keywords