[4Rin1-45] Sparse Contextual Sentence Representation for Fast Information Retrieval
Keywords:Information Retrieval, Deep Learning, Sparse representation
Most of the existing neural ranking models are employed for re-ranking a small set of relevant documents fora given query, provided by an efficient retrieval model. In other studies, to achieve a standalone neural rankingmodel, sparse neural ranking models have been proposed. The sparsity of representation is measured by L0 normand often achieved via L1 regularization in previous works. However, the sparse learning presents a risk to obtainthe model which maps documents in subspace. In this study, we propose a two-stage neural ranking model thatcan map documents high dimension space by learningα-pseudo sparse representation. Our model achieves a higherscore than previous models for all the documents in a large-scale collection.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.