JSAI2020

Presentation information

Interactive Session

[4Rin1] Interactive 2

Fri. Jun 12, 2020 9:00 AM - 10:40 AM Room R01 (jsai2020online-2-33)

[4Rin1-45] Sparse Contextual Sentence Representation for Fast Information Retrieval

〇Taku Hasegawa1, Kyosuke Nishida1, Soichiro Kaku1, Junji Tomita1 (1.NTT Media Intelligence Laboratories, NTT Corporation)

Keywords:Information Retrieval, Deep Learning, Sparse representation

Most of the existing neural ranking models are employed for re-ranking a small set of relevant documents fora given query, provided by an efficient retrieval model. In other studies, to achieve a standalone neural rankingmodel, sparse neural ranking models have been proposed. The sparsity of representation is measured by L0 normand often achieved via L1 regularization in previous works. However, the sparse learning presents a risk to obtainthe model which maps documents in subspace. In this study, we propose a two-stage neural ranking model thatcan map documents high dimension space by learningα-pseudo sparse representation. Our model achieves a higherscore than previous models for all the documents in a large-scale collection.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password