3:40 PM - 4:00 PM
[2S5-IS-2c-02] Symbolic piano music understanding from large-scale pre-training
Working-in-progress
Keywords:pre-training, music understanding, NLP
Pre-training driven by a vast amount of data has shown great power in natural language understanding. The existing works using pretraining for symbolic music are not general enough to tackle all the tasks in musical information retrieval. To make up for the insufficiency and compare it with the existing works, we employed a BERT-like masked language pre-training approach to train a stacked Music Transformer on polyphonic piano MIDI files from the MAESTRO dataset. Then we finetuned our pre-trained model on several symbolic music understanding tasks. In our current work in progress, we complemented several note-level tasks, including next token prediction, melody extraction, velocity prediction, and chord recognition. And we compared our model with the previous works.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.