JSAI2024

Presentation information

Poster Session

Poster session » Poster session

[3Xin2] Poster session 1

Thu. May 30, 2024 11:00 AM - 12:40 PM Room X (Event hall 1)

[3Xin2-61] Evaluating the Capability of Transformers in Computational Algebra

〇Yuta Sato1, Kazuhiko Kawamoto2, Hiroshi Kera2 (1.Chiba University, 2.Graduate School of Engineering, Chiba University)

Keywords:Transformer, computational algebra, machine learning

Recent studies have revealed that Transformers, one of the most successful deep neural networks in recent machine learning, can learn to solve algebraic computations from many pairs of problems and solutions. For example, symbolic integration can be learned through supervised learning using many functions and the corresponding primitive functions. Unlike mathematical studies, this learning approach does not need algorithm design. Most of the prior studies individually address challenging algebraic problems, including NP-hard ones, but comprehensive investigation of the learning approach is still limited. In this study, we analyze the general Transformer's ability to learn algebraic computations through several computational algebra problems. Our experiments show that Transformers successfully learn polynomial factorization, polynomial greatest common divisor computation, and polynomial factor counting when working on integer coefficients. Interestingly, when finite field coefficients are used, we observe a significant performance degradation. We also observe that the input encoding scheme affects the performance.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password