JSAI2023

Presentation information

General Session

General Session » GS-2 Machine learning

[2A6-GS-2] Machine learning

Wed. Jun 7, 2023 5:30 PM - 7:10 PM Room A (Main hall)

座長:森 隼基(NEC) [現地]

6:50 PM - 7:10 PM

[2A6-GS-2-05] A Study on Performance and Efficiency Improvement of FT-Transformer

〇Tokimasa Isomura1, Tomoki Amano1, Ryotaro Shimizu1, Masayuki Goto1 (1. Waseda University)

Keywords:Transformer, Attention, Tabular Data, Explainable AI, DNN

In recent studies, Deep learning (DL) models demonstrate high performances on various datasets. FT-Transformer (FTT), which applies the Transformer model to tabular data, has been proposed as an effective DL model for tabular data. FTT performed higher on several datasets than gradient boosting models, the current mainstream for tabular data. Initially proposed for unstructured data, Transformer shows high performance by sensitively considering the relationships between all features (e.g., words and patch images) through the attention mechanism. However, the relationships between features in tabular data can be considered less complex than those in unstructured data such as documents and images. Therefore, we propose an improved FTT suitable for tabular data that does not excessively consider the unnecessary relationship between features in the Transformer's attention mechanism and improves performance and computational efficiency. We perform evaluation experiments on regression, binary classification, and multi-level classification tasks and show our model's effectiveness.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password