Keywords:Nonlinear Feature Selection, Structured Regularization Learning, Multiple Task Learning, Independence Criterion, Multiple Kernel Learning
For capturing non-linear dependency, feature selection methods using regularized learning and independence criterion have attracted much attention. In particular, HSIC Lasso is one of the most effective sparse non-linear feature selection methods based on the Hilbert-Schmidt independence criterion. However, the previous feature selection based on a single basis kernel function tends to ambiguous results, and moreover, relevant features may be missed in certain problem settings. In this study, we propose a method for multi-task learning using multiple basis kernel functions and a non-negative constrained Group Lasso with a group structure for each feature, which can clearly select useful features based on multiple independence measures. We applied the method to several synthetic datasets and real-world datasets and verified its effectiveness regarding redundancy, sparsity, and classification and prediction accuracy using the selected features. The results indicate that the method can more drastically remove irrelevant features, leaving only relevant features.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.