2:40 PM - 3:00 PM
[1S3-GS-2-04] Dynamic Negative Correlation Learning in Deep Ensemble Learning
Keywords:Deep Ensemble Learning, Negative Correlation Learning
Deep ensemble learning, in which multiple models are combined, is widely used to improve the generalization performance of deep learning models. To maximize the effect of ensemble learning, “Diversity,” in which each model has a different output distribution, is important, and both ensuring diversity and improving accuracy are required. In this study, we propose a new method called “Dynamic Negative Correlation Learning (DNCL)” that dynamically adjusts the balance between diversity and accuracy during learning. In conventional methods, the parameters for adjusting this balance are fixed, making it difficult to respond flexibly according to the learning situation. On the other hand, DNCL aims to improve ensemble performance by dynamically adjusting parameters as learning progresses. Experiments on an image recognition task confirmed that DNCL outperforms conventional methods. It was also found that the parameter transition varied depending on the model size and task difficulty, suggesting that the method is adaptable to various models and datasets.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.