7:00 PM - 7:20 PM
[1F5-GS-10-05] A Dynamic Ensemble Method for Adapting to Distribution Shifts without Retraining
Keywords:Machine Learning, Ensemble, Concept Drift, MLOps
In real-world deployment, machine learning models often experience concept drift, resulting in degraded predictive performance. Existing dynamic ensembles use a single set of weights, limiting their ability to address localized drifts in the feature space. We propose a novel Two-Layer Conditional Dynamic Ensemble (TL-CDE) that partitions the feature space into subregions defined by multiple conditions via a two-stage process, assigning distinct weights to each subregion. By leveraging a single pretrained model and its modulated variants, TL-CDE maintains performance without retraining while also providing detailed drift insights through weight update logs. Experiments on synthetic and real-world datasets demonstrated that TL-CDE outperforms existing methods.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.