JSAI2019

Presentation information

General Session

General Session » [GS] J-2 Machine learning

[1J3-J-2] Machine learning: bayesian models

Tue. Jun 4, 2019 3:20 PM - 4:40 PM Room J (201B Medium meeting room)

Chair:Ichigaku Takigawa Reviewer:Satoshi Oyama

3:40 PM - 4:00 PM

[1J3-J-2-02] Scalable Bayesian Optimization with Memory Retention

〇Hidetaka Ito1, Tatsushi Matsubayashi1, Takeshi Kurashima1, Hiroyuki Toda1 (1. NTT Service Evolution Laboratories, NTT Corporation)

Keywords:Bayesian Optimization, Gaussian Process

Bayesian optimization is a method for the global optimization of black-box functions as few evaluation as possible. It utilizes Gaussian processes to efficiently select parameters to be evaluated. However, it is not scalable because Gaussian processes scale cubically with the number of iterations. In this work, we propose a method for scalable Bayesian optimization by leveraging models used in past iterations, which we call past memory. This technique enables us to t Gaussian processes to only input-output pairs near the previously selected input parameter. In experiments, we show our proposed method outperforms naive Bayesian optimization in terms of optimization performance with limited time budget.