The 81st JSAP Autumn Meeting, 2020

Presentation information

Oral presentation

CS Code-sharing session » 【CS.3】 Code-sharing Session of 3.3 & 4.4

[9p-Z10-1~11] 【CS.3】 Code-sharing Session of 3.3 & 4.4

Wed. Sep 9, 2020 1:00 PM - 5:00 PM Z10

Hiroyuki Suzuki(Gunma Univ.), Kazuya Nakano(Univ. of Miyazaki), Kenji Harada(Kitami Inst. of Tech.)

2:15 PM - 2:45 PM

[9p-Z10-4] [INVITED] Deep Depth from Aberration Map

Masako Kashiwagi1, Nao Mishima1, Tatsuo Kozakaya1, Shinsaku Hiura2 (1.Toshiba, 2.Univ. of Hyogo)

Keywords:computational photography, lens aberration, depth estimation

Passive and convenient depth estimation from single-shot image is still an open problem. Existing depth from defocus methods require multiple input images or special hardware customization. Recent deep monocular depth estimation is also limited to an image with sufficient contextual information. In this work, we propose a novel method which realizes a single-shot deep depth measurement based on physical depth cue using only an off-the-shelf camera and lens. When a defocused image is taken by a camera, it contains various types of aberrations corresponding to distances and positions in the image plane. We named these complexly compound aberrations as Aberration Map (A-Map) and we found that A-Map can be utilized as a reliable physical depth cue. Additionally, our deep network named A-Map Analysis Network (AMA-Net) is also proposed, which can effectively learn and estimate depth via A-Map. To evaluate the validity and robustness of our approach, we have conducted extensive experiments using real outdoor scenes. The qualitative result shows our approach achieved highly accurate depth measurement and highly robust performance.