An Asynchronous Kalman Filter for Hybrid Event Cameras
Z. Wang, Y. Ng, C. Scheerlinck, R. Mahony
International Conference on Computer Vision (ICCV), 2021
Abstract. Event cameras are ideally suited to capture HDR visual information without blur but perform poorly on static or slowly changing scenes. Conversely, conventional image sensors measure absolute intensity of slowly changing scenes effectively but do poorly on high dynamic range or quickly changing scenes. In this paper, we present an event-based video reconstruction pipeline for High Dynamic Range (HDR) scenarios. The proposed algorithm includes a frame augmentation pre-processing step that deblurs and temporally interpolates frame data using events. The augmented frame and event data are then fused using a novel asynchronous Kalman filter under a unifying uncertainty model for both sensors. Our experimental results are evaluated on both publicly available datasets with challenging lighting conditions and fast motions and our new dataset with HDR reference. The proposed algorithm outperforms state-of-the-art methods in both absolute intensity error (48% reduction) and image similarity indexes (average 11% improvement).
Computer Vision Foundation Open Access page (incl. PDF).
Reference:
- Z. Wang, Y. Ng, C. Scheerlinck, R. Mahony, “An Asynchronous Kalman Filter for Hybrid Event Cameras”, International Conference on Computer Vision (ICCV), October 2021.