SHIFT: A Synthetic Driving Dataset for Continuous Multi-Task Domain Adaptation

SHIFT Dataset

Abstract

Adapting to a continuously evolving environment is a safety-critical challenge inevitably faced by all autonomous-driving systems. Existing image- and video-based driving datasets, however, fall short of capturing the mutable nature of the real world. In this paper, we introduce the largest synthetic dataset for autonomous driving, SHIFT. It presents discrete and continuous shifts in cloudiness, rain and fog intensity, time of day, and vehicle and pedestrian density. Featuring a comprehensive sensor suite and annotations for several mainstream perception tasks, SHIFT allows to investigate how a perception systems’ performance degrades at increasing levels of domain shift, fostering the development of continuous adaptation strategies to mitigate this problem and assessing the robustness and generality of a model. Our dataset and benchmark toolkit is publicly available at https://vis.xyz/shift.

Publication
In Procedings of The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2022
Avatar
Tao Sun
M.Sc. in Computer Science