Datasets
Our research group is working on a range of topics in Robotics and Autonomous Vehicles. We are happy to share our data with other researchers. Please refer to the respective publication when using this data.
The first open-source collaborative perception dataset focused on adverse weather conditions. Adver-City has 110 scenarios with multi-modal data from LiDARs, cameras, GNSS and IMU captured from 5 viewpoints. Simulated on CARLA with OpenCDA, it has over 29 thousand frames and 890 thousand annotations. Click here for more!
The QueensCAMP dataset is a collection of RGB-D images of an indoor environment designed to evaluate VSLAM systems’ robustness in real-world indoor environments with diverse challenges. The dataset contains dynamic objects, motion blur, lighting changes, and other challenges that are common in real-world indoor environments. Additionally, it includes sequences with emulated lens failures. Click here for more!