Open Datasets

Datasets

  • Cruise open source data viewer and sample data
  • Waymo open dataset. High resolution lidar and camera data has been collected by self-driving cars across a diverse range of situations.
  • Boxy vehicle detection dataset. A vehicle detection dataset with 1.99 million annotated vehicles in 200,000 images. It contains AABB and keypoint labels.
  • Unsupervised LLAMAS dataset. A lane marker detection and segmentation dataset of 100,000 images with 3D lines, pixel level dashed markers, and curves for individual lines.
  • The Bosch Small Traffic Lights Dataset (BSTLD). A dataset for traffic light detection, tracking, and classification.
  • DIY Robocars Oakland warehouse track
  • AMUSE -The automotive multi-sensor (AMUSE) dataset taken in real traffic scenes during multiple test drives. (Philipp Koschorrek etc.)
  • Autonomous Driving – semantic segmentation,pedestrian detection,virtual-world data,far infrared,stereo,driver monitoring(CVC research center and the UAB and UPC universities)
  • Joint Attention in Autonomous Driving (JAAD) – The dataset includes instances of pedestrians and cars intended primarily for the purpose of behavioural studies and detection in the context of autonomous driving.(Iuliia Kotseruba, Amir Rasouli and John K. Tsotsos)
  • Oxford RobotCar dataset, a 10km route driven 100x in all conditions of traffic, light and weather: 
  • Germany road signs dataset – 50000 unique images of ~300 different road signs in a wide variety of conditions, angle, distance, clarity, resolution and light
  • LISA Vehicle Detection Dataset – colour first person driving video under various lighting and traffic conditions (Sivaraman, Trivedi)
  • Lost and Found Dataset – The Lost and Found Dataset addresses the problem of detecting unexpected small road hazards (often caused by lost cargo) for autonomous driving applications. (Sebastian Ramos, Peter Pinggera, Stefan Gehrig, Uwe Franke, Rudolf Mester, Carsten Rother)
  • SHRP2 Study – naturalistic driving data set includes approximately 2,000,000 vehicle miles, almost 43,000 hours of data, 241 primary and secondary drivers, 12 to 13 months of data collection for each vehicle, and data from a highly capable instrumentation system including 5 channels of video and many vehicle state and kinematic sensors.
  • UDRIVE – ‘naturalistic driving’ dataset in production across 7 European countries – very large project across multiple vehicle types aiming for 100,000 hours of driving data concluding in June 2017.
  • SYNTHIA – Large set (~half million) of virtual-world images for training autonomous cars to see. (ADAS Group at Computer Vision Center)
  • The SYNTHetic collection of Imagery and Annotations – the purpose of aiding semantic segmentation and related scene understanding problems in the context of driving scenarios (Computer vision center,UAB)
  • Self Racing Cars – data collected at Self Racing Cars events and for possible use by competitors

Leave a Reply

Your email address will not be published. Required fields are marked *