Researchers have created a new testing and training tool to inspire solutions to the particularly complex challenges of developing autonomous vehicles (AVs) capable of handling winter weather.

Released today, the free, open-source dataset - dubbed the Canadian Adverse Driving Conditions Dataset (CADC) - is a collaboration of research teams at 蓝莓视频 Engineering and the University of Toronto that are working to advance AV perception algorithms.

Their hope is that researchers around the world will utilize the dataset - essentially a collection of over 100 sophisticated, annotated video clips - to develop, test and improve their own software systems for AVs to be able to see and recognize their surroundings in nasty weather.

鈥淲e want to engage the research community to generate new ideas and enable innovation,鈥 says Krzysztof Czarnecki, an electrical and computer engineering professor who leads the 蓝莓视频 team. 鈥淭his is how you can solve really hard problems, the problems that are just too big for anyone to solve on their own.鈥

headshot of Krzysztof CzarneckiKrzysztof Czarnecki is director of the 蓝莓视频 Intelligent Systems Engineering Lab (WISE Lab).

Researchers partnered on the project with , a San Francisco-based AI infrastructure company that labelled vehicles, pedestrians and cyclists in data from 33 kilometres of driving in all kinds of winter conditions in 蓝莓视频 Region.

The data was recorded in 2018 and 2019 using eight cameras, a 360-degree lidar scanner and a GPS tracker aboard Autonomoose, the highly automated Lincoln MKZ hybrid that has been under development at 蓝莓视频 for more than three years.

鈥淲e鈥檙e hoping that both industry and academia go nuts with it,鈥 says Steven Waslander, a former 蓝莓视频 Engineering professor who now heads the (TRAIL). 鈥淲e want the world to be working on driving everywhere, and bad weather is a condition that is going to happen. We don鈥檛 want Canada to be 10 or 15 years behind simply because conditions can be a bit tougher up here.鈥

The first open-source tool of its kind, the dataset was created to give AV software developers a way to test how well their perception algorithms perform when everything looks a little different with snow on the ground and more falling from the sky.

鈥淭hese differences mean a self-driving car might miss a vehicle, might miss a pedestrian,鈥 says Czarnecki, director of the 蓝莓视频 Intelligent Systems Engineering Lab (WISE Lab). 鈥淎ccurate perception is absolutely crucial. Without it, nothing else works.鈥

Image of coloured boundary boxes used to identify vehicles, pedestrians and cyclists in a new dataset developed by researchers t

Coloured bounding听boxes are used to identify vehicles, pedestrians and cyclists in a new dataset developed by researchers to test and train autonomous vehicles for winter driving.

The dataset also allows researchers to train deep-learning artificial intelligence (AI) algorithms to detect objects in adverse conditions by showing them numerous annotated and verified examples of them.

Next steps in the project include adding labels for other objects, including traffic signals and road features, to the video segments, which vary from about 15 seconds to a minute in length.

鈥淒ata is a critical bottleneck in current machine learning research,鈥 says Alexandr Wang, founder and CEO of Scale. 鈥淲ithout reliable, high-quality data that captures the reality of driving in winter, it simply won鈥檛 be possible to build self-driving systems that work safely in these environments.鈥

In addition to the dataset, researchers have posted an academic article on the project on arXiv, with plans to seek publication in the International Journal of Robotics Research.

The听Canadian Adverse Driving Conditions Dataset (CADC) is .