Researchers from ETH Zurich have launched a new competition for autonomous driving. The playing field for the competition is Duckietown. In this model town, small self-driving taxis equipped with a minicomputer, a camera, and a few LEDs transport ducklings from A to B. The platform was created at the Massachusetts Institute of Technology (MIT) and developed further at ETH Zurich.
The participants in the “Artificial Intelligence Driving Olympics”, abbreviated as “AI-DO”, have to teach these little robot taxis to stay in their lane, recognise objects and to avoid them or to interact with them as part of the complete taxi fleet in Duckietown.
Andrea Censi and Jacopo Tani, assistants to ETH Professor Emilio Frazzoli, came up with the idea for the competition. They would like to use the event to test the limits and possibilities of machine learning in physical robots.
Can artificial intelligence steer cars?
The research question to be investigated in the competition: Can artificial intelligence (AI) soon make an active contribution to steering autonomous vehicles in the streets? Today, machine learning helps cars to do things like detect objects. But it does not actively make decisions. “Some robotics researchers think that this will soon be possible. Censi remains sceptical. “But if my colleagues believe that they can implement that on real streets, then they should be able to do that in Duckietown easily,” he says with a chuckle. “They can prove it in our competition.”
Participants receive access to a basic code, simulators in the Amazon cloud, and the so-called robotarium: an autonomous Duckietown platform with remote access that Jacopo Tani and his team have developed and built in the mechanics lab at ETH Zurich. In the qualification phase, teams from around the world work on the code for different tasks. Using “containers” – packets of code that can be implemented on the robots at the touch of a key – they can test and then ultimately set up their code.