How the LAMBDA-V project could help make driverless cars more human

The Engineer

March 2019

David Fowler explores how the LAMBDA-V project is helping driverless cars to understand and embrace the unwritten rules of human driving behavior.

 

LAMBDA-V: artificial intelligence starts with human data

Last month the government announced moves to update regulations for testing autonomous vehicles, aiming to pave the way to allow fully self-driving vehicles to be introduced on UK roads by 2021.

Observers believe that cars that will be able to steer, brake and accelerate by themselves on sections of motorway could become a reality within a few years.

But what about driving on the residential streets near your home: when will an autonomous vehicle (AV) be able to manage that without human intervention? How often do you find yourself in a situation where the rules governing driving – a combination of the law and the Highway Code – don’t really apply, and you have to use your own judgement? What will AVs do in such situations?

A new project, LAMBDA-V (Learning through AMBient Driving styles for Autonomous Vehicles), is looking into that question. Starting last November, a one-year £244,000 feasibility study, with funding from Innovate UK, is collecting data from “ambient” driver behaviour to see whether it can be codified into a set of rules for AVs to follow.

The project is led by machine learning specialist CloudMade, with telematics and Big Data analytics firm Trakm8, traffic modelling specialist Aimsun (a Siemens company), and Birmingham City Council, which wants to know what the implications will be for how its road network is operated. Andy Graham of White Willow Consulting is project manager.

An observed drive Graham took from his home to the M25 illustrates why the project is needed. In the 3.1 miles, there were 23 situations in which he took decisions that could prove challenging to an AV because they were not strictly in compliance with the Highway Code.

Incidents included a car parked just before a road junction, making it necessary to approach on the junction on the right hand side of the road; a place where another parked car made it necessary to drive with two wheels on the kerb to get past; a stretch of road wide enough for only one vehicle, with a blind bend and just one passing space midway along, posing the dilemma of what happens if you meet something coming the other way; and a junction where, if you want to turn left and then right at the junction immediately following, it is necessary to make the left turn in the right hand lane in order to be in position to take the right turn.

In situations such as the narrow road, human drivers will signal by hand or by flashing their headlights to invite another vehicle to go first. It’s unclear how AVs will react to such situations and driver behaviour.

With the average age of the car fleet around eight years, it would take at least that long before half the fleet becomes autonomous even if all new vehicles were AVs. So AVs will have to mix with humans for a considerable period. “Humans drive vehicles: there is a need to understand human behaviour and how AVs react, to tailor the early AVs so they drive like humans,” said Graham. If they don’t, it could be confusing and unsettling both for human drivers and for AV passengers.

The partners in LAMBDA-V have complementary skills to bring to bear on the problem.

Read the full article for more detail.