Monday, 5 December 2016

How would you treat a driverless car?

Scenario: Say you're driving down a two-way street and there's a vehicle parked in the opposite lane. The oncoming traffic therefore needs to pull out into your lane to overtake.

What do you do?
Many of us just drive on as we have right of way. But eventually one of us feels charitable and slows down to allow the oncoming car to overtake, giving permission with a quick flash of headlights or a beckoning wave.

Now consider if this oncoming car was a driverless or autonomous vehicle (AV)?
would it be able to understand what you mean when you flash your lights or frantically wave your hands?

Its sensors could decide that it's only safe to overtake when there's no oncoming traffic at all. which on a busy road this may be never, leading to increasingly exasperated passengers and increasingly angry drivers queuing behind.

These safety-first robot cars could become victims of their own politeness and end up being bullied and ignored by aggressive, impatient humans.

This, at any rate, is one of the conclusions to be drawn from research carried out by Dr Chris Tennant of the psychological and behavioural science department at the London School of Economics.

His Europe-wide survey, commissioned by tyre-maker Goodyear, finds that nearly two-thirds of drivers think machines won't have enough commonsense to interact with human drivers.

And more than two-fifths think a robot car would remain stuck behind our hypothetical parked lorry for a long time.

Robot v. human
Driving isn't just about technology and engineering, it's about human interactions and psychology.

"The road is a social space," as Carlos Cipolitti, general director of the Goodyear Innovation Centre in Luxembourg, puts it.

And it is this social aspect that makes many people sceptical about driverless cars.

"If you view the road as a social space, you will consciously negotiate your journey with other drivers. People who like that negotiation process appear to feel less comfortable engaging with AVs than with human drivers," says Mr Tennant in his report.

Of course, humans are always sceptical about new technologies of which they have little experience. That scepticism usually diminishes with usage, however.

And even many sceptics accept that emotionless AVs could cause fewer accidents than we humans, with our propensity to road rage, tiredness and lack of concentration.

A statistic often used out is that human error is responsible for more than 90% of accidents.

But 70% of the 12,000 people Mr Tennant and his team interviewed agreed that: "As a point of principle, humans should be in control of their vehicles."

An an even greater proportion - 80% - thought an autonomous vehicle should always have a steering wheel.

AV pioneer Google - which aims to develop cars without steering wheels - reckons it can meet most of these real-world challenges.

It has already filed patent requests for tech that it claims will be able to identify aggressive or reckless driving and respond to it; and recognise and react to the flashing lights of police cars and emergency services.

In time then, it may well be able to programme its cars to recognise the different meanings of headlight flashes, and interpret the intentions of human drivers by their behaviour.

In the latest Google self-driving car project monthly report, head honcho Dmitri Dolgov says: "Over the last year, we've learned that being a good driver is more than just knowing how to safely navigate around people, [it's also about] knowing how to interact with them."

These interactions are "a delicate social dance", he writes, claiming that Google cars can now "often mimic these social behaviours and communicate our intentions to other drivers, while reading many cues that tell us if we're able to pass, cut in or merge."

Google's test cars have now racked up more than two million fully-autonomous miles of driving on public roads in California, Arizona, Texas and Washington, reporting a handful of minor accidents to the Californian authorities.

Interestingly, quite a few of these accidents have involved human-driven vehicles going into the back of the Google cars, suggesting perhaps that the ultra-cautious robots, with safety as their first priority, are more timid in their approach than we're used to.

Mr Dolgov admits that the self-driving software is not yet ready for commercial release.

www.radar-detectors.co.uk


Source: BBC

No comments:

Post a Comment