David Fritz
2017-10-12 20:02:32 UTC
Last year, a dark historical landmark was reached. Joshua Brown became the
first confirmed person to die in a crash where the car was, at least in
part, driving itself. On a Florida highway, his Tesla Model S ploughed
underneath a white truck trailer that was straddling the road, devastating
the top half of the car.
Browns crash is well known. But more mundane bugs are finding their way
into increasingly software-dependent, semi-autonomous cars. Software
problems accounted for nearly 15 per cent of US car recalls in 2015, up
from less than five per cent in 2011, according to the most recent report
from financial advisors Stout Risius Ross.
Last year, to name a few examples, Toyota recalled around 320,000 cars
after they found improper programming could cause airbags and seatbelt
pretensioners to activate unbidden. Ford had to recall 23,000 cars because
software problems in their electronic windows meant they had excessive
closing force.
Despite the latest wave of excitement about artificial intelligence, the
fear among some of those in the industry is that bugs could prove a
serious hurdle to mass adoption not least because of the weird,
unexpected nature of the accidents they can cause.
Philip Koopman, an associate professor at Carnegie Mellon University and
an expert on autonomous vehicle safety, told The Reg: I look at the
errors, and almost always say: Wow, that should not have happened. And
the most likely explanation is that they did not follow a safety
standard.
The continuous stream of defects in the car industry signals underlying
problems: they just dont want to spend the time and effort to get it
right, he argues.
Car manufacturers contacted by The Reg were unwilling to talk.
Significantly, many developing autonomous vehicles are hiring developers
from Silicon Valley whose backgrounds are in general purpose software
software that, of course, crashes with reasonable frequency. People are
not hiring from among the ranks of the airline safety industry.
Knowing how to code is not knowing how to be safe, Koopman says.
Allegations of poor code go back years. Koopman was an expert witness for
plaintiffs in a 2013 court case in Oklahoma that looked into whether
computer problems had caused a Toyota Camry to accelerate uncontrollably
and crash, killing a passenger in 2007.
He and another investigator found Toyotas electronic throttle control
system was just awful. An 18-month investigation found numerous problems
in the software [PDF], including the potential for stack overflow and no
protection against bit flips where ambient radiation in the outside
environment can switch a bit. The report concluded Toyotas code was
spaghetti.
The jury decided the electronics had been at fault and awarded $3m in
compensation. Toyota stands by the safety of its throttle system, a
company spokesman said, pointing out that an earlier official
investigation, partly carried out by NASA, did not find any faults.
Yoav Hollander, founder of Foretellix, a company looking to develop new
ways to find bugs in engineered systems, has for a number of years been
attending conferences about verifying the safety of autonomous cars (and
other autonomous systems). He was not impressed by progress initially,
although thinks the situation is now improving.
One of the issues, Hollander says, is that companies are overly focused on
preventing what he calls expected bugs where engineers anticipate a
problem. This might include making sure that the car cameras can correctly
identify a pedestrian wearing a black coat at night.
But then there are unexpected bugs problems that no one has thought of
or situations that have been overlooked. A car designed in the US but
driven in the UK could set off on the right hand side of the road simply
because its default location is the US all because a developer forgot to
include an instruction to check location after a memory reset.
These kinds of autonomous vehicle only bugs mistakes that no human
would ever make will be big news, Hollander says. People will say:
Hey, Im at the mercy of the vehicle.
The Joshua Brown crash driving at full speed into a clearly visible
trailer is arguably one such example as it would never happen to a
human being, Hollander says.
After the Florida accident, Tesla reportedly wasnt immediately sure why
its autopilot system hadnt braked. They probed the possibility that the
system deliberately ignored the trailer to avoid braking when approaching
objects like overhead bridges. This was an idea supported by an
investigation [PDF] by the National Highway Traffic Safety Administration.
A Reg request for clarification from Tesla went unanswered.
https://www.theregister.co.uk/2017/10/09/bugs_in_autonomous_vehicles/
first confirmed person to die in a crash where the car was, at least in
part, driving itself. On a Florida highway, his Tesla Model S ploughed
underneath a white truck trailer that was straddling the road, devastating
the top half of the car.
Browns crash is well known. But more mundane bugs are finding their way
into increasingly software-dependent, semi-autonomous cars. Software
problems accounted for nearly 15 per cent of US car recalls in 2015, up
from less than five per cent in 2011, according to the most recent report
from financial advisors Stout Risius Ross.
Last year, to name a few examples, Toyota recalled around 320,000 cars
after they found improper programming could cause airbags and seatbelt
pretensioners to activate unbidden. Ford had to recall 23,000 cars because
software problems in their electronic windows meant they had excessive
closing force.
Despite the latest wave of excitement about artificial intelligence, the
fear among some of those in the industry is that bugs could prove a
serious hurdle to mass adoption not least because of the weird,
unexpected nature of the accidents they can cause.
Philip Koopman, an associate professor at Carnegie Mellon University and
an expert on autonomous vehicle safety, told The Reg: I look at the
errors, and almost always say: Wow, that should not have happened. And
the most likely explanation is that they did not follow a safety
standard.
The continuous stream of defects in the car industry signals underlying
problems: they just dont want to spend the time and effort to get it
right, he argues.
Car manufacturers contacted by The Reg were unwilling to talk.
Significantly, many developing autonomous vehicles are hiring developers
from Silicon Valley whose backgrounds are in general purpose software
software that, of course, crashes with reasonable frequency. People are
not hiring from among the ranks of the airline safety industry.
Knowing how to code is not knowing how to be safe, Koopman says.
Allegations of poor code go back years. Koopman was an expert witness for
plaintiffs in a 2013 court case in Oklahoma that looked into whether
computer problems had caused a Toyota Camry to accelerate uncontrollably
and crash, killing a passenger in 2007.
He and another investigator found Toyotas electronic throttle control
system was just awful. An 18-month investigation found numerous problems
in the software [PDF], including the potential for stack overflow and no
protection against bit flips where ambient radiation in the outside
environment can switch a bit. The report concluded Toyotas code was
spaghetti.
The jury decided the electronics had been at fault and awarded $3m in
compensation. Toyota stands by the safety of its throttle system, a
company spokesman said, pointing out that an earlier official
investigation, partly carried out by NASA, did not find any faults.
Yoav Hollander, founder of Foretellix, a company looking to develop new
ways to find bugs in engineered systems, has for a number of years been
attending conferences about verifying the safety of autonomous cars (and
other autonomous systems). He was not impressed by progress initially,
although thinks the situation is now improving.
One of the issues, Hollander says, is that companies are overly focused on
preventing what he calls expected bugs where engineers anticipate a
problem. This might include making sure that the car cameras can correctly
identify a pedestrian wearing a black coat at night.
But then there are unexpected bugs problems that no one has thought of
or situations that have been overlooked. A car designed in the US but
driven in the UK could set off on the right hand side of the road simply
because its default location is the US all because a developer forgot to
include an instruction to check location after a memory reset.
These kinds of autonomous vehicle only bugs mistakes that no human
would ever make will be big news, Hollander says. People will say:
Hey, Im at the mercy of the vehicle.
The Joshua Brown crash driving at full speed into a clearly visible
trailer is arguably one such example as it would never happen to a
human being, Hollander says.
After the Florida accident, Tesla reportedly wasnt immediately sure why
its autopilot system hadnt braked. They probed the possibility that the
system deliberately ignored the trailer to avoid braking when approaching
objects like overhead bridges. This was an idea supported by an
investigation [PDF] by the National Highway Traffic Safety Administration.
A Reg request for clarification from Tesla went unanswered.
https://www.theregister.co.uk/2017/10/09/bugs_in_autonomous_vehicles/