"Operational limitations" in the Autopilot self-driving system built-in to electric car maker Tesla's cars played a "major role" in a fatal crash in 2016, the US National Transportation Safety Board has concluded in a report into the first-ever fatal crash involving an autonomous driving system.
The Board met today to reveal what they believe was the probable cause of the fatal crash on 7 May 2016 on the highway near Williston, Florida.
In the accident, the 2015 Tesla Model S had been switched to Autopilot by the driver, Joshua Brown, and there is evidence that despite the immaturity of the self-driving system Brown was not watching the road when the accident happened.
Brown was killed when his self-driving Tesla failed to see the white side of a tractor-trailer that had pulled out across the highway.
In a statement at the time, the company admitted: "Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied."
In September 2016, just four months after the accident, Tesla founder Elon Musk effectively admitted responsibility, claiming that a subsequent update to the Autopilot software could have prevented the crash.
Later in that month, the company issued another update after a Tesla car hack was demonstrated by researchers in China.
However, a recall of the cars was prevented following the publication of a preliminary report in January that suggested that the driver should have been paying attention.
Today the Board also suggested that the nature of the Tesla self-driving systems means that drivers can become disengaged and fail to watch the road if the Autopilot system is on for long periods of time.
"Today's automation systems augment, rather than replace human drivers. Drivers must always be prepared to take the wheel or apply the brakes," warned NTSB chairman Robert Sumalt.
Tesla's Autopilot, he added, worked as operated in Brown's vehicle, but the system itself did not do enough to ensure that drivers paid enough attention to take control when it failed.
The system could not reliably detect 'cross traffic' and "did little to constrain the use of autopilot to roadways for which it was designed", according to the Board.
At the public hearing today, the Board suggested that the driver had "at least 10 seconds" to notice the truck and to apply the brakes, but eye witnesses have suggested that Brown was watching a DVD at the time of the accident.
Campaigners want US authorities to break-up Instagram, WhatsApp and Messenger into separate companies
The perception of the industry as "a white man in a hard hat" is limiting new applicants, says Hayaatun Sillem
Almost two years late - and just as AMD is readying 7nm Zen 2 for early 2019
Eye-wateringly expensive smart speakers take just six per cent market share, claims Strategy Analytics