√Former Tesla employee deems Full Self-Driving program unsafe
Safety issues surrounding the role of artificial intelligence (AI) in the electric car maker’s ‘Full Self-Driving’ program have been raised by a whistleblower.
A former Tesla employee has blasted the electric car maker, suggesting he had evidence it had not followed safety requirements during the real-world testing of its so-called ‘Full Self-Driving’ (FSD) semi-autonomous system.
Lukasz Krupski, a former service technician for the car maker in Norway, said during a BBC interview Tesla’s FSD system was not ready for public roads, and he was concerned about the role of artificial intelligence (AI) in its development.
“I don’t think the hardware is ready and the software is ready,” he told BBC technology editor and reporter, Zoe Kleinman.
“It affects all of us because we are essentially experiments in public roads. So even if you don’t have a Tesla, your children still walk in the footpath.”
Last month, Krupski was revealed as the whistle-blower in a May 2023 leak of confidential Tesla company information to German newspaper Handelsblatt.
That came, says the former employee, after he became frustrated with the company’s relaxed approach to safety after he raised concerns with Tesla CEO, Elon Musk, after a workplace fire.
The leak included personal information on Tesla employees globally, but more critically thousands of accident reports – including collisions by the FSD system – among internal company communications.
Krupski told the BBC the data included evidence the carmaker did not follow the required protocols for the safe operation of vehicles with a set level of autonomous driving capability.
The system – which Tesla CEO Elon Musk has said would be “safer than a human” – originated from ‘Autopilot’, introduced in 2014 and includes driver assistance systems that steer, brake and accelerate a vehicle using cameras and sensors.
In 2016, Joshua Brown died while using Autopilot in his Tesla Model S in what was the first Autopilot/FSD-related death on public roads, with the system coming under scrutiny by the NHTSA and US National Transport Safety Bureau.
The most recent iteration, Version 12, was introduced to Tesla employees – who received it ahead of customers – in late November 2023.
Autonomous driving capability is categorised into levels, with Tesla Autopilot classed as a Level 2 autonomous system.
Tesla’s unverified data says that by the end of 2022, US Tesla drivers using Autopilot travelled an average eight million kilometres per collision, compared to 2.4 million kilometres when not using the system.
The large amount of information leaked by Krupski in July 2023 sparked an exposé on the Tesla FSD system in Handelsblatt and tech publication Wired before spreading to mainstream media outlets.
Krupski also provided information to the NHTSA (National Highway Traffic Safety Administration) which is currently investigating the Autopilot system.
In February 2023, the NHSTA recalled 363,000 Teslas with the system after finding that it presented an unreasonably high risk of a fatal collision.
The post Former Tesla employee deems Full Self-Driving program unsafe appeared first on Drive.
Post a Comment for "√Former Tesla employee deems Full Self-Driving program unsafe"