Tesla’s Autopilot does not sufficiently protect drivers and their passengers, according to American highway safety. Put in the spotlight more than two years ago by the National Highway Traffic Safety Administration (NHTSA), which conducted an investigation for more than twenty-four months, Tesla’s driving assistance system, called Autopilot, will must be reviewed on more than two million vehicles in the United States. The vehicles affected are certain Model S produced between 2012 and 2023 and equipped with the system, all Model use of driving assistance systems in its electric cars, Tesla is therefore forced to recall, or more precisely to remotely update, its famous control system.

Three options are currently offered for purchase by the manufacturer: the standard Autopilot system, free of charge, which includes cruise control and lane keeping assist. Two other options, paid this time, are also offered: the “improved” paid Autopilot, which this time allows you to change lanes thanks to a component called Autosteer, but also to exit or enter a parking space without driver. And finally, Tesla offers a final option called “fully autonomous driving capability”, or in English “full self driving” (FSD). On the latter, Elon Musk’s firm promises recognition and reaction of the machine to traffic lights and stop signs. Faced with an increased risk of “collision risk”, Tesla, however, had to deactivate this option which allowed cars not to come to a complete stop in certain conditions.

Also read: Tesla’s Cybertruck, between innovative object and futuristic whim

NHTSA opened an investigation in August 2021 into Autopilot after identifying more than a dozen accidents in which Tesla vehicles collided with stationary emergency vehicles. At the same time, the American regulator had also opened more than three dozen special investigations into Tesla vehicle accidents in 2016 in cases where driver assistance systems were suspected of being used, and involving 23 deaths.

NHTSA says today that there may be an increased risk of crashes in situations where the Autopilot system is engaged without the driver’s alertness. Either when it does not maintain control of the vehicle and is not ready to intervene or when it does not recognize when the system is canceled or not. Clearly, drivers tend to wrongly entrust driving to the system by no longer having their hands on the wheel. It is true that Tesla maintains ambiguity about the degree of autonomy of its software-equipped vehicles, which largely contributes to their dangerousness. The company also admitted last October that the US Department of Justice had issued subpoenas concerning its Full Self-Driving (FSD) and Autopilot systems.

Tesla executives said they disagreed with the NHTSA analysis. But they will roll out an over-the-air software update that will “incorporate additional controls and alerts to those that already exist on affected vehicles in order to encourage the driver to adhere to their driving responsibility whenever Autosteer – a component of ‘Autopilot is engaged.’