ADVERTISEMENT

Tesla recalls nearly 54,000 self-driving vehicles programmed to break the law

The autonomous, electric cars and SUVs were programmed to illegally roll through certain stop signs at up to nine kilometres per hour.

The recall covers some S, X, 3 and Y vehicles operating between 2016 and 2022 with Full Self-Driving (FSD) software.

The rolling stop feature applied to all-way, or four-way, stop signs, where every vehicle on a road at an intersection must stop.

In Australia, all-way stop signs are usually only found on remote, rural roads with low visibility but they are common at American intersections, especially for schoolchildren when there is no crossing guard.

Tesla’s Full Self-Driving software could only be used when the driver enabled it in an area with a maximum speed limit of 48 kms per hour (30 mph). The vehicle had to be approaching the intersection below nine kms per hour and, most importantly, Tesla claimed that the software could detect “relevant” moving cars, pedestrians or bicyclists nearby.

However, Tesla’s machine learning system can mistakenly identify objects, according to Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University.

“What happens when the FSD decides a child crossing the street is not ‘relevant’ and fails to stop?” he asked.

“This is an unsafe behaviour and should never have been put in vehicles.”

After Tesla’s founder and CEO, Elon Musk, met twice with the National Highway Traffic Safety Administration (NHTSA) in America, Tesla removed the rolling stop software.

The NHTSA has also challenged Tesla’s non-disclosure agreement during its investigation of crashes.

A Californian, Tesla beta driver, who is an untrained person testing the software on public roads, complained to the NHTSA that the Full Self-Driving software caused a crash in November.

The driver said that a Model Y SUV went into the wrong lane and gave an alert halfway through the turn. The driver tried to turn the wheel to avoid other traffic but the car took control and “forced itself into the incorrect lane” and crashed with another car, the driver reported. Nobody was injured in the crash.

Last year, the NHTSA began a formal investigation into why Tesla’s ‘autopilot’ driver-assist software caused approximately 12 crashes into parked emergency vehicles.

Despite a Tesla statement that the cars cannot drive themselves and drivers must be ready to take action at all times, ‘Passenger Play’ video games were installed on centre touch screens, supposedly for passengers while the vehicles were in motion. In December, Tesla complied with the

NHSTA’s directive to update software to prevent the games from being played while the vehicle was moving

There have been many Tesla crashes worldwide, and dangers include bike lanes not being recognised, electric door handles not extending, cars breaking apart, and their damaged batteries bursting into flame and reigniting.

The self-driving car industry is in its infancy and there are other safety issues ahead, according to Paul Ausick in ’10 Dangers of Self-Driving Cars.’

These include hacking, computer glitches, complacent drivers and autonomous cars failing to recognise the road because of GPS error or heavy rain obscuring the white road lines.

Support the Barrier Truth!

We are a small, independently owned newspaper. If you got something from this article, giving something back helps us to continue publishing the truth from the Broken Hill region. Every little bit counts.

More Articles

ADVERTISEMENT
ADVERTISEMENT