Technology

Self-driving illusion puts Aussies at risk

By Danielle Collis

Copyright news

Self-driving illusion puts Aussies at risk

Leading safety experts agree, warning that the system risks drivers becoming complacent.

Tesla has confirmed that the system, which allows cars to steer, brake, change lanes, merge, and navigate streets with minimal driver input, is coming to Australia.

While drivers are legally required to keep their hands on the wheel, the system’s high level of automation makes it feel closer to a fully driverless technology.

Carnegie Mellon University Professor Phil Koopman, a leading expert on automated vehicle safety, told news.com.au that while Tesla’s system is far more advanced than typical Level 2 driver-assist features, it is still classified in the same category as basic lane-keep assist.

“So-called Level 2+ systems are especially dangerous because they can give the illusion of a complete driving capability that is not really there,” he said.

“The car drives itself, until it doesn’t. At that point, the driver might struggle to respond to a crisis due to automation complacency.”

MORE:Finally! Tesla gifts Aus drivers ‘world first’

MORE: Musk, Tesla smashed by China ‘military’ ban

Koopman argues that regulators should create a new category between Level 2 and Level 3 and treat highly capable systems like Tesla’s FSD with the same oversight as more advanced automation.

“Requiring drivers to remain in control and responsible for monitoring highly automated vehicles amounts to denying the humanity of the drivers,” he said.

Professor Koopman warns that this technology puts drivers in what he calls a “moral crumple zone,” where they are blamed for crashes even though prolonged monitoring of near-perfect automation is beyond human capability.

“Expecting a normal human being to pay continuous, hawk-like attention for hours while a car drives itself almost perfectly is beyond credibility,” he explained.

MORE: ‘Corrupt’: Musk called out over dodgy move

MORE: Aus Tesla owners in wild move to sue Musk

“And it’s dangerous because things might seem fine for lots and lots of miles until the crash comes out of the blue, and the driver is blamed for not preventing it. Simply telling people to pay attention isn’t going to cut it.”

Decades of research, Koopman said, show that people are poor at supervising automation for extended periods.

“We have known since the 1990s that when steering is automated, drivers tend to stop paying attention. That is not a lack of moral character — it is fundamental human nature,” he said.

“Nobody can maintain the degree of vigilance needed to ensure a car almost driving itself is safe hour after hour on a long, boring drive.”

Tesla maintains its FSD technology is safer than human drivers, citing its fleet data. However, experts and the government remain cautious.

Australasian College of Road Safety CEO Dr. Ingrid Johnson agrees that the technology is outpacing Australia’s regulations.

“Today’s ‘supervised self-driving’ is Level 2, and Level 2 oversight remains fragmented,” she said.

“The new Automated Vehicle Safety Law (AVSL) chiefly targets higher-automation (L3+), not the Level 2 systems most Australians encounter now. There’s a gap on uniform requirements for driver-monitoring performance, misuse prevention, and data transparency for L2 across brands.”

Johnson warned that while features like autonomous emergency braking can be lifesaving, “partial automation should be treated as driver-assist, not a safety guarantee.”

She called for the urgent adoption of a mandatory crash-reporting hub for advanced driver-assist and automation systems.

“This would give the country a clear, evidence-based picture to guide policy, enforcement, and consumer safety,” she said, citing the U.S. National Highway Traffic Safety Administration (NHTSA) as an example.

“The NHTSA requires manufacturers to report crashes involving Level 2 systems, which produces a transparent national dataset that has informed recalls and ongoing investigations.”