首页 > Scams
How bad is Tesla's full self driving feature, actually? Third-party testing bodes ill
发布日期:2024-12-19 11:26:41
浏览次数:565

According to testing firm AMCI, Tesla’s FSD software can’t drive more than 13 miles without needing intervention.

We’re Just weeks out from Tesla’s big RoboTaxi presentation, where the automaker's self-driving shuttle will be revealed, and third-party independent research firm AMCI Testing has some bad news that could hang over the event like a cloud. AMCI just completed what it claims is “the most extensive real world test” of Tesla’s Full Self Driving (FSD) software, ostensibly the technology that would underpin the RoboTaxi's driverless tech, and the results are not confidence inspiring.

AMCI says its test covered over 1,000 miles of use and, in short, showed that the performance of Tesla’s FSD software is “suspect.” This isn’t the first time Tesla has caught criticism for FSD. For years, Tesla FSD software has been a source of controversy for the automaker. Tesla has dealt with everything from being called out by the California DMV for false advertising to being investigated by NHTSA.

There have been so many incidents involving Tesla’s Autopilot and FSD that we had to build a megathread to keep track of them all. It's worth noting that Tesla claims FSD is still in "beta," so it's incomplete, but it also sells the feature as a five-figure option on its current lineup of EVs, allowing owners to opt into being, essentially, real-world test dummies for the system. They must acknowledge that the system requires driver oversight and is not, as its name implies, a fully self-driving system today. Still, Tesla is essentially offloading the kind of testing other automakers conduct scientifically, with engineers and oversight, to customers in the real world. And AMCI’s findings on how reliable FSD is—or rather, is not—are just the latest road bump for Tesla and FSD.

AMCI says it conducted its tests in a Tesla Model 3 with FSD versions 12.5.1 and 12.5.3 across four different driving environments: city streets, rural two-lane highways, mountain roads, and freeways. AMCI was impressed with FSD’s ability to rely solely on cameras. (Tesla is the only automaker whose driver assistance systems of FSD's ambition operate using only cameras and, essentially, short-distance parking sensors, rather than a more complex—and expensive—combination of cameras, sensors, radar, and lidar, which can paint a much clearer picture with more redundancies than Tesla's camera array.) However, AMCI found that, on average, when operating FSD, human intervention is required at least once every 13 miles to maintain safe operation.

“With all hands-free augmented driving systems, and even more so with driverless autonomous vehicles, there is a compact of trust between the technology and the public. When this technology is offered the public is largely unaware of the caveats (such as monitor or supervise) and the tech considered empirically foolproof. Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results,” said David Stokols, CEO of AMCI Testing’s parent company, AMCI Global. “Although it positively impresses in some circumstances, you simply cannot reliably rely on the accuracy or reasoning behind its responses.”

You can see the full results of the test for yourself, but here is the gist from AMCI:

If you think 13-miles intervals between instances where a driver must grab the wheel or tap the brakes is pretty good, it's not just the number of interventions required, but the way those situations unfold. AMCI’s final point is the most eyebrow-raising (emphasis theirs): “When errors occur, they are occasionally sudden, dramatic, and dangerous; in those circumstances, it is unlikely that a driver without their hands on the wheel will be able to intervene in time to prevent an accident—or possibly a fatality.

To back up its report, AMCI released three videos showing some of the instances in which FSD performed unsafely. Tesla has yet to publicly respond to this report, though we wouldn’t hold our breath for that. Again, the automaker can fall back on the idea that the software is still in development. Common sense, however, suggests that putting a feature with the FSD name and purported future self-driving capabilities into the hands of regular people now—when decisions the system makes or can flub—have dire consequences, and AMCI's testing proves that FSD's shortcomings rear their heads quite often.

上一篇:Engines on 1.4 million Honda vehicles might fail, so US regulators open an investigation
下一篇:'Cowboy Carter' collaborators to be first country artists to perform at Rolling Loud
相关文章