Full Tesla FSD Self-Driving Clips Still Show Flaws and Problems



  • Tesla has launched the latest version of its full self-driving beta after a long delay this month.
  • Beta testers immediately began posting clips of the driver assistance system.
  • The software is awesome and advanced, but still puts drivers in dangerous situations.

Earlier this month, Tesla rolled out a long-awaited update to its Full Self-Driving software for beta testers. It’s impressive, but it still doesn’t make cars self-sufficient.

The electric vehicle maker first opened access to pre-production technology in October, and it’s now in the hands of a few thousand loyal Tesla owners. It takes Tesla’s existing driver assistance system, which is primarily suited for predictable highway driving, and adds the ability to automate driving tasks on more complicated non-highway streets.

Videos of the new improved software in action show that it can navigate some challenging driving situations impressively, but there are also many dangerous flaws and issues.

In one clip, a Tesla confidently manages a narrow, unmarked road with an oncoming car. The computer does pretty much exactly what a human would do: slow down and stop to let the oncoming car pass first, then move forward once it’s clear the other driver is giving way.

Another shows the system navigating stop-and-go traffic:

Another shows that he can see stop signs and make turns in the dark – though empty – streets of the city as well. Some videos also show cars stopping for pedestrians and other vehicles.

Read more: Fort Lauderdale asked Elon Musk to build a commuter train tunnel. So how did he end up considering a $ 30 million beach tunnel for Teslas instead?

But the system still struggles with very basic driving tasks, putting drivers and passers-by in risky situations. In a clip documenting a drive through downtown San Francisco, the car swerves while intoxicated into a striped midline, forcing the driver to take control.

In the same video, the car stumbles in a left turn and almost oversteps in a parked car.

In a clip set in Chicago, the car slowly traverses intersections, stops at random, and only notices a road closure at the last second. A bunch of orange barricades is something any average human driver would recognize before attempting a turn.

All of these dangerous hiccups show how far Tesla is from mimicking human driving. But a particularly alarming clip from Seattle takes the cake.

In the nighttime video, the beta fails to recognize the huge concrete columns supporting the city’s monorail – and the car almost twice rushes through it in an attempt to change lanes.

If a highly automated car is to be able to do one thing, it’s recognize large stationary objects and avoid them. But it seems the car had no idea the pillars were even there, judging by the visualization on the center screen.

People in the car wonder if the failure is due to Tesla’s switch to a camera-only system that doesn’t use radar. And it is certainly a possibility. Automakers, including Tesla, have relied on radar for years for features like emergency braking and cruise control. But Tesla decided in May to stop using the sensors and remove them from future cars.

Tesla has taken a quick and loose approach to its automated driving technology that other automakers are not taking. And safety advocates have challenged Tesla’s strategy of asking amateur drivers to test unproven technology on public roads. Pedestrians, cyclists and other drivers did not register to participate in this lab experiment, they say.

But the company is under increasing pressure to deliver a final version of full self-driving to customers, who have shelled out as much as $ 10,000 over the years for the add-on under the promise it would eventually allow. Teslas to drive themselves. It seems more and more that this will not happen anytime soon.


Leave A Reply

Your email address will not be published.