Tesla FSD status: still utter crap

Status
Not open for further replies.

joec

Referral Code - MX1KDTYY
Moderator
Verified Owner
Supporting Member
Joined
May 1, 2022
Messages
5,230
Location
Boulder, CO
Cars
Air Touring
Referral Code
MX1KDTYY
So I finally got another update for FSD on my Model 3 today. It's been about 2 months since the last one. My friends who follow such things on reddit swear this is a BIG one. (They say that about every update, by the way.) Took it for a little spin over to the local hardware store to check out what's changed. Lucky I made it home in one piece.

First, the good:

- The car no longer breaks in the middle of turns at intersections, like a frightened first-time driver.
- Someone at Tesla finally figured out when you change lanes, you should accelerate, not decelerate.
- The little car cartoons are more detailed

Then the bad:

- Had to disengage getting onto an on overpass ramp, because it refused to continue, although there were no cars in the merging lane
- Cyclist in a marked bike lane. First, the car just kept breaking, as if the bike were darting out in front of me, even though he wasn't. Then the car turned INTO the biking lane, in an apparent move to hit the cyclist. Had to disengage again.
- Right turn at a basic intersection. Luckily, no other cars around, because the car completed the turn perfectly, then immediately gunned it, swerved into the middle lane for no reason, then hit the brakes hard. Actually laughed out loud at that one.

Keep in mind, this trip is about 1.2 miles. Maybe a total of 5 turns involved.

Over the course of the past several months, there have been four major updates to FSD. Not a single one of them has made the actual city driving any better for me. It would be comical if it weren't so dangerous.

Seriously, it should be illegal to engage FSD on the streets of Boulder.

So I guess, all of you who are worried Lucid will lag behind on the self-driving front: you have nothing to worry about. Tesla won't be approaching good for quite a while. And it's likely going to get people killed in the meantime. So be thankful you can't even play with it.

Highway driving is sometimes okay (if they would just fix the darn phantom braking). But we are not going to see Teslas without steering wheels for a very, very long time. As a shareholder, I'm glad Lucid is not focusing on this very much. If they can get highway lane assist done in the next year, they'll be good to go for at least another 5 years.
 
So I finally got another update for FSD on my Model 3 today. It's been about 2 months since the last one. My friends who follow such things on reddit swear this is a BIG one. (They say that about every update, by the way.) Took it for a little spin over to the local hardware store to check out what's changed. Lucky I made it home in one piece.

First, the good:

- The car no longer breaks in the middle of turns at intersections, like a frightened first-time driver.
- Someone at Tesla finally figured out when you change lanes, you should accelerate, not decelerate.
- The little car cartoons are more detailed

Then the bad:

- Had to disengage getting onto an on overpass ramp, because it refused to continue, although there were no cars in the merging lane
- Cyclist in a marked bike lane. First, the car just kept breaking, as if the bike were darting out in front of me, even though he wasn't. Then the car turned INTO the biking lane, in an apparent move to hit the cyclist. Had to disengage again.
- Right turn at a basic intersection. Luckily, no other cars around, because the car completed the turn perfectly, then immediately gunned it, swerved into the middle lane for no reason, then hit the brakes hard. Actually laughed out loud at that one.

Keep in mind, this trip is about 1.2 miles. Maybe a total of 5 turns involved.

Over the course of the past several months, there have been four major updates to FSD. Not a single one of them has made the actual city driving any better for me. It would be comical if it weren't so dangerous.

Seriously, it should be illegal to engage FSD on the streets of Boulder.

So I guess, all of you who are worried Lucid will lag behind on the self-driving front: you have nothing to worry about. Tesla won't be approaching good for quite a while. And it's likely going to get people killed in the meantime. So be thankful you can't even play with it.

Highway driving is sometimes okay (if they would just fix the darn phantom braking). But we are not going to see Teslas without steering wheels for a very, very long time. As a shareholder, I'm glad Lucid is not focusing on this very much. If they can get highway lane assist done in the next year, they'll be good to go for at least another 5 years.
Maybe the sensor suite with LiDAR will ultimately prove to be more capable than Teslas camera only system. Time will tell. I paid for FSD for 7 years and didn’t get it.
 
I always see this Twitter account @WholeMarsBlog continually hype up how perfect FSD is every update and then I actually experience the new version myself....
 
Maybe the sensor suite with LiDAR will ultimately prove to be more capable than Teslas camera only system. Time will tell. I paid for FSD for 7 years and didn’t get it.
I’m not sure- from videos I’ve seen it seems to “see” what is around it ok but just makes really dumb a** decisions. I’m willing to bet that in @joec bike lane issue the car perfectly displayed the cyclist in the visualization as it made the decision to try to hit it. At least at this point it seems the software is the issue, rather than the sensors.

Did they really “release” so-called FSD 7 years ago?
 
I’m not sure- from videos I’ve seen it seems to “see” what is around it ok but just makes really dumb a** decisions. I’m willing to bet that in @joec bike lane issue the car perfectly displayed the cyclist in the visualization as it made the decision to try to hit it. At least at this point it seems the software is the issue, rather than the sensors.

Did they really “release” so-called FSD 7 years ago?
So what you are saying is that Tesla's FSD is Jeremy Clarkson and has an innate hatred towards cyclists?
 
Did they really “release” so-called FSD 7 years ago?
No but they were selling it and promising release “ any month now”. That’s one of the reasons I’m driving a Lucid and not a plaid. The other reasons are the yoke, fit and finish, overall cheap looking interior and customer service.
 
No but they were selling it and promising release “ any month now”. That’s one of the reasons I’m driving a Lucid and not a plaid. The other reasons are the yoke, fit and finish, overall cheap looking interior and customer service.
Its a fair point and Elon has always been late with delivering on FSD. I got the most recent beta update last night and its two steps back on my drive to work this morning with some odd abrupt stopping behavior. Overall (I've FSD beta for about 6 months now) its getting better and better. What that translates to in terms of safety is that the car supervised by me is almost definitely better than just me driving in all scenarios. From a comfort perspective, some of the behavior is odd, abrupt and certain decisions are not the ones I would make. The amount of time needed to achieve safety greater than that of a human driver (will likely be solved this year). The amount of time needed to have the car drive as smooth as a human with the some factor of greater safety/accident avoidance is anybody's guess.
 
Its a fair point and Elon has always been late with delivering on FSD. I got the most recent beta update last night and its two steps back on my drive to work this morning with some odd abrupt stopping behavior. Overall (I've FSD beta for about 6 months now) its getting better and better. What that translates to in terms of safety is that the car supervised by me is almost definitely better than just me driving in all scenarios. From a comfort perspective, some of the behavior is odd, abrupt and certain decisions are not the ones I would make. The amount of time needed to achieve safety greater than that of a human driver (will likely be solved this year). The amount of time needed to have the car drive as smooth as a human with the some factor of greater safety/accident avoidance is anybody's guess.

I like your last two sentences very much. Statistically and proportionally, vehicles on "autopilot" have had a lower accident rate than cars driven by humans. I know, hard to believe.

I'm NEVER going to get FSD Beta even though I've long since paid for it. I'm Asian. (joke alluding to Tesla's silly "safety score").
 
I like your last two sentences very much. Statistically and proportionally, vehicles on "autopilot" have had a lower accident rate than cars driven by humans. I know, hard to believe.

I'm NEVER going to get FSD Beta even though I've long since paid for it. I'm Asian. (joke alluding to Tesla's silly "safety score").
Is it really FSD that is making cars safer though? What novel thing has FSD introduced or pioneered to make cars safer to drive? It seems, I could easily be wrong about this, but a lot of the safety features in FSD were available already on the MB S class, no?
 
Is it really FSD that is making cars safer though? What novel thing has FSD introduced or pioneered to make cars safer to drive? It seems, I could easily be wrong about this, but a lot of the safety features in FSD were available already on the MB S class, no?
There's no question in my mind my Tesla is a LOT safer when I am driving manually than it will be anytime soon with FSD engaged. If I have to take over every few turns because the car wants to hit cyclists, how could that possibly be an improvement over me just driving myself?

It's like saying a trained professional driver taking out a first-time student driver is safer than just the pro driver on their own. There's so much anxiety that comes along with having to babysit this computerized lunatic that I'm never in a good frame of mind while letting it drive.

Now, maybe if I were extremely tired, or had a few too many drinks, a car with FSD engaged might save me from doing something stupid. But I've already done something stupid by getting behind the wheel at that point. And again, in that situation, I'm likely to react poorly when the car has one of its little "moments."

I simply don't buy this "safer than humans" argument. Not yet. Someday, sure. When all of the cars on the road are driving themselves, and all of them are communicating with each other in perfect harmony, so they can all anticipate each others' next move.

But Tesla is likely to keep their software proprietary forever, anyway. So that's unlikely to happen.

I get that people think having to monitor the car makes you more attentive. There's another easy way to solve that. Just be more attentive.
 
I’m not sure- from videos I’ve seen it seems to “see” what is around it ok but just makes really dumb a** decisions. I’m willing to bet that in @joec bike lane issue the car perfectly displayed the cyclist in the visualization as it made the decision to try to hit it. At least at this point it seems the software is the issue, rather than the sensors.

Did they really “release” so-called FSD 7 years ago?
You are correct. The little cartoon of the bike was shown perfectly riding in the bike lane. Not veering at all. But the car was breaking, anyway.

The graphic designers at Tesla are doing a great job.

There are other times, though, when I notice items appearing and disappearing in the view. And that's not good. Clearly the cameras do have their limitations, just as our eyes do.

I've gone back and forth on whether cameras alone can do the job. I think it's forcing Tesla's engineers to really work with that limitation, and perhaps the AI is improving more rapidly under those conditions. But that doesn't mean having more sensors would be a bad thing. Would we be better drivers if we had built-in LIDAR in our skulls? Maybe. My guess is our brains (which are infinitely more efficient and capable than any computer of intelligence) would know how to parse that extra info and make better decisions.
 
Is it really FSD that is making cars safer though? What novel thing has FSD introduced or pioneered to make cars safer to drive? It seems, I could easily be wrong about this, but a lot of the safety features in FSD were available already on the MB S class, no?
I suppose on a x miles per accident basis. The baseline is the overall vehicle fleet though so I would assume other cars with advanced driver assistance features would also have lower accident rates than the overall fleet average.
 
You are correct. The little cartoon of the bike was shown perfectly riding in the bike lane. Not veering at all. But the car was breaking, anyway.

The graphic designers at Tesla are doing a great job.

There are other times, though, when I notice items appearing and disappearing in the view. And that's not good. Clearly the cameras do have their limitations, just as our eyes do.

I've gone back and forth on whether cameras alone can do the job. I think it's forcing Tesla's engineers to really work with that limitation, and perhaps the AI is improving more rapidly under those conditions. But that doesn't mean having more sensors would be a bad thing. Would we be better drivers if we had built-in LIDAR in our skulls? Maybe. My guess is our brains (which are infinitely more efficient and capable than any computer of intelligence) would know how to parse that extra info and make better decisions.
Maybe Neuralink could develop a chip implant to provide us with a LiDar connection to our brains :)
 
I tried FSD for a few minutes on surface streets near my house. It was abrupt enough to be frightening. It made a sharp turn and suddenly accelerated toward a raised median divider full of large trees. I hit the brakes and opted out of FSD as soon as I got home. Selling the Model 3 today to Shift.com
 
Last edited:
“More than 750 Tesla owners have complained to U.S. safety regulators that cars operating on the automaker’s partially automated driving systems have suddenly stopped on roadways for no apparent reason.”

Emphasis mine- that seems pretty serious!
 
“More than 750 Tesla owners have complained to U.S. safety regulators that cars operating on the automaker’s partially automated driving systems have suddenly stopped on roadways for no apparent reason.”

Emphasis mine- that seems pretty serious!
It is, when you experience it. Especially the first time.

Known issue for years now. Tesla has never addressed it. In words or actions.
 
It is, when you experience it. Especially the first time.

Known issue for years now. Tesla has never addressed it. In words or actions.
I mean- are they actually stopping in the middle of the road (which would mostly be on highways / freeways presumably as that’s where autopilot is mostly used) of their own accord?
 
Status
Not open for further replies.
Back
Top