Tesla FSD

MGRMLN

Active Member
Verified Owner
Joined
Mar 14, 2024
Messages
291
Reaction score
255
Location
Atlanta, GA
Cars
AGT/570S Spider/Cayenne
Referral Code
Y5OFBG6Z
My Uber was a Tesla Model Y and the driver, with my approval, was using FSD for my ride to the airport. I must say, comparing driving this same route in my AGT, with The latest Highway Assist, and FSD isn’t a contest. I’m shocked how well Tesla FSD negotiated everything hands free. I hope we get something close to this.
 
My Uber was a Tesla Model Y and the driver, with my approval, was using FSD for my ride to the airport. I must say, comparing driving this same route in my AGT, with The latest Highway Assist, and FSD isn’t a contest. I’m shocked how well Tesla FSD negotiated everything hands free. I hope we get something close to this.
FSD is very risky, I tried it a few times in our model 3. Very stressful. There a few instances I had to intervene because I wasn't sure if the car would navigate safely. Until it is safer than me, I'm never using it. I actually trust Waymo more. With no redundancy and relying on just cameras that can get obstructed, FSD it will never get beyond Level 2.

Why put your life in tech that is not foolproof? They way I think is, getting hit by another driver is a risk I accept, but getting injured or killed by a mistake by a computer is totally unaacceptable. The bar is much higher. Also, Tesla refuses to take liability in accidents when on FSD. Shows you how much trust they have in their own software even though they brag that robotaxi is coming next year, saying this from 2016 and they got beaten by Waymo and Mercedes.

I might take a look in a decade, but until then, I will never use anything like FSD till the car manufacturer takes full liability.
 
they brag that robotaxi is coming next year,
This whole thing was a farce. It’s like he just thinks he can submit an application for autonomous driving to the state and 2 weeks later it’s approved.

Also NHTSA has a 2500 vehicle limit for cars without steering wheels. Does Elon really believe that a government body is going to just magically change this ruling in less than a year.

It seems investors have finally woken up to the lies but damn, those fanboys just breathe it all in and believe every word he says. Even Sandy Munro drunk kool aid. That whole event was a bunch of nothing, even the capabilities of what the robots were doing are seriously questionable. Smoke and mirrors!
 
I still dislike Tesla and especially Elon, and selected with my dollars, but the experience this morning showed me how far things have come.

I prefer a car that is enjoyable to drive vs drives itself at the end of the day which is why I went with my AGT vs Plaid.

But it was much more impressive than I expected. That’s all. I’d love my Lucid to be that composed in the lane and navigating lane changes. I laughed that one setting on the Tesla is called “assertive”.
 
My Uber was a Tesla Model Y and the driver, with my approval, was using FSD for my ride to the airport. I must say, comparing driving this same route in my AGT, with The latest Highway Assist, and FSD isn’t a contest. I’m shocked how well Tesla FSD negotiated everything hands free. I hope we get something close to this.
Tesla ADAS is just like a lottery: It's inconsistent. It works great this time, but it might not next time. That inconsistency is also in the Summon feature: It might work great the first time it comes to you from the parking lot, but with the same scenario, the second time, it might destroy your tires or crash into other cars.

When Apple engineer Walter Huang died after crashing into the median in 2018, Tesla said:

"Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of. There are over 200 successful Autopilot trips per day on this exact stretch of road."

Yes, the numbers are impressive, but it explains that the lottery system works sometimes and not other times. Since then, Tesla has settled the case.

I prefer predictability: Highway Assist means the Self-Steering doesn't work in unapproved areas every single time. At this early date, self-steering might not keep up with sharp curves, so I get my hands ready to steer it.

I've owned FSD from 2017 to the end of 2023. It was very stressful to use it, so I just turned it off despite paying thousands of dollars for it.

Highway Assist will work better, but it just takes time to do it safely and consistently.
 
FSD is very risky, I tried it a few times in our model 3. Very stressful. There a few instances I had to intervene because I wasn't sure if the car would navigate safely. Until it is safer than me, I'm never using it. I actually trust Waymo more. With no redundancy and relying on just cameras that can get obstructed, FSD it will never get beyond Level 2.

Why put your life in tech that is not foolproof? They way I think is, getting hit by another driver is a risk I accept, but getting injured or killed by a mistake by a computer is totally unaacceptable. The bar is much higher. Also, Tesla refuses to take liability in accidents when on FSD. Shows you how much trust they have in their own software even though they brag that robotaxi is coming next year, saying this from 2016 and they got beaten by Waymo and Mercedes.

I might take a look in a decade, but until then, I will never use anything like FSD till the car manufacturer takes full liability.
Hmm - works great for me, but only using on the freeways. Have no desire to use self driving anywhere else. Like driving too much. 🙂 On long trips it is definitely nice to have.
 

Attachments

  • 7AB3C984-96EC-4D6A-9E4E-B6FD7FC97A5D.webp
    7AB3C984-96EC-4D6A-9E4E-B6FD7FC97A5D.webp
    90.5 KB · Views: 93
  • FC478EE7-94E9-494E-9316-53DC2DCF3401.webp
    FC478EE7-94E9-494E-9316-53DC2DCF3401.webp
    85.5 KB · Views: 97
The price dropped straight after the event so clearly they didn't see anything new or fantastic to drive the price up. The recent price increase has been a result of the earnings call. No idea why you felt the need to include the Lucid stock price.......
 
Also, Tesla refuses to take liability in accidents when on FSD. Shows you how much trust they have in their own software even though they brag that robotaxi is coming next year, saying this from 2016 and they got beaten by Waymo and Mercedes.
For me that is a clear signal no any Tesla robotaxi will come to the road any time soon: who is going to be liable? That can't be anyone in the car as robotaxi will not have any controls inside. That can't be Tesla as Elon will do everything to avoid any liability. Who then? Some 3rd party "taxi" companies? Why would they agree to become liable for such unreliable system that they have no control on?
 
Earnings are more important than show-and-tell or future “promises” - and that applies to Lucid as well. Wouldn’t you agree that earnings are what is driving the stock prices of both?
 
Hmm - works great for me, but only using on the freeways. Have no desire to use self driving anywhere else...
Walter Huang was killed by FSD on the busy 101 freeway in the middle of silicon valley.
 
I was a model S owner w/FSD. highway is okay. city is terrible. is it ahead of lucid? yes.

There is something you cant put in writing or videos. FSD is inconsistent as TAM mentioned. Tesla is using everyone as a sheep to be road test. is it 99% safe? yes, but with 350k+ people with fsd, 1% error is not acceptable. FSD almost killed me once, it was enough for me to never use it again. I've sold the model s.
 
This whole thing was a farce. It’s like he just thinks he can submit an application for autonomous driving to the state and 2 weeks later it’s approved.

Also NHTSA has a 2500 vehicle limit for cars without steering wheels. Does Elon really believe that a government body is going to just magically change this ruling in less than a year.

It seems investors have finally woken up to the lies but damn, those fanboys just breathe it all in and believe every word he says. Even Sandy Munro drunk kool aid. That whole event was a bunch of nothing, even the capabilities of what the robots were doing are seriously questionable. Smoke and mirrors!
I lost all respect for Sandy Munro....he became famous because of Tesla and now is a part of the cult.
 
FSD is very risky, I tried it a few times in our model 3. Very stressful. There a few instances I had to intervene because I wasn't sure if the car would navigate safely. Until it is safer than me, I'm never using it. I actually trust Waymo more. With no redundancy and relying on just cameras that can get obstructed, FSD it will never get beyond Level 2.

Why put your life in tech that is not foolproof? They way I think is, getting hit by another driver is a risk I accept, but getting injured or killed by a mistake by a computer is totally unaacceptable. The bar is much higher. Also, Tesla refuses to take liability in accidents when on FSD. Shows you how much trust they have in their own software even though they brag that robotaxi is coming next year, saying this from 2016 and they got beaten by Waymo and Mercedes.

I might take a look in a decade, but until then, I will never use anything like FSD till the car manufacturer takes full liability.
Auto makers or the fleet owner is liable only it is level 4 driverless as in waymo. I can never trust just cameras and AI in the cloud.
 
My Uber was a Tesla Model Y and the driver, with my approval, was using FSD for my ride to the airport. I must say, comparing driving this same route in my AGT, with The latest Highway Assist, and FSD isn’t a contest. I’m shocked how well Tesla FSD negotiated everything hands free. I hope we get something close to this.
It'll work good on highways and straight roads. It should be geofenced. When it doesn't get things right, it's detrimental
 
Here in greater Phoenix we were Waymo's pioneering locality. I now see lots of Waymo vehicles on the road. One aspect of them is that they are programmed not to exceed the speed limit (as they would have to be). Not only does pretty much everyone out here drive above the speed limit but what happens if a Waymo needs to pass someone? Can it be programmed to accelerate past the speed limit? would that expose the company to liability? It can be a pain the butt to be driving behind a Waymo.

Why would anyone purchase a car with the amount of HP and torque that is in even the most basic Lucid if the car is never going to exceed the speed limit? The same would apply to Teslas, Mach-Es, etc. I know...it is an option in the vehicle that can be turned on or off. But when would someone turn it on? On the highway? Well highway drive pretty much takes care of that in most new EVs. On city streets where even Waymos sometimes get confused?

I think it is a solution in search of a problem.

And, by the way, we haven't even talked about all the sensors that Waymo vehicle use to maneuver safely on city streets.
 
I still dislike Tesla and especially Elon, and selected with my dollars, but the experience this morning showed me how far things have come.

I prefer a car that is enjoyable to drive vs drives itself at the end of the day which is why I went with my AGT vs Plaid.

But it was much more impressive than I expected. That’s all. I’d love my Lucid to be that composed in the lane and navigating lane changes. I laughed that one setting on the Tesla is called “assertive”.
It is inconsistent, and that’s the issue. I’m glad it worked well for you that day. That very same trip may not be the same tomorrow or in an hour. That is what is scary about FSD, and why it is unreliable.

HA isn’t perfect, but it is *consistent*

Waymo is *consistent*
 
Last edited:
what happens if a Waymo needs to pass someone?
Following the letter of the law, you don't pass someone unless you can do it without exceeding the speed limit, so that's naturally what autonomous cars will do for liability. People have this strange assumption that passing is an exception where you are allowed to go as fast as possible, but that's just not true.
Why would anyone purchase a car with the amount of HP and torque that is in even the most basic Lucid if the car is never going to exceed the speed limit?
HP and torque have to do with acceleration and maneuverability, not speed. It is safer to drive a car with more HP and torque slowly than it is to drive a car with less HP and torque slowly, because it can more effectively maneuver to avoid accidents. Just like how it's safer to use sharp knives in the kitchen. You still don't want to hand a powerful car or a sharp knife to a child or a drunk, but in modestly competent hands they are both safer by doing what is expected of them quickly and without resistance.
 
Following the letter of the law, you don't pass someone unless you can do it without exceeding the speed limit, so that's naturally what autonomous cars will do for liability. People have this strange assumption that passing is an exception where you are allowed to go as fast as possible, but that's just not true.

HP and torque have to do with acceleration and maneuverability, not speed. It is safer to drive a car with more HP and torque slowly than it is to drive a car with less HP and torque slowly, because it can more effectively maneuver to avoid accidents. Just like how it's safer to use sharp knives in the kitchen. You still don't want to hand a powerful car or a sharp knife to a child or a drunk, but in modestly competent hands they are both safer by doing what is expected of them quickly and without resistance.
This is a fantastic analogy. I’m stealing this for the future.
 
FSD is an ADAS feature like highway assist. But it does more. Yes it makes mistakes. But so can Highway Assist. You can’t tell me you don’t have to occasionally disengage HA because someone jumps in front of you or traffic stops abruptly.

You guys are acting like you’re supposed to trust FSD with your lives. As long as you stay vigilant and keep your hands on the wheel, it’s easy to monitor. If it starts going the wrong way, just grab the wheel a little tighter and it’ll disengage.

I had it on my last model 3 back in the very early v10 days when it was really unreliable, but I never so much as scratched a wheel. Monitor and disengage as needed. It’s easy 🤷‍♂️
 
Back
Top