Gravity Orders Discussion

Status
Not open for further replies.
In the Phoenix metro area we have a lot of Waymo self driving cars. When one of them is next to me and I look at all the spinning radars and lidars and sonars and other sensors all over the exterior of those electric Jaguars, I wonder how it is that Tesla can supposedly accomplish essentially the same task with so many fewer sensors? Is it a case of Tesla's FSD sensors being better and smaller and less visible? Or is it a case of Tesla not needing to care as much since Tesla's FSD only operates if there is a human driver behind the wheel, as opposed to Waymo where there literally are times the Waymos don't have a human inside, much less driving?
It’s not that Tesla is using better sensors, it’s simply using far fewer of them. Tesla’s big bet is that humans drive only using their eyes, and hence self driving cars should be able to do the same. In fact Tesla has eliminated sensors like forward looking radar (that came standard on my 2018 Model 3) in order to cut system costs. Meanwhile the Waymo cars use not only cameras but radar and Lidar (those spinning things project laser beams that measure distance with great precision rather than estimating as humans do).

Long term Tesla should be right: if humans can drive only using their eyesight then robots should be able to as well, or in fact better- humans only have two eyes, while robots can have lots more cameras. Short term, however, dropping sensors makes some parts of the problem much harder. In most of the frontal accidents Tesla’s have had the computer vision misses an important, often hard to see object. A flat bed trailer across the road of similar color to the background. A person crouched in a similar color bush with a hat on, a poorly lit vehicle stopped in the fast lane in fog. A shadow that looks like the edge of the road (the thing that tried to kill me).

Currently Tesla is using their customers as guinea pigs: every time a person takes the wheel to correct a FSD Tesla that event can be logged. They have data centers full of thousands of CPUs identical to the ones in the cars, so as FSD software is updated they can present the new version of software with either simulated or actual camera footage from real cars to see if near misses and accidents that previously occurred can be caught by the software next time. Thus much as Chat GPT trains on data from the internet Tesla trains on data from its customers.

Unfortunately this method carries with it a substantial amount of risk for the testers (ie you and me). The forward looking radar Tesla used to install would 100% see big, solid objects in the road every time. The computer could then even make some judgements nearly impossible for a human (is that paper bag in the road empty, or full of concrete?). So while Tesla’s AI system, trained by millions of customers, is a good way to make self driving tech cheaper (radar and lidar are not inexpensive) it has inherent flaws that will make getting to high nines of reliability very challenging. It results in very fluid and natural car motion, but it can be hard to remember that the system has inherently bad eyesight that it relies on 100%. Which is why Waymo, etc have gone another way and are well further along in certifications despite having a fraction of the cars on the road to train their systems on.
 
The Y has always been the ugliest Tesla. The new refresh doesn’t help. The headlights getting smaller just enhances the fact the car is vertically stretched in an unpleasant way. Proportions are still all out of whack. And it’s their worst handling car as well.

But it is their best seller, because the price and the form factor are what people want.

There’s so much more competition now and coming soon. I would not bet on a modest redesign moving the needle much on sales. It’ll do fine. Don’t get me wrong. I just don’t see it taking the world by storm.
It is the most sold car in the world actually
 
Perhaps after 67 pages, it's time to consider we've said all there is to say about Gravity orders? At least until we hear something new from Lucid.

All in favor of closing this one down?
 
I'm sorry to keep this thread off track. I listened to Rivian’s earnings calls, during which they announced many upcoming improvements for their driver assistance system, which is being trained end to end. I hope Lucid will provide on-par capabilities as well and share some updates in the upcoming earnings call.

I recently purchased a comma AI system with open pilot for my 2020 Lexus RX, making it hands-free for around $1k. I love it. It uses a computer vision approach like Tesla but also integrates my vehicle’s safety features, such as radars, blind spot monitoring, etc. All by simply plugging into my LKAS camera and sending more accurate signals to the car. Highway drives are totally hands-free. It changes lanes by putting on a blinker as long as no vehicle is in my blind spot. In experimental mode, it stops at stop signs and red lights and proceeds when green. It does active curve speed control.

It shows what is possible without LIDAR, which I don’t have on my Lexus. I would argue that it is safer than Tesla since it uses radars, not just vision. People have successfully made it support Rivian as well, and I hope a few of us can spend some cycles porting it over to Lucid, too.

Lucids without DD Pro are capable of such features, I believe, and I hope someday Lucid will offer a subscription service for DA for folks who didn’t opt for DD Pro. It could be a decent revenue stream for them once they achieve more scale. You can’t blame a man for hoping!!
 
Status
Not open for further replies.
Back
Top