Tesla is yet again in hot water with regulators over its decision to rely exclusively on cameras for its "Full Self-Driving" feature.
futurism.com
FSD is great! …unless it’s raining, or foggy, or too sunny, or snowing, or you have a leaf on your camera, or there’s a bug stuck on the lens…
With no redundancy, you’re just screwed. That’s not rocket science.
Can it be done with just cameras? Maybe. I don’t think using AI is the right approach because I’d rather it be predictable, but maybe. I’m the first to admit nearly anything is possible with enough engineering time and investment.
The more important question is
should it be done with just cameras, to which the answer is a hard no. You need redundancy. You need the sensors to be able to tell you things cameras
can’t see. This should not be hard to grasp. Different senses provide different information, just like in humans (to oversimplify), and it is the synthesis of that information that allows us to make sense of the world.
Deaf people do just fine in life, and many of them hate being called disabled. But at the end of the day, they glean less information than hearing-enabled people glean, and have to make sense of the world with less information, which is harder.
Blind people develop workarounds like using a cane, or echolocation… why? Because they have redundant sensors. If sight fails them, they can use their sense of hearing to make sense of the world around them; less perfectly, but well enough to manage.
I do not understand why anyone would decide that hamstringing your vehicle with *less* useful information would somehow make it better. The synthesis argument is trash. Other manufacturers have shown it isn’t an issue.
So, Occam’s razor says it’s purely cost-cutting, and will screw then in the end.
We shall see.