Yes, I watched this video a couple of days ago when it was posted.
I think
@borski is reacting to a prior post when I asked
"how does Lucid use the LiDAR sensors installed in the Air?"
I am NOT a Tesla fan. Even tough I am a EV Enthusiast and I think Tesla made EVs real to the public and created the EV industry infrastructure (e.g., charging stations), I never owned a Tesla because I think Tesla over-promise and under-deliver. I don't trust FSD.
My Lucid Air GT is my first EV. I had a Honda Clarity (PHEV) before. It is now my son's car. Great car!
Let me clarify a few points:
> (typical) humans see using binocular vision. Binocular vision can detect depth and movements. And, to a large extent, most trained and experienced human drivers were able to navigate driving pretty well. Yes, humans can be fooled. But achieving even human-level visual discrimination is not an easy feat for a machine.
>Tesla, who originally investigated the use of LiDAR to augment binocular vision for FSD, eventually abandoned LiDAR (and other augmentations), declaring they can implement FSD with just binocular vision with two offset cameras. I don't know their real reasons. I suspect it might be related to cost and marginal benefits.
> some car manufacturers, Lucid as an example, decided to stay the course with LiDAR.
With respect to my original post, I was asking
HOW Lucid is using the installed LiDAR and what are the benefits/ differentiation/limitations compared to binocular vision using cameras.
I was in no way endorsing one over the other.
To be clear, I am skeptical about binocular vision-only FSD. But I am also puzzled by the real-benefits of a simple front-facing array LiDAR and Lucid's implementation, hence my question.
I will explain below.
> LiDAR is implied as a quantum leap from binocular vision FSD. This is based on the ability of LiDAR to QUANTITATIVELY range the obstructions ahead of it, like a RADAR. And a fully decked-out EV like Waymo seems to navigate traffic graciously. I live in the Phoenix area and have seen LiDAR-equipped autonomous cars roaming the streets for the past decade. I have used Waymo before. They are truly impressive!
> Lucid's linear, front-facing LiDAR array is a small subset of what Waymo has done.
>as implied in the video
@borski referred to, LiDAR can (potentially) discern ambiguous scenes better than a binocular, camera-only, implementation. I believe that is true.
But that wasn't my question.
> my question was....how is LUCID using its front-only LiDAR sensor and how is it integrated into the ADAS system to improve driving safety?
I have yet to get an informed reply to my question.
I am not expert enough to give a dissertation on LiDAR. But I will point out a few relevant factors:
> the reason why your eyes, with binocular vision, can discern depth is not because of your individual eyeballs. Your BRAIN is the "computer" that make that determination, by comparing the (slightly) spatially offset images returned from your eyeballs (or the binocular camera sensors). Thus, Tesla is not wrong in that if you have a powerful-enough computer to process the binocular images from the two cameras, IN THEORY, FSD can be as good as a human driver.
> LiDAR on the other hand, generates a quantitative return signal images that can map Space Mountain and detect the false image ahead of it because LiDAR would have detected the flat landscape mural as a barrier.
But both of these scenarios require a powerful "computer", or your brain, to process the information in real-time. Does Lucid have the capability to do that?
Footnotes:
> I use LiDAR in my non-vehicular activities. I am pursuing building an ADU attached to my existing home. In this project, we are going to use LiDAR scan to map the construction area to generate a topo-map and ensure the mating of the new ADU to the existing house.
> like mapping Space Mountain, my ADU project requires LiDAR scans, but not real-time analysis and real-time decisions based on the LiDAR data. It is "pedestrian".
> In case you are not aware, the last couple of generations of iPhones and iPads (top SKUs) have built-in LiDAR. But you have to buy an APP to use it. It is not "real-time" either (in terms of analytics).
> using LiDAR data to evoke actions (e.g., "collision avoidance" would require "real-time" data processing. Part of the question I raised pertains to whether Lucid (and other EV vendors) have the on-board computing power to use LiDAR in real-time, other than just ranging.