DreamDrive ADAS

Dortreo

Active Member
Verified Owner
Joined
Oct 12, 2021
Messages
911
Location
Baahstan, MA
Cars
AGT
What did you all think? I found the DreamDrive video underwhelming. DreamDrive offers adaptive cruise control, lane centering, and parking. Nothing too different or advanced from many late model cars. Are they deliberately under promising? (As opposed to Tesla FSD, which may as well be vapor ware.)

I’m concerned that Lucid has the software expertise to craft a unique, differentiating ADAS package in house. But the sensor and network suite is impressive provided that the software can take full advantage of it. What will it take to get there?
 
Last edited:
Yes, I was underwhelmed by the video. They had some items they mentioned for future software updates, like lane changing. They didn't mention anything about city streets and kept talking about highway. Yes, I will need this on highway, but between highway use, I drive 80% in the city.

They have had 5 years since they demonstrated a better system with Mobileye. I just hope this does not turn into the same wait and see like Tesla. They are not promising much for future software, which doesn't sound very ambitions - maybe they are being realistic. I would rather they were optimistic and extremely competent.
 
I wasn't expecting anything like Level 3 driving but was hoping to see some kind of application that takes advantage of all the sensors on the car. It seems like the car has more than enough sensors but doesn't have the software to interpret and act upon the incoming data, except in the most basic way. That said, data gathering from the sensors should help refine and advance the car's software functionality over time.

Though the company is refreshingly conservative in its claims, this PR quote does not inspire confidence in the eventual capabilities of the system: "the planned rollout of the Highway Pilot system for conditional automated driving on select roadways in the coming years."
 
Unfortunately its reflected in the stock... As a reservation holder I'm trying so hard not to pick. Hopefully, there will be something revolutionary introduce soon.
 
I have read that ADAS systems that use lidar do not use the lidar for real-time piloting of the car. That is done primarily with standard cameras. The lidar data is used to build digital maps for future reference and, in some systems, as a back-up data stream to fill gaps in the input from cameras that algorithms cannot fill in.

I get the drift that Lucid's software will be dependent on the evolution of the digital map database, and that will take some time. Remember that Tesla has been accumulating and digesting such data for some years now. Lucid is reportedly buying digital maps from the Dutch firm Here from which BMW and other European manufacturers source the maps for their ADAS systems, but I don't know how developed Here's maps are for the U.S.

I was very disappointed in what was nothing but a piece of marketing fluff in the Dream Drive Reveal. While I didn't expect Lucid to dump proprietary information, I was hoping they might at least give a broad-brush view of how their ADAS approach is structured.
 
Last edited:
So that's interesting. LIDAR is really meant to be a data gathering instrument and otherwise has little immediate use to the driver? Is this why Tesla is decidedly anti-LIDAR? Because it doesn't add anything to its years of data gathering? And why no other car manufacturers have opted into the technology? I would hope that LIDAR could be used to resolve discrepancies among what the cameras are sensing so that the car doesn't say run into stopped emergency vehicles in low light like some Teslas did until the recent update. Let's see.

It's interesting that the Air is so close to being delivered but no one has extensively driven a production model yet.
 
I suspect different groups working on ADAS use lidar differently.

Lucid is no longer working with Mobileye for ADAS, but my understanding is that Mobileye's system pilots cars based primarily on camera and radar sensor inputs but continuously runs a second tier of inputs, including lidar, in the background as a reference check to resolve conflicts or gaps in the primary system inputs.

The highly technical aspects of some of what I've read elude me, but Tesla is a real outlier in thinking ADAS can best be done only with optical cameras. Some writers say that there may ultimately be more than one approach to which type of sensors to use for full self-driving, but few buy into Musk's view that lidar is useless.

My own suspicion is that, since lidar was exceedingly expensive when Musk first started pushing autonomous driving, he needed to find a way to keep the cost down as he was trying to broaden his market by moving down the price curve. So he started down the path he's now on, not because he thought it was the best technical path, but because he had to find a path that was not as costly as using lidar. Of course, now that lidar prices have come down considerably and are still dropping, that cost imperative might have weakened. But Musk being Musk, he's not likely to back up and start down the lidar path.

Instead, he just keeps promising true self-driving is just months away and then keeps moving the arrival date out and out and out, year after year . . . and, more recently, just labeling his system as "full self driving" when it is demonstrably something well short of it.

But who knows? Musk proved all the experts wrong over a decade ago when he delivered an EV using laptop cells by showing that many cells could, in fact, be effectively kept in balance over a range of operating conditions. Maybe he'll be proved right that AI rather than sensor technology is the key to true self driving.
 
It's interesting that the Air is so close to being delivered but no one has extensively driven a production model yet.

Well, it wasn't quite a production car, but Jonny Lieberman of "Motor Trend" did have a Dream Edition R for two full days to drive on the Angeles Crest Highway and from Los Angeles to the Bay Area. And "Road & Track" subsequently was given a car to drive for 8 hours.

However, no one has yet been given a final production car with all the software programs and features operational as far as I know.
 
@hmp10 : I've been reading more about ADAS systems and autonomous cars. It sounds like the rest of the world has unified under a standard set of principles (UNECE) but the US is, as usual, lagging, due to state differences. Also, when is Full Self Driving NOT full self-driving? :)

So, this suggests that even if Lucid has an advanced set of sensors, it can only go so far with them until NHTSA and state guidelines catch up. Still, it would have been nice to see something different from Lucid given the in-car network and sensors.

Barriers Fall Unleashing Autonomous Cars in 2021, Says IDTechEx

Despite the US being a leader in autonomous mobility as a service (MaaS) testing, with the likes of Waymo and Cruise making huge strides, the US has become bottlenecked in the deployment of level 3 applications.

The main problem is achieving consistency across all states, with individual states and even cities adopting their own stance on autonomous testing. It is unlikely that the deployment of level 3 applications could be controlled by individual states, as customers would find it unacceptable if a primary selling point of the vehicle was unavailable in their own state.

It would also cause difficulty if the car's operation needed to change as it crossed state lines. Harmony is required to bring level 3 applications to the US. The National Highway Traffic Safety Administration (NHTSA), however, does not indicate that this is on the near horizon. The NHTSA indicates that highway autopilot (a level 3 technology) will be available on the roads from "2025+", as well as fully automated safety features. It is therefore unlikely that the US will be seeing high levels of private autonomy on the roads soon. This is also considered within IDTechEx's "Autonomous Cars, Robotaxis & Sensors 2022-2042" forecast, with the US not having market entry until 2025, up to 3 years behind other regions.

What about Tesla "Full Self Driving" (FSD)? Despite the name, Tesla FSD is no more than a level 2 system, although a particularly sophisticated and advanced one. This has been established in correspondence between Tesla and the California DMV, as well as cautions with the latest FSD update that the driver should always stay in control of the vehicle with their hands on the wheel. Tesla FSD is an example of a level 3 vehicle in waiting. It may be technically capable of these levels of operation but is being sandbagged by current legislation.
 
Using the excuse of "NHTSA and state guidelines catch up" as a way to justify not having a complete suite of functions is cop out for auto manufacturers. I was told by a Lucid sales advisor that Lucid will have level 2+..3 functionality once there are proper guidelines and regulations in place - as if that's why their ADAS system is so lame now. You can have a "fully" functioning system that's similar to Tesla's Beta FSD without any regulations. You just can't call it Level 3 or imply that it will drive for you without intervention. Basically, Lucid currently does not provide Level 2+..3 features because they haven't implemented them successfully. They emphasize highway driving, like that's all people want. Even my wife's BMW Xdrive has lane keeping and even automatic driver initiated lane changes.
 
I have read that ADAS systems that use lidar do not use the lidar for real-time piloting of the car. That is done primarily with standard cameras. The lidar data is used to build digital maps for future reference and, in some systems, as a back-up data stream to fill gaps in the input from cameras that algorithms cannot fill in.
I am a bit skeptical on this. While LIDAR data may be used to build digital maps, I would be very surprised if it was not used for real time driver assistance. LIDAR has several advantages over radar, optical and ultrasonic. It sees longer distances, smaller objects, and more precise distance information. Sequential scans accurately determine relative movement. However, the image maps from LIDAR are not well suited to identifying objects. Hence, radar and optical are used to identify objects such as cars, trucks, motorcycles, people, etc. Optical is not well suited to determine distance because the cameras are not far enough apart to get good geometry where images from two cameras converge, especially at greater distances. Radar is more precise than optical for distance but has difficulty detecting smaller objects. Ultrasonic is good for very for short distances only.

While Elon Musk may be trying to prove that optical only will work, most of the industry is looking at a fusion from all sensors. Each sensor type has its own unique advantages and disadvantages.
 
It's true that some of the industry is using a fusion approach, but Mobileye is on a slightly different path. This article from Mobileye discusses the role of cameras vs. radar and LiDAR:


"From the outset, Mobileye’s philosophy has been that if a human can drive a car based on vision alone – so can a computer. Meaning, cameras are critical to allow an automated system to reach human-level perception/actuation: there is an abundant amount of information (explicit and implicit) that only camera sensors with full 360 degree coverage can extract, making it the backbone of any automotive sensing suite . . . . While other sensors such as radar and LiDAR may provide redundancy for object detection – the camera is the only real-time sensor for driving path geometry and other static scene semantics (such as traffic signs, on-road markings, etc.)." [my emphasis]

A more recent article explains that, as they move up the ladder into Level 3 and Level 4 autonomy, they're now also using lidar and radar as primary sensors, but they are still running those data streams separately from camera inputs to create redundancy:


"Mobileye’s differentiated approach of True Redundancy creates two parallel AV sub-systems, with two independent models of the driving environment – one from cameras, one from radar and LiDAR – each operating independently of the other." [my emphasis]

Dr. Eugene Lee, Lucid's head of ADAS, was the primary force behind GM's Super Cruise system. While that system uses LiDAR, it's used for digital mapping but is not part of the real-time data stream:

"Similar to other assisted driving systems, the Cadillac’s gathers real-time data from cameras, GPS, and radars. But Super Cruise adds another element with LiDAR-scanned map data. By precision mapping controlled-access highways (e.g., divided freeways that require on- and off-ramps), Cadillac Super Cruise is beneficial on long-distance trips because factors like intersections are eliminated from the equation." (source: https://www.jdpower.com/cars/shopping-guides/how-does-cadillac-super-cruise-work)

Lucid is apparently not going to give out too much detail about their ADAS system, but we know they worked first with Mobileye and now have the father of Super Cruise leading their effort . . . and neither of those systems fuse LiDAR-generated data with camera-generated data. This isn't a definitive answer to the question of how Lucid uses LiDAR data, but it's all we've got to go on until Lucid reveals more.
 
From Green Car Reports July 2020

Lucid says that it worked with the suppliers Continental and Bosch, as well as Here (for mapping and telematics), although the integration for the systems was done entirely in-house.

“Lucid chose to include lidar in its sensor suite for Level 2+ ADAS not just because it’s the optimal technology for long-range sensing, but also because it adds a high level of redundancy to the sensor suite overall.” According to Lee, the lidar the company has chosen extends to 150 meters, with a relatively wide field of view, “so it captures a tremendous amount of data and allows for an increased level of safety in assisted driving modes.”

DreamDrive's three sensing modalities—cameras, radar, and lidar—are supplemental to each other and provide redundancy in the system. “Lucid incorporated lidar technology into its ADAS and AD sensor suite because we want to minimize any potential misperception by other sensors such as camera and radar,” said Lee.


It sounds like lidar is being used to check that the camera and radar systems are accurately sensing the environment. As for what to do with the data, it sounds like the integration is being done in-house, but Continental could be supplying the development kit. And maybe more?

And I thought Continental only made tires…

Continental ADAS : This reads almost exactly like the DreamDrive press release.

Continental LIDAR : And guess who makes a solid state flash lidar?
 
Last edited:
Continental ADAS : This reads almost exactly like the DreamDrive press release.
Thanks for the info. You are right. It does sound like the DreamDrive press release. So basically Lucid just did some system and UI integration and called it a day. Sounds like their future level 2+ software still needs to be developed, if it hasn't already been done.
 
Back
Top