The pop-up displays are neat, but what Lucid is pitching is more like the MB AR-HUD. Here's a good article (with videos) that describes what that is like:
https://spectrum.ieee.org/augmented-reality-car-hud
Being able to register eye position against the windshield means that actual things in your line of sight can be highlighted -- hazards, other vehicles, turn-by-turn directions. Someone in my household often misses turns when there are dense city streets and rotaries ("roundabouts" to some of you) even with voice navigation and the mini-map on the dash. Having the actual turn highlighted in front of you will be a huge difference.
I have really high hopes for this technology. As you can seen in some of the concept videos on the page above, you could literally never miss a red light or a stop sign because you'd see a virtual red barrier exactly at the stop line. Never be in the wrong lane or miss an exit. It's really cool stuff. The big challenge is integrating all the software together. It doesn't make sense to have multiple camera systems and multiple computers all doing scene analysis for different purposes (nav, ADAS, HUD), but that's the classic supply-chain model. It's much harder, however, to make the software work right so that everything runs centrally and is displayed through different output channels.