The Airs LiDAR resolution and field of view is still very much state of the art. 150m range, 120 degrees horizontal field of view and 25 degrees vertical with 0.05 degrees of angular resolution. I think the bigger issue for becoming obsolete is the NVIDIA compute power. Lucid is using a newer sensor fusion strategy is referred to early fusion compared to the legacy automakers. In early fusion, all of the sensor information is combined before target identification. The advantage is that all sensor information, rather than filtered targets, is available for the perception algorithm. The disadvantage is that it requires a very high power central processor for the fusion. Since it is a newer strategy, the perception algorithms are not as well developed and hence the delay in rolling out new Dream Drive features. It is also the reason that the NVIDIA processing power is key to making Dream Drive Improvements.
Legacy automakers tend to use late fusion where the target recognition is done separately for each sensor (camera, radar, LiDAR) and the resulting targets are overlayed together for the final perception algorithm. This strategy distributes the processing power across many processors and reduces the workload on the central processor. It also eliminates and simplifies a lot of sensor data before being passed to the central processor for combining.