Tesla & Google Disagree About LIDAR — Which Is Right?
Tesla rather famously has chosen not to use LIDAR as one of the sensors on its cars to assist with its autonomous features, at least so far. Google uses LIDAR as one of its dominant sensors and insists it’s necessary. With the recent fatality in a Tesla that was operating under Autopilot, Tesla’s choice is under attack. Assessing the competing claims requires understanding the strengths, weaknesses, and compromises inherent in the different sensor types.
There are 4 types of sensors which provide external and immediate information to autonomous and semi-autonomous vehicles:
- LIDAR — a surveying technology that measures distance by illuminating a target with a laser light. LIDAR is an acronym of Light Detection And Ranging, (sometimes Light Imaging, Detection, And Ranging) and was originally created as a portmanteau of “light” and “radar.”
- Radar — an object-detection system that uses radio waves to determine the range, angle, or velocity of objects.
- Ultrasonic — an object detection system which emits ultrasonic sound waves and detects their return to determine distance.
- Passive Visual — the use of passive cameras and sophisticated object detection algorithms to understand what is visible from the cameras.
Each technology has different strengths and weaknesses.
Representative manufacturers include Continental AG, LeddarTech, Quanergy, and Velodyne.
Google’s self-driving car solution uses LIDAR as the primary sensor, but uses the rest of the sensors as well. Tesla’s current solution does not incorporate LIDAR (although, its sister firm SpaceX does) and past and current statements indicate that Elon and team do not believe it is necessary for autonomous cars.
Quanergy is demonstrating a near-production solid-state LIDAR system which is expected to have 150 meters of range, a $250 cost, and adequate resolution. The unit is still much larger than all other sensors and much more expensive, but the price and size compared to expected performance will allow it to be a very competitive sensor if it makes it to production. This price/performance point might make its inclusion on Teslas more likely, and a Tesla Model S with a conventional lidar mounted on top has been spotted.
Representative manufacturers include Delphi, Kyocera, Valeo, and Visteon.
Representative manufacturers include Bosch, Valeo, Murata and SensorTec.
Representative manufacturers include Mobileye, Delphi, Honeywell, and Toshiba.
Most recently, Mobileye has announced that it will no longer be supplying Tesla with its solution due to disagreements about use, and will focus on fully autonomous solutions. Mobileye’s current implementation has acknowledged limitations of resolution and side-collision detection, the second of which is expected to be included in an upcoming new product release.
Performance Variance in Different Conditions
It’s worth looking at the significant variance in performance of sensors under different conditions. The charts below (which I created) provide rough approximations of range and acuity of the sensor based on rough averages across various technical implementations of different sensor types.
Range is in meters. Acuity is an asserted value based on a combination of resolution, contrast detection, and colour detection.
Obviously, passive visual has the longest range and best acuity when conditions allow it to be used, and equally obviously, it degrades rapidly in terms of the quality of information it can provide under adverse conditions.
What becomes apparent from this assessment is that radar, while not the best sensor under all conditions, degrades the least at ranges necessary to detect vehicles and other objects at higher speeds. LIDAR is better until significant atmospheric murkiness occurs with fog, snow, or heavy rain, but degrades under those conditions.
This assessment triggers the question of how much reliance should be placed on what quality of information for autonomous cars. Is the lower acuity of radar sufficient for identification of the majority of objects and vehicles, allowing the car to proceed faster with safety than if a human were driving? Or is the acuity too low and speed would have to be dropped to allow the other sensors to gather enough information to respond in time?
And the other question is whether autonomy should be designed with the most reliable information available under all conditions, radar, or whether it should be designed with assumptions of sensor sets which degrade so substantially under more adverse conditions?
LIDAR, radar, camera, and headlight technologies are all proceeding, so this assessment is at a point in time in mid-2016 and may change substantially in the coming years.
A Less-Expensive, Full-Featured Compromise
What it would appear to lack is good resolution imaging in the dark, where LIDAR has an edge over lower-resolution radar.
Regardless of whether LIDAR is required or not, no solution based on a single-sensor set or even a dual-sensor set is likely to be viable. Each sensor type has strengths and weaknesses and amalgamating a single representation of reality from multiple sensors is required in order to avoid false positives and also false negatives.
Some early statements from Tesla seemed to indicate that one of its sensors recognized the truck side in Florida, but another didn’t or thought it was a suspended sign. The system resolved the conflict in favour of avoiding a false positive, and Tesla is expected to release a software update shortly which better uses the radar system to avoid this edge condition.
No single solution is perfect. Every combination solution has compromises, even if those compromises are of size or differences in degree of awareness in different directions. The sets of sensor technologies will be combined in different ways at different price points of vehicles for different solutions to arrive at effective solutions.
For now, it appears as if Tesla is correct. With the advent of solid-state LIDAR at steadily decreasing price points, and with better performance characteristics, that may change soon. [Editor’s Note: I assume developers of any breakthrough LIDAR system would be extremely eager to have Tesla as a client and would be pushing for trial use of their tech, which I imagine Tesla would happy to implement. In other words, as soon as such potentially breakthrough LIDAR is ready for production, I assume Tesla will be one of the quickest (if not the quickest) to be offered it and to actually implement it in production vehicles.]
→ Related: Tesla Has The Right Approach To Self-Driving Cars
Chip in a few dollars a month to help support independent cleantech coverage that helps to accelerate the cleantech revolution!
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one if daily is too frequent.
CleanTechnica's Comment Policy