One step closer to self-driving cars thanks to new sensors
The rise of self-driving cars has been occurring more in the public sphere over the last few years, with companies like Tesla and Google rising to the forefront. But there have been questions of safety concerns with this new technology.
A recent study from Massachusetts Institute of Technology (MIT) highlights a potential breakthrough in improving the safety of self-driving cars.
In their paper published in IEEE Access, members of the Camera Culture group at MIT’s Media Lab discuss their new approach to time-of-flight imaging – an approach that measures distance by analyzing the time it takes light projected into a scene to bounce back to a sensor.
The team’s new approach increases the depth resolution by 1,000-fold, which is the type of resolution needed to make self-driving cars practical. Furthermore, their technology could allow for accurate distance measurements through fog, which has been a significant obstacle in the development of these vehicles.
PhD student and first author of the paper, Achuta Kadambi, explains why increasing the resolution is so important:
“As you increase the range, your resolution goes down exponentially. Let’s say you have a long-range scenario, and you want your car to detect an object further away so it can make a fast update decision. You may have started at 1 centimeter, but now you’re back down to [a resolution of] a foot or even 5 feet. And if you make a mistake, it could lead to loss of life.”
However, the MIT team’s new system at a distance of 2 meters has a depth resolution of a whopping (or tiny) 3 micrometers. And at 500 meters, the system theoretically would have a depth resolution of only a centimeter. This is an enormous difference compared to the systems that are currently in place.
Fog has been an issue for these time-of-flight systems in the past, as it scatters light and deflects returning light signals so that the arrive back to the sensor late and at odd angles. The researchers are currently working on a new gigahertz optical system that combines the ideas of interferometry, light detection and ranging (LIDAR), and the principles of acoustics. This system has been shown in theoretical analyses to produce a usable signal in the presence of fog.