AEye unveils iDAR advanced perception and planning for driverless cars
Aeye autonomous cars, Idar

AEye unveils iDAR advanced perception and planning for driverless cars

AEye, a San Francisco computer vision start-up backed by Airbus Ventures and Intel Capital, has launched iDAR. The next generation vision technology offers advanced perception and motion planning for autonomous vehicles.

The majority of autonomous car manufacturers consider LiDAR (light detection and ranging) technology a vital part of the self-driving fleets of the future. These laser scanners act as the ‘eyes’ for onboard computers, by building a 360-degree image of the world around the vehicle.

However, LiDAR technology is yet to be perfected. As well as being bulky, expensive and short in supply in a demand-heavy market, questions remain over its long-term future as the eyes of autonomous vehicles.

Despite working in tandem with computers, cameras and radar systems, it can struggle in adverse weather conditions; fog, rain and even dust interfere with LiDAR’s ability to build a comprehensive image of the world.

Read more: MWC 2017: The car in front is autonomous; or soon will be

Enter AEye

In a similar move to industry giant Velodyne, AEye has launched a second generation ‘solid state’ LiDAR system, which aims to solve these issues.

Its MOEMS (micro-opto-electromechanical system) LiDAR has been integrated with a low-light camera and artificial intelligence. The result, according to AEye, is vision hardware that can dynamically adapt in real-time to “deliver higher accuracy, longer range, and more intelligent information to optimize path planning software.”

AEye’s iDAR system does this by overlaying 2D images onto the 3D point cloud data captured by the LiDAR. The embedded AI then ploughs through thousands of computer vision algorithms to form efficient path-planning software.

“AEye’s unique architecture has allowed us to address many of the fundamental limitations of first-generation spinning or raster scanning LiDAR technologies,” said Luis Dussan, AEye founder and CEO.

“These first-generation systems’ silo sensors use rigid asymmetrical data collection that either oversample or undersample information. This dynamic exposes an inherent trade-off between density and latency in legacy sensors, which restricts or eliminates the ability to do intelligent sensing.”

With AEye’s intelligent sensing, he has said, iDAR can selectively revisit any chosen object twice within 30 microseconds. That equates to a 3,000-fold improvement. “This embedded intelligence optimizes data collection, so we can transfer less data while delivering better quality, more relevant content.”

Read more: Dell Technologies unveils new IoT strategy in New York

Mimicking the visual cortex at the ‘edge’ with iDAR

AEye’s iDAR system mimics a human’s visual cortex. It focuses on and evaluates potential driving hazards and relies on distributed architecture and edge processing to track objects of interest.

“Humans have an instinctive ability to respond to visual cues. By fusing intelligence within the data collection process, iDAR takes us a step closer to this instinctive response,” said AEye director of software, Jon Lareau.

“AEye’s iDAR is also an open and extensible platform, allowing us to integrate best-of-breed sensors to improve performance, increase redundancy and reduce cost. Most importantly, iDAR should help our customers streamline their development process and bring better autonomous vehicles to market, faster.”

AEye has also announced the iDAR Development Partner Program, a move that will no doubt be of interest to the many automotive manufacturers developing autonomous cars. The start-up plans to demo iDAR alongside its automotive product suite at CES 2018 in Las Vegas this January.

Read more: Autonomous driving will create $7 trillion “passenger economy”, says Intel