AEye's iDAR Leverages Biomimicry To Enable First Solid State LiDAR With 100Hz Scan Rate
Breakthrough speed to be discussed today at Western Automotive Journalists event "Silicon Valley Reinvents the Wheel"
PLEASANTON, Calif., Oct. 1, 2018 /PRNewswire/ -- Following on the heels of being honored as "Most Exciting Start-Up" at the prestigious AutoSens awards in Brussels, AEye, a leader in artificial perception systems, today announced that it has established a new speed benchmark for 300m+ class solid-state LiDAR sensors achieving scan rates of 100Hz or greater – a 10x improvement over currently deployed competitive systems which typically scan at 10Hz. AEye's artificial perception platform, iDAR, is designed to mimic the performance of the human visual cortex. This biomimicry enables software-definable search patterns that can be optimized for specific driving situations that deliver much more precise and actionable information at speeds never seen in commercially available LiDAR sensors.
iDAR's biomimicry enables autonomous perception engineers to create these situationally-specific scan patterns that are capable of searching a scene 4X-5X faster than the human eye. This scanning speed is matched by superior spatial coverage that breaks down a scene into Dynamic Vixels, a data type unique to iDAR which combines X,Y,Z and R,G,B data.
By finding and locating objects as fast or faster than a human, iDAR enables perception that can intelligently classify and track objects at unprecedented rates – including the unique ability to calculate the vector and velocity of each object within a frame. Much of this can be done at the sensor level within the same frame, bypassing 100s of milliseconds of latency seen in currently deployed systems. This ability to modulate both spatial and temporal dimensions simultaneously as humans do is what is needed to achieve level 4 and 5 autonomy.
"iDAR is based on a revolutionary new agile LiDAR design that allows autonomous vehicles to perceive far beyond the limits of human perception," said Blair LaCorte, AEye's Chief of Staff. "This powerful software-driven sensor system allows vehicle perception engines to actively interrogate their environment to identify the precise information they need at speeds that will radically improve safety."
On Monday afternoon in Mountain View, CA, Mr. LaCorte and Dr. James Doty, Clinical Professor of Neurosurgery at Stanford University will present a discussion on "Making Sense of the Sensor: Applying Biomimicry to Vehicle Autonomy." In this session, they will explore why the human brain and visual cortex are the ideal models for autonomous perception and how their performance could be best replicated with existing sensor technologies.
"AEye has taken some of the most elegant lessons from human brain science and combined them with cutting edge technology," said James Doty. "This integration created something that I believe will allow autonomous vehicles to process data like a computer but perceive like a human."
The annual Western Automotive Journalist event is being held at the Computer History Museum in Mountain View, CA on October 1 from 10am to 5pm. For more information see waj.org.
About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a perception system that that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Airbus Ventures and Intel Capital. For more information, please visit www.aeye.ai.
SOURCE AEye
Related Links
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article