• Autonomy requires exceptional sensors

    Autonomy requires exceptional sensors

    Yes, better sensors will always improve AV performance. But only when AVs can intentionally interrogate the driving scene will human-like performance be achieved.

Autonomous Vehicles

To fulfill the true promise of autonomous ground vehicles, AVs need machine perception to improve on, or at least match human perception. That will happen in part by using more sensors and information, such as LIDAR, radar, cameras, V2X communication, external data resources, and high-resolution 3D maps. 

Humans don’t drive using static sensors. Our brains, attached to multiple sensors (eyes & ears among them), tell us where to look and when based on data being collected and processed in real time. Machine perception must behave similarly if it is to improve upon human perception.

This requires sensors attached to the fusion and decision layers of the AV stack to present control capabilities that direct sensor resources to collect pertinent data that significantly improves driving scene situational awareness. 

EchoDrive Overview

This video demonstrates some of the advanced imaging and adaptive interrogation functionality of EchoDrive Cognitive AV Radar. 

Dynamic Sensor Control

Dynamic Sensor Control

No matter the sensor, manufacturer, resolution, or data rate, one-way data flows from sensor through fusion to decision will never achieve human-like perception. When humans hear a sound or see something in peripheral vision, the brain directs sensor resources on the object or scene area to resolve ambiguity. When continuous data consumption fails to resolve ambiguities in the driving scene, algorithms struggle to reach sufficient confidence to activate vehicle controls.

Learn more about EchoDrive Cognitive AV Radar

Whitepaper: Highly Adaptive Radar for Cognitive Imaging

EchoDrive is a new type of AV sensor - it delivers cognitive functionality by placing radar control in the AV stack itself. This allows the vehicle AI to resolve ambiguities and discrepancies by dynamically tasking the radar to measure specific aspects of the driving scene. We have authored a white paper on the topic and invite you to request your copy. 

Get Your Copy!
Whitepaper: Highly Adaptive Radar for Cognitive Imaging

In the News

Echodyne steers its high-tech radar beam on autonomous cars with EchoDrive

Echodyne steers its high-tech radar beam on autonomous cars with EchoDrive

January 6, 2020

The EchoDrive system meets all the requirements set out by the company’s automotive partners and testers, with up to 60hz refresh rates, higher resolution than any other automotive radar and all the other goodies.

Read more
Most self-driving companies say this tech is crucial. Elon Musk disagrees

Most self-driving companies say this tech is crucial. Elon Musk disagrees

June 18, 2019

"The more sensors you use, the better chance you have of getting it right," said Eben Frankenberg, CEO of Echodyne, a Kirkland, Washington-based company developing radar.

Read more
All the things that still baffle self-driving cars, starting with seagulls

All the things that still baffle self-driving cars, starting with seagulls

October 1, 2018

This article captures the dynamics of static sensor flows in the context of dynamic neural net needs. AV designers need to dynamically task sensors to interrogate aspects of uncertain drive scenes. Sensor companies need to respond.

Read more
Echodyne CTO to Speak on Revolutionary Machine Perception Radar for the Autonomous Age

Echodyne CTO to Speak on Revolutionary Machine Perception Radar for the Autonomous Age

April 30, 2018

Dr. Tom Driscoll to Speak on the Game-Changing Technology of MESA™ Radars. Echodyne, the manufacturer of innovative high-performance radars for autonomous machines, announces today that its co-founder and Chief Technology Officer, Dr. Tom Driscoll, will speak on the company’s patented Metamaterial Electronically Scanning Array (MESA) radar technology at AUVSI’s XPonential

Read more