Radar for autonomous ground vehicles must look to human drivers for inspiration. The way the human eyes and brain work together to understand their surroundings and resolve ambiguity is elegant and effective. When a person’s peripheral vision detects movement, the brain directs the eye to examine it further to remove the ambiguity. The rapid interaction of senses and movement to achieve decision-making creates robust situational awareness from relatively poor sensors, like the human eye.
Autonomous vehicles are unable to mimic this basic human function. Disambiguation is best resolved by directing sensor resources to specific scene areas to gather data that increases probabilistic confidence levels. More, deeper information from one area is worth more than sustained flows from everywhere.