Aurora Flight Scienes is working with the University of Marytland on an Air Force Research Laboratory contract to demonstrate a vision-based guidance system for micro air vehicles that combines optics and sonar and allows small UAVs to navigate autonomously down city streets and through urban canyons. (AvWeek story here.)
Bat-inspired echolocation will allow the MAV to detect and dodge obstacles like trees, poles and wires. This is needed as the visual sensor is not sensitive enough to see small objects because it uses a phenomenon called optical flow to allow the MAV to navigate relative to its surroundings. Optical flow is a technique being explored for machine vision applications, so far mainly for ground robotics. The most easily understood explanation I can find is here.
Optical flow is the apparent movement of an object relative to the observer. here's how Wikipedia describes it: "This allows a person to judge how close he is to certain objects, and how quickly he is approaching them. It is also useful for avoiding obstacles: if an object in front of an observer appears to be expanding but not moving, he is probably headed straight for it, but if it is expanding but moving slowly to the side, he will probably pass by it."
Intuitively I can see how it would work. If you run down a street, cycle down a path, or drive along a road you avoid bumping into things more by sensing their motion in your peripheral vision than by actually looking at them. The MAV doesn't need to know its exact position to be able to navigate, only its position relative to its surroundings. Aurora says Bees, flies and even pilots use optical flow when flying. Looking on YouTube to find a video illustrating optical flow, I stumbled on this, which has some interesting moments:
Video: Federal University of Minas Gerais (UFMG)