It's harder than it looks to build a small perch-and-stare unmanned aircraft, even with the help of a crowd. That's the conclusion of Darpa's UAVForge crowdsourcing competition, which failed to produce a winner for the $100,000 prize.
Team Halo (all photos DARPA)
Team Halo from the U.K.'s Middlesex University scored highest, but no one was able to complete the perch-and-stare reconnaissance mission, requiring a vertical takeoff, navigation to an area beyond line of sight, landing on a structure and capturing video, and returning to the starting point.
With the dust settled after the May flyoff between the nine finalists, program manager Jim McCormick has provided some answers to my questions. I was particularly interested in how well the crowdsourcing part of the competition worked, as it is a key feature of an increasing number of Darpa programs.
"UAVForge demonstrated the willingness of a global community of non-traditional developers to participate in a compelling challenge," he says. "The ability to build a community, foster collaboration and overcome inhibitions such as intellectual property concerns proved valuable. The ability to recognize and filter sources of bias in crowd voting was surprisingly important and well received."
His sentiment is echoed in a post on uavforge.net by a member of one of the finalist teams, Phase Analytic: "The collaboration among teams to solve common, and individual, problems was incredible. This was especially so, given that it was a competitive event with a large purse. The selflessness with which teams helped each other was the antithesis to personal interests and validated that elements of the event's structure truly attracted and provided an atmosphere in which 'crowd sourcing' was possible."
Or as a member of Team Halo put it in another post: "Great meeting you guys and sharing ideas. Like inmates at Colditz Castle trying to think of ways to escape...we nearly made it, however, the war ended before we could get out."
Technically, the biggest problem the flyoff competitors had was maintaining communications with their UAVs. "The primary limitation was communications. Most teams lost control and video links before they could descend to a perch," says McCormick.
"None of the teams that made it to the observation area had sufficient autonomy to operate without an RF link," he says. "One team maintained communications via cell-phone, but its platform wasn't stable enough to reach an effective observation post. Consequently, we never ran the target area scenario, so it is unknown whether any team's submission had the endurance or sensor performance to pull it off."
The take-away? It's harder than it looks. McCormick says: "The main lesson learned is that even though small UAVs and cameras are readily available off the shelf, it takes more than that to deliver something practical for use in the field."
Congratulations, nonetheless, to Team Halo and its 'Y6'-configuration hexacopter UAV. Notably, the team's design ranked highest in the independent manufacturability assessment that was an integral part of the UAVForge competition, scoring 27 out of a possible 30 points. This would have been key had the program moved to the next planned step, producing 15 of the winning UAVs to participate in a real military exercise.
Darpa has no plans (or funds) to stage another UAVForge competition, but several of the participants are keen on a rematch next year and are trying to find sponsors and venue.
And what happened with Singapore's Gremlion, my personal favorite going into the competition? Well the team made it to the finals, but came in last on points. It didn't fare too well in the manufacturability assessment...or the flyoff: