Aviation automation climbs new heights with ALIAS
The Aircrew Labor In-Cockpit Automation System is expected to complete its first fly-by-wire experiment as early as May.
The Defense Department's investment in artificial intelligence and battlefield autonomy starts in the lab, but not much if it stays there.
"This is all a part of operationalizing autonomy," Lt. Col. Philip Root, acting deputy director of the Defense Advanced Research Projects Agency's Tactical Technology Office, told FCW. "Those words really mean something to me. "
Autonomy, which Root called AI in practice, "means experimenting with [the application] until it actually becomes useful. That takes a different kind of commitment."
DARPA's Aircrew Labor In-Cockpit Automation System (ALIAS) project sets out to make autonomy the battlefield norm and is expected to complete its first fly-by-wire experiment led by Sikorsky in May or June. A demo will come later this fall.
The technology aims to improve flight safety and performance and reduce the number of onboard crew members with a customizable, drop-in, removable kit that would allow advanced automation to be easily added to existing aircraft. DARPA successfully tested the effectiveness of ALIAS' sense and avoid capabilities in 2016 with a Cessna 172G aircraft approaching an unmanned aerial system from multiple angles.
Fly-by-wire isn't the capability, but the mechanism needed for autonomy, Root said. Moreover, it'll be a first for Army aviation.
"We need the fly-by-wire to add a computer in the middle that helps and augments a human," he said. "Once we prove that works, now we can begin adding the autonomy flight controls -- operating in the background like a lane assist [feature in cars that helps] the human operator avoid a tree."
ALIAS expects to do its first zero-pilot test in early 2020 with an unmanned Black Hawk helicopter. If successful, the initiative would allow the military to make better use of pilots' time and get more use out of the aircraft.
Root described the scenario: "Two human pilots go on a dangerous mission, come back, and during the day that aircraft can be doing safe logistics runs, low-risk missions while the pilots sleep."
A single-pilot test was the most challenging to pull off, according to Root. There's no feedback loop for the pilot because co-pilot is a relatively invisible, silent machine.
"You now have a co-pilot that's not there, and [pilots] don't know how to rely on someone who's not there," he said. "How does a machine have the same contextual understanding of when to talk, so to speak, and what information is relevant? And how does the human pilot know when to trust the autonomy?"
The situation can be likened to adaptive cruise control that causes a car to automatically slow down when approaching another moving vehicle but doesn't slow down when the driver thinks it should.
ALIAS is working on this issue, even though Root admitted it might not get it right. "We could absolutely do this wrong. We could have an autonomous co-pilot that's supposed to allow the human pilot to do other things, and in actuality we reduce the effectiveness because the guy or gal is so concerned that they never relinquish control."
But despite humans' natural distrust for machines, ALIAS aims to bridge that gap by having pilots train with the technology, becoming familiar with it like they would with a new iPhone.
"Pilots, operators, Marines trust those things that work," Root said.
"Trust is two parts: You have to believe the system can actually deliver, and you have to see it deliver routinely. Those two things are separate," he said. "We can provide ability to trust a machine if we develop it from the ground up to foster that trust."
Whether that trust is earned remains to be seen.
"On ALIAS, the jury is still out" because there haven't been many tests, Root said. "Let's talk in a year, and I'll tell you exactly how it worked out."
Video: ALIAS Testing
NEXT STORY: Zero-trust networks to build on EIS