DARPA LongShot takes a step towards putting unmanned vehicles in the thick of warfare by firing weapons
THE MIL & AERO COMMENTARY – Firing weapons from unmanned aerial vehicles (UAVs) is a relatively new and somewhat ticklish subject. U.S. military leaders typically are uncomfortable with enabling autonomous systems to shoot missiles or bullets without a human being in the loop who ultimately makes the decision to fire.
It has to do with the so-called "human-in-the-loop" doctrine in which military leaders want humans -- not machines -- making life-or-death decisions.
Unclear, however, is how long this human-in-the-loop requirement can stay in place, given the speed and ferocity that modern technology brings to the battlefield. Wait for a human to make a fire-or-no-fire decision, and valuable targets could vanish or move out of range, electronic defenses could be activated, or enemy weapons seemingly could come out of nowhere. Give a human the responsibility to fire, in other words, and crucial life-critical decisions could come too late.
It's not a direct violation of the human-in-the-loop doctrine to place weapons on unmanned vehicles; it's been done for quite a while. The Reaper UAV, for example, carries the GBU-12 Paveway II laser-guided bomb, the AGM-114 Hellfire II air-to-ground missile, the AIM-9 Sidewinder air-to-air missile, and the GBU-38 Joint Direct Attack Munition (JDAM).
Related: Top enabling technologies for the warfighter in the 2020s
Safety measures can be put in place to prevent the machine itself from pulling the trigger. Still, launching weapons from unmanned systems where there's no human actually there to put eyes on procedures and results is a step away from this long-held doctrine, and another step toward putting warfare in the hands of machines.
Now comes another development with the U.S. Defense Advanced Research Projects Agency (DARPA) LongShot project, which seeks to develop a UAV that is launched from aircraft, like a missile, but with the ability to deploy several of its own air-to-air weapons. The idea is to extend aircraft engagement ranges from beyond the reach of enemy weapons to reduce risks to manned aircraft.
The three companies working on the first phase of the LongShot project are General Atomics Aeronautical Systems Inc. in Poway, Calif.; Northrop Grumman Corp. in Falls Church, Va.; and Lockheed Martin Corp. in Bethesda, Md.
The LongShot aircraft, essentially, will be an unmanned jet fighter-bomber with missiles attached to hardpoints underneath the wings, on the fuselage, or possibly in internal weapons bays for enhanced stealthiness.
Military air superiority today relies on advanced manned fighter aircraft to provide a penetrating counter-air capability to deliver weapons effectively, DARPA officials say. The LongShot will enable piloted aircraft to fire the UAV from standoff ranges far away from enemy threats. The unmanned LongShot, meanwhile, can fly closer to enemy targets to increase precision, while keeping human pilots out of harm's way.
In later phases of the program, the LongShot project calls for building a flyable full-scale air-launched demonstration system capable of controlled flight, before, during, and after firing its weapons.
Would the LongShot UAV take a human out of the loop in making the decision on whether or not to fire weapons? Probably not -- at least in its early stages. Yet does this kind of unmanned aircraft move us closer to a day when machine automation and learning take a bigger role in crucial battlefield decisions? Maybe yes.
Look at artificial intelligence (AI) technology today. It doesn't represent human-quality thinking yet, but it's getting closer all the time. Ever-more-powerful general-purpose graphics processing units (GPGPUs) are making embedded parallel processing of supercomputer speeds a reality today. What about the new frontier of quantum computing that's under development right now?
It's not a far leap for trusted embedded supercomputers with the ability to reason and learn to move from complex logistics tasks to firing weapons from unmanned aircraft. Given the likelihood that AI technology will become more powerful and trusted in the near future, letting machines do the fighting most likely will be too great a temptation to ignore.
John Keller | Editor-in-Chief
John Keller is the Editor-in-Chief, Military & Aerospace Electronics Magazine--provides extensive coverage and analysis of enabling electronics and optoelectronic technologies in military, space and commercial aviation applications. John has been a member of the Military & Aerospace Electronics staff since 1989 and chief editor since 1995.