NASHUA, N.H. - Uncrewed vehicles for military and civilian use are garnering a lot of attention thanks to new technologies and abilities they afford to warfighters, command, commercial interests, and end-users, but their use and development have been going on for more than a century.
While armies and navies have worked to keep their warfighters out of harm's reach while inflicting damage on their enemies, the birth of systems we'd think of as uncrewed -- also known as unmanned -- goes back to the latter days of World War I.
The British military employed radio frequency (RF) technology to control the Hewitt-Sperry Automatic Airplane. The Brits demonstrated the Hewitt-Sperry, which was a biplane equipped with explosives, for their American allies, who developed their own radio-controlled (RC) aircraft. Neither country employed their RC flying bombs before Armistice Day in 1918, but the foundation for uncrewed aircraft, ground, and sea vehicles had been laid.
In the intervening century, all manner of developments were made across the air, land, and sea spectrum in the nascent uncrewed platforms -- including remotely-piloted aircraft for nuclear testing and for reconnaissance purposes -- but their use arguably hit public consciousness the most when the United States utilized uncrewed aerial vehicles (UAVs) in the early days of its War on Terror following the attacks of 11 Sept. 2001 to deliver missile strikes on targets.
On their own
Uncrewed vehicle capabilities have grown to where humans piloting them are being taken out of the equation and the machines are able to act autonomously. Whether performing underwater reconnaissance, transporting passengers through the air, or warfighters to the front, the possibilities for autonomous vehicles are near endless.
Vehicle autonomy is made possible through artificial intelligence (AI) technology. The air, land, and sea vehicles need to operate not only with human safety in mind, but also with self-preservation in mind. The technology can sufficiently go from Point A to Point B, but what happens if there's an unexpected object in the way? Why go through the trouble of taking the decisions out of the hands of humans to begin with?
Human brains are remarkably adept at solving problems. After all, humans are among of the few animals that worked out how to develop and use tools -- the only to meaningful way to alter nature beyond picking up and using what is nearby like rocks or sticks. Why take all that brainpower away from people to make sure a UAV doesn’t run into a tree, or to enable an unmanned ground vehicle to avoid a boulder?
For the military, uncrewed vehicles, also called "drones," are a good way to reduce risk and expense. Putting pilots in the air is inherently risky, as is the crushing depth of the ocean.
Drones also avoid the biological realities of human operators and can operate for extended periods without the need for rest, food, or other human necessities. This enables longer mission durations and increased range compared to manned vehicles.
Drones equipped with surveillance capabilities can provide persistent and real-time monitoring of areas of interest. This is valuable for intelligence gathering and situational awareness. Without a crew to deploy, uncrewed vehicles can also be rapidly sent out to aid missions.
Uncrewed vehicles may result in cost savings in terms of training, personnel, and infrastructure. They eliminate the need for crew facilities, life support systems, and other components necessary for human presence. This staff reduction also can be seen in commercial aviation as the industry grapples with a pilot shortage of more than 17,000 in the previous year.
Of course, all of the promised benefits of autonomous vehicles will be for naught if they can't operate safely in all manner of conditions and to be trusted to do so with expensive materiel, and most of all, human lives.
Electronic eyes
The U.S. Federal Aviation Administration (FAA) in Washington requires that all aircraft flying within the U.S. National Airspace System (NAS) must "remain well clear" of other aircraft. With a pilot in the cockpit, they use their vision and technology to ensure safe operation. With autonomous drones, sensors and associated technology must act as the eyes and decision-making center to keep the vehicle moving safely. This is done with detect-and-avoid (DAA) systems, which the FAA mandates have an "equivalent level of safety, comparable to see-and-avoid requirements for manned aircraft."
Andrew Baker, principal systems engineer at Honeywell Aerospace Technologies in Phoenix notes that most DAA systems use passive visual [camera], microwave [radar], light wave [lidar] and ultrasonic technologies to keep the vehicle out of harm’s way.
Radar DAA sensors detect the presence and location of other aircraft, obstacles, or terrain. Radar can provide information about the distance, speed, and direction of potential threats.
Light detection and ranging (lidar) sensors use laser beams to measure distances and create detailed three-dimensional maps of the surroundings. Lidar is effective for accurate obstacle detection and mapping the environment.
Optical cameras capture visual information about the surroundings, helping the DAA system recognize objects and make decisions based on visual data. Computer vision algorithms are often employed to analyze camera feeds.
Sensor data is then analyzed by sophisticated algorithms to determine the potential threats or obstacles in the vehicle's path. These algorithms take into account factors such as distance, speed, heading, and trajectory of other objects.
Decision-making algorithms assess the level of risk of detected obstacles and generate avoidance strategies. These strategies may include altering the vehicle's course, adjusting altitude in the case of aircraft, or slowing down.
“Passive visual [sensors] have excellent range and resolution, however at the expense of lighting conditions,” Honeywell’s Baker says. “In nighttime or bad weather, optical is not going to work well. Radar and lidar work excellent in the dark, but you lose resolution that a camera would provide. Ultrasonic works well in the dark but has very limited range. It is best for proximity detection. There is no one sensor that does all. The environment and uses cases will determine what sensor is best. Ultimately using a passive visual with radar will provide the best of all worlds.”
DAA systems often incorporate communications capabilities to exchange information with other nearby vehicles or air traffic control. This communication helps coordinate movements and avoid conflicts in shared airspace.
"ADS-B [Automatic Dependent Surveillance-Broadcast] is another piece of data which is highly used," Baker says. "While UAVs cannot yet broadcast on ADS-B, we can receive and use this information along with the sensor to achieve greater accuracy. In general, DAA sensors are used for non-cooperative traffic, which are entities that are not transmitting their locations."
Working it out
While lidar, radar, and vision technology that make up the DAA systems are mature enough for some uncrewed vehicles to operate now, the tricky question is how to integrate UAVs safely into the NAS.
"The FAA, NASA, and other partner agencies, including industry are currently defining those data exchange requirements," says Honeywell's Baker. "UTM [unmanned traffic management] is the ecosystem for uncontrolled operations that will complement the FAA’s air traffic management (ATM) system. Existing RF ground control station technology can be used, but secure LTE communications are being evaluated. A common concern is security and encryption of these networks for unmanned systems."
In preparation for air taxis and other aircraft flying passengers in and out of airports, the National Aeronautics and Space Administration (NASA) and industry partners are working with the FAA to demonstrate how creative use of existing tools and airspace procedures can support safe integration of air taxi operations into the national airspace. Robust DAA systems are crucial in scaling up the nascent UAM industry.
“Most sensors not only make detections but generate track information of objects that are moving,” Baker says. “It is through the track information that determines whether an object is stationary or moving. Cameras, Lidars, and radar all have this capability built into their software. Due to the resolution of cameras, they can go one step further and classify what that object is. The Honeywell radar can track up to 30 objects at once.”
An air traffic management integration simulation developed by NASA's Ames Research Center and Joby Aviation in Santa Cruz, Calif., will provide useful air traffic controller data to the FAA and industry for integrating these aircraft into operations.
Conducted at NASA’s Future Flight Central, a high-fidelity virtual tower facility providing a real-time simulation of an airport with a 360-degree view, the activity involved a team of NASA and Joby engineers, pilots, and air traffic controllers. The simulation focused on traffic patterns at Dallas Love Field (DAL) and Dallas-Fort Worth (DFW) representing intricate and bustling airspace conditions.
In the simulation, teams of controllers virtually tested the feasibility of integrating as many as 120 electric vertical take-off and landing (eVTOL) aircraft operations -- either arrivals or departures -- per hour from DFW’s Central Terminal area, alongside existing airport traffic. At the peak of the activity, up to 45 simulated eVTOL aircraft were concurrently airborne in DFW’s Class B airspace.
"Working alongside our NASA colleagues, we have now demonstrated in a real-world simulation how air taxi operations can take place in today’s airspace system, alongside active airport traffic, using tools and procedures currently available to air traffic controllers," says Tom Prevot, air taxi product lead at Joby. "These successful simulations were made possible by years of careful planning and collaboration between two organizations committed to redefining what is possible, and we’re proud to be paving the way towards the scaled commercialization of air taxis in the National Airspace System."
Related: AFWERX, NASA collaborate to develop digital Advanced Air Mobility operations center
NASA's initial assessment of the simulation suggests that the procedures developed for operating eVTOL vehicles could be scaled for implementation in airports across the country. This scaling has the potential to alleviate the workload on air traffic controllers. NASA intends to release a comprehensive analysis of the simulation results in 2024. The newly generated data will be shared with the FAA, the commercial industry, and airports to assist in identifying tools and procedures for air traffic controllers. These tools and procedures aim to facilitate the integration of eVTOLs into current and future airport operations at a high tempo. The envisioned future use of eVTOLs as a taxi service for passenger transportation to and from airports holds the promise of reducing carbon emissions and significantly enhancing the overall commuting experience for passengers.
"This simulation validates the idea that we can find a way to safely integrate these vehicles into the airspace at scale," says NASA researcher Ken Freeman.
While American government agencies and eVTOL industry experts work to integrate next-generation technology into an existing structure, UAM is taking off elsewhere. Early this year, Chinese eVTOL aircraft company EHang announced that it was in the first urban air mobility (UAM) to carry paying passengers. Its aircraft, the EH216-S, carried passengers on a pre-planned route that was surveyed by the company. In the event of an experienced anomaly, the aircraft's DAA system will reroute to another pre-approved landing site.
While advancements in this field have primarily concentrated on future civil and commercial airspace navigation, military applications are crucial for ensuring the secure passage of military UAS through the NAS and over international waters, minimizing the risk of collision with other aircraft.
On the ground
Like autonomous aircraft, self-driving ground vehicles use a suite of sensors to analyze information at a breakneck speed to sense, think, and react. The Society of Automotive Engineers (SAE) defines six levels of sophistication for autonomous vehicles, from zero to five.
For consumers, level two autonomy could be active cruise control where the vehicle slows itself and speeds back up automatically. SAE categorizes a level five vehicle as having no human input. To achieve this, many more sensors will be required. Simply put, the more autonomous a vehicle, the more data it needs to collect, sort, and prioritize.
Siemens in Munich notes that level two vehicles require approximately 17 sensors, while a level five vehicle will have around 50, including ultrasonic, surround camera, long- and short-range radar sensors, long-range and stereo cameras, LiDAR, and dead reckoning sensors.
Stephan Heinrich from commercial car manufacturer Lucid Motors in Newark, Calif., estimates a low-level vehicle's system's sensors will produce 3 gigabits of data per second or 1.4 terabytes per hour. Siemens extrapolates that at high levels of autonomy, sensors will deliver approximately 40 gigabits per second, or 19 terabytes per hour.
Whether in commercial or military vehicles, DAA systems for autonomous vehicles (AVs) will need to employ edge computing with a massive amount of processing power as offloading computing to operate safely.
"While AVs are the main subject of discussion regarding the evolution of how we get from point A to point B, other technologies may prove even more vital. Vehicular communications technologies, known broadly as vehicle-to-everything (V2X) communication, will form the foundation of future mobility systems," writes Piyush Karkare, director of global automotive industry solutions at Siemens Digital Industries Software for his company in a piece titled The Data Deluge: What do we do with the data generated by AVs? "V2X technologies, such as vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I), enable real-time communication between vehicles, and between vehicles and their surroundings to provide a safe, convenient, connected, and affordable mobility experience. Critically, these real-time communications allow vehicles to interact with each other and with the environment directly, in a way that humans cannot replicate. Such interaction can improve road safety, avoid traffic congestion, and reduce fuel consumption to enhance the overall mobility experience."
Karkare continues, "Here is a use case: an autonomous vehicle recognizes a fallen tree in the road and applies the brakes to avoid a collision. The vehicle can simultaneously warn vehicles behind it to decelerate, ensuring all vehicles safely come to a stop. The leading vehicle can even inform the local network of vehicles about the tree, allowing them to avoid that specific road until it can be cleared. To make this vision a reality, automotive manufacturers and suppliers are working together to develop vehicular communications based on cellular networks. The technology can use today’s mobile network and future 5G networks, enabling transmission times in the millisecond range."
The U.S. Department of Defense (DOD) tapped Kodiak Robotics Inc. in Mountain View, Calif., to build a test vehicle equipped with the company's autonomous system dubbed the Kodiak Driver. Built into a Ford F-150 half-ton pickup truck, the Kodiak Driver-equipped vehicle is designed to handle complex military environments, diverse operational conditions, and areas with degraded GPS, as well as off-road variables like rocks, dust, mud, and water. The Kodiak Driver also provides the Army the ability to remotely operate vehicles when necessary.
The Kodiak Driver is a vehicle-agnostic autonomous system that runs the same software as Kodiak's autonomous long-haul trucks and features Kodiak DefensePods, an adapted version of Kodiak's modular, swappable SensorPods, designed for defense applications. A technician can swap out a DefensePod in the field, which the company says can be done in 10 minutes or less, with no specialized training required.
"Kodiak's new autonomous vehicle shows the maturity and portability of our autonomous system, which we call the Kodiak Driver," says Don Burnette, Founder and CEO, Kodiak. "We have built a comprehensive autonomous system that can be integrated into any vehicle, from a Class 8 truck, to a pickup, to a next-generation defense vehicle. Integrating Kodiak's technology into an off-road capable vehicle shows the potential for commercial and dual-use technology to revolutionize national security, just as the Department of Defense is looking to ramp up its focus on autonomous technology. We are proud to support the military and look forward to the day that Kodiak Driver-powered vehicles can provide the U.S. military with more mission options and technical superiority, all while keeping our servicemen and women out of harm's way."
Down deep
Autonomous underwater vehicles (AUVs) are similar to their airborne and land technology cousins in needing lots of data to operate, but due to their operating environment, differences abound.
Like surface and air drones, AUVs are designed for various military applications, including reconnaissance, surveillance, mine countermeasures, and environmental monitoring, while the vehicles see heavy use in natural resource extraction to map the sea floor.
Unique to AUVs compared to their above-surface cousins are sensors seen on other maritime equipment, including surface and submarines like inertial measurement units, Doppler velocity logs, depth sensors, and magnetometers.
In December, the U.S. Navy accepted delivery of its first Extra Large Unmanned Undersea Vehicle (XLUUV) test asset system from the Boeing Company in Arlington, Virginia. The autonomy-capable craft, dubbed "Orca," is a new class of autonomous submarine that can perform long-duration critical missions in changing environments and contested waters. The Orca XLUUV is a cutting-edge, autonomous, unmanned diesel-electric submarine with a modular payload section to execute a variety of missions critical to enhancing the Navy’s undersea prowess. Configured to accommodate various payloads, the Orca XLUUV allows for the seamless integration of sensors, communication systems, and other mission-specific components.
"This has been a very busy year for the XLUUV team and their hard work is culminating in delivery of the Navy’s first-ever unmanned diesel-electric submarine," says Capt. Scot Searles, program manager of the Unmanned Maritime Systems (PMS 406) program office. "We look forward to continued success with our Boeing teammates in fielding this important capability for the warfighter."
Jamie Whitney
Jamie Whitney joined the staff of Military & Aerospace Electronics and Intelligent Aerospace. He brings seven years of print newspaper experience to the aerospace and defense electronics industry.
Whitney oversees editorial content for the Intelligent Aerospace Website, as well as produce news and features for Military & Aerospace Electronics, attend industry events, produce Webcasts, oversee print production of Military & Aerospace Electronics, and expand the Intelligent Aerospace and Military & Aerospace Electronics franchises with new and innovative content.