Making sense out of chaotic skies

Feb. 18, 2025

As the skies and battlefields grow increasingly crowded, the ability for crewed and uncrewed vehicles to avoid collisions and operate safely in dynamic environments has become a cornerstone of modern vehicle design. Detect-and-avoid systems are instrumental in operational success in military and commercial sectors on the ground, in the air, and at sea. These systems combine advanced sensors, embedded computing, and signal processing to provide real-time situational awareness and autonomous decision-making with and without people in-the-loop.

The rise of uncrewed vehicles has reached a stage where autonomous systems are replacing human operators, allowing machines to perform complex tasks independently. From underwater reconnaissance to aerial passenger transport and battlefield logistics, the potential applications for these autonomous vehicles are vast.

Artificial intelligence (AI) underpins vehicle autonomy, enabling operations across air, land, and sea with human safety and self-preservation in mind. While these systems are adept at navigating from Point A to Point B, challenges arise when unexpected obstacles appear. This begs the question: why shift decision-making from human operators to machines?

Humans possess extraordinary problem-solving abilities, driven by their capacity to innovate and use tools to alter their environment. Yet, for tasks like ensuring a drone avoids trees or enabling an unmanned ground vehicle to bypass obstacles, autonomy offers clear advantages.

In military contexts, uncrewed vehicles -- commonly referred to as drones -- reduce risk and cost. Human-piloted missions, whether in the air or deep underwater, inherently carry dangers. Drones eliminate these risks while also overcoming the physical limitations of human operators. Autonomous systems can perform extended missions without requiring rest, food, or other biological necessities, offering greater endurance and operational range compared to manned vehicles.

In the military, detect-and-avoid technology plays a critical role in ensuring mission success and protecting assets. Uncrewed aerial vehicles (UAVs) rely on sophisticated algorithms to navigate hostile environments, while fighter jets use advanced avionics to avoid collisions in complex combat scenarios. These systems must operate seamlessly, often in contested and GPS-denied environments, where accuracy and speed are paramount.

At the core of these capabilities are embedded computing systems and signal processing technologies. Embedded systems enable real-time data integration from multiple sensors, allowing vehicles to make use of incredible amounts of data to make split-second decisions. Meanwhile, signal processing filters and interprets this data, ensuring reliable detection of threats in cluttered and fast-changing environments.

The impact of these innovations extends beyond military applications. In commercial aviation, detect-and-avoid systems safeguard millions of flights annually, while in the burgeoning advanced air mobility (AAM) industry, they facilitate autonomous deliveries -- and soon -- flights across crowded urban environments. The automotive and maritime sectors are leveraging similar technologies to push the boundaries of autonomy and safety.

Making decisions

The process begins with sensors that detect obstacles, other vehicles, or potential hazards in the environment. In many cases, a mix of myriad sensors are used to build as complete an operational picture as possible and to provide redundancy. Andrew Baker, principal systems engineer at Honeywell Aerospace Technologies in Phoenix notes that most detect-and-avoid systems use passive visual [camera], microwave [radar], light wave [lidar] and ultrasonic technologies to keep the vehicle out of harm’s way.

"Passive visual [sensors] have excellent range and resolution, however at the expense of lighting conditions," Honeywell’s Baker told Military + Aerospace Electronics in 2024. "In nighttime or bad weather, optical is not going to work well. Radar and lidar work excellent in the dark, but you lose resolution that a camera would provide. Ultrasonic works well in the dark but has very limited range. It is best for proximity detection. There is no one sensor that does all. The environment and uses cases will determine what sensor is best. Ultimately using a passive visual with radar will provide the best of all worlds."

Detect-and-avoid systems often incorporate communications capabilities to exchange information with other nearby vehicles or air traffic control. This communication helps coordinate movements and avoid conflicts in shared airspace.

"Most sensors not only make detections but generate track information of objects that are moving," Baker says. "It is through the track information that determines whether an object is stationary or moving. Cameras, Lidars, and radar all have this capability built into their software. Due to the resolution of cameras, they can go one step further and classify what that object is. The Honeywell radar can track up to 30 objects at once."

"ADS-B [Automatic Dependent Surveillance-Broadcast] is another piece of data which is highly used," Baker says. "While UAVs cannot yet broadcast on ADS-B, we can receive and use this information along with the sensor to achieve greater accuracy. In general, detect-and-avoid sensors are used for non-cooperative traffic, which are entities that are not transmitting their locations."

In crewed aircraft, detect-and-avoid systems often broadcast information regarding the location and altitude to other nearby aircraft. Relatively lower tech than automated detect-and-avoid systems, the Traffic Collision Avoidance System (TCAS) will provide pilots with suggestions to climb or descend to encroaching into another aircraft's airspace.

The Automatic Ground Collision Avoidance System (Auto-GCAS) also keeps an eye on altitude and location, but provides, as the name implies, an automatic response from the aircraft, to avoid crashing into terrain.

This technology, made up of a system that uses precise navigation, aircraft performance, and on-board digital terrain data, is used on American military aircraft like the F-35 and F-16, and was developed by Lockheed Martin's Skunk Works division, the Air Force Research Laboratory, and the National Aeronautics and Space Administration (NASA).

According to Ed Griffin, the Lockheed Martin Skunk Works program manager for the Automatic Collision Avoidance Technologies (ACAT) Fighter Risk Reduction Program, the system consists of a set of complex collision avoidance and autonomous decision-making algorithms that use precise navigation, aircraft performance and on-board digital terrain data to determine if a ground collision is imminent. "If the system predicts an imminent collision, an autonomous avoidance maneuver—a roll to wings-level and +5g pull—is commanded at the last instance to prevent ground impact," Lockheed Martin says in its description of Auto-GCAS.

Since the roll-out of the Auto-GCAS system began in 2014, Lockheed Martin says that it has been credited with saving 13 pilots and 12 aircraft. "Based on the data we’ve seen so far, the Auto GCAS is doing exactly what it was designed to do: save priceless lives and valuable military aircraft," said Griffin. "Many aviation professionals believe autonomy is emerging as the new frontier in aviation and Auto-GCAS currently represents the leading edge of autonomy as it applies to manned platforms."

On the ground

When power demands, mass, or size constraints of the vehicle will not allow the compute power necessary to handle all of the sensor processing on-board, uncrewed vehicles can be assisted by off-board, ground-based detect-and-avoid (GBDAA) systems. Like edge systems, GBDAA sift through sensor data, often including radar, which provides long-range detection of objects and tracks their position and movement and vision systems, which offer visual confirmation and help identify obstacles. Additionally, Automatic Dependent Surveillance-Broadcast (ADS-B) receivers detect and track ADS-B-equipped aircraft by capturing real-time position data broadcast from those aircraft. Some systems also use radio communication intercepts to gather information from pilots or other sources via traditional radio frequencies. Collectively, these sensors create a comprehensive picture of the airspace, identifying the positions, velocities, and trajectories of aircraft and other objects.

For operations in controlled airspace, GBDAA systems often coordinate directly with air traffic control (ATC). This ensures that conflict resolution aligns with broader airspace management and complies with regulatory standards. Such coordination is vital for maintaining safe and efficient airspace operations. GBDAA systems are particularly advantageous in military applications where unmanned aircraft systems (UAS) frequently operate in complex airspace environments. These systems enable operations in areas with limited radar coverage or air traffic services, facilitate coordination between unmanned and manned aircraft during missions, and enhance situational awareness for operators located in remote locations. By centralizing the detection and avoidance process, GBDAA systems can reduce costs and streamline operations for vehicles that do not have onboard detect-and-avoid capabilities.

As advanced air mobility (AAM) aircraft, also known as "flying taxis," look to takeoff in 2025, a mix of on-board and GBDAA systems may provide a level of safety necessary to operate in congested urban environments, says Jia Xu, CEO of SkyGrid, a Boeing company focused on integrating these aircraft into existing systems.

"SkyGrid’s ground-based detect-and-avoid is being developed to meet the required integrity and availability to satisfy the operational requirement to remain well clear of other traffic, even without a pilot onboard," Xu says. "In case a degradation of performance in our ground-based traffic surveillance is detected, the system will alert the operator in a way like how operators are notified of GPS unavailability today. We anticipate that a hybrid detect-and-avoid solution -- e.g., onboard and ground-based detect-and-avoid -- may provide adequate mitigation against the failure or performance degradation of either type of detect-and-avoid system. The system is designed to ingest and correlate data from non-cooperative sources that are resilient to the effect of GPS degradations. It should also be said that non-cooperative and independent traffic surveillance functions can also be used to support positioning and navigation independent of GNSS and onboard systems to directly mitigate the effect of GPS degradation and denial.

Xu explains that third-party, ground-based technology services for AAM like the ones SkyGrid provides will offer a common operating picture. "A ground-based third-party service can act as a provider of a high-fidelity and high-integrity digital model of the operating environment, shared among all airspace users. When operators can rely on the same data for aeronautical decision-making, the predictability of operations can be increased, and in the future certain aspects of decision-making can be automated."

In addition, Xu says that third-party services can reduce reliance on ATC through scalability, and provide redundancy to onboard detect-and-avoid technology.

Regarding efficiency, Xu remarks that "A third-party service provider can provide highly automated strategic deconfliction to airspace users. For example, in future AAM networks, third-party service providers such as SkyGrid will help operators plan and schedule flights in ways that prevent airborne conflicts and ground delays. This will be achieved by third-party service providers acting as highly automated systems connecting operators, aerodromes, and air navigation service provider."

Ground down

Like their airborne cousins, detect-and-avoid systems on ground vehicles crewed and uncrewed rely on data provided by numerous sensors utilizing vision systems and radio and light waves to ascertain what is going on in the environment around them. Depending on the scenario, land vehicles can be tasked with avoiding collisions with terrain, other vehicles, pedestrians, while keeping passengers safe traveling along a pre-planned route or on-the-fly.

Because ground vehicles will likely encounter many more obstacles than their airborne counterparts, cameras play a larger role in detect-and-avoid systems by enabling the recognition of objects, lane markings, and traffic signs. Lidar, on the other hand, creates a high-resolution 3D map of the surroundings using laser pulses, making it particularly effective in determining object distance and geometry. Radar systems provide accurate detection of object speed and distance, especially in adverse weather conditions, while ultrasonic sensors are instrumental in close-range obstacle detection, often during parking and low-speed scenarios.

Localization is an essential component of ground-based autonomous detect-and-avoid systems, enabling the vehicle to determine its precise position on the road. High-definition maps, enhanced global positioning system (GPS) technology, and inertial measurement units (IMUs) work together to achieve centimeter-level accuracy. Engineers must ensure that the system can handle discrepancies between map data and real-world conditions, such as construction zones or temporary road closures.

AI and machine learning are integral to self-driving systems. Engineers employ AI for object recognition, decision-making, and adaptive learning. Neural networks, for example, classify objects and predict the behavior of other road users, while reinforcement learning improves system performance through iterative simulations and real-world data analysis. The challenge for engineers lies in training these models to handle edge cases, such as rare or unpredictable scenarios, to enhance system robustness.

Self-driving technology is categorized into five levels of autonomy, from Level 0 -- no automation -- to Level 5, which is full vehicle autonomy under all conditions. Lower-level detect-and-avoid integration can include lane keep assistance and blind spot warning systems. Engineers must address challenges such as sensor calibration, computational load, software reliability, and system safety to achieve higher levels of automation.

Commercial breakthroughs are helping drive development in uncrewed military vehicles, Kevin O'Brien, technical director for the U.S. Army's Defense Innovation Unit autonomy portfolio, said in late 2022.

"There has been a revolution in the techniques and capabilities of uncrewed ground vehicles occurring in the private sector over the past two decades. We're eager to bring these matured technologies back into the Department of Defense where initial work was inspired by the DARPA Grand Challenges," O'Brien said, referring to the Defense Advanced Research Projects Agency's unmanned vehicle competitions.

Sea legs

A constant in detect-and-avoid technology is the integration of sensors in placing the vehicle in its surroundings and keeping it on mission, but what if there are virtually no landmarks to aid in decision-making? Waves and currents create dynamic conditions that require robust filtering and adaptive algorithms to distinguish between obstacles and natural water motion. Unlike land navigation, where fixed landmarks are common, maritime navigation relies heavily on GPS, inertial navigation systems (INS), and, in some cases, celestial navigation due to the lack of consistent reference points. Engineers must also account for weather variability, which can affect sensor performance, particularly for cameras.

One distinctive feature of seaborne detect-and-avoid systems is their need to navigate in three dimensions. While air vehicles also operate in 3D, maritime navigation is uniquely influenced by buoyancy, water currents, and the ability to operate underwater for submersibles. Engineers must incorporate sonar and hydrodynamic modeling for underwater navigation. Additionally, maritime vessels move slower than land or air vehicles, requiring advanced prediction and planning for obstacle avoidance due to their low maneuverability. While aircraft detect-and-avoid systems can takeover in must anticipate hazards far in advance to ensure timely course corrections.

Marine environments also feature a wide variety of obstacles, ranging from stationary ones like buoys and reefs to mobile ones like other vessels and wildlife. Submerged hazards, invisible to surface-based sensors, demand hybrid sensor arrays for effective detection. Furthermore, the constantly shifting conditions of waves and tides require real-time calibration and adaptive algorithms, as the environment beneath a vessel is always changing.

Unlike land and air systems, which rely on traffic signals or air traffic control, seaborne systems must comply with decentralized maritime laws such as the Convention on the International Regulations for Preventing Collisions at Sea, 1972 (COLREGs). These laws govern right-of-way, overtaking, and crossing paths, requiring nuanced algorithmic interpretation. Engineers also face communication challenges, as long-range VHF radio and satellite communications are critical for data sharing due to the absence of land-based infrastructure, making off-board computing assistance for detect-and-avoid more difficult.

Processing power

Whether on-board or off, the effectiveness of these systems depends on the quality of the sensors, communication infrastructure, and algorithms that underpin their design alongside the ability to process the signals. Of course, the ability for sensors on a vehicle to produce data is only useful if it can be processed and detect-and-avoid systems can act or make recommendations.

The large volume of data from these sensors necessitates the ability to process it in real time with nearly no margin of error. To do this, embedded computing systems enable vehicles to have the proper processing power to keep the mission on track and out of preventable danger. While detect-and-avoid systems operate in environments as varied as the crushing depths of the ocean up to the edge of space, the building blocks of the decision-making systems that keep them moving are often made with modularity in mind.

"Detect-and-avoid systems are designed to be modular and interoperable, enabling seamless integration with a variety of sensors, including radar, lidar, and electro-optical/infrared systems," says Aneesh Kothari, president of Systel, Inc. in Sugar Land, Texas. "We design with a modular open standards approach (MOSA), using industry standard interfaces, protocols, and computing architectures to ensure compatibility with existing sensor and communication architectures.

Like everything in the world of aerospace and military electronics, size, weight, and power (SWaP) concerns are top-of-mind in uncrewed technologies, Kothari says.

"SWaP optimization is vital for uncrewed platforms where space is at a premium and weight savings can be measured in ounces. Systel has led the market, bringing innovative products like Sparrow-Strike -- launched in 2024 -- to market to meet the emerging demands of next-gen uncrewed platforms. Sparrow-Strike is an ultra-small form factor (USFF) MIL-SPEC rugged edge compute solution, integrating the NVIDIA Orin Jetson NX embedded edge AI processing module or an Intel x86 based CPU," Kothari says.

Of course, processing a torrent of sensor data necessary to keep autonomous vehicles moving draws a lot of power -- and creates a lot of heat. William Pilaud, Chief Solutions Architect at LCR Embedded Systems in Norristown, Pa., says that managing thermal performance in next-generation ruggedized systems pushes the limits of traditional conduction-cooled systems.

VPX modules, widely used in high-performance systems, are reaching the thermal limits of conduction cooling, which has traditionally relied on VITA 48.2 standards. “We’re stuck between a rock and a hard place,” Pilaud said, highlighting that components now often draw more than 150 Watts of power.

Emerging cooling solutions, such as air-flow-through (AFT) cooling in VITA 48.8 and liquid cooling in VITA 48.4, offer a path forward, according to Pilaud. Liquid cooling stands out for its efficiency. However, Pilaud acknowledged the practical difficulties, noting, "In the far future, you’re going to be dragged kicking and screaming to liquid."

Pilaud also discussed the growing adoption of VITA 48.4, which has seen significant momentum over the past year. He predicted further advancements as the industry works to address gaps in the current standards. Sidewall liquid cooling was mentioned as a potential retrofit solution for older systems, though Pilaud cautioned it may not match the performance of module-level cooling.

Designing VPX backplanes for sensor-heavy systems presents its own set of hurdles, Pilaud says. LCR Embedded Systems’ expertise in backplane development, dating back to the early 2000s, has been instrumental in addressing these issues. “LCR has a long track record with designing VPX backplanes…and has a proven track record of signal integrity for industry-standard interfaces such as PCIe," Pilaud says.

Key challenges include maintaining signal integrity despite minor differences in trace routing and limited space for input/output (I/O) within chassis. "Space for I/O is much more limited in the chassis,” Pilaud notes, adding that higher-speed sensors and parallel bus interfaces exacerbate the problem. While SOSA is improving standardization, Pilaud emphasized, "It’s not fully standardized at the moment."

About the Author

Jamie Whitney

Jamie Whitney joined the staff of Military & Aerospace Electronics and Intelligent Aerospace. He brings seven years of print newspaper experience to the aerospace and defense electronics industry.

Whitney oversees editorial content for the Intelligent Aerospace Website, as well as produce news and features for Military & Aerospace Electronics, attend industry events, produce Webcasts, oversee print production of Military & Aerospace Electronics, and expand the Intelligent Aerospace and Military & Aerospace Electronics franchises with new and innovative content.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!