There was a time when sensors bolted onto platforms for one a single use like infrared surveillance, radar coverage, or signals intelligence. These one-purpose sensors most often were designed on proprietary closed-system technology and were difficult, if not impossible, to blend with other sensors.
Today it’s different. Sensors on platforms like unmanned and manned aircraft, ground vehicles, surface ships, and submarines often are designed from the ground-up to work together with different sensors. Modern open-systems sensors sensor-processing and networking hardware, moreover, is combining with open-systems software to make sensor interoperability a dominant trend among systems designers.
The growing ease of blending a wide variety of sensors is giving rise to new generations of integrated packages with small size, weight, and power consumption (SWaP); unprecedented computer and signal-processing power; and the flexibility to add new sensors to the mix with a minimum of additional integration work.
“We are seeing a paradigm shift,” says John Bratton, product marketing director at embedded computing specialist Mercury Systems in Andover, Mass. “In the past, sensors were ad-hoc. Now we see interoperability between suppliers, platforms, and domains.”
Benefits of interoperability
“Interoperability is one of the growth areas that I think is interesting,” says Shaun McQuaid, director of product management at Mercury Systems. “Today we are seeing multifunction systems, or those that have one large, wide aperture, and do radar processing, some jamming, electronic attack, or electronic countermeasures. The actual application is a hot-swap for different needs.”
As good as it sounds, however, sensor and sensor-processing interoperability brings with it a special set of design challenges — namely the imperative to deal with a lot of data very quickly. “It results in much larger data flows from the aperture to the processors, and among different processing elements that are doing different things with sensor fusion going on,” McQuaid says.
As an example, McQuaid points to future generations of persistent-surveillance blended radar and electro-optical sensor systems. “On the radar side there is a desire for 360-degree coverage, which requires a whole bunch more data,” he says. “This open-architecture approach is showing some real growth. Electro-optical and infrared sensors are showing growth because as the RF spectrum becomes more crowded, and as jamming becomes more effective, we can fuse E/O and IR with the radar systems to confirm targets.”
Interoperability among sensors and processing is becoming the norm, rather than the exception. “In the future sensors will have to talk to each other,” says Doug Rombough, vice president of business development at persistent surveillance expert Logos Technologies LLC in Fairfax, Va. “We need to put multi sensors in one package or on one platform and allow them to be paired together and make it as autonomous as possible.”
Interoperability solutions
Sensors and sensor processing today handle more data than ever before, and that trend almost certainly will continue. “I/O bandwidth between the sensor and among processors continues to grow,” says Mercury’s McQuaid. Making components interoperable to take advantage of the latest powerful commercial off-the-shelf digital signal processors is one way that systems integrators are doing this.
Designers at North Atlantic Industries in Bohemia, N.Y., have come up with a framework for blending off-the-shelf hardware and software in sensor and signal processing applications. It’s called the Configurable Open System Architecture, or COSA for short.
“We deliver commercial off-the-shelf I/O, communications, simulation, and measurement,” says Lino Massafra, vice president of sales and marketing at North Atlantic. “If I have a system and a payload on one platform, and one payload on another platform, it makes it easy to go from one platform to another without a lot of software development.”
COSA represents a modular portfolio of rugged embedded smart modules, I/O boards, single-board computers, power supplies, and ruggedized systems, all pre-engineered to work together and be easily changed or reused in the future.
The architecture uses field-programmable gate arrays (FPGAs), and systems on chip to help engineers create smart modules for configurable mission systems rapidly while reducing or eliminating embedded computing overhead.
COSA enables systems designers to select components, and customize them in modular fashion by selecting from more than 70 high-density I/O, communications, measurement and simulation, and smart-function modules. Designers can place OpenVPX, VME, CompactPCI, and PCI Express boards into rugged systems ranging in size one module to high-density systems supporting as many as 10 motherboards and 60 smart modules.
“Designers can build multiple payloads on the same platform without re-designing the box with interfaces with those payloads,” Massafra says. “We play in any kind of I/O communications under 250 kHz. Among the projects using COSA is the NASA Lockheed Martin Quiet Supersonic Technology (QueSST) experimental aircraft, Massafra says.
Industry standards
The rise of, and growing adherence to, open-systems industry standards is helping gain momentum for interoperable technologies. “People more and more are going with an Ethernet interface,” for example, says David Pepper, senior product manager at embedded computing specialist Abaco Systems in Huntsville, Ala. “With multiple camera inputs youare starting to see a need for 10-gigabit Ethernet. Interoperability comes down to a ubiquitous interface like Ethernet.”
Another open-systems industry standard lending itself to interoperability is OpenVPX, Pepper says. “Open systems are driving us to OpenVPX,” he says. “It’s not as ubiquitous as Ethernet, but if a designer uses a computer board, he probably will ask for OpenVPX these days. That’s kind of the terms of interoperability: the data planes and backplanes I know of can interoperate with a number of boards from a number of different vendors.”
Even with the notoriety of Ethernet and OpenVPX as technology drivers for interoperability, one of today’s the most widely anticipated and talked-about industry standards is the embryonic Sensor Open Systems Architecture (SOSA), which seeks to address issues such as affordability, versatility, and capabilities, as sensor systems increase in number, applications, cost, and complexity. SOSA seeks to make sensor systems rapidly reconfigurable and reusable by as many systems designers as possible.
The SOSA Consortium, sponsored by the Open Group in San Francisco, enables government and industry to develop open standards and best practices collaboratively to ease and accelerate deployment of affordable, capable, interoperable sensor systems. The consortium is creating open-system reference architectures that employ modular design and use widely supported, consensus-based, nonproprietary standards for key interfaces.
“When I think SOSA, I think about a capability like a mission processor,” says Abaco’s Pepper. “Those interfaces will be clearly defined, like a 38999 connector. I want high assurance that I can unplug this cable from this box and plug it into another box, and it will work.”
Pepper cautions, however, that despite enthusiasm for SOSA in the sensor and sensor-processing industry, the standard “is still a work in progress.” Still, companies are introducing products that seek to meet emerging SOSA guidelines, and market those products as “SOSA-aligned.”
The influence of SOSA
Mark Littlefield, defense vertical product manager at Kontron America in San Diego, told an embedded computing conference earlier this year that SOSA may be the best thing in long time to enable the U.S. military to specify embedded computing systems that are economical, powerful, upgradable, and competitive. Littlefield has been a key member on several standards committees that promote military use of commercial off-the-shelf (COTS) components and subsystems.
SOSA will create “an open-systems architecture for defense that works,” Littlefield said last January in a presentation at the 2019 VITA Embedded Tech Trends (ETT) conference in San Diego. SOSA will enable military embedded systems designers to create new system, and make substantial upgrades to existing systems, in a fraction of the time that today’s open-systems standards allow. “Today this takes months to years, and this could be reduced to weeks to months,” Littlefield said.
Several embedded computing companies have introduced SOSA-aligned products — among them Kontron, Pentek Inc. in Upper Saddle River, N.J.; Elma Electronic in Fremont, Calif.; and Annapolis Micro Systems Inc. in Annapolis, Md. Additional companies are expected to launch SOSA-aligned products over the next few months.
Among the big advantages of SOSA is its reduction of OpenVPX circuit card slot profiles essentially to three — two basic profiles and a secondary profile for 3U and 6U OpenVPX, Littlefield points out. SOSA “is a complete common infrastructure for developing and maintaining sensor systems over several generations,” Littlefield says.
An answer to critics who say SOSA guidelines may be too limiting, and threaten to reduce powerful enabling technologies to commodities, Littlefield said “I would trade commoditization for something that doubles or triples our market.”
The promise of SOSA cannot be overestimated. Whereas previous generations of sensors and sensor-processing were stand-alone systems, in SOSA “sensors become a plug-in affair,” says Mercury’s Bratton. “SOSA is ahead of the curve, and we are spending a lot of time working with SOSA and looking at its roadmap.”
North Atlantic’s Massafra points out that his company’s COSA architecture “enables customers to create systems that lend themselves to the SOSA-aligned systems that support the changes in profiles that go from platform to platform.”
Enabling technologies for interoperability
Capitalizing on open-systems industry standards represent one enabling technology for sensor and sensor-processing interoperability. Logos Technologies is taking a different approach. “We have a customized processing system that enables co-visualization, or the ability to see three sensor modalities on one computer screen,” says Logos’s Rombough.
The company’s RedKite-I WAMI sensor for platforms like the Boeing-Insitu Integrator small unmanned aerial vehicle (UAV) combines a hyperspectral imager that blends many different electro-optical spectra, and a narrow-field high-definition sensor to zoom-in on areas of interest.
The sensor, requires unique processing because all video processing is done on-board, Rombough says. “The processor used for this sensor weighs less than two pounds, uses less than 50 Watts of power, and can process 1 billion spectra per second, Rombough explains.
The processor, called the Multi-Modal Edge Processor (MMEP), uses off-the-shelf general-purpose processors, FPGAs, and GPGPUs. Data comes in on the FPGA, and we can kick it over to the CPU or GPGPU to make the most of our throughput, keep the cost of the sensor down, and keep up with technology.” The MMEP does the processing on all four different sensor payloads that Logos has developed for the Integrator UAV.
John Keller | Editor-in-Chief
John Keller is the Editor-in-Chief, Military & Aerospace Electronics Magazine--provides extensive coverage and analysis of enabling electronics and optoelectronic technologies in military, space and commercial aviation applications. John has been a member of the Military & Aerospace Electronics staff since 1989 and chief editor since 1995.