The number-one activity of the U.S. military is training — sharpening old skills, learning new ones, mastering new equipment, concepts of operation (CONOPs), and tactics, techniques and procedures (TTPs) — all to be at their best should they be deployed into combat.
Just before a new deployment, military leaders also conduct numerous mission-rehearsal exercises, some using simulators or virtual reality, others using physical “towns,” “villages” or natural environments.
With all this preparation, using some of the most advanced methods and technologies available, U.S. warfighters arrive at their designated stations overseas or aboard ship at the peak of their capabilities, honed to the highest degree.
But with each passing week of not using those skills — because the expected combat scenarios are on hold for some reason or whatever skills they are called upon to use are not what they trained for or leave the majority of their training on the shelf — their edge fades slightly. The longer they are away from training or involved in conflicts that call for only a small portion of their skills, the less impact their pre-deployment training has.
While efforts have been made to make training systems available in the field — especially aboard larger ships and at semi-permanent bases in combat zones —- these systems are not truly portable and require the warfighter to go to them, something not always possible.
“The big thing you hear from the military now is having a point-of-need requirement where you train at both fixed facilities and in the field,” says Darren Shavers, director of business development and foreign military sales for Meggitt Training Systems in Suwanee, Ga. “We’re working with the Army and Marine Corps to figure out how they get to having the training capabilities they need anywhere,” The enabling technology for that, he adds, is a more affordable instrumented or surrogate weapon to be used with a COTS head-mounted display.
“There are a number of field-deployable simulations out there, some of them are classified, but multiple different training on a wide range of devices and platforms is coming together so it can be used anywhere,” Shavers says. “The military — and not just the U.S. — is looking at more than just weapons. You need the total force to train together to have bloodless training battles before going into actual combat. That includes tanks, convoys, aircraft, etc. The Squad/Soldier Virtual Trainer [S/SVT] will enable everyone to work together on everything they have to do in the real world, so they can train on real equipment with the real people they will be working with in combat.”
Squad/Soldier Virtual Trainer
The S/SVT is an integrated Soldier/Squad simulation training capability for close combat, multi-domain battle in diverse complex operational environments, according to the U.S. Department of Defense (DOD) January 2019 Live Training Environment Statement of Need. The deployable simulation and training system provides training, qualification and certification for soldiers to conduct precision targeting with precision munitions, replicate, integrate unmanned aerial systems (UAS)/counter UAS operations, conduct clearance of fires and airspace de-confliction. It provides virtual training capability for weapon skill development and for Use of Force, to maintain requirements for federal law enforcement certification.
In March 2019, U.S. Army released the services’s Common Synthetic Environment (CSE) statement of need, which outlined the Synthetic Training Environment (STE) the Army sees as its future training capability.
The Synthetic Training Environment enables tough, iterative, dynamic and realistic multi-echelon/combined arms maneuver, mission rehearsal and mission command collective training in support of multi-domain operations, the Statement reads. The training environment will provide units the repetitions necessary to accelerate individual through unit skill and collective task proficiency resulting in achieving and sustaining training readiness. It provides complex operational environment representations anytime and anywhere in the world. The Synthetic Training Environment will deliver collective training, accessible at the Point-of-Need (PoN) in the operational, self-development and institutional training domains.
The focus is one interconnected training capability that provides a Common Synthetic Environment that delivers a comprehensive, collective training and mission rehearsal capability, the statement continues. The Common Synthetic Environment is composed of three foundational capabilities: One World Terrain (OWT), Training Management Tool (TMT) and Training Simulation Software (TSS). The Common Synthetic Environment enables the convergence of the live, virtual, constructive and gaming environments into the Synthetic Training Environment.
The Common Synthetic Environment, targeted for initial operational capability by September 2021 and full operational capability by September 2023, will provide the software, applications, and services necessary to enable and support next generation systems, including the Reconfigurable Virtual Collective Trainer, Soldier/Squad Virtual Trainer, and Live Training Environment.
The Common Synthetic Environment vendor will need to monitor the SSVT/Integrated Visual Augmentation System solution set and will have the requirement to integrate to the Adaptive Soldier Architecture. The Common Synthetic Environment vendor will need to expand the IVAS prototype (expected to be delivered November 20) to meet Synthetic Training Environment requirements for S/SVT to include TSS, TMT and OWT.
Upgrading technology
The Synthetic Training Environment will resolve shortcomings of the Multiple Integrated Laser Engagement System (MILES) that has been used to support direct-fire force-on-force training since 1980.
“MILES was a device used for training outside, but if I hide behind a bush, you can’t shoot through the bush, which is not realistic,” says Meggitt’s Shavers. “Or you had to use blanks or simulation, but now they want to know how to have more weapons and hiding behind something isn’t a block.”
The Army today cannot simulate realistic multi-domain operations training from soldier through brigade combat team at home station, Maneuver Combat Training Centers (MCTCs), or deployed locations in live training environments, which will be integrated into Synthetic Training Environment. MILES cannot replicate the ballistic trajectory of munitions, simulate a munition’s effect on impact or engage targets using indirect fire, such as artillery or mortars. As a result, only half of the small arms and munitions assigned to a light infantry platoon can be represented accurately in live force-on-force training. The same is true for 40 percent of brigade combat team weapons effects.
“We’ve been dealing with deployable systems for some time, but with improved computational power and new COTS technologies, we can provide high fidelity software in a much smaller footprint and reduced cost,” says Lenny Genna, president of the military training sector at L-3 Harris Technologies in Arlington, Texas. “The technology provides the capability to do a lot, but in some cases, you want tactile feel that isn’t fully there yet.”
Experts say there are several ways to expand this capability. “Not only are we providing the capability to train, but we also are embedding adaptive learning for mission rehearsal with technologies that can help determine how the students are doing,” says L-3 Harris’s Genna. “Our job is to be as hardware agnostic as possible, so when new technology comes along, we can still work with it. The intent is to be plugged in and set up in 30 minutes.”
The Synthetic Training Environment is being designed to simulate not only weapons effects at all ranges, but also the feel of each weapon’s discharge, enabling warfighters to have confidence in their training and mission rehearsals on deployment, before entering combat.
Combining live environment training with the Synthetic Training Environment ecosystem enables users to measure training goals against actual performance. “That’s a huge part of being able to collect that information and provide that information back to the soldier, not only objectively but also with their trainers so they have the objective and the subjective information together,” says Kevin Hellman, capabilities developer for the Synthetic Training Environment at the Army Combined Arms Center - Training (CAC-T) at Fort Leavenworth, Kan.
A significant change made possible by machine learning and artificial intelligence (AI) is personalized training, where the system analyzes the performance of individuals and units and adjusts the speed and level of training accordingly, ensuring each warfighter receives precisely the training he or she requires. The computer will determine, with high degrees of accuracy, what areas require additional training, intervention or acceleration for each warfighter and unit and quickly advise the instructor of action that needs to be taken.
Simulation cyber security
Such complexity and thorough knowledge of weapons systems and individual/unit warfighter capabilities, however, make the computers driving these trainers major targets for hackers. Thus the need for high levels of security, from the point-of-need to the cloud, is a major component of any advanced field-deployable high-fidelity simulation and mission rehearsal system.
“Right now one of the things we’re working through is security. The closer you push for mission rehearsal and high fidelity, the higher the need for protection. To practice team training, you need to share classified information over a network, so it is important the construction, the cyber security access, is very high so you don’t compromise the technology or the system,” L-3 Harris’s Genna says.
“Whether someone is stealing it or hacking into it to try to change it, the security is the same. You build in anti-tamper plans so they can’t be messed with,” Genna continues. “The fidelity you want and access to the cloud requires working through the ability to have it secure no matter where it is. I think we now have worked it out so the students have a device that is not classified but connects to classified data once hooked up to the cloud.”
Wearable technologies
Being field-deployable will mean using head-mounted displays and other wearable technologies that can blend real and computer-generated images using virtual reality, augmented reality, and mixed reality — collectively known as XR capabilities. As resolution and field-of-view continue to improve, XR systems are expected to become smaller and more deployable, enabling high-fidelity training in the field as well as reducing some of the demand on large, home-station trainers that then can be used for more advanced mission rehearsal.
One determination not yet been made is whether field-deployable simulation should be incorporated into or added onto real weapons and platforms or if the training device itself should be simulated.
“When you use a real weapon, training could damage it, but it is a lot better in terms of realism. However, a simulated weapon can be equipped with a variety of sensors to get a wide range of information on cant and squeeze and other things. If you try to do that on a real weapon, it can get in the way of what the warfighter is trying to do,” Meggitt’s Shavers says. “So there are pros and cons on both sides; a new trainee needs a lot of sensors, where an experienced warfighter does not. It’s still open which direction the military is going to take.”
L-3 Harris’s says he agrees, saying that other factors also come into play when deciding whether to use real weapons or simulations for training.
“When the assets are being used [in combat, for example], you have no capability to do training. Operating a separate simulation is low cost versus running a tank and the wear and tear on it. There are places where embedded makes sense, but the lower the cost of a deployable system and the lower footprint so you can take it wherever it is needed are key,” Genna notes. “We don’t just deliver tech for tech’s sake, but how it can best be used in training and mission rehearsal. By no means will it replace the actual system or instructors, but it is another tool in your toolkit.”
The aim is to make the trainee feel like he or she actually is on the battlefield. “The whole training thing is about immersion,” Genna continues. “When they put on a set of goggles for the F-18 [jet fighter-bomber], for example, they need high quality vision and the fidelity of the goggles. They want to be able to fly an F-18 simulator exactly the way they would the actual aircraft. How well the models behave and fly are extremely important to getting the proper training. We want the goggles to be contingent on proven resolution. It’s not where we want to get it yet. The other is haptic response, you want the same force feedback when you touch a button you get in the actual platform. We are working to get good haptics, even when wearing gloves.”
These advances also will influence multi-domain and multi-national training as improvements in wireless technologies and cloud computing enable broad-scale cyber-safe synthetic training environments for large-force virtual constructive training at scales not possible in the past. Developing them quickly and getting them into the field also relies on new approaches to acquisition and making warfighters an integral part of the process from the start.
“Other transactional authorities are being leveraged heavily now. That involves prototypes that allow them to not have as stringent a requirement for the device up front because they aren’t always clear about what technology is out there. How you get to what they want is open to interpretation,” Meggitt’s Shavers says.
“They then bring in the end users, warfighters, who are part of the review — what’s called a touchpoint — who tell the developers what they have right and what needs to be changed before the prototype is completed. The warfighters come in every month or two to see what changes have been made and if anything else is needed. So when the prototype is complete, it is exactly what the warfighter wants and you can move to production and deployment much more quickly than the normal acquisition process.”
Involving warfighters
Involving warfighters throughout the development process will ensure they receive more realistic training sooner and more current than in the past.
“The need to keep our warfighters as influencers is one of the most important aspects. The Marines and soldiers participating in this are a lot more involved and are looking forward to these devices,” he continues. “Another was going outside the typical military training channel to Microsoft and Hollywood and computer games. So we now look at how to do realistic games for the military, using technologies and resources that haven’t been used in the past. A lot of this has been used in the civilian world for years — now it is open for the military.”
Industry experts predict the amount of change brought about by rapid technology revolution will bring more change in the next five years than in the past two decades, improving the level and portability of training while reducing costs in a time of ever-tightening budgets. Leading those technologies are advancements in artificial intelligence, improved helmet-mounted displays, greater use of cloud computing (with accompanying increases in cyber security) and an integration of large-force virtual constructive training rather than using individual trainers not linked to each other, thus creating a more real-world training environment.
L-3 Harris’s Genna says he agrees this fast-paced development cycle for new technologies makes having the warfighter involved from the beginning all the more important.
“Warfighters are a key part of our development and deployment. When we are looking at where we want to spend our independent research and development, we bring in the warfighter to tell us what they need,” he says. “We have warfighter subject matter experts — former military and active reservists — on our payroll who can be the customer’s eyes throughout development. We also have active duty warfighters come in as needed and available. And they are crucial to ensure what we deploy meets actual requirements and expectations.”
Field-deployable systems
To be field-deployable, simulation systems must meet the same environmental requirements as the weapons or platforms they simulate or are mounted on.
“Using it in adverse weather conditions — high heat, deep cold, mud, sand, etc. — means hardening the devices so they can be used in different environments instead of just inside,” Meggitt’s Shavers says. “To meet the point-of-need, they have to be able to function in all environments.”
A key element in making field-deployable simulation and mission rehearsal is integrating cyber, which has become a major part of all military activities. Cyber training is conducted today, but not as an integrated part of weapons and tactics simulation and mission rehearsal. The military is concerned that not having a full cyber implementation in field-deployable training could lead to negative results, as the absence of both a major threat and offensive capability of growing importance leaves the trainee(s) without the experience necessary to respond to and instigate attacks as an integral part of the military’s kit.
COATS is part of the effort to comply with the 2015 DOD Cyber Defense Strategy, which calls for the development of an individual and collective training capability … to conduct joint training (including exercises and mission rehearsals), experimentation, certification, as well as the assessment and development of cyber capabilities and tactics, techniques and procedures for missions that cross boundaries and networks.
Part of the problem is convincing leaders the value of introducing cyber into existing simulation and mission rehearsal systems outweighs any perceived risks, especially as that integration is improved and made more realistic, giving warfighters the appropriate full-scale training they need to operate efficiently and effectively in a 21st Century battlespace.
Meanwhile, researchers continue to push the boundaries of field-deployable simulation and mission rehearsal, with a goal of making them as realistic and high-fidelity as possible, while also making them smaller, lighter, less expensive and easy to operate and upgrade in the field.
“Later generations will use more software and less hardware, but currently some changes require both hardware and software, which is not easy to do in the field,” Meggitt’s Shavers says.