Air Force researchers ponder Field of Light Display (FoLD) technology for future DOD needs
WRIGHT-PATTERSON AFB, Ohio – Advanced display technology and the needs of the U.S. Department of Defense (DOD) and U.S. Department of Energy (DOE) were discussed during a two-day IPT meeting and workshop on advanced displaysEngineer, Air Force Research Laboratory (AFRL).
The plenary sessions of the workshop conducted on 16 Nov 2016 featured a morning of presentations by representatives from across DOD and DOE who described their operations and advanced display needs. The afternoon presentations by current AFRL contractors provided these government users information on the state of development and future plans for holographic and light field displays.
The Field of Light Display (FoLD) class of visualization systems provides true 3D without the need for special eyewear/glasses. Types within the FoLD class include: Holographic; Integral Ray (hogel-based); Integral Image (plenoptic-based), and Volumetric.
The FoLD class of information visualization systems all create 3D images without glasses and without the eyestrain and discomfort of the Stereo 3D (S3D) class. Some FoLD systems offer full parallax, meaning you can see around objects when moving your head left/right or up/down, and some offer horizontal parallax only. Two of the three display-focused teams that presented at this workshop are working on full parallax displays: (a) TIPD Ltd. teamed with the Massachusetts Institute of Technology (MIT) Media Lab and Brigham Young Univ. (BYU); and (b) FoVI3D Inc. (pka Zebra Imaging Inc. and Rattan Software Inc.). A third team is working on a horizontal-only display that shows promise for being fielded in the near term: (c) Third Dimension Technologies Inc. (TDT) teamed with the Oak Ridge National Laboratory and Insight Media. For details on the work of these performers the reader is referred to TIPD (Dr. Lloyd J. LaComb Jr., [email protected]); MIT (Prof V. Michael Bove Jr., [email protected]; BYU (Prof Daniel E. Smalley, [email protected]); FoVI3D Inc. (Mr. Thomas L. Burnett III, [email protected]); and TDT (Dr. Clarence E. (Tommy) Thomas, [email protected]).
Dr. Hopper described a range of programs on advanced 3D displays (FoLD systems) that have been funded – mostly by DOD – over the past 30 years. Recently, from 2005-2013, the government invested some $65M in integral-ray (hogel-based) display research, comprising $45.5M for three Defense Advanced Research Projects Agency (DARPA) programs (Urban Photonic Sandbox Display (USPD), High Resolution Affordable Emissive Display (EMD), and Non-Visible-Light Emissive Micro Display) and $19.5M for two Intelligence Advanced Research Projects Agency (IARPA) programs (3D Holographic Display Technology and Synthetic Holographic Observation). AFRL is the technology transition agent for these programs and continues to fund activities around holographic, integral ray/image, and volumetric 3D displays using a series of extramural Small Business Innovative Research (SBIR) and SBIR Technology Transfer (STTR) efforts in conjunction with its in-house Battlespace Visualization Lab.
Hopper then gave a presentation describing the needs of the Combined Air Operation Centers (CAOC) that have driven the development of ever more sophisticated non-eyewear full parallax 3D visualization prototypes over the past 30 years. To date, none of these prototypes has passed the usability test: sufficient image quality at an affordable price.
The CAOC is comprised of a joint and coalition team that executes day-to-day combined air and space operations and provides rapid reaction, positive control, coordination, and de-confliction of weapon systems.
Source: http://www.afcent.af.mil/About/FactSheets/Display/tabid/4822/Article/217803/combined-air-operations-center-caoc.aspx
Hopper said that CAOC has tried several 3D display solutions both operationally and as research prototypes. Ultimately, they want a situational display that officers can stand around and see the battle space with full parallax. Toward that end, AFRL has worked with the Air Force Life Cycle Management Center (AFLCMC), DARPA and IARPA to develop glasses-free light field based 3D prototypes with Zebra Imaging Inc. and Ostendo Technologies Inc. Zebra delivered a tabletop system, branded the Zspace Motion Display (ZMD) Gen 1, that offered about an effective spatial image resolution of qVGA at each 5-mm pupil position in a 90-deg conical viewing zone with about 3 in. depth. Ostendo developed and demonstrated prototypes of their quantum photonic imager (QPI) chips, both 2D and 3D.
Currently, under SBIR funding, Zebra’s successor company, FoVI3D Inc., has a follow-on contract to double the ZMD pixel density while seeking to reduce cost and weight of the Gen. 2 version. Ostendo designed a high resolution monochrome QPI chip. TIPD, teamed with MIT and BYU, is developing a novel hybrid acousto-electro-optic device technology for a future lightfield display prototype. FoVI3D, TIPD, and Adi-Display Consulting LLC are developing metrics for FoLD systems. FoVI3D is developing a tool and methodology for measuring FoLD systems. And TDT and FoVI3D are developing a standard Streaming Model for sending video from client computer to any type of FoLD system (holographic, integral-ray, integral-image, volumetric).
Lt. Col. Jeremy Raley of the Defense Advanced Research Projects Agency (DARPA) said that space enterprise command and control requires cognizance of assets in Earth orbit. Orbital congestion is increasing and will likely experience a marked uptick in the next 5-10 years if any of several commercial proposals move forward and place thousands of additional satellites on orbit.
Source: http://www.darpa.mil/news-events/2016-06-17a
Understanding the motion of these assets around the Earth and communicating what-if scenarios involving them is difficult due to the non-intuitive way that space objects move and interact. Having 3D displays that facilitate the visualization of these assets would enhance communication and reduce operator workload to create a better decision-making environment for commanders of space forces.
Raley is managing a program called Hallmark with a focus on advancing technologies relevant to future space operations centers. Hallmark seeks to develop and deliver a testbed environment to determine whether new technologies improve effectiveness – and new visualization software and hardware technologies are a key tool for improving effectiveness. Raley says he is looking for ideas that help with situational awareness, indications and warnings, evaluating courses of action, and executing commanders’ decisions. His job is to then develop evaluation methods to see if new tools facilitate improvement or not. He noted that proposals have already been received and he expects to make initial awards in early 2017.
Nilo Maniquis of the Naval Sea System Command (NAVSEA) Program Executive Office for Integrated Warfare Systems (PEO IWS) described the layout and needs of the sailors manning the Combat Information Center (CIC) onboard the DDG 51 ARLIEGH BURKE AEGIS-class Destroyers. Here, there are many workstations consisting of three ruggedized monitors along with one larger multi-mission display system for the commanding officers to view – all packed into a fairly tight space.
Source: NAVSEA 17-015.
Maniquis is working to develop visualization solutions for deployment in 2023. He would entertain new visualization solutions to replace the 3-screen desktop displays, but they need to be operational for 60 years! However, what he really wants is the same situational “sand table” concept described for CAOC with naval officers standing around a glasses-free, full parallax 3D display.
Matthew Hackett is a Science and Technology Manager at the Army Research Laboratory. His focus is on medical applications where he wants to see a transition from 2D and glasses-based 3D displays to glasses-free 3D displays. Holographic or light field displays will fit the bill because lots of medical image sets (CT, MRI, etc.) are already 3D in nature so better visualization of the data will clearly impact diagnosis and treatment.
Currently, Mr. Hackett is evaluating the use of static anatomy holograms made by Zebra Imaging in joint operations medical training centers. He wants to know if such images help to reduce the cognitive load of medical personnel and enhance understanding by presenting true 3D models. Early results showed improved knowledge gains, increased understanding complex 3D anatomical structures, and a trend of cognitive load reduction.
Credit: Aaron Harlan, Zebra Imaging, Inc., www.zebraimaging.com
In addition, Mr. Hackett wants to better understand the features and benefits of many other types of advanced 3D display systems across a range of applications. For example, what resolution, what color information, and what depth cues are needed to improve performance? Is a head mounted display, CAVE environment, or sand table type 3D display best for certain tasks? Ultimately, can such advanced display systems become good enough for telesurgery? If providers can make the case that their advanced visualization technique can offer a benefit, he wants to talk to you.
Aljith Curtis represented the National Geospatial-Intelligence Agency (NGA) Research Directorate. She noted that Research is entrusted with the responsibility to deliver state of the art technologies and capabilities to the NGA mission. The directorate was recently restructured into seven focused research areas: Radar, Automation, Geophysics, Spectral, Environment and Culture, Geospatial Cyber and Anticipatory Analytics. Research also has the responsibility of test, evaluation, transition and protection of new capabilities to be delivered to the mission and community.
Source: [email protected] , Approved for Public Release, NGA 16-389
All research focus areas represent challenges in visualization, from representing non-literal static and dynamic information to effectively eliciting the intended response of the analyst to the decision maker. Effectively interfacing with and visualizing data from the seven focus areas pose even more unique challenges when combined with co-contributor information from multiple civilian agencies that have worldwide sensors.
Source: [email protected], Approved for Public Release, NGA 16-389
Technology trends in 3D visualization have already been proven useful to the warfighter and parts of the IC community. NGA continues its interest in those trends and movements in industry where standards and open source code make it easier to integrate modular, scalable technology and stream disparate data types to the customer interface rapidly and securely.
Finally, Jason Rugolo, a program manager in the Department of Energy (DOE) Advanced Research Projects Agency-Energy (ARPA-E), talked about his Digital Transportation initiative. He noted that most transportation activity is focused on making vehicles more efficient and reducing emissions. But his mission is to reduce energy use and emissions by creating more effective real-time high-fidelity telepresence capabilities. In 2014, transportation consumed about 27% of the total energy use (around 100 quadrillion BTUs) in the US, with nearly all of that energy coming from petroleum sources. So why do people travel, he asked, and then presented a busy pie chart showing about 30 different reasons. Some of this travel can’t be eliminated, like meeting friends and having dinner, but some activities could be reduced or eliminated if viable alternatives were available.
The alternatives may take multiple forms. For example, persistent virtual worlds are now starting to be built, such as the Oculus social world and Second Life. These are virtual computer-generated worlds where a digital representation of you – an avatar – can interact with others and objects. The technology to motion capture and scan 3D objects can be very sophisticated and Rugolo predicts they will improve and spread to the masses. Interaction today is with virtual reality headsets and these may be fine to substitute for some travel-based social activity, shopping and entertainment, he suggested.
Sources: www.secondlife.com and https://dpo.si.edu/blog/smithsonian-creates-first-ever-3d-presidential-portrait
Rugolo also touched upon one of the key problems with 3D displays today – including 3D virtual reality (VR) headsets, which are typically instantiation of the S3D class of visualization systems. One of the main reasons that 3D causes eyestrain is the mismatch between vergence and accommodation, which is a characteristic of all S3D display systems. Vergence refers to how your eyes toe in to look at closer objects and accommodation is the focusing of the eye. The human brain is very aware of slight muscle changes (both intra- and extra-ocular) as these are important depth clues.
On most 3D displays, the image emits light at only one plane, so our eyes must focus there. But objects can appear well in front of the screen, so our eyes want to converge on this point, thus creating conflict between vergence and accommodation and eye strain. Holographic and light field displays do not have this mismatch issue so viewing 3D is much more natural. This is why Rugolo is very interested in this technology and his hope that it will be a very good option to replace travel. But he is also interested in the techniques that can create convincing digital humans.
Rugolo concluded by noting his telepresence program is only focused on improved audio-video solutions, not haptic, touch or speech interfaces, and he does have modest funds for those with good ideas.
What is clear from the briefings from these key military and government agencies is a desire for more advanced visualization. The data sets are complex and growing and ripe for advanced 3D visualization.
However, a big gap exists between what visualization technology is available today and what is needed. All these activities and user needs suggest we will be seeing improvements in display technology for quite some time.
Learn more: search the Aerospace & Defense Buyer's Guide for companies, new products, press releases, and videos