Abstract

Extended reality (XR) technologies have realized significant value for design, manufacturing, and sustainment processes. However, industrial XR, or XR implemented within industrial applications, suffers from scalability and flexibility challenges due to fundamental gaps with interoperability between data, models, and workflows. Though there has been a number of recent efforts to improve the interoperability of industrial XR technologies, progress has been hindered by an innate separation between the domain-specific models (e.g., manufacturing execution data, material specifications, and product manufacturing information) with XR (often-standard) processes (e.g., multiscale spatial representations and data formats optimized for run-time presentation). In this paper, we elaborate on promising research directions and opportunities around which the manufacturing and visualization academic community can rally. To establish such research directions, we (1) conducted a meta-review on well-established state-of-the-art review articles that have already presented in-depth surveys on application areas for industrial XR, such as maintenance, assembly, and inspection and (2) mapped those findings to publicly published priorities from across the US Department of Defense. We hope that our presented research agenda will spur interdisciplinary work across academic silos, i.e., manufacturing and visualization communities, and engages either community within work groups led by the other, e.g., within standards development organizations.

1 Introduction

Industrial extended reality (XR) can be defined as the use of augmented reality (AR), mixed reality (MR), and/or virtual reality (VR) within industrial applications. Industrial XR can provide a human-centered mechanism for the delivery of data in austere, complex, and unpredictable environments. The Department of Defense (DoD) has already leveraged advanced visualization systems, such as XR, for a number of industrial and industry-similar applications, including (i) warfighter training in complex scenarios [1], (ii) manufacturing and sustainment of weapon systems [2], and (iii) geospatial overlay on the battlefield [3].

Figure 1 presents the perspective of Milgram et al. on the “reality-virtuality (RV) continuum’” [4] relating the real (or physical) environment and objects to their virtual (or digital) counterparts. This continuum provides a framework for defining the scopes of VR, AR, and MR. VR is a simulated experience that relates user behavior (such as eye movement and hand gestures) to a defined virtual world, eliciting a sense of immersion. Contrary to VR, AR anchors the user experience within a real environment. AR scenes often spatially locate digital data and/or objects relative to salient physical features such that these data “live” in the real world. MR is any point on the RV continuum that blends the physical anchoring of AR’s real environment and the immersion of VR’s virtual environment to unlock more intuitive human engagement and interaction.

Fig. 1
Simplified representation of a RV continuum. Adopted from Milgram et al. [4].
Fig. 1
Simplified representation of a RV continuum. Adopted from Milgram et al. [4].
Close modal

One primary challenge for the use of industrial XR is the lack of interoperability. For each of the XR categories, developers must coordinate real-world data with virtual data, each possessing (or lacking) their own coordinate systems, modeling paradigms, and governing physics. Any errors in coordination between virtual and real data ultimately affect embodiment and presence, i.e., the user’s sense of natural engagement in an XR scene. Issues with interoperability in manufacturing and sustainment scenarios can occur at multiple levels, which can be summarized as follows:

  • Data or entity: Manufacturing data comprise a huge variety of types, with additional variability originating from the breadth of specific use cases. Such data may be nominal, ordinal, continuous, and/or discrete, and consist of any arbitrary composition (e.g., scalars, vectors, meshes, images, etc.). Data can also span a variety of physical descriptions of the world, for example spatial versus temporal entities. As a result of such differences, a fundamental lack of interoperability must be overcome by highly flexible storage schema, data registration, mapping, or other data-related techniques.

  • Model or representation: The manufacturing community is a leading adopter of standards for domain-specific model definitions [5]. However, most manufacturing standards are developed in silo with quite narrow perspectives in mind. As a result, coordinating models and representations, even ones that are developed by internationally recognized standards development organizations (SDOs), within a single XR scene remains challenging, albeit even more challenging when dealing with proprietary formats.

  • Workflow: Meaningful industrial XR scenes often merge highly specialized models that are native to domain-specific workflows, such as computational fluid dynamics and collision detection for robotic planning. Such software tools are critical tools in the manufacturing workflow. However, there is a significant lack of robust communication and exchange capabilities across these various workflow platforms, partly due to a lack of expressiveness in the available application programming interfaces (APIs). The lack of such robust APIs hinders interoperability for real-world use cases.

Early XR adopters have identified the grand challenge of realizing interoperability for industrial data and systems [6]. Most efforts have been targeted at developing standards per industrial use case, such as standardizing machine controller data readouts, e.g., MTConnect [7], and three-dimensional (3D) data presentation, e.g., ASME Y14.41 [8]. While there has been a number of recent efforts to improve the interoperability of industrial XR technologies, progress has been hindered by an innate separation between the domain-specific models, e.g., manufacturing execution data, material specifications, and product manufacturing information (PMI), with XR (often-standard) components, e.g., multiscale spatial representations and data formats optimized for run-time presentation, e.g., glTF 2.0 [9].

In this paper, we elaborate on promising research directions and opportunities around which the manufacturing and visualization academic community can rally. To establish such research directions, we (1) conducted a meta-review on well-established state-of-the-art reviews that have already presented in-depth surveys on application areas for industrial XR, such as maintenance, assembly, and inspection and (2) mapped those findings to publicly published priorities from across the DoD. Note that our presented perspectives are underpinned by our focus on manufacturing and sustainment. Other application spaces for industrial XR are important yet might not necessarily fall under our primary focus.

2 Background and Related Work

Several prior studies have documented the technical gaps, trends, and opportunities associated with implementing industrial XR. To provide context, we summarized seven well-recognized review papers in the area of industrial AR [1016]. Note that we focus on AR, since AR provides the most value to in-field DoD applications, such as manufacturing and sustainment use cases. That being said, remote training through VR remains a focal point for the DoD at large; however, in our opinion, VR-based training is a mature technology offered as a service by a number of specialized companies.

Table 1 summarizes key points delivered by the papers, ordered from highest to lowest consensus. Amid a steep uptake of early adopters, Nee et al. [10] presented an early review on benefits and trends of AR and VR applications in design and manufacturing. Fraga-Lamas et al. [11] conducted a more focused review of AR introduction into the shipyard, focusing on low-level attributes of successful implementations. Palmarani et al. [12] conducted a use-case centric review for XR installations for maintenance. Bottani et al. [13] conducted an analysis of 10 years (i.e., 2006–2017) of publications related to industrial AR and found that most papers are related to the areas of assembly, maintenance, and training/learning. De Souza Cardozo et al. [14] conducted a systematic review for AR and MR across industrial applications and found that the main challenges and constraints relate to the following (in order of frequency): hardware; projection quality, accuracy, and interaction; tracking methods; users’ health and acceptance; and development complexity. Egger et al. [15] conducted a holistic review on the use of XR within “intelligent manufacturing” spanning across all domains and use cases. Baroroh et al. [16] established a recent review on AR and MR implementations associated with smart manufacturing (i.e., Industry 4.0) and focused on the human-in-the-loop elements of the XR experience. High-level commonalities across the reviews can be reviewed in Table 1.

Table 1

Summarized gaps, trends, and opportunities across the seven review papers studied

Nee et al. (2012) [10]Fraga-Lamas et al. (2018) [11]Palmarani et al. (2018) [12]Bottani et al. (2019) [13]De Souza Cardozo et al. (2020) [14]Egger et al. (2020) [15]Baroroh et al. (2021) [16]
Opportunities exist for assembly, disassembly, maintenance, and design
Mobile devices (HMDs or hand-held) are most common for AR hardware
Human-centric software and hardware approaches are needed
Marker-based tracking is most common/mature for virtual object positioning
Task completion time is a key measure of success
Pure marker tracking is not the ideal for an industrial environment
Industrial AR use has unique challenges
AR is a key component of Industry 4.0
Testing limited to labs, not the field
Organizational culture is a barrier to adopt AR in the workplace
Nee et al. (2012) [10]Fraga-Lamas et al. (2018) [11]Palmarani et al. (2018) [12]Bottani et al. (2019) [13]De Souza Cardozo et al. (2020) [14]Egger et al. (2020) [15]Baroroh et al. (2021) [16]
Opportunities exist for assembly, disassembly, maintenance, and design
Mobile devices (HMDs or hand-held) are most common for AR hardware
Human-centric software and hardware approaches are needed
Marker-based tracking is most common/mature for virtual object positioning
Task completion time is a key measure of success
Pure marker tracking is not the ideal for an industrial environment
Industrial AR use has unique challenges
AR is a key component of Industry 4.0
Testing limited to labs, not the field
Organizational culture is a barrier to adopt AR in the workplace

Note: A black dot signals that the paper (column) presented the concept (row).

2.1 The Grand Challenge of Interoperability.

The IEEE Standard Computer Dictionary defines interoperability as “the ability of two or more systems or components to exchange information and to use the information that has been exchanged’” [17]. Most work aimed at improving interoperability targets specific applications. For example, interoperability across digital systems for production and sustainment environments has been a significant area of research for decades [18,19]. One common approach to improve interoperability across manufacturing systems is through standards [5,20]. However, a limitation of standards is that they usually require specification scoped to particular use cases. For example, standards that specify design representations do not directly correlate with controller readouts from the machines that manufacture physical instances of those designs. As a result, mappings must be manually encoded to enable a true digital thread [21]. Since these mappings are normally created for specific applications and system configurations, they are brittle to changes in equipment or modifications of the manufacturing process.

Achieving more interoperable and integrated software-based pipelines is especially challenging when two or more different application areas are involved, such as the integration of gaming-based models with engineering-based information [22]. For industrial XR, the goals of the underlying data formats and models fundamentally differ between the design and XR communities. For example, design and manufacturing values mathematically defined geometries leaning towards expressiveness, i.e., encoding high-quality geometric data to ensure watertight definitions. On the other hand, XR model expressions, such as glTF [9] and FBX [23], value efficiency and light-weighting. As a result, integrating annotations (and other critical authoring information) is left at the hands of those that develop XR scenes. This means developers are often scrambling to collect (often incomplete) data from disparate sources to construct scenes by “hand-prepping” authoring annotations.

2.1.1 Challenges With Industrial Extended Reality Data.

One key element of XR scenes is maintaining a reference frame to establish a common coordinate system. When utilizing XR in manufacturing, registering digital assets into a common reference frame presents additional challenges since other physical assets may retain their own coordinate system, such as robotic arms and computer numerical control (CNC) machines. Figure 2 provides an example of an Air Force funded project that overcame this challenge for a demonstration [24]. In this example, an operator (Fig. 2(a)) interacts with a 6-DOF robotic arm to perform a repair process on a sample. Figure 2(a) illustrates the real-world coordinate system that must be considered to maintain safe collaboration between the human and robot. Figure 2(b) shows the view of the operator with the overlay planning interactions through the Microsoft Hololens. The coordinate system of the real-world must be registered with the digital objects that the operator views in the headset to realize efficient interaction. Figure 2(c) provides a screenshot of the digital simulation of the robot’s process plan that must be coordinated with the human interactions from the physical world. Though the coordination across multiple spatial and temporal scales is possible, often the integration is done on a one-off basis with limited applicability to other use cases.

Fig. 2
Completed Air Force Manufacturing Technology project aimed to expedite the process planning for robotic-assisted thermal spray. The project represents an example of the challenge with coordinating process planning data across three coordinate systems: the physical room, the XR scene, and the robot’s controller [24].
Fig. 2
Completed Air Force Manufacturing Technology project aimed to expedite the process planning for robotic-assisted thermal spray. The project represents an example of the challenge with coordinating process planning data across three coordinate systems: the physical room, the XR scene, and the robot’s controller [24].
Close modal

Other data-related interoperability challenges arise from misalignment in goals between XR and manufacturing data. Currently, XR hardware is limited by the amount of data that can be presented to the user at any one time. As a result, 3D models must be decimated to support more efficient interaction and practical visualization. In manufacturing, the precision of 3D presentation is paramount for most use cases. This issue can be the root cause of errors in the presentation of annotations and other critical data, e.g., the bounding box of the repair region for which the robot in Fig. 2 must act upon. A negotiation between the two perspectives must take place, whether that means the XR installation must introduce additional computing power offline or encode an additional step in the virtual space to check the placement of planning instructions (i.e., Fig. 2(c)). Note that the tradeoff between model precision and XR device performance is driven by human perception and cognition [25]. Kruijff et al. [25] classified the causes of perceptual issues into five categories: environment, capturing, augmentation, display device, and user. Quantifying user-acceptable thresholds for model precision and device performance is very much use-case dependent. Ultimately, well-structured user studies must be designed and executed to set requirements for effective XR scenes [26].

2.1.2 Challenges With Industrial Extended Reality Models.

To realize interoperability in design and manufacturing systems, SDOs have released standards that address specific exchange and curation requirements per use case. For example, the STandard for the Exchange of Product Data (STEP), or the ISO 10303 series, is one of the most widely used representations (or data models) for exchanging product definitions, including geometries, kinematics, and product manufacturing information (PMI) [27]. Though the affordances provided by STEP can cover most exchange challenges, computer-aided design (CAD) software packages lag in implementing the full capabilities provided therein. As a result, information loss often occurs at the stage of model packaging for exchange.

Figure 3 presents a depiction of a workflow that is affected by such model exchange challenges. In this example, we show a pipeline for developing animation-driven industrial XR scenes. Note that the complexity of such a pipeline can be greatly reduced by remaining in a single PLM platform; however, in most DoD scenarios, multiple organizations are involved and remaining in a single platform with full access to all data structures cannot be ensured. In the scenario conveyed in Fig. 3, a set of developers interact with disparate software tools to generate assembly (.STP), animation (.DAE), and XR scenes. Often, information is lost between each translation and exchange step introducing additional human labor. Also, the loss of information in this forward feeding process eliminates the possibility of updating upstream processes if something changes. This example is meant to convey that current exchange models for industrial XR are not robust enough to ensure content adaptability.

Fig. 3
An example of disparate data exchange between software applications for industrial XR. In this example, developers step through different software tools to encode animations for the scene.
Fig. 3
An example of disparate data exchange between software applications for industrial XR. In this example, developers step through different software tools to encode animations for the scene.
Close modal

2.1.3 Challenges With Industrial Extended Reality Workflows.

To date, the overwhelming majority of workflows and use case demonstrations in engineering with XR utilize professional gaming engines, namely The Unreal Engine [28] and Unity [29], for content creation. Engineers have used these powerful frameworks to create, develop, and deploy a wide range of applications. While offering a turnkey entry point for XR visualizations on a wide variety of platforms, there is a cost associated with data formatting and conversions into files more easily consumable by the engines. In some cases, these conversions may prove trivial, but for other, more exotic data types these could result in a significant part of the application’s development time. Furthermore, the learning curve for even the simplest form of content development may prove to be too large of a barrier for a typical engineer with a limited to nonexistent game-development background looking to explore the use of XR for their application. As such, we see a need in the engineering community for the continued development of (semi-)generic yet accessible entry points into content creation geared towards engineers (as opposed to game developers). These new access points could come in the form of on-boarding applications with inputs for data formats commonly used within a given engineering discipline or, conversely, the increased support of XR ports from within existing software tools used inside the given discipline.

2.2 Approaches for Improving Interoperability for Industrial Extended Reality.

As previously mentioned, the community is aware of the significant interoperability challenge for industrial XR, having been noted [30] through industrial consortium activities, academic workshops, and research contributions. Specific efforts include (1) open source efforts for mapping PMI from authoritative engineering design data, i.e., STP, to lightweight AR-amenable representations [31], (2) leveraging known techniques for encoding industrial animations in a faster, more updateable manner [32], and (3) developing open-source connectors to enable quick viewing of engineering simulations on XR [33]. Figure 4 provides a high-level perspective of AFRL’s Aerospace Analysis and Design in Virtual and Augmented Reality toolKit (AArDVARK), an in-house solution for porting simulation design quickly and robustly to XR environments. Figure 4(a) conveys that AArDVARK is able to convert a variety of design and engineering model types into AR-amenable representations for intuitive presentation. Figure 4(b) illustrates the web-based architecture of AArDVARK, specifically designed to minimize the software and hardware infrastructure necessary to step into an XR experience.

Fig. 4
Use case diagram (a) and architecture diagram (b) of AArDVARK, an Air Force Research Laboratory (AFRL) in-house development project to port design and analysis models into AR/VR [33]
Fig. 4
Use case diagram (a) and architecture diagram (b) of AArDVARK, an Air Force Research Laboratory (AFRL) in-house development project to port design and analysis models into AR/VR [33]
Close modal

All three efforts are supported with follow-on efforts by AFRL and anticipate significant buy-in from the other DoD services and partners. In short, additional work in this space is required to highlight platform-agnostic benefits for industrial XR installations. As standard data, model, and workflow structures continue to proliferate across industry, pressure on the large computer engineering service providers will continue to grow. Eventually, better conformance to global standards by the software providers can make significant impacts on the industrial XR scene development process.

For the remainder of the paper, we highlight some key research directions and opportunities for working towards more interoperable industrial XR systems.

3 Research Opportunities and Future Directions

To identify trends and potential technical roadmaps for industrial XR, we systematically recorded many future directions proposed by the review papers conveyed in Table 1. We then manually (and subjectively) tagged each potential opportunity to uncover potential commonalities. We classified each research opportunity as either near-term, mid-term, and far-term. Table 2 presents 40 research and development (R&D) directions conveyed across the review papers studied. These R&D efforts are not necessarily XR-driven opportunities; take “5G Infrastructure” for example. However, technological improvements in adjacent fields, e.g., optics, machine vision, and sensors, will enhance XR experiences. We identified 17 near-term, 13 mid-term, and 10 far-term opportunities.

Table 2

Alignment of future R&D opportunities with technology tags

TrackingTestingHardwareSoftwareErgonomicsBusinessHuman Machine Inter.Industrial XR UsesComputing ArchitectureReal Time PerformanceBig DataUser Experience
NearIndustrial Environment Field Testing [13,14]
Industrial Environment Marker Tracking [10,15]
Online Sensorization [11,16]
5G Infrastructure [16]
AR with Sensors [10,12]
Standardized Data Protocols [15]
Machine Learning [16]
Improved Processing Speed [15]
Hardware Miniaturization [15]
Immobile Devices for AR [10,13]
Extended Use Testing [14,15]
Lighter Mobile Devices [14]
Exterior Supported Handheld Devices [12]
Highly Interactive Interfaces [10]
AR with Haptics [12]
Gamificaiton of AR Systems [14]
Improved Content Visibility [15]
MidNatural Marker Tracking [15]
Markerless Tracking [15]
CAD Feature Extraction [12]
Edge Computing [11,16]
Content Adaptability [15]
Predictive Maintenance [15]
Content Authoring Systems [12,15]
Natural Gesture Mapping [13,15]
Remote Collaboration [13]
Head Mounted Display Visual Fatigue [15]
Comfortable Head Mounted Displays with better FOV [12]
Easily Maintainable and Modifiable AR [12]
Task Customized User-Friendly Interfaces [10]
FarHuman Collaboration with Computational Intelligence [16]
Real Time Data Utilization from Manufacturing Systems [15]
AR for Smart Manufacturing Prognostics [16]
Human Machine Interaction Safety Through Path Planning [15]
Integrating AR into a Broader Data Infrastructure [15]
AR Data Collection for Process Improvement [12]
AR Contact Lenses [12]
Virtual Retinal Displays [12]
3D Hologram Projectors [12]
Adaptive AR Interfaces [10]
TrackingTestingHardwareSoftwareErgonomicsBusinessHuman Machine Inter.Industrial XR UsesComputing ArchitectureReal Time PerformanceBig DataUser Experience
NearIndustrial Environment Field Testing [13,14]
Industrial Environment Marker Tracking [10,15]
Online Sensorization [11,16]
5G Infrastructure [16]
AR with Sensors [10,12]
Standardized Data Protocols [15]
Machine Learning [16]
Improved Processing Speed [15]
Hardware Miniaturization [15]
Immobile Devices for AR [10,13]
Extended Use Testing [14,15]
Lighter Mobile Devices [14]
Exterior Supported Handheld Devices [12]
Highly Interactive Interfaces [10]
AR with Haptics [12]
Gamificaiton of AR Systems [14]
Improved Content Visibility [15]
MidNatural Marker Tracking [15]
Markerless Tracking [15]
CAD Feature Extraction [12]
Edge Computing [11,16]
Content Adaptability [15]
Predictive Maintenance [15]
Content Authoring Systems [12,15]
Natural Gesture Mapping [13,15]
Remote Collaboration [13]
Head Mounted Display Visual Fatigue [15]
Comfortable Head Mounted Displays with better FOV [12]
Easily Maintainable and Modifiable AR [12]
Task Customized User-Friendly Interfaces [10]
FarHuman Collaboration with Computational Intelligence [16]
Real Time Data Utilization from Manufacturing Systems [15]
AR for Smart Manufacturing Prognostics [16]
Human Machine Interaction Safety Through Path Planning [15]
Integrating AR into a Broader Data Infrastructure [15]
AR Data Collection for Process Improvement [12]
AR Contact Lenses [12]
Virtual Retinal Displays [12]
3D Hologram Projectors [12]
Adaptive AR Interfaces [10]

Leveraging our colleagues’ recommendations, we built out technology vignettes, mirroring programmatic visions that link the suggested R&D objectives as building blocks. The outcome of this exercise is conveyed in Figs. 5 and 6. Figure 5 illustrates potential research paths to improve data-driven processes for (and/or with) XR. Figure 6 captures potential research paths for improving human factors and ergonomics for XR projects. Below, we elaborate on far-term research goals that can help realize interoperability for industrial XR, indicated as the eventual end state of bold lined paths in Figs. 5 and 6. Those goals include the following:

  • human collaboration with computational intelligence,

  • AR for smart manufacturing prognostics,

  • integrating AR into a broader data infrastructure, and

  • adaptive AR interfaces.

Fig. 5
Potential research and development opportunities for industrial XR related to data-driven processes. Placement of boxes signaling opportunities is intentional (i.e., nearer term goals are placed more left on the X-axis). Shaded boxes indicate commonalities with Fig. 6.
Fig. 5
Potential research and development opportunities for industrial XR related to data-driven processes. Placement of boxes signaling opportunities is intentional (i.e., nearer term goals are placed more left on the X-axis). Shaded boxes indicate commonalities with Fig. 6.
Close modal
Fig. 6
Potential research and development opportunities for industrial XR related to human factors and ergonomics. Placement of boxes signaling opportunities is intentional (i.e., nearer term goals are placed more left on the X-axis). Shaded boxes indicate commonalities with Fig. 5.
Fig. 6
Potential research and development opportunities for industrial XR related to human factors and ergonomics. Placement of boxes signaling opportunities is intentional (i.e., nearer term goals are placed more left on the X-axis). Shaded boxes indicate commonalities with Fig. 5.
Close modal

3.1 Human Collaboration With Computational Intelligence.

Human collaboration with computational intelligence is about leveraging AR so that human operators can easily interface with the automated systems present in a manufacturing environment. AR content that is adaptable to individual human operators and to the many systems present in a manufacturing environment sets up easier interfacing systems leveraging computational intelligence in real-time. Adaptability can only be achieved by standardized data protocols that provide coordination between virtual data/content existing in AR and the real data of manufacturing systems. Hence, to achieve such a goal, we recommend additional work on leveraging existing and new standardized data protocols and representations to develop approaches to encode XR content more amenable to chain-to-chain updates [32].

3.2 Augmented Reality for Smart Manufacturing Prognostics.

Standardized data protocols provide coordination between and across digital environments and the real data of manufacturing systems. Implementing standardized data protocols within manufacturing activities, such as execution and control, facilitate the introduction of more advanced decision-making, such as the incorporation of data-driven modeling. Leveraging data-driven processes can provide the opportunity for greater industrial efficiency, e.g., improving prognostics and health management (PHM) with predictive maintenance where machine learning is used to define maintenance epochs. Adding AR into predictive maintenance introduces a data presentation module on the floor and can avoid unnecessary time sinks.

3.3 Integrating Augmented Reality Into a Broader Data Infrastructure.

Standardized data protocols provide coordination between and across digital environments and the real data of manufacturing systems. Content authoring systems are the tools used to create AR scenes, including domain-specific annotations such as GD&T callouts and other PMI. A robust content authoring system that is built in and around standard data representations can facilitate a future state of industrial AR systems that are flexible, adaptable, and more update-able. Standard communication protocols facilitate more innovation through the discovery of underutilized data streams. That being said, we recommend additional work with user-centric experimentation for a deeper understanding of what’s the right data to collect, modify, and visualize per use case.

3.4 Adaptive Augmented Reality Interfaces.

Per Fig. 6, we can summarize the future opportunities relevant to interoperability and human factors as adaptive AR interfaces.

3.4.1 Task Customized User-Friendly Interfaces.

AR haptics provides a mode of physical interactions for the user to the interface, including hand gestures, voice commands, and better peripheral proxies. Some AR experiences in design or training would be improved if haptics were utilized to provide physical feedback. This is the goal of having task-customized user-friendly interfaces. The next step would be to have interfaces that are not only customized to a task but adaptive to the current environment. One example of potential future work is the development of physics-based simulations coupled with live AR interactions to provide feedback to the user through various sensory modes. For such ideas, it is critical to keep the application space in mind to avoid safety hazards in dynamic and austere environments, such as DoD sustainment depots.

3.4.2 Computer-Aided Design Feature Extraction Tracking.

Compared to research lab settings, conditions within industrial environments are unpredictable and subideal, e.g., inconsistent lighting and visibility. These characteristics provide unique challenges for the marker tracking required in much of AR. Ultimately, the insight developed from overcoming lighting challenges will allow for advanced tracking techniques such as CAD feature extraction tracking to be used in an industrial setting. An AR system could use this improvement to provide interfacing options the user that adapt to the displayed part and feature in front of them. Note that feature-based tracking exists to some degree yet is computationally expensive and unreliable. Tradeoff studies between feature identification modes are warranted, specifically under the conditions presented in a manufacturing setting.

4 Call to Action

Interoperability continues to hinder progress for industrial XR systems. One challenge for improvement is the lack of “ownership” of the problem itself. Commonizing data structures for industrial XR does not nicely fit into a single swimlane of activities. Often, the work itself is far too engineering-focused to be considered as a prioritized objective of an academic laboratory. At the same time, the incentive/value structure is lacking for software providers to fit the necessary effort into their development roadmaps. That being said, we believe that as industrial XR continues to proliferate, the cost burden of development time to overcome data exchange challenges could incentivize significant change.

As a result, industrial XR scene developers are the stakeholders that face the most pain. Much like the adoption and dissemination of smart manufacturing (or Industry 4.0) best practices, full realization of interoperable industrial XR requires internationally recognized standards that offer practical solutions without hindering innovation. We recommend additional testing and development of ecosystems that leverage standards across application domains. Standards groups of particular merit include the Open Geospatial Consortium and the Khronos Group. Historically, the associated communities of these SDOs, i.e., the Geospatial Information Science Industry and the Gaming/Entertainment Industry, have not worked together as much as the alignment across technical interests would suggest. Recently, the launch of the Metaverse Standards Forum [34] seems promising, as it provides an informal mechanism for cross-domain collaborations.

Additional development in the integration and linking of such standard data representations will uncover more scaleable solutions for industrial XR. We hope for additional engagements that incorporate more domain-specific challenges, such as those faced by exchanging data from secure engineering systems to the edge. Designing the right balance between cloud-based (i.e., more centralized) and edge-based (i.e., less centralized) decision-making is an active research area [35]. Reducing up-front (or at the edge) work enhances system scalability and reduces overall cost of computation. Satyanarayanan et al. [36] presented the GigaSight framework, an edge architecture that leverages cloudlets to perform computer vision analytics in near real time. Extension of such work in a secure environment would be a significant leap for DoD applications. Similar momentum can be seen in communities interested in delivering complex machine learning algorithms on small edge devices, such as the tinyML consortium.2

If the relevant research disciplines appropriately rally, we envision a frictionless XR ecosystem wherein data are passed in real-time with no loss of information or fidelity. To achieve this vision, two routes are possible: (1) the deployment of a single platform “to rule them all” or (2) the wide adoption of neutral and open data formats. The latter seems more tenable and sustainable, as evidenced by the wide adoption of open XR-relevant standards, such as OpenXR,3 and CityGML.4 Additionally, we believe that the aforementioned manufacturing standards should also sit at the heart of discussions moving forward. Per Fig. 5, Standardized Data Protocols is the R&D opportunity that presents the best potential return on investment as it could represent the start of a path to each of the six captured far-term opportunities:

  • Human Collaboration with Computational Intelligence,

  • Real Time Data Utilization from Intelligent Manufacturing Information Systems,

  • AR for Smart Manufacturing Prognostics,

  • Human Machine Interaction Safety Through Path Planning Algorithms,

  • Integrating AR into a Broader Data Infrastructure, and

  • AR Data Collection for Process Improvement.

To improve adoption of standards, further development on robust mappings across standard data formats is required. Such mappings could be done through structured queries [37], inference making through ontologies [38], and ad-hoc semantic-based language interpretation [39], to name a few directions.

At the Air Force Research Laboratory, we intend to press forward in illuminating fundamental challenges related to industrial XR coupled with agile robotic systems. We anticipate that achieving more interoperable industrial XR systems will be a key goal and focus of our work.

Footnotes

Acknowledgment

We recognize those that have contributed in the standards community, dedicating time, and effort to improve interoperability. Standards are a key aspect of a plan forward to realize interoperable, scale-able, and agile industrial XR.

Disclaimer

Distribution A. Approved for public release: Distribution unlimited under AFRL-2023-0707 determined on 08 Feb 2023.

Conflict of Interest

There are no conflicts of interest.

Data Availability Statement

The authors attest that all data for this study are included in the paper.

Nomenclature

AFRL =

Air Force Research Laboratory

AP =

application protocol

API =

application programming interface

AR =

augmented reality

AArDVARK =

Aerospace Analysis and Design in Virtual and Augmented Reality toolKit

CAD =

computer-aided design

CNC =

computer numerical control

DoD =

Department of Defense

DOF =

degree of freedom

FBX =

FilmBox 3D File Format

glTF =

Graphics Language Transmission Format

GD&T =

geometric dimensioning and tolerancing

HMD =

head mounted display

ISO =

International Organization for Standardization

MR =

mixed reality

OGC =

Open Geospatial Consortium

PHM =

prognostics and health management

PMI =

product manufacturing information

RDF =

resource description framework

RV =

reality-virtuality

SDO =

standards development organization

STEP =

STandard for the Exchange of Product Model Data

VR =

virtual reality

XR =

extended reality

3D =

three-dimensional

References

1.
Lele
,
A.
,
2013
, “
Virtual Reality and Its Military Utility
,”
J. Ambient Intell. Humanized Comput.
,
4
(
1
), pp.
17
26
.
2.
Gilbert
,
A.
,
2016
, “
Augmented Reality for the US Air Force
,”
International Conference on Virtual, Augmented and Mixed Reality (VAMR 2016)
,
Toronto, Canada
,
July 17–22
, Springer, pp.
375
385
.
3.
Swann
,
D.
,
2005
, “Chapter 63,”
Geographical Information Systems: Principles, Techniques, Management and Applications, 2nd Edition, Abridged
, P. A. Longley, M. F. Goodchild, D. J. Maguire, and D. W. Rhind, eds., Vol.
2
,
Wiley
,
Hoboken, NJ
, pp.
889
899
.
4.
Milgram
,
P.
,
Takemura
,
H.
,
Utsumi
,
A.
, and
Kishino
,
F.
,
1995
, “
Augmented Reality: A Class of Displays on the Reality-Virtuality Continuum
,”
Proc. SPIE 2351, Telemanipulator and Telepresence Technologies
,
Boston, MA
,
Oct. 31–Nov. 4, 1994
, Vol. 2351, SPIE, pp.
282
292
.
5.
Lu
,
Y.
,
Morris
,
K. C.
, and
Frechette
,
S.
,
2016
, “
Current Standards Landscape for Smart Manufacturing Systems
,” National Institute of Standards and Technology, NISTIR, Gaithersburg, MD,
8107
(
3
).
6.
Perey
,
C.
,
2015
, “
Open and Interoperable Augmented Reality and the IEEE [standards]
,”
IEEE Consumer Electron. Mag.
,
4
(
4
), pp.
133
135
.
7.
MTConnect Institute, 2014, MTConnect Standard. Accessed on 31 March, 2017.
8.
ASME Y14.41-2012, 2012, Digital Product Definition Data Practices, American Society of Mechanical Engineers, New York.
9.
Bhatia
,
S.
,
Cozzi
,
P.
,
Knyazev
,
A.
, and
Parisi
,
T.
,
2021
, gltf 2.0 specification, Tech. Rep., Khronos Group, 2017, https://www.khronos.org/gltf/.
10.
Nee
,
A.
,
Ong
,
S.
,
Chryssolouris
,
G.
, and
Mourtzis
,
D.
,
2012
, “
Augmented Reality Applications in Design and Manufacturing
,”
CIRP Ann.
,
61
(
2
), pp.
657
679
.
11.
Fraga-Lamas
,
P.
,
FernáNdez-CaraméS
,
T. M.
,
Blanco-Novoa
,
O.
, and
Vilar-Montesinos
,
M. A.
,
2018
, “
A Review on Industrial Augmented Reality Systems for the Industry 4.0 Shipyard
,”
IEEE Access
,
6
(
1
), pp.
13358
13375
.
12.
Palmarini
,
R.
,
Erkoyuncu
,
J. A.
,
Roy
,
R.
, and
Torabmostaedi
,
H.
,
2018
, “
A Systematic Review of Augmented Reality Applications in Maintenance
,”
Rob. Comput.-Integr. Manuf.
,
49
(
1
), pp.
215
228
.
13.
Bottani
,
E.
, and
Vignali
,
G.
,
2019
, “
Augmented Reality Technology in the Manufacturing Industry: A Review of the Last Decade
,”
IISE Trans.
,
51
(
3
), pp.
284
310
.
14.
de Souza Cardoso
,
L. F.
,
Mariano
,
F. C. M. Q.
, and
Zorzal
,
E. R.
,
2020
, “
A Survey of Industrial Augmented Reality
,”
Comput. Ind. Eng.
,
139
(
1
), p.
106159
.
15.
Egger
,
J.
, and
Masood
,
T.
,
2020
, “
Augmented Reality in Support of Intelligent Manufacturing ‘A Systematic Literature Review’
,”
Comput. Ind. Eng.
,
140
(
1
), p.
106195
.
16.
Baroroh
,
D. K.
,
Chu
,
C. -H.
, and
Wang
,
L.
,
2021
, “
Systematic Literature Review on Augmented Reality in Smart Manufacturing: Collaboration Between Human and Computational Intelligence
,”
J. Manuf. Syst.
,
61
(
1
), pp.
696
711
.
17.
Janssen
,
M.
,
Estevez
,
E.
, and
Janowski
,
T.
,
2014
, “
Interoperability in Big, Open, and Linked Data—Organizational Maturity, Capabilities, and Data Portfolios
,”
Computer
,
47
(
10
), pp.
44
49
.
18.
Ray
,
S. R.
, and
Jones
,
A. T.
,
2006
, “
Manufacturing Interoperability
,”
J. Intell. Manuf.
,
17
(
6
), pp.
681
688
.
19.
Zeid
,
A.
,
Sundaram
,
S.
,
Moghaddam
,
M.
,
Kamarthi
,
S.
, and
Marion
,
T.
,
2019
, “
Interoperability in Smart Manufacturing: Research Challenges
,”
Machines
,
7
(
2
), p.
21
.
20.
Lu
,
Y.
,
Xu
,
X.
, and
Wang
,
L.
,
2020
, “
Smart Manufacturing Process and System Automation—A Critical Review of the Standards and Envisioned Scenarios
,”
J. Manuf. Syst.
,
56
(
1
), pp.
312
325
.
21.
Hedberg
,
T.
,
Lubell
,
J.
,
Fischer
,
L.
,
Maggiano
,
L.
, and
Barnard Feeney
,
A.
,
2016
, “
Testing the Digital Thread in Support of Model-Based Manufacturing and Inspection
,”
ASME J. Comput. Inf. Sci. Eng.
,
16
(
2
), p.
021001
.
22.
Scholz
,
J.
,
Bernstein
,
W. Z.
, and
Radkowski
,
R.
,
2022
, “
Research Directions for Merging Geospatial Technologies With Smart Manufacturing Systems
,”
Smart Sustain. Manuf. Syst.
,
6
(
1
), p.
226
.
23.
Autodesk, 2023, FBX – FilmBox 3D File Format.
24.
ARM Institute, 2022, “Project Highlight: Virtual Part Repair Programming for Robotic Thermal Spray Applications,” https://arminstitute.org/news/project-highlight-virutal-part-repair/.
25.
Kruijff
,
E.
,
Swan
,
J. E.
, and
Feiner
,
S.
,
2010
, “
Perceptual Issues in Augmented Reality Revisited
,”
2010 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
,
Seoul, South Korea
,
Oct. 13–16
, IEEE, pp.
3
12
.
26.
Elmqvist
,
N.
, and
Yi
,
J. S.
,
2012
, “
Patterns for Visualization Evaluation
,”
BELIV '12: Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors – Novel Methods for Visualization
,
Seattle, WA
,
Oct. 14–15
, pp.
1
8
.
27.
Kemmerer
,
S. J.
,
1999
, “STEP: The Grand Experience,” NIST Pub Series – Special Publication (NIST SP).
28.
Epic Games, 2023, The Unreal Engine, https://www.unrealengine.com.
29.
Unity Technologies, 2023, Unity3D Game Engine, https://unity.com/.
30.
Perey
,
C.
, and
Bernstein
,
W. Z.
,
2022
, “
A Research Agenda for Enterprise Augmented Reality
,”
2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
,
Christchurch, New Zealand
,
Mar. 12–16
, IEEE, pp.
477
480
.
31.
Vernica
,
T.
,
Lipman
,
R.
,
Kramer
,
T.
,
Kwon
,
S.
, and
Bernstein
,
W. Z.
,
2022
, “
Visualizing Standardized Model-Based Design and Inspection Data in Augmented Reality
,”
ASME J. Comput. Inf. Sci. Eng.
,
22
(
4
), p.
041001
.
32.
Mirzaiee
,
R.
,
Vernica
,
T.
,
Scheuringer
,
K.
, and
Bernstein
,
W. Z.
,
2022
, “
Towards Retargetable Animations for Industrial Augmented Reality
,”
2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
,
Christchurch, New Zealand
,
Mar. 12–16
, IEEE, pp.
872
873
.
33.
Durscher
,
R.
,
Pankonien
,
A. M.
, and
Bhagat
,
N.
,
2019
, “
AArDVARK: Aerospace Analysis and Design in Virtual and Augmented Reality ToolKit
,”
Proceedings of the AIAA Aviation 2019 Forum
,
Dallas, TX
,
June 17–21
, p.
3560
.
34.
Metaverse Standards Forum, 2023, “Metaverse Standards Forum,” https://metaverse-standards.org/.
35.
Satyanarayanan
,
M.
,
2017
, “
The Emergence of Edge Computing
,”
Computer
,
50
(
1
), pp.
30
39
.
36.
Satyanarayanan
,
M.
,
Simoens
,
P.
,
Xiao
,
Y.
,
Pillai
,
P.
,
Chen
,
Z.
,
Ha
,
K.
,
Hu
,
W.
, and
Amos
,
B.
,
2015
, “
Edge Analytics in the Internet of Things
,”
IEEE Pervasive Comput.
,
14
(
2
), pp.
24
31
.
37.
Kwon
,
S.
,
Monnier
,
L. V.
,
Barbau
,
R.
, and
Bernstein
,
W. Z.
,
2020
, “
Enriching Standards-Based Digital Thread by Fusing As-Designed and As-Inspected Data Using Knowledge Graphs
,”
Adv. Eng. Inform.
,
46
(
1
), p.
101102
.
38.
Kulvatunyou
,
B.
,
Wallace
,
E.
,
Kiritsis
,
D.
,
Smith
,
B.
, and
Will
,
C.
,
2018
, “
The Industrial Ontologies Foundry Proof-of-Concept Project
,” Advances in Production Management Systems, Smart Manufacturing for Industry 4.0: IFIP WG 5.7 International Conference, APMS 2018, Seoul, Korea, Aug. 26–30, Proceedings, Part II,
Springer
, pp.
402
409
.
39.
Brundage
,
M. P.
,
Sexton
,
T.
,
Hodkiewicz
,
M.
,
Dima
,
A.
, and
Lukens
,
S.
,
2021
, “
Technical Language Processing: Unlocking Maintenance Knowledge
,”
Manuf. Lett.
,
27
(
1
), pp.
42
46
.