The story of Alicia, Bob, Carrie and Dave. Credit: N. Hanacek/NIST

Industrial XR: Fulfilling Human Potential in Smart Factories

--

Bill Berstein, Engineer, National Institute of Standards and Technology

Teodor Vernica, Research Associate, National Institute of Standards and Technology

Step into the factory of the future. Alicia, an operations manager, sits at her workstation viewing a digitally enhanced video feed of the facility using cameras installed in strategic locations. Wearing safety gear, a maintenance engineer named Bob checks his tablet for the next machine to fix. Equipped with a headset and controllers, Dave, a software engineer at HQ, serves as a virtual tour guide for Carrie, the company’s lead executive. Wearing an augmented reality (AR) headset, Carrie surveys her machines as she walks through the facility. With Dave’s guidance, she sees digital information, such as a machine’s status, appearing in her view.

Each able to experience a virtual overlay onto a physical environment that provides more context relevant for their jobs, these co-workers can realize their potential as a team through industrial extended reality (XR), an umbrella term that encompasses a spectrum of technologies from partially immersive AR to completely immersive virtual reality.

This factory might be hard to imagine, but each technology already exists. What’s missing are standard formats, protocols and guidelines for them to work seamlessly with one another. In other words, the communication channels among these technologies remain shut.

The National Institute of Standards and Technology (NIST) is helping U.S. factories realize this vision through advanced information technology. Vital to this transformation are standards. Standards enable efficient organization of data and create a common language for machines and people to communicate. The factory of the future will require people to work closely and collaboratively with complex machines, but many unanswered questions remain. How do we pin down the precise locations of people and machines? How do we present the right information at the right time about machines to people? And, how can we do it all safely?

Leveraging industrial XR, manufacturers can present data in a spatially relevant way that takes advantage of people’s senses and natural ability to instantly recognize objects. For example, our most fond childhood memories often start with a specific time and place. For instance, one of us (Bill) will never forget sitting on his father’s shoulders during an afternoon at Disney World watching his hero Mickey Mouse lead the parade. Similarly, we can easily describe the spatial context of our workplace.

To support the pragmatic use of industrial XR, data standards play a key role. Our Product Lifecycle Data Exploration and Visualization project explores new approaches to bridge the communication gap between human and machine. Our project uses XR as a platform for presenting standard data representations to workers on the floor. Using existing standards from both manufacturing and IT communities, we develop more precise, interoperable and efficient approaches for industrial XR experiences.

Currently, our research focuses on delivering contextual views of machines for human workers by correlating their location and performance data. In doing so, we have already achieved many aspects of the activities of the stars of our story: Alicia, Bob, Carrie and Dave.

Tablet view of navigable path in a factory. Credit: B. Bernstein/T. Vernica/NIST

We recently published an article that demonstrates the use of standard geospatial representations to calculate the best path for a worker to travel — just as Carrie is being led through the facility by Dave.

Central to our approach is the use of standards for digitally representing geospatial layouts. Our project builds on standards published by the Open Geospatial Consortium (OGC) that provide guidelines for location-based services. An OGC standard called IndoorGML makes it possible for multiple devices to access the same digital content relative to a shared spatial context. Our approach enables the integration of a variety of technologies, including hardware such as webcams, tablets and headsets, and software such as computer vision toolkits and game engines. Note that our approach requires up-front time to create a spatial representation of a facility. Much like blueprints to a construction project, well-defined plans are critical to robust communication.

Demonstration of multiple devices referencing a combination of virtual-physical objects defined by a standard geospatial representation. Credit: B. Bernstein/T. Vernica/NIST

Manufacturing is just one domain that benefits from XR technologies. At NIST, researchers are leveraging XR to help solve other important problems, including understanding the spread of wildfires, enabling AR for first responders using smart helmets, and easing the training of industrial robots. Those NISTers interested in XR belong to a recently publicly announced group, called the NIST XR Community of Interest, where we share our cross-disciplinary experiences and skills. We encourage you to check out our website to learn more.

We anticipate that XR technologies will soon become a staple in modern daily life. As evidence, smartphones are already incorporating embedded hardware to make AR ubiquitous. To support the efficient and safe use of such technologies, standards are critical. XR technologies, like AR, are inherently designed to augment people’s awareness of what’s around them. Fulfilling the technological potential requires proper placement of things (both virtual and physical) and efficient communication between them. The technology will let humans fulfill their potential to do more things better and more safely.

This post originally appeared on Taking Measure, the official blog of the National Institute of Standards and Technology (NIST) on July 14, 2020.

To make sure you never miss our blog posts or other news from NIST, sign up for our email alerts.

About the Authors

Bill Bernstein leads the Product Lifecycle Data Exploration and Visualization project as part of the Model-Based Enterprise Program. Recently, Bill deployed the Digital Information Visualization and Exploration (DIVE) Lab to explore the use of visualization modalities in manufacturing environments. In 2019, Bill received the Young Engineer Award from the ASME Computers in Engineering Division for his research related to computer-supported tools easing the integration of manufacturing and product design knowledge.

Teodor Vernica is a research associate at the National Institute of Standards and Technology (NIST). He received his M.Sc. in computer science from Aarhus University in Denmark, where he focused on augmented reality (AR) applications. At NIST, Teodor is studying how standards can facilitate the development and adoption of Industrial AR solutions for smart manufacturing.

--

--

National Institute of Standards and Technology
National Institute of Standards and Technology

Written by National Institute of Standards and Technology

NIST promotes U.S. innovation by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life.

No responses yet