Rendering eines Kaffeevollautomaten mit Display und eines Smartphones mit dem gleichen Display.

Mixed Prototyping: Test your UI with AR

Alyona Morozova UX Designer

21/10/2020 • 8 minutes reading time

Imagine this…

You have an idea or a product in an early development stage. Now, you want to know if it fulfills its purpose and gain a better understanding of its future Look & Feel. You also want to test its acceptance with real users.

Yet not developed, your product does not leave the chances to be tested conventionally. But you still need like to know in advance if the design is worth further development.

What would you do?

"The technologies that we design do not, and never will, exist in a vacuum." — Bill Buxton

Leaves room for improvement: classic prototyping

Typically, usability professionals start with defining the product goals and user groups, and continue with developing prototypes for testings. Prototypes, as opposed to the final product, need not to be fully functional. Basically, they only need to fulfill two goals: look like a product and let researchers test a selected number of scenarios.

So far, so good.

But imagine you’re prototyping a software or an embedded UI for an industrial machine or a complex environment. In this context, the influence of the interaction and impression of the machine might be crucial to the success of your interface.

Will testing your idea in a neutral environment deliver a realistic experience?
How could you efficiently test an interface without its respective hardware?

Well… Mixed Reality might be the answer.

When the project started we used the HoloLens 1 — the second generation was not available yet
When the project started we used the HoloLens 1 — the second generation was not available yet

Exploring new possibilities: Prototyping in Mixed Reality

There exist various ways of prototyping, ranging from completely physical to completely virtual scenarios.

Before Microsoft’s HoloLens 2 was released, we explored augmenting the hardware to the testing scenario using the first generation of HoloLens.

The idea was to give the user a more realistic understanding of the size and appearance of a product, i.e., the hardware, plus a real display to interact with.

Imagine the basic workflow as follows:

A designer creates a digital UI prototype in a prototyping tool and exports it to a web format (HTML, CSS, JS).

The resulting export is used as an input for a hologram, which represents a physical device.

During the testing, the user wears a HoloLens. He or she interacts with the interface through the browser on a smartphone (or any other web-enabled device) and gets an instant animation or sound.

The machine surrounding the interface is added as a hologram to the screen via the HoloLens.

Both the hologram and the interface are now seen as one system. The interaction and the interface can be tested more realistically.

Hence, designers and developers can test a ready-to-interact product more quickly.

Of course, we have tested the concept at Ergosign. Here’s how:

Rendering eines Kaffeevollautomaten mit Display und eines Smartphones mit dem gleichen Display.
Our testing scenario: a new interface for a coffee machine

Working in the background: the technology

We set up an IPC Network for the interaction between two devices (the screen and the HoloLens) and a user. It includes three major components:

  1. NodeJS Server

  2. Socket.io Webclient

  3. TCP .NET Client

The NodeJS server runs on a computer, and two clients run on the smartphone and the HoloLens.

Schematische Zeichnung zur Kommunikation zwischen den Servern, Clients und der HoloLens.
Kommunikation zwischen den Servern, Clients und der HoloLens

We used WebSockets, a TCP based network protocol for a web browser client, with a Unity/UWP (Universal Windows Platform) client.

Programming code in Editor.
Code Snippet

The web client sends a message to the server, while the UWP client constantly requests state updates from the server.

The input that the WebSocket client receives serves as a trigger for the animation of the virtual prototype. If a user starts a machine process, the machine hologram reacts accordingly.

Our 3D models are done in Blender or Autodesk 3ds Max, all interactions and particle systems are implemented in Unity.

We also used HammerJS for recognizing touch interactions from the web client and mapping a virtual model to the UI with the help of Vuforia SDK in Unity.

This project started before the second generation of the HoloLens was published and delivered, so the machine is hologrammed with the first generation of the Microsoft HoloLens. We used these glasses as they are wireless, handsfree, offer a decent resolution and might be considered an industry standard. And because technology can have a big impact, we’re curious about how the HoloLens 2 will change the game!

Let’s go: our showcase

We built a prototype of a coffee machine to demonstrate how the whole setup should function:

The hologram of a coffee machine is anchored around the physical display of a smartphone. The smartphone is mounted on a tripod, with the UI opened in its browser. Because the interface is designed for a touch display, the user interaction experience is as close as possible to a real interaction with an embedded display.

The user is welcome to play around with the prototype and offered classic scenarios, such as

  • making a regular cup of coffee (pressing a button and observing coffee pouring into the cup)

  • adding milk to it (pressing the button longer)

  • refilling the coffee container after getting an error message.

This setup also allows testing how the digital UI accounts for possible safety issues. For example, having a user make herself a cup of tea, requires observing that it is poured from another tap.

Summary: the numerous benefits

Remote usability testings have high chances to become a new standard, requiring for reusable, mobile, but robust setup for smooth and effortless interaction.

It will provide the main stakeholders, namely designers, researchers, clients and system users, with a common ground.

It will also equip designers with a tool for better understanding the underlying physical product and early testing design ideas and interaction techniques.
At the same time, this setup allows researchers to test the interface in order to meet users’ needs.

Our project is the first step towards implementing virtual prototypes in a normal environment.

Remote testing could also be interesting for your project? Feel free to contact us!