Trail planner designed for the AR era
Brisa is an experimental app developed in Unity. It aimed to test the boundaries of AR tech and check how intuitive it might be to regular users. As a designer specializing in mobile I was happy to take on that challenge.

The main problem was to figure out a way to translate UI patterns into 3D space. I didn't want to just copy 2D interfaces which, while practical, ignores most of AR’s potential. Augmented Reality apps can be so much more than a couple of iPads hanging in the air.
4 stages of UI design, starting with a flat interface, making way towards fully 3D UI
Prototyping
To tackle this problem I started with a flat UI and gradually made my way towards the fully 3D interface. Each phase would start with simple sketches that were then recreated in ShapesXR. The final mockup was then imported into Unity.
Research plan and app development
The main objective was to understand how users interact with 3d interfaces. Secondary goal was to test how various UI patterns translate into the 3D space.

After the initial competitive analysis, to understand the AR space and platform standards, I moved on to the development phase.
Research participants
Selected patterns
I consulted game developers and build a working front-end in Unity, to make the experience as real as possible. Meta Quests 3’s ability to show apps over a passthrough video feed was the reason I chose it for deployment.

I recruited 6 participants, chosen based on their AR/VR literacy. Their skill level was ranging from “I’ve never seen a headset before” to “cooking in AR to watch youtube on the wall”.

In-depth interviews were followed by the usability testing. I wrote down a simple scenario relying on basic UI patterns.
Key findings
Research revealed that less advanced users struggle with transition between distance grab and direct grabbing. If one of those interactions is in progress, other scripts should be disabled.

Beginners often have trouble entering and exiting a grab gesture. Experienced people know they need to be clear with their gestures, making them appear over-the-top, so they are recognized correctly. Disabling fingers other than index makes a big difference here.

Interestingly people tend to switch between direct touch and ray-casting depending on circumstances. Direct manipulation is preferred when precision is required.
When it comes to patterns, it turns out we don’t have to be so careful. Everything is easily recognizable in 3D space, and the new interactions are just as intuitive. Participants had no trouble dragging the map, pressing buttons, switching tabs and rearranging windows.

Page controller proved to be problematic, though. All participants tried pointing at the page and doing a click gesture to make it move. Less experienced users gave up at this point, not discovering the swipe interaction. Pro users correctly tried to drag the page, but only after a realization that tap doesn’t do anything.

Adding slight bounce animation after tapping to indicate the UI can be dragged would make improve the experience here.

Positive reactions I got made me realize that a proper AR app feels a place to work. When you look at it through the headset it should resemble an organized desk. Another thing that became obvious pretty quickly: people prefer curved screens, as they put everything at the same distance from the viewer.
The importance of platform standards
To understand the differences between guidelines I created designs for three major platforms: VisionOS, AndroidXR and MRTK3.

Even when UI is fully custom-made, imitating platform patterns adds a lot of value. It's worth considering, especially when it usually is just a matter of tweaking a few shapes and materials.

Aside from colors and corner-radius, Android’s carousel was the most distinctive change. It’s very unique to the platform and should be taken into the account.
Platform comparision: visionOS, androidXR and MRTK3
“I was amazed I instinctively knew how to move windows and use the app” Research Participant
“It’s like the whole room is an app now. I really see a potential in this technology” Research Participant