Could glasses replace your phone?
This question has been bothering me for a while now. What would it take for a AR OS to make us consider that switch.

Release of Meta SDK Interaction kit finally guided me towards the solution. It gave us the ability to use an index finger as a touchpad – so, we have physical feedback, precision, and a set of gestures we know well from our phones.

I just had to create a project that would give me a glimpse into the future of AR.

I built a prototype in Unity, relying on Meta XR SDK, running on Quest 3.
Microinteractions: physical feedback for multitasking
Palm keyboard: physical, more natural input
Working around the lack of physical feedback
Swiping sideways on your index finger, with your thumb, translates perfectly to multitasking interactions we have on mobile – swiping the bottom of the screen to switch/close apps.

To replace our phones AR interfaces need to get more comfortable, this is definitely a good start.

With the multitasking out of the way, keyboard was the next blocker to deal with. Data input is a serious problem in AR. To improve the experience of typing I came up with a Palm Keyboard.

Typing on your palm, feeling the resistance of your hand while pushing a button is a game changer.
AR interfaces have to feel natural
The gesture of bringing up the UI simulates the act of picking up a phone. It makes using the app intuitive. Grabbing the UI to switch hands or to show it to another AR users (share your screen) is another example of making interactions more natural.
Hand tracking and grab interaction