SCROLL

TRACKING

GESTURE VIZ

Giving users the freedom to control their world

UX Design | AR Design | Adobe XD | AFTER EFFECTS

CHALLENGE

Gestures on traditional AR and VR headsets that use hand tracking for interaction utilize mechanical and static gestures that do not calibrate based on users input. Design a system that allows users to interact with natural and customizable gestures

BACKGROUND

My love for AR/VR led me to work on it for the past 4-5 years and I’m still considered an early adopter. As an early adopter, I took it upon myself to provide more exposure to the field by nudging my friends and family to use instagram filters I would create, my google cardboard, the university HoloLens and my Oculus Quest.

While doing so, I would observe their experience, their satisfaction, and their pain points. This project is a reflection of those observations and they served as a foundation to my research.

ROLE

RESEARCH

01

I conducted observations in a variety of settings, ranging from using AR interfaces with the HoloLens, to using controller as well as hand tracking with the Oculus Rift with Leap motion. I then proceeded to conduct literature review and semi structured interviews to gain further insights and take a look at examples in other fields and their adaptation strategies.

ANALYSIS

02

I analyzed these observations and information to create insights which then facilitated my design direction. It also allowed me to narrow down my scope to create a solution.

IDEATION

03

Based on the analysis I moved ahead with prospective design iterations by creating paper prototypes and testing the usability of the interactions.

DESIGN

04

To create the high fidelity concept, I decided to create a video showcasing the solution with the intention of using it as a reference for future development when the technology becomes available.

SOLUTION

Gives users the ability to create and use personalized and natural gestures without the need to code.

OBSERVATIONS

ONBOARDING

01

First time users often times had a difficult time using both controllers and hand tracking for different purposes

I had a tough time orienting myself to the finger allotment of the controller

Mechanical gestures broke the immersion of the experience

When the gesture does not work I need to think about doing it correctly and try again.. it just kills the mood

GESTURES

02

The gestures did not seem intuitive and forced users to consciously take efforts to complete tasks

I do not click on things by creating an “L” in other cases, so it just feel awkward

- Hololens 1 (Non-instinctual interaction)

Discrepancy between interaction with  visual elements and real world objects

I always want to touch the objects and it bothers me that I can’t use them like I would in the real world

- Elixir Demo, Oculus Connect 6

SIMULATION

COMPARISON

01

To gain better a understanding of interactions in VR/AR vs the real world I created low fidelity simulations that helped me focus on just the interactions

I hung a piece of paper to simulate the floating UI in VR. Doing so allowed me to rapidly test multiple UI by simply replacing the paper and turning it around for the next page and so on.

INSIGHTS

INTERACTIONS

01

Based on the observations and simulation, I understood the difference in interaction between the two worlds.

This allowed me to identify problems with current input techniques. These are my insights:

traditional gestures were more mechanical and did not adjust for user habits

there is a gamut of methods in which people interact with objects

gestures should correspond to their counterparts in the real world

Traditional Orientation

Natural Orientation

INTERFACES

02

For a truly immersive experience it is critical to create experiences that do not raise the question of

How should I use this?

This can be mitigated by

Mimicking real life objects which instinctively nudges users to interact with the object in a natural manner

IDEATION

I used affinity diagramming to brainstorm possible ideas and narrow down my prospects

CONCEPT

GESTURE VIZ

A gesture input system that learns user gestures based on use to narrow the gap between traditional, and user gestures and also allows users to visualize their custom gestures while creating them.

By utilizing these gestures, users can interact with UI elements in a natural and personalized manner. This is assisted by the fact that the UI elements themselves are 3-dimensional which allows users to interact with the system in a way that they would with real objects.

RECOGNITION

01

To design a system that would recognize user inputs and learn from it, it would need to differentiate between gestures. To do so, I conceptualized a technique that would recognize physical displacement of finger joints between start and end states of the gesture.

CATEGORIZATION

02

The displacement technique required me to study and categorize gestures in a way that would be scalable to any gesture.

start - start gestures

Gestures can be categorized into:

start - end gestures

START-START

Start-Start gestures are gestures that begin and end at virtually the same point. These gestures need not be 1 step gestures and can have multiple checkpoints that ultimately return to the start point.

For example:

DOUBLE tap

single tap

START-END

Start-End gestures are gestures that begin at one point and end at a different point. Like Start-start gestures, these gestures need not be 1 step gestures and can have multiple checkpoints.

For example:

KNOB TURNS

SWIPE LEFT/RIGHT

REGRESSION

03

Finally, both categorize of gestures can be scaled infinitely by using regression at any point between the first and last point of displacement

LOOPED

This regression can be categorized into:

UNLOOPED

LOOPED

Looped regression is a cyclic process that occurs inside and or with the first and last point of displacement that can include anywhere from 1 to n different points, occurring at least 2 times to complete the gesture

UNLOOPED

Unlooped regression is a non-cyclic process that occurs inside and or with the first and last point of displacement that can include anywhere from 1 to n different points, occurring at least 1 time to complete the gesture

Using this categorization technique, any gesture can be identified categorized into steps irrespective of size(n).

For example this figure shows an n-step Start-End gesture

FINAL

A gesture setup system that allows users to visualize their custom gestures while creating them. Using these gestures, users can interact with UI elements in a natural and personal way. This is assisted by the fact that the UI elements themselves are 3-dimensional which allows users to interact with the system in a way that they would with real objects.

USE CASES

ACCESSIBILITY

01

Providing users especially ones with physical disabilities with the power of creating custom gestures through visualization will allow to broaden the user pool. For example, A user suffering from Parkinson's disease can take advantage of such a system to train their headset and use VR and AR

POWER USERS

02

As VR and AR grows, it will lead to processes and tasks exclusive to the medium. With every industry that arises a new set of power users is created. The gesture viz system will allow users to customize their process to improve efficiency

DEV AND DESIGN

03

One of the major problems that designers and developers face is prototyping in AR and VR. My solution can help solve that problem by providing rapid customizable gestures to prototype.