SCROLL
Giving users the freedom to control their world
UX Design | AR Design | Adobe XD | AFTER EFFECTS
Gestures on traditional AR and VR headsets that use hand tracking for interaction utilize mechanical and static gestures that do not calibrate based on users input. Design a system that allows users to interact with natural and customizable gestures
RESEARCH
I conducted observations in a variety of settings, ranging from using AR interfaces with the HoloLens, to using controller as well as hand tracking with the Oculus Rift with Leap motion. I then proceeded to conduct literature review and semi structured interviews to gain further insights and take a look at examples in other fields and their adaptation strategies.
ANALYSIS
I analyzed these observations and information to create insights which then facilitated my design direction. It also allowed me to narrow down my scope to create a solution.
IDEATION
Based on the analysis I moved ahead with prospective design iterations by creating paper prototypes and testing the usability of the interactions.
DESIGN
To create the high fidelity concept, I decided to create a video showcasing the solution with the intention of using it as a reference for future development when the technology becomes available.
Gives users the ability to create and use personalized and natural gestures without the need to code.
ONBOARDING
First time users often times had a difficult time using both controllers and hand tracking for different purposes
Mechanical gestures broke the immersion of the experience
GESTURES
The gestures did not seem intuitive and forced users to consciously take efforts to complete tasks
- Hololens 1 (Non-instinctual interaction)
Discrepancy between interaction with visual elements and real world objects
- Elixir Demo, Oculus Connect 6
COMPARISON
To gain better a understanding of interactions in VR/AR vs the real world I created low fidelity simulations that helped me focus on just the interactions
I hung a piece of paper to simulate the floating UI in VR. Doing so allowed me to rapidly test multiple UI by simply replacing the paper and turning it around for the next page and so on.
INTERACTIONS
Based on the observations and simulation, I understood the difference in interaction between the two worlds.
This allowed me to identify problems with current input techniques. These are my insights:
traditional gestures were more mechanical and did not adjust for user habits
there is a gamut of methods in which people interact with objects
gestures should correspond to their counterparts in the real world
Traditional Orientation
Natural Orientation
INTERFACES
For a truly immersive experience it is critical to create experiences that do not raise the question of
This can be mitigated by
Mimicking real life objects which instinctively nudges users to interact with the object in a natural manner
I used affinity diagramming to brainstorm possible ideas and narrow down my prospects
A gesture input system that learns user gestures based on use to narrow the gap between traditional, and user gestures and also allows users to visualize their custom gestures while creating them.
By utilizing these gestures, users can interact with UI elements in a natural and personalized manner. This is assisted by the fact that the UI elements themselves are 3-dimensional which allows users to interact with the system in a way that they would with real objects.
To design a system that would recognize user inputs and learn from it, it would need to differentiate between gestures. To do so, I conceptualized a technique that would recognize physical displacement of finger joints between start and end states of the gesture.
The displacement technique required me to study and categorize gestures in a way that would be scalable to any gesture.
start - start gestures
Gestures can be categorized into:
start - end gestures
Start-Start gestures are gestures that begin and end at virtually the same point. These gestures need not be 1 step gestures and can have multiple checkpoints that ultimately return to the start point.
For example:
DOUBLE tap
single tap
Start-End gestures are gestures that begin at one point and end at a different point. Like Start-start gestures, these gestures need not be 1 step gestures and can have multiple checkpoints.
For example:
KNOB TURNS
SWIPE LEFT/RIGHT
Finally, both categorize of gestures can be scaled infinitely by using regression at any point between the first and last point of displacement
LOOPED
This regression can be categorized into:
UNLOOPED
Looped regression is a cyclic process that occurs inside and or with the first and last point of displacement that can include anywhere from 1 to n different points, occurring at least 2 times to complete the gesture
Unlooped regression is a non-cyclic process that occurs inside and or with the first and last point of displacement that can include anywhere from 1 to n different points, occurring at least 1 time to complete the gesture
Using this categorization technique, any gesture can be identified categorized into steps irrespective of size(n).
For example this figure shows an n-step Start-End gesture
A gesture setup system that allows users to visualize their custom gestures while creating them. Using these gestures, users can interact with UI elements in a natural and personal way. This is assisted by the fact that the UI elements themselves are 3-dimensional which allows users to interact with the system in a way that they would with real objects.
ACCESSIBILITY
Providing users especially ones with physical disabilities with the power of creating custom gestures through visualization will allow to broaden the user pool. For example, A user suffering from Parkinson's disease can take advantage of such a system to train their headset and use VR and AR
POWER USERS
As VR and AR grows, it will lead to processes and tasks exclusive to the medium. With every industry that arises a new set of power users is created. The gesture viz system will allow users to customize their process to improve efficiency
DEV AND DESIGN
One of the major problems that designers and developers face is prototyping in AR and VR. My solution can help solve that problem by providing rapid customizable gestures to prototype.