Haptic Vision
#UX Design #VR/AR #Unity
Haptic Vision is an inclusive technology that allows individuals to navigate their physical surroundings with greater ease and understanding.
Timeline
Jan 13 - 15, 2023 (2.5 days)
Project Type
MIT Reality Hackthon , Semi-Finalist
Team
Winny Wang (UX designer), Christine Saderr (UX designer), Kyle Diaz-Castro (UX designer), Leon Kipkoech (Developer), Malcom Powers (Developer)
Tools
Figma, Illustrator, Premiere Pro, HapTx Haptic Gloves, HTC Vive pro eye (with Lighthouses), Unity
My Contribution
UX Research & Design: user interview; literature research; design interactions
Prototype & Test: experiment the haptic gloves for prototyping; conduct user tests and document feedback
Branding: design visual content, presentation
Background
Overview
I joined the MIT Reality Hack 2023, where over 350 participants come together from all over the world and develop creative VR/AR projects in a group of 5. Our team created Haptic Vision in 2.5 days, an inclusive technology that allows individuals to navigate their physical surroundings with greater ease and understanding. It won Semi-Finalist and was featured in the organization video.
Challenge
Problem Statement
How might we utilize XR technology and design a smart navigation tool for the visually impaired?
Design Process
We defined the Problem
In the morning of Day 1, we spent time writing different problems we identified with different types of visually impaired people. We narrowed down the project to focus on a specific problem, which was the fact that Visually Impaired users have a difficult time navigating their environment confidently.
We collaborated to come up with Solutions
In the rest of Day 1, we collaborated on different ideas to approach a solution. Some of the ideas explored improving vision with enhancements of visuals, audio systems, and haptic feedback. During this phase, we also conducted user research by communicating with visually impaired people. To learn about what pain points they experience, how they "see", what tools they use, and what they want. With this, we were also able to create a persona.
We decided on the MVP Solution
We narrowed down the problem to a specific solution involving Haptic Feedback. And we defined the MVP of the product to ensure we don't sidetrack ourselves in this concise time frame. The idea is to create a way to detect objects and use haptics to let users feel how far that object is.
Tritanopia color blind friendly color palette
Prototype process
We developed a prototype
The project was developed using Unity and the HTC Vive Pro Eye with Lighthouses, HapTx haptic Gloves as its hand tracking is compatible with any VR HMD that uses Lighthouses and Windows. The following list summarizes the key features and technologies used:
Technologies HapTx, HTC Vive pro eye, Unity 19.4.31f
SDKs HapTx 2.0.0 beta 8 SRworks
Physical Environment and Mixed Reality Using SR works, we created a 3D model of the environment. However, we faced challenges with implementing hand tracking.
Hand Tracking We chose to use the HapTx SDK, compatible with the same version of Unity used for SR works.
Object Detection SR works were utilized for object detection, and the AI model provided by SR works can identify common objects such as chairs and tables.
Distance Approximation Ray casting, built into Unity, determined the distance from the hand to an object.
Haptics Based on the distance, we set the frequency and amplitude of the gloves, with a maximum frequency of 30 Hz and a minimum of 15 Hz. The amplitude remains untested at this time.
Testing The project was tested on Windows 11, which supports VR development.
Staging The project was hosted on Github, with the "works" branch used for development and testing, and the main code was published under the "main" branch.
We tested on actual users
We got a visually impaired user to test out the project in the end. And the feedback was amazing. The user wears the gloves and headset, they can use their hands to sense the room from the haptic feedback on the gloves. In the Unity view, there is a AI model getting the depth data from the surroundings. The closer they get to objects, the stronger the vibration is.
Final outcome
Accomplishments that we're proud of
We’re the first group in history to make a haptic XR experience with the gloves!!
By utilizing sound and haptic feedback, Haptic Vision is an extension of a traditional seeing cane, enabling users to sense the presence of nearby objects and furniture. The user wears a VR headset and Haptx gloves, and as their hands approach objects like furniture, they will feel a vibration sensation as if they are physically touching the object. The closer they get to the object, the stronger and more detailed the sensation becomes, providing a clear understanding of its location and size.
Our product is designed for individuals of all abilities. For those with visual impairments, Haptic Vision is an essential tool that can assist in understanding and navigating their environment. For those without visual impairments, Haptic Vision offers a unique, empathetic experience, allowing them to understand better the challenges faced by those with visual impairments.
We envision Haptic Vision as a pioneering solution for the future of navigation for individuals with visual impairments. By leveraging cutting-edge technologies such as XR, we aim to enhance the senses and empower those with visual impairments to navigate confidently and independently, even in obstacles. Haptic Vision is a glimpse into the next 5-10 years, where innovative technologies will revolutionize how people with visual impairments interact with the world around them.