Microsoft Garage | UI/UX Design
The Chromati is in the form of glasses we designed that facilitate communication between a hearing-impaired person and those around them through speech and text.
Microsoft Design Team
"Design a product, service, or solution to
solve for exclusion in the workplace.”
Visuals and Prototyping
Team of 5
( Advised by Professor Dana Karwas)
In some specific industries, communication is seen to be one of the most if not the most essential factors to success. But nearly 2% of all Americans have a hearing problem, and more than 1% of those are considered deaf. This causes people with hearing problems might be excluded by such occupations.
The current communication system doesn't include hearing disabilities, which make hearing impaired people's employment situation become worse.
So the design challenge is ...
How might we provide a solution to solve exclusion in the workplace for hearing impaired people and extend the users to larger scope?
Understanding the User
To gain insight and learn about the disability we were working with, we went to the Center for Hearing and Communication, and met the advisor. The advisor introduced different kinds of hearing aid and information about hearing loss people. We summarized three main problems that impede the normal communication of people with hearing impairment.
Reliance on Lip Reading
Since we aim to design a diversity of ways thus include more people that could face hearing problems in various deskless works, we made a persona spectrum based on Microsoft Inclusive Toolkit Manual. It helps us understand who we will design for throughout the design process.
To help us define target users and better understand their main problems and needs, we chose three typical positions that could exclude the hearing impaired people: surgeon, customer service and lecturer. And we did research and interviews with people who were excluded by these occupations, and created the following persona.
We also created the user journey of each persona in order to decide what solutions could work for these varying users and contexts.
User persona and user journey
Our final solution has a mixed reality glasses, which is portable and comfortable to use when doing conversations. The Chromati is in the form of glasses that facilitate communication between a hearing-impaired person and those around them through speech and text. It can be operated through eye and hand movement.
To organize and clarify the function, we did information architecture and summarized these functions:
1. Eye Tracking
This would be used to select words, phrases, and settings in the Chromati interface.
2. Keyboard with autocomplete phrases
The user would type out what they would like to say with a virtual keyboard. Hand movements would be registered through the camera on the front of the frames.
3. Text to speech
The words that the user types would be spoken by an AI voice incorporated in the glasses.
4. Speech to text
The Chromati can pick up what other people are saying, and convert their dialogue into screen-based text on the interface.
5. Hand movement tracker
The glasses track hand movement so that the user can select specific words/phrases and select settings in the interface.
Concept map for Chromati's functions
For the visuals of the product, we used sketches in the beginning. We wanted to use these glasses because it is important that they do not take up too much of the face, and are not too uncomfortable. These glasses also have a webcam above and have bluetooth to connect hearing aids. Webcam could solve the distance and lip reading problems, catch people's lip movements and translate their words more accurate especially when they are far away from you. Connected to hearing aids will let them help you filter different frequency when you switch different modes on glasses.
In order to make the Chromati look more discrete, we coated it in a sleek and all black color.The glasses themselves sit lightly on the user's face so there is not much physical distraction when wearing them, but it has a noticeable enough design so a person interacting with the user can be able to tell they are wearing the Chromati.
Sketches of physical prototype
Physical prototype of Chromati
Wireframes and Iterations
We considered three main tasks as our design priority:
Providing different modes for users. We set four scenarios of communication, and the hearing aids will filter different noise frequency based on users' choices. This allows users to cope with different contexts, such as one on one talk, or group talk with crowd of people.
The speech-to-text and text-to-speech system. The screen will visualize the speech and show on the screen. Also, users can type on the virtual keyboard and convert words to voice.
The AI response. The AI response system provide users with possible replies when users would like to type responses. This could make it easier for users to convey their words, especially when it is in an emergency.
In order to user test our prototype, we compiled wireframes into Marvel app so that users could try the interactions. Click here and try our first version prototype
Low fidelity prototype of Chromati's interface
After the first round of our user testing process, we learned that the initial prototype of the product is too confusing as the symbols are not easy to understand. Moving forward, the design of the interface needs to be cleaned up and labels on our design need to indicate when someone is wearing the glasses. Overall, we want to improve the ease of use of our device as it has many functionalities which can get confusing. Some feedback and suggestions that we got from our users included:
1. Include more back buttons.
2. The ability to change the keyboard should be more visible.
3. Having to press the home button twice is tedious.
4. The icons did not make any sense. Navigating the interface was a bit confusing.
For subsequent testing we improved on our design for the second round of user testing by simplifying the interface. We deleted some screens, and we got rid of some unnecessary features. The users that we tested noticed. We got higher ratings on our design from the second round of user testing than from the first, and when asked what they liked about our design most said that the design was simple and easy to understand.
Based on testing results, we made the final style direction. Users can use eye tracking or hands movement to select and swipe elements on the interface. It has 4 modes for people to choose based on different situation: one on one mode, small group mode, crowd mode and phone call mode.
The Homepage of Chromati, select elements or slide the whole page to another
Simplified speech bubbles for group communication mode
One on one mode, AI relative response and virtual keyboard
Phone call mode needs people to connect the phone to the Chromati by accessing the device settings, then users can see the real-time text transferred from the voice of the phone.
Demo videos that simulate how Chromati helps hearing impaired people work in three scenarios: doing surgery, communicating with customers, giving lectures to students.