style it: IoT + productivity application for visually impaired users

One’s outward appearance becomes a dominating factor in a person’s representation in social situations such as job interviews, First dates, and social events. with all of the visual indicators that identify stylish and socially appropriate clothing—pattern, color, and texture—choosing an outfit for the day is a simple task for most people. However, people with visual impairment must use alternative methods to complete this same task by relying mostly on touch, memory and sound. vision loss often limits one’s ability to identify colors and patterns making appropriate decisions about clothing combinations challenging.

People make decisions about how they will visually represent themselves by choosing the most appropriate outfit to wear for the day’s activities. envision the process and steps when choosing an outfit for the day: what do you think and see when choosing an outfit? how do you make choices about matching colors or pattern? now imagine completing this same daily task with little or no functional vision. This is the scenario my uncle bob went through after losing his vision, in the hospital room, following a diabetic surgery. He was in his mid forties and had to relearn how to complete daily living tasks including selecting an outfit for the day.  His adaptation to a new way of life was the inspiration for this project.

current methods & research

observing current methods of clothing choice

Outside of the manual methods to choose an outfit, such as touch, memory, tactile tags, basic fashion knowledge, articulate organizations systems, and sighted assistance; people with visual impairment can explore technological approaches. Technical solutions range from clothing specifically designed for the visual impairment community, to Assistive Technology alternatives. Study participants, for the preliminary research phase of this study, were selected from the iBug (iOS iBlind Users Group), A nonprofit organization headquartered in Houston, Texas, who promotes individual technological independence to their monthly meeting attendees.

Visually impaired participants included; two members with low vision and four members with no functional vision. The group of individual participants consisted of five females and one male. Two research subjects were employed while the other four were unemployed. Four of the participants lived in their residence alone while the other two lived together as a couple. meetings were arranged at their place of dwelling to understand and observe how each participant completed choosing an outfit for the day.

Data gathered from the immersive design approach, in their home where they could evaluate all of the items they own to make the best choice for the occasion using current methods, and task analysis, a task list was read out loud to study participants to complete in-person at their place of living. Outfit occasion requests included: casual outfit, social outfit, professional outfit, and special outfit. time-on-task and accuracy of identification were documented while tasks were completed. this documented time-on-task provided a standard time to measure improvements or flaws in the mobile application digital prototype user experience.

The evaluation of current methods resulted in 78% of items chosen accurately, the slowest recorded time for selecting one item was four minutes and five seconds while the fastest recorded time was 25 seconds, and an average time between two to one and a half minutes for choosing a complete outfit.

control group outcome graphs

control group outcome graphs

phase I prototype

For this prototype study, the individuals used the VoiceOver accessibility feature available on the iPhone since the 3GS model, which provides speech output of text information on the phone. The solution for the ideation phase user-testing resulted in a mobile application that required the user taking a photo and adding a voice description to the item. this information would then be synced to a digital hanger.

digital prototype testing: prototype I

User-testing for Prototype 1 was completed at the Lighthouse San Francisco location, in San Francisco, California, where the organization is active in participating in user experience testing for other companies in the surrounding area. Erin Lauridsen, the Director of Access Technology at Lighthouse San Francisco, shared this opportunity with their members who showed a lot of interest in assistive technology and were open to sharing their opinions and feedback. Positive outcomes from the Prototype 1 user-testing resulted in improvements in accuracy of identifying items at 100%, versus the 78% percent accuracy observed during the manual methods of the control group, and improvement in the successful identification of clothing items with patterns or multiple colors decreasing the time-on-task at an average of 23 seconds per item. In Prototype 1 testing the longest time recorded out of the 5 participants was 2:23 seconds and the fastest time recorded was 47 seconds. post user-testing feedback provided some clear evaluations of the mobile application concept and experience. When asked if participants would adopt this method responses indicated simplified user flow solutions for entering clothing in to the digital closet were needed.

prototype 1 user data

prototype 1 user data

prototype 2

Improvements sought in the interaction for the Prototype 2 user-testing focused on decreasing the amount of time it took to add clothing to the digital closet. prototype 2 solutions included removing the sync from the hanger to the actual item of clothing, removing the need to take a photo, and using a scan plus VoiceOver implementation for identifying items accurately. After implementing the changes in Prototype 2, participants were recruited from the Texas State University Offices of Disability and The National Federation of the Blind Austin Chapter. There was less variation in time-on-task during Prototype 2 testing versus the many varying times in the control group data. The fastest time for selecting an outfit using the digital Prototype 2 was recorded at 20 seconds, with the slowest time recorded at 1:22 seconds. 7 out of 7 (100%) of users were very interested in having this mobile application to improve their current methods for choosing clothing on their own. 7 out of 7 (100%) of participants said minimal improvements were needed and admitted this mobile application would help them match clothing they already own. 7 out of 7 (100%) of participants thought they would use the mobile application everyday for outfit assistance.

add voice description or select provided item description

add voice description or select provided item description

grab outfit or check the match

grab outfit or check the match

prototype 2 user data

prototype 2 user data

results

When selecting and outfit for the day time-on-task decrease improvements using Prototype 2, versus control group data, by occasion were 13% for the social outfit, 44% for the professional outfit, 35% for the casual outfit, and 37% for the special occasion outfit. 7 out of 7 (100%) of participants were impressed with the ease of use and accessible design of Prototype 2 saying it was really clear and easy to understand, the labeling was on point, and all buttons were well labeled. many participants were eager to use this tool as soon as possible.

results.jpg

conclusion

By designing this mobile application user experience for people with visual impairment, as an initial audience with the goal of a final inclusive design: a design that is accessible by as many people as possible, is achievable. Through immersive research, the visually impaired participants were able to establish how they complete tasks independently and provided insight in to target areas of exploration this design could improve. Data gathered in this thesis research study indicates that improvement is possible with a mobile application and scan solution in decreasing time-on-task and achieving successful identification, when assisting people with visual impairment in choosing an outfit for the day.

Previous
Previous

design strategy: The Home Depot product page recommendations redesign

Next
Next

product design: The Home Depot: frequently bought together redesign