LikeThat Fashion App

Hired to ideate on apps that could incorporate a patented visual search technology, I drove research, discovery, UX and final visual design while keeping in mind my goal of providing user delight. I worked closely with the team as we started the journey in understanding user’s goals, needs and desires in the world of women’s fashion. Early on, I conducted competitive product reviews, as well as chaired primary research including expert interviews, demographic focus groups and analysis of fashion oriented online content and apps.

Research Phase

Target Demographic: Women who are fashion conscious, Age: 16-39. Other SME’s & Stakeholders include – Algorithm Scientists, Fashion Bloggers, Sales Clerks and Women Shoppers. To round out research  I also met with each stakeholder to gather their own vision for the company and products. I also had phone calls with professional fashion bloggers and talked with sales clerks.

Persona Development

The Fashionista – She’s into fashion big time! Follows other fashion forward stars on Instagram and Youtube and her friends ask her advice before they buy something. She posts to her own blog or sites like The Hunt and has tons of fashion photos on her phone album. She has a tolerance for a higher designer price point on key items.

The Fashion Hunter – She came across a fashion item she’s madly in love with. It could be from last season, belong to a stranger, out of stock or not available in her size. She needs a way to find this item or something close to it. She’s willing to look online and spend a considerable amount of time searching.

Fashion on a Budget – She’s younger and is still developing her own style. Wants a lot of inspiration, is budget conscious and is seeking a great deal. She has several store apps that send her email coupons and shops online most of the time at stores she can trust.

Competitive Review

To better understand user desires, needs and pain points, we conducted a competitive product review. We looked for insights around the problem these apps were solving and how could we could potentially do it better. Below are a few of the main apps we reviewed. (The Hunt, ASAP54, Instagram and ShopStyle).

Focusing on User Goals

Since photographs are the heart of JustVisual’s search technology, I led the team in exploring ideas around user intent and photo journey lines. To get clearer understanding and gain user empathy, I also had the team create some problem statements. Contribution – Used UX exercises to lead the team through deeper understanding of user goals and problems to be solved. Ideation included User Journeylines, Assumptions gathering, Problem Statements and Storyboarding.

Hypotheses to Test

We brainstormed and voted on our most promising hypotheses to test based on our problem statements.

  • If we provide a camera utility, women will take a photo of a fashion item they want to find

  • If we search hundreds of online retail stores for similar items, users will find what they want and save time (Multiple stores online and offline)

  • If we connect users to similar products that cost less than the original, they will be delighted

  • If we give users a fashion gallery, they’ll browse and find something that interest them and tap on it

  • If we curate hundreds of fashion photos by similarity, women will like to use it for outfit inspiration

Narrowing in on Assumptions

  • Women often see an item that is unavailable

  • Women find fashion items they like, but are too expensive

  • Women have photos of fashion they like or want to buy in their mobile phone album

Discovery through User Research

I next created some quick paper tests to try with users. Prior to our onsite meeting, I asked each user to email me photos of products they wanted. In our user session we’d test our hypothesis and determine its “success” metric, which would inform us around persevering or pivoting.

Hypothesis – Participants have a photo of something they want and would be delighted with our product results. 

In one case, I hypothetically mocked up the final result without the algorithm’s help. It would still help us measure whether users liked the idea. In a later session, I had our engineers manually run the photo through the algorithm and printed out the results. In both sessions, users expressed surprise and excitement when showed the results. Our metric was 95% – Positive.

Field Research

Follow-me-homes and observing users in their environment provided powerful insights to user pain points in the real world. Doing this revealed what users “say” versus “do”. To quickly generated insights, I spent time in retail spaces to observe and talk with sales clerks and shoppers about their experiences. I wanted to observe how photos played a role in fashion exploration.

Another main hypothesis we wanted to test was around inspiration. Instagram has a huge following for photos surrounding complete outfits.

Hypothesis – Women want outfit inspiration!

Again we had participants submit photos, but this time of a piece of clothing they currently owned. Since the algorithm was not trained on complete outfits, to simulate the concept I manually put together a collection of matching outfits and I printed the results.  To the team’s surprise, the final usability metric was only 30% – Most women felt confident in their style, didn’t necessarily agree that these outfits were something they’d wear and felt they had enough inspiration.

 Pivoted Hypothesis

Sketching

Based on our findings thus far, I began sketching UI, data elements and overall functions. I presented a wide variety of ideas. Some designs had emphasis on the phone’s camera, while others asked permission to access user’s albums. The vision came together as we considered user goals, simplicity and a delightful experience.

The app went through several wireframe iterations as we considered different ways of leading the user through a best-success path.

Wireframing​

Prototyping

Pixate proved to be the best tool of choice for prototyping. I started with InVision, but felt I needed better transition animations. These animations helped the app feel real for user testing, but were also valuable to show engineers how I envisioned gestures and things animating. Because of tight timelines the team wanted to develop a high‐fidelity prototype, so I quickly put together some designs. As time went on, I continued to refine the visuals prior to cutting assets for the engineers. Contribution – Developed a number of working prototypes that I used in the field conducting impromptu user tests at malls and with friends and family.

Challenges, Ideas & Camera Usage 

The team continued to build out the prototype, while the algorithm team released version 1.0 of the visual search program. Our friends and family sent in images for us test. What became apparent was that most of these images were poor quality. Issues included – shot from an odd angle, poor lighting, items on hangers, cropped or obscured clothing items. In response, the algorithm team began to train for poor quality images to compensate.

In response to these new issues, I proposed some contextual training to help users take better photos. Some ideas included camera tips on a poor result or null. Another rough idea was to include an overlay in the camera, prompting the user (with or without text) to align the item they were photographing in a particular way. Contributions – Committed to the best user experience, I designed a number of contextual solutions to help improve user generated photos 

At this time, with the team’s hesitation on including any camera education, I thought of more radical ways of improving the user’s experience. Using the algorithm I tested professionally shot store images versus user generated images. Professionally shot photos faired better. Next I proposed an idea of downplaying the camera and album usage and introduce a filterable, product feed that could be used as queries. Contributions – I experimented with all types of photos, from user generated, blog style shots, magazine spreads, Instagram photos and merchant photos, finding that professionally shot merchant images returned the best results.

Data Set
camera help
Feed
Image 1: Data set results show poorly shot image return mismatches. Image 2 & 3: Ideas for improving user image in use. Image 5 & 6: User feed of images for search queries.

Visual Design

Initial visual design treatments included “luxury” blog photos with beautiful environments. However, we pivoted on such photos, as they didn’t always return the best results. Later I altered the design to have a two-column layout to accommodate smaller photo sizes scraped from online merchants.

27d_visual_design4b
27d_tap_to_startvisual_design4b
27d_community_visual_design4
27d_try-it_visual_design4_no_button
27d_tap_to_start3_visual_design4b
27d_tap_to_start2_visual_design4b
27d_community2_visual_design4

Launch, Usage and Retention and Awards

LikeThat Style and StylePlus apps were launched live in the app store. The team monitored app growth over the next two months. We created Facebook ads driving installs. Users appeared excited about the app and many would tap 10-15 images in one session, returning in a few days with more taps and submissions of their own photos. Retention looked promising.

 

The app was featured in articles and fashion blogs and won first prize in TNW Conference. The entire app suite, including LikeThat Decor (Home decor), LikeThat Pet and LikeThat Garden were also nominated for a Webby in 2016.

Full LikeThat App Suite

Success with the LTS Fashion App lead to a larger app suite with other verticals experimental apps such as home decor, plant identification, finding a pet in local shelters and games involving face matching. Final visual designs were not final for these new verticals and for other financial and legal reasons outside of the apps successes, JustVisual, a Palo Alto startup, was closed in 2015. Sadly, the apps are no longer available in the app store.

© 2020 Kelly Harrison. All rights reserved. (C)

13_brainstorm_terms_and_photo_locations