Tasty

Tasty is a company specializing in simple recipes accessible to all ages, initially launching their Android app in 2017 and expanding to iPhone users in 2018. During my senior project in college in spring 2017, I focused on Tasty's app, unearthing innovative mobile interactions that could revolutionize how we engage with recipes.

What made Tasty different.

While Tasty is now a well-known brand, when I began this case study, they were primarily a social media venture. Their video content was adored for its instructional cooking guidance, featuring overhead shots for step-by-step visual learning and precise ingredient measurements. During the development of their mobile app, I took the opportunity to contribute to its user-friendliness, ensuring an enjoyable experience for users on the go.

Observing how users follow recipes.

When I conducted interviews and user testings I discovered that 7 out of 10 people would gather ingredients before cooking and 9 out of 10 people would go back to ingredients 2 times before moving forward to the next steps.

This resulted in noticing 3 pain points:

  • Users having to go back and forth from ingredients to instructions leading to them being lost.
  • Continuously washing hands before moving forward to the next steps.
  • Locating the ingredient to measure correctly before moving forward which can risk following the recipe accurately.

Going back to how we currently follow recipes: books, articles, mobile apps

Since the beginning of writing recipes; ingredients were first shared and then instructions were second because it was part of the process to gather the ingredients first before cooking.

Over time we've seen a couple of different ways of laying out the information such as two pages, 2 columns, or ingredients over instructions.

Cooking up a solution

I hypothesize that if we were to allow users to swipe left and right instead of going up and down for recipes it will be easier to follow. I also believe that if we were to apply a hands-free control user will benefit from having to touch the device and focus on the cooking.

Swipe, tap, voice or all of the above?

As I begin sketching solutions to my hypothesis, I began wondering what would work best for the users while cooking? A storyboard was conducted to sketch out how a user may interact with the different features and to locate the best interaction to test out in a low fidelity wireframe.

Testing the idea in the kitchen.

Once I chose an idea, I brought the sketch to Adobe XD to create a low fidelity prototype to test with 3 people in the kitchen.

The results were promising:

  • Users continued to go back and forth from ingredients to instruction.
  • Having a swiping interaction is not effective while cooking.

Cooking up a NEW solution.

My new hypothesis is if we gave users a step by step feature to view instructions, ingredients, and video we can give users more time to enjoy cooking the recipes instead of understanding them.

Ingredients and instructions

Even though the swipe feature doesn't work well for when the user is cooking, it still has potential for when users are locating a recipe to cook. Helping them locate servings, time frames, and challenges.

Step by step experience

Not only does the user get a step by step experience in the video but they also get the ingredients and instruction included. Giving users all the information they need little at a time. To move to the next step users can either tap next or swipe next, which means less interaction needed to follow recipes.

Delicious results!

I tested the idea with a high fidelity prototype with 5 people cooking one recipe from Tasty. The results were promising according to how users interacted with the prototype and the less amount of times users had stopped to understand the next steps. One even quoted, "this is why I loved Tasty in the first place."