Increasing transparency of video recommendations on Youtube — a UX case study

All eyes on it.

Aosheng Ran
UX Collective

--

What does a more transparent YouTube look like? In the last three weeks, I set out to re-imagine a YouTube experience that helps us understand the algorithmic personalization and make better sense of the world.

Agency

YouTube is the most popular video-sharing website in the world. In 2019, more than 500 hours of videos were uploaded to YouTube every minute, available for billions of people to watch.

Thanks to Google’s deep neural network, only dozens of videos show up on our home page. The algorithm is capable of sifting through billions of videos and serving different videos for individuals in real-time. Google described the it as “one of the largest scale and most sophisticated industrial recommendation systems in existence.”

The algorithm allows us to find the most viral cat videos, the epic fails on the Voice, and vlog of the most expensive food. It also allows children to autoplay inappropriate videos after their favorite animated series. It also allows pedophiles to discover homemade videos of kids in swimming suits playing in the backyard pool. It also allows conspiracy believers to binge-watch a daily dose of radical political theories.

What videos are shown to us? Many might already know that YouTube recommends videos based on our activity histories, but the assumption ends there. When the impact is amplified to a global scale, we are left in a vulnerable position.

We don’t have agency of these video recommendations, and many other streams of information in our digital lives today.

Transparency

Transparency is the first step towards agency. To start gaining control over what’s provided to people, we could try to answer:

How might we make the process and implications of video recommendations more transparent to viewers?

Defining Transparency for YouTube Viewers

In an iterative control cycle, a personalization algorithm should demonstrate its transparency at these moments below.

Define: What is the goal of the algorithm?
The algorithm should state its objectives: to find the videos I’m more likely to engage with.

Plan: What attributes are considered?
The algorithm should tell me what influenced the video recommendations. Based on the Google’s explanation and YouTube Help, the reasons might include:

  • My activity and watch history. This includes implicit feedback (i.e., staying on a page or watching most of a video.)
  • My demographic information.
  • The features of videos, such as their categories and popularity.

Act: Can I see the process and output of the algorithm?
For each video, I should be able to see YouTube’s prediction of my engagement and the specific attributes.

Measure: How does the algorithm change as I watch on YouTube?
I should be able to see how my actions influence YouTube’s understanding of my watching habits in real-time.

Concepts

Here are two concepts that aim to help viewers gain better understanding of the video recommendations in the existing YouTube experience.

When viewers visit YouTube, they see attention scores under the recommended videos, a metric representing YouTube’s prediction on how likely viewers want to engage with each video or topic.

Attention scores declare the presence of algorithms. They provide entrances to understanding video recommendations.

Attentions scores are understandable. They transforms the numeric values of algorithmic output into percentages that viewers could understand and compare between videos.

Hovering on an attention score reveals attribute labels, the factors that influence the recommendation and contributes to the score of each video.

When playing a video, viewers can also see the attention score and the attribute labels at the side.

Hovering on labels reveals short descriptions and relevant statistics.

Attribute labels add clarity to why and how videos are recommended. They demonstrate the capability and limitation of the algorithm.

Attribute labels are contextual. They surface the adequate amount of behind-the-scenes based on viewers’ preference without disrupting their exploration.

The attention score stays on the page and updates itself in real-time as viewers continue to interact with the video (watch, like/dislike, comment, etc.)

Through long-term usage, viewers could further understand how their different behaviors refine the recommendation.

Design Process

YouTube developed a sophisticated machine learning algorithm that still remains opaque to most people to a large extent. One could say it is also a challenge explaining the algorithm and its impact to non-engineers. How can we make the technical information more understandable to the viewers?

Listing attributes

Focus

To define an appropriate scope, I drafted the principles that helped me unpack the meaning of transparency from the perspective of YouTube viewers. Inspired by the design frameworks that teams like Oracle Design have used for automation, I listed the principles above based on the popular iterative feedback loop models, such as plan-do-check-act and build-measure-learn.

Brainstorming

From the principles, I started looking at the opportunities on the YouTube homepage, where most videos are recommended to us.

Brainstorming concepts
Brainstorming UX

When brainstorming concepts and the key uses, I found it essential to treat the new features as organic addition to the current YouTube experience. The behind-the-scene information should not change the way viewers prefer to explore videos.

Prototype & Test

To complement the design process driven by secondary research, I sought early feedback on concept and usability from others. I reached out to six people who use YouTube frequently in the school community.

I believe that people could understand the concept and its implications better when they could see their data in the prototype. Therefore, the conversations started with the exploratory interview as I created the low-fidelity prototype with the interviewee’s data for testing.

Low-fidelity prototype. A set includes four screens, from discovering a video to providing feedback.

Interviewees were asked to pick a video they wanted to watch at the moment and explain the reasons. I populated the attribute labels based on their described watching habits.

Most interviewees found the features difficult to understand from a glance due to limited interactions available in the prototype. Thus, the iteration focused on providing additional explanations and statistics to make the information more useful.

Stories

How will seeing the process of video recommendations change our watching habits? The interviewees answered the questions while they shared stories about their life with YouTube.

“I will find a feature like this interesting in the beginning, but after a while, I might feel creepy when I get to see all the things YouTube knows about me.”

Julie, a fine arts college student, frequently looks at the home feed when she’s looking for videos to watch with a meal. She subscribed to multiple vloggers who travel and explore food.

Julie’s homepage (Taiwanese) with attention scores.

When I asked Julie to show me her homepage, she started explaining why videos are there. “My friend recommended this food channel to me, so I searched for it, and it has been on my homepage since.” “Why is this video even here?” I asked her to pick a video she wants to watch the most at the moment, and she pointed to a food vlog in Taipei. “This was from my favorite vlogger, but I haven’t looked at her stuff recently. This looks delicious.”

Julie picked a food vlog.

Speaking of what happens if features like these exist in reality, “I don’t know how it changes my habit in the future, but I guess ultimately I will have a more dynamic relationship with YouTube because I know what it thinks of me when I use it.”

Compared to Julie, Lynn is much more aware of her time spent on YouTube. As a busy architect, YouTube is where she looks for tutorials and immediate relaxation. She is very tactical in searching for the right episodes of drama shows or white noise videos to help her fall asleep.

Lynn’s homepage with attention scores.

She said, “I feel guilty that I spent time watching mostly useless stuff. I’ve wanted to watch news or something more meaningful on YouTube, but I haven’t found a reliable channel yet. It doesn’t help that the platform keeps pushing the same things to my homepage.”

Lynn picked a drama show video.

Lynn continues, “I think I spent a lot of time on YouTube not knowing what I’m doing. YouTube keeps recommending me video after video, as if it keeps feeding while I’m trapped in this place.”

“If you’re conscious, this feature could act like a report card, a summary of the things I watched over the last month. I might treat it like the Screen Time feature. When I look at the labels, I would wonder: what have I watched?

Taylor, an industrial designer and music lover, avoids interacting with YouTube as much as possible.

She describes the watched videos as the information she’s processed. “I’m tired of searching for something random just for research, and then seeing similar videos on my homepage. Seeing a large pile of them thrown back at me makes me feel like I’m bombarded with trash, it’s overwhelming.”

At the same time, she finds the intimate connections of the videos to her life uncomfortable. “It somehow seems that all the videos on the homepage have something to do with my life. Opening the homepage, especially in front of others, is almost like exposing my private life out in the open.”

Taylor’s homepage with attention scores.

She’s aware that many of her actions will leave YouTube an impression. “I tend to linger on the music videos that I like for a bit longer and hope YouTube will learn about my interests.”

“Sometimes, it feels like I’m not using a platform. It feels like I’m talking to an AI. Every action I have will be recorded and looked for meanings by it. It feels scary sometimes that it’s become more humane and pose judgment on my activity.”

Taylor picked a music performance video.

“If the feature exists, I would run away from YouTube and force myself to use it less. It feels creepy when my personal info hangs next to the video I’m watching, because it suggests that my information is in someone else’s hands. It feels like I have no control of the situation.”

“It’s brutal. I’m already concerned about the tracking behaviors from companies like Google. People like me will probably be scared away when the camouflage of these behaviors disappears, and everything gets exposed.”

Future

In June 2019, YouTube shared updates of their efforts to deal with harmful content on its platform. They are mostly changes to algorithms to promote, demote, and remove videos based on their policies. In October 2019, Mozilla questioned the effectiveness of YouTube’s approach and stated “the era of ‘trust us’ is over.” It listed recommendations for YouTube to provide independent researchers with better tools to understand the problem. Meanwhile, Mozilla’s #YouTubeRegrets project shed light on how people’s lives are affected by dangerous recommendations from YouTube. These individual stories could often be dismissed in our public discourse, but they are the reminders that designers aren’t only building experiences: design of such a scale poses an social and cultural impact on all of us.

When we make products with transparency and honesty in mind, we could create a culture where more dialogues exist between humans and technology, and a future where we are collectively empowered to understand our decisions.

Acknowledgements

Especially grateful to Kate Rutter for being an amazing mentor who motivates and challenges me to push forward.

Special thanks to Apurva Shah and Weiwei Hsu for giving enlightening feedback on drafts of this article.

Thanks to the lovely anonymous people who agreed to have conversations with me and sharing the vulnerable moments of life.

Thanks to Oracle Design team for sharing relevant insightful knowledge to the CCA community.

Thanks to Nathalia Kasman for many inspiring discussions.

--

--