Millions of young people use TikTok to de-stress from daily responsibilities and routine. The ease of watching videos on the For You Page is what sets TikTok apart from other social media apps. Yet, it’s often hard to unwind on TikTok, since toxic content still slips onto user screens.
tl;dr Users feel uncomfortable viewing negative videos and comments.
So, how can users de-stress while continuing to interact with the platform?
To get the clearest insights into the user experience on TikTok, I set goals for my user research.
User Research Goals
1. Find out why people use TikTok — what motivates them to open the app?
2. Determine how comfortable users are expressing themselves on TikTok
3. Learn about the types of content users (dis)like seeing
I interviewed 4 TikTok users and conducted a survey to affirm my interview insights.
Common user-reported pain points:
“Some communities are filled with hate and bigotry. I do not like to see that content on my feed.”
“Sometimes a certain Tik Tok will be too specific and relatable about a past trauma. Other times I see homophobic content.”
“People are so mean on there sometimes :(”
The more data I collected, the clearer it was that this was an issue of mental health.
With these insights in mind, I started to define the people problem.
The People Problem
Many users use TikTok to de-stress by zoning out on the For You Page. Their goal is to relax and take a break from their everyday responsibilities.
Part of the TikTok experience is reading comments, which tend to build on the video’s content. Yet, this clashes with the de-stressing experience, because:
- Comments often become hateful and combative, which prevents users from commenting.
This fear stops users from having the full TikTok experience. (Some end up avoiding the comments section altogether!)
- Negative or even triggering content can end up on the FYP and harm the user’s mental health.
Even though TikTok’s algorithm identifies a user’s unique interests quickly, sometimes people still see videos that upset them.
After brainstorming dozens of solutions, here were three of my top solutions:
I wanted a solution with room for growth. Toxic content on social media is a big problem that goes beyond TikTok. I felt that designing for features 1 or 3 would not allow the app to address the problem as head-on as I wanted to, and gave band-aid solutions, not truly fundamental changes.
I considered how similar “community spaces” on other social media impacted user content. On those sites, community-specific moderators were quick to remove any rule-violating content. Translating this effect to TikTok would best address my people problem.
This feature will be highly integrated into the app. Creating an information hierarchy made it much easier to visualize where different interactions with a space will fit in:
After deciding where different parts of a community space would reside, it was time to implement.
For this case study, I focused on the flows that address my people problem’s biggest themes: comfort and safety. In the following flows, I focused on how my solution helps moderate content and gives users more power over what they see on TikTok.
Flow 1 uses the Search feature on the Discover page to find a new Space to follow. I added a Space tab to the top navigation bar. From there, the user can look at spaces that relate to the Search phrase. The bottom row shows how to hide a Space from its page and view other features of the Space.
Flow 2 lets users hide an entire space from their For You Page. Instead of just pressing “Not interested,” TikTok gets more specific feedback. This way, the app can better learn what users don’t like, since a Space tends to have a certain type of content in it.
Flow 3 shows how Space moderators review reported content. This feature is essential in maintaining a safe, pleasant Space. TikTok moderation is hard to do with millions of videos uploaded each day, but a Space has its own moderators to handle rule-breaking videos.
Flow 4’s report feature looks like how you report a user profile. By letting users report whole Spaces, it lessons the risk of a Space becoming a harmful filter bubble. As a part of TikTok, Spaces still need accountability.
Flow 5 is how a user would report a video for violating the rules of the space it belongs to. It mostly follows the current reporting system, but I added a step to choose which guidelines the video violates.
Next I’ll focus on making moderation convenient and easy by focusing on:
- Managing users within a space
- Adding another entry point for moderators to review reports
I will also focus on improving the creator end of the space. Content creators will contribute to spaces, not the passive viewers I designed spaces for. I will focus on flows for:
- Creating a space
- Adding a video to an existing space