How TikTok Decides Who To Make Famous (2024)

Once you have a product, distributing it becomes the next challenge for any entrepreneur. At TRASH (one-tap video editing), we looked to TikTok as a potential marketing channel. As early learnings started to roll in, we decided to share what’s going on inside this exploding and mysterious beast.

First appeared on TechCrunch – Feb 17, 2020
By Hannah Donovan & Geneviève Patterson

The advantage of having a deep tech company that uses AI to help speed the process of editing video is that we can do it for “free.” This is pretty cool when you consider that editing a semi-pro video will run you a minimum of $1,500 and six hours in post-production. When we started working on distribution and how to hack our CAC (customer acquisition costs), TikTok was first.

When posting to TikTok, there are three key areas to pay attention to:
• What contributes to your authority score
• The review process and making it to the For You Page
• Making better content (and what you might be doing wrong)

How TikTok Decides Who To Make Famous (3)

The most critical part of posting to TikTok is your authority ranking, which is: “how much of an influencer are you?” Your authority ranking is directly tied to your verticals (the styles you’re making videos in).

  1. New accounts. Like your Uber five-star passenger rating, every post you make contributes to your score.
  2. Multiple accounts. TikTok allows for multiple accounts, but pro tip: multiple accounts from one phone will flag you as a business account and like many platforms, they’ll de-prioritize you unless you’re a paying advertiser. If you’re giving some of these things a try, limit your account login to one device.
  3. The first five videos you post. TikTok wants you to create types of videos that stay in the same vertical. So if you are making meme videos in your first five, TikTok will basically say, “this is a meme account.” So, the first five are critical: you need to have a plan and focus.
  4. Verticality. TikTok doesn’t want you being experimental. Pick a content vertical and stay with it. Content that varies or doesn’t have a specific theme won’t weigh well. If you start to make videos that fall into a different category, it’s like starting over because you don’t have authority on that vertical yet.
  5. Views. If your videos get 100 or fewer views, you’re going to have a zombie account, so delete and start again. Videos that get between 1000–3000 views mean you have a mid-tier account. Videos that get 10,000+ views mean you have a “head” account.
  6. Viewing completion. This is one of the most important factors. Your video needs to be viewed from start to finish to count for this metric. The key things that help with this are:
  • Short videos. Videos can be up to 60 seconds long, but TikTok recommends to their advertisers that they be 9–15 seconds (the internet thinks the average length of a TikTok is 30 seconds).
  • Looping videos. If the video is watched repeatedly, then its Completion Ratio will be over 100% and will increase the overall performance rating of the video. A common practice is to create seamless loops in the video so that viewers are tricked into watching it multiple times.
  • Format. Often there will be a challenge format with a punchline at the end. People understand this format so they’ll stick around to see the punchline.
  • Matching action to music. Always more satisfying to watch.

So, now that you know all the ways you can eff up your authority score, have a plan for the type of account you want to create and created five killer videos, you’re ready to start posting. Here’s what happens next, including how to get coveted FYP (For You Page).

  1. Authority-based automatic distribution. Based on your score, your video goes out to a geo-local network of about 300–500 viewers. At this point, there are no real checks on your content.
  2. Integrity-based AI review and data collection. Shortly after this initial fan-out to a few hundred people, it’s being checked frame-by-frame by an AI for inappropriate content, copyright issues, etc. It’s then given a new weighting (integrity rating) and is either de-listed or distributed again.
  3. Delayed explosion. This is one of the biggest differences between TikTok and other platforms and where you have a second chance of getting onto your FYP. Delayed explosion is why you should carefully consider deleting old content, regardless of how well or poorly it did before. Periodically (it’s unclear what timescale this happens on, it could be weeks or months), TikTok hides the publish date of content on the FYP. TikTok will test your older content and restart a cycle that looks something like: a small batch of content for about two hours; then a medium batch where the AI is looking at the key metrics that feed into your authority rating; finally a large batch that includes your integrity rating (no “bad” content or content they consider “bad”). At this point, it shows something like, “hey, we’re a top 5% video.”
  4. Human review. A human reviewer will see the video with these scores and decide if it has the potential to be a super-viral video. They’ll also double-check for copyright and “bad” content that may have slipped past the AI in step two. To be promoted to the FYP, the content must fit TikTok’s (and as a Beijing-based company, inevitably China’s) idea of what is nice and popular in the geo-local region. Common things that have been noticed are people who represent conventional beauty standards (though this may also be algorithmic bias trained on human bias), no strong political opinions (unless they’re joking or meme-y in nature in certain countries only… though probably not Winnie The Pooh) and no violations of the most reactive local social norms. There’s definitely a degree of… hom*ogeneity going on here. This might offer some insight into why TikTok wants you to create content that’s based on copying? It makes it easier to review and stick to the format of not just what “works” (ie. is going viral) but what aligns with their opinions of what is okay.

This vid says it all.

Making better content (and what you might be doing wrong!)

Pick a format. Because verticalization is key to your authority score, you need to pick a format and work within it. If you want to have different personalities, use different accounts. This will boost your authority score as well as help with gaining followers because their expectations will be set for the type of content you make. Examples of vertices that do well are comedy, memes, dance, vlogs, creation/DIY, and hacks.

Copy the format. TikTok encourages many forms of co-creation such as reactions, collaboration/remix and mimicking. This has created formats, trends and memes throughout the platform. Rather than seen as ripping off other creators, audiences enjoy trends and become inspired to create their own version. TikTokers like Charli D’amelio create unofficial choreography for pop songs and just copying those dance moves can send a song to the top of the charts. The next iconic dances like “Thriller,” “Single Ladies” or “Gangam Style” will be created by someone who may have no real connection or ownership to the original song.

In general, Gen Z is known for being less “solo” in their pursuits than Millennials. We think this collaborative approach to creation is a sign of the times not just for social entertainment, but the next wave of creation tools and platforms. Know your music: songs are one of the best ways to get people to understand your meme content. A lot of viewers will already know what your content is going to be about just based on the song, so picking the right song for the format you want to copy is key — this can’t be an afterthought and might be a place you’re going wrong!

Get ready to sell. If Instagram is QVC for Millennials, TikTok is the line outside the Supreme store for Gen Z. Instead of glossy, in-your-face advertisem*nts for fitness and beauty, the shopping is going to be more “authentic” and narrative. Shoppable video is already a major thing in Asia and it’s reportedly being tested on TikTok to come to the rest of the world soon.

We suspect Gen Z will simply treat Amazon like the Google Search of Stuff & Things and the new social platforms become the virtual mall. We also suspect TikTok will weigh shoppable content more highly in the FYP algo because money.

What’s it going to look like? We don’t know, but maybe something like this:

How TikTok Decides Who To Make Famous (4)

So what do we mean when we say “then, the post is reviewed by an AI?” TikTok is the most extensive content moderation system that has ever existed.

The 411 on content moderation

To make that claim, we need to understand the status quo in moderation today. Content moderation was one of the first product problems of user-generated content that computer-vision scientists were tasked with solving (ie. filtering out p*rn). With very little content moderation on one end of the spectrum (like 4chan) and heavy moderation on the other end (like TikTok), “whatʼs the right balance?” is a complex product question that touches all the major platforms today — especially when you as a consumer might not even be aware that itʼs being filtered out because of the “content bubbles” we live in.

From a product perspective, designing for content usually only involves three variations on a theme:

  1. Search: goal-oriented, I know what Iʼm looking for (Google)
  2. Browse: aimless, not sure what I want, anything good? (Netflix)
  3. Contextual: finding something else along the way (Wikipedia)

These systems are present in some form or another in almost every piece of software we use. When browse-based systems are the priority in a product: algorithmic feeds or “discover” screens (like the FYP), the possibility of users living in a content bubble at scale is inherent in the design. We know that if platforms can influence your consumption of content if itʼs viable and profitable to them… then, of course, they can influence public opinions.

Due to these content bubbles (like the FYP) In the last few years, the word “algorithm” has worked its way into the vernacular of non-nerds talking about their Facebook feeds, why their Insta post isnʼt doing well or what Netflix and Spotify are serving them up to enjoy.

Censorship in Entertainment

When it comes to entertainment, content moderation has an interesting history. TikTok, after all, is a social entertainment platform — itʼs not a news service that tells you what is happening in the world like Twitter or a visual social network depicting which fire party your friends attended (Instagram). Entertainment is designed so that we can lean into a mood or change our mood. This is why itʼs so hard to pick a movie with friends: what mood are you all feeling/do you want to feel?

Not too long ago, comics and movies were heavily moderated in the U.S. The Motion Picture Production Code lasted from the 1920s to the 60s and the Comics Code of the 1950s regulated the themes and storylines young Boomers were allowed to read.

But seriously — if TikTok is the entertainment network of the next generation, we should all be questioning these things. Not just the fact that TikTok is under national security review for its influence on America, but because they are the very questions we as product people, scientists and engineers need to consider when we sit down at our jobs to create our futures with code.

First of all, itʼs massive. TikTok moderation is happening on a huge scale with “shadow banning” for 500 million users. As the BBC reports,

Shadow banning is the act of partially blocking content so it doesn't reach the platformʼs entire community of users. It will not be obvious to the user that the creatorʼs content is not being promoted.

So, what’s getting shadow banned?

  1. Political content. While a lighthearted joke or a meme will probably be fine in the US — political content is allowed on the viral video app as long as it’s ‘creative and joyful’, Plenty of things get removed, especially if they don’t please Beijing, the Guardian recently reported. Here’s a BBC World piece on exactly that about the allegedly shadow-banned Indian TikTok star Ajay Barman for making videos on the theme of Hindu-Muslim unity. What to moderate (but do we really mean censor?) in the US is “still in question” and TikTok has hired very nice lawyers to frame this in a positive light.
  2. Pro-LGBTQ+ content. Tiktok censors pro-LGBTQ+ content. On a purely scientific level, identifying content that should be removed from platforms (ie. child abuse and p*rn) is still a difficult computer vision problem (which we go into in more detail below) and as a result, children doing adult things or being abused often make it through the same AI fence that blocks legitimate LGBTQ+ expression. However, given the list of things being moderated here, itʼs a reasonable assumption that human moderation is at work shadow-banning this content too.
  3. People with disabilities, marginalized individuals. Recently, TikTok moderators were told to hide all videos made by marginalized people. TikTok moderators say they are doing this to “prevent bullying,” but the world loses brilliant content and is ultimately a less diverse place when bullies get to be tastemakers.
  4. “Anything they don’t like” Tiktok also censors at its discretion, for example, this Turkish moderatorʼs account of “overtly gay content, politics, religious content or drinking would be chief reasons to take a video down” (alcohol and hom*osexuality are legal in Turkey), not to mention content of villages and shantytowns they didnʼt like the look of.

A political/moral content experiment

While testing what makes content pop, we did a little growth-hack experiment using content that touches “political” and “moral” topics just to see what would happen. Recently, this TikTok mocking a pro-life bumper sticker appeared on our growth expertʼs FYP.

He got curious about this because we know that TikTok censors political content. So, true to how we just walked you through making a TikTok, he copied the format to make his own.

Rather than use the version of Lil Dickyʼs song “Pillow Talk” from the original video, he opted to create a shortened version for his post. While still the widely familiar song, it would receive higher completion rates and loops. The original post referenced a pro-life bumper sticker, but for his experiment, he decided to create a fake, yet believable, Trump tweet with the same message. This not only further politicized the content but crossed the boundaries of fake news and misinformation.

The results: 500k views, 70k likes and over 50 recreations in 3 days. It made it onto the FYP. So, while TikTok is clearly moderating a lot of things, itʼs interesting to see what slips through the cracks in this particular experiment involving fake news.

First appeared on TechCrunch

What this kind of moderation means for creativity

While TikTok has weird and unexpected videos, the aperture for what constitutes weird and unexpected still only allows for a narrow set of things that TikTok has trained its AI and human moderators to recognize.

A certain middle of the road flavor of not-rocking-the-boat entertainment is the norm here. While some of it is incredible content and we respect the creators who make it, itʼs important to understand the leash on creativity and what is okay here, especially if creative entrepreneurs are the new artists. If so, art as we know it is about to become transform as it potentially ceases us to ask tough questions in ways that are visually, sonically and emotionally arresting.

Itʼs important that there are entertainment platforms that put creativity first like Byte.

Beyond content: moderating behavior

The social credit system. Part of TikTok’s appeal is that it feels like “the last sunny corner of the Internet” according to The New Yorker. But how do TikTok moderators and AIs keep it that way? Part of the strategy is adhering to social and cultural norms in the local region. An emerging approach is the social credit system pioneered by municipalities in TikTokʼs home county.

This system awards greater freedom to participate in society (in this case TikTokʼs platform) to people who donʼt break the rules or make other peoplesʼ experiences worse. This clearly influences the content moderation and FYP promotion (as discussed in Part 1, the review process). In China, this system is even starting to cross the boundaries from virtual to real. TikTok has partnered with local courts in Nanning, Guangxi to “display photographs of blacklisted people as advertisem*nts between videos, in some cases offering reward payments for information about these peopleʼs whereabouts that are a percentage of the amount of money the person owes.”

While using social media to guilt people into paying debts seems pretty heavy handed, spotlighting social behaviors that have consequences in the digital town square also promotes trust among the people who live there. Maybe the Chinese approach to being a decent digital citizen deserves some credit? Moderating behavior with rules vs. complete freedom of expression has been a major existential issue for people working at social platforms.

While we are considering AI-supported content moderation at TRASH for certain things (for example, p*rn), there are a few computer vision pitfalls weʼre trying to avoid:

  • Person re-identification: being able to find the same person in all the videos they appear in. This is something TikTokʼs parent company ByteDance has invested considerable R&D in over the last few years and is making it available to Chinese police services. A few days ago, the New York Times reported on a similar service, Clearview AI, that over 600 law enforcement agencies have started using in the last year.
  • Gender, race and ‘criminality’ prediction: Unfortunately, this tech is plagued by training data bias and is very difficult to debug. As a result, people from certain groups, especially women and minorities, are often confused with each other. We donʼt want to get into the game of accidentally misidentifying the wrong people. Even classifying whether a person in a video looks like they are committing a crime is as likely to be mistaken as correct at this time.
  • Tracking social credit: Content moderation is incredibly difficult. Making a fun, safe space is as hard online as it is in the real world. While we use some AI to determine video content at TRASH, we arenʼt classifying peopleʼs intentions or calculating social scores.

The Future of the Video Internet

TikTok managed to create a ridiculously fun place on the internet. As video becomes the dominant media and streaming tips over 70 percent of global mobile traffic, the choices of TikTok and other large UGC media platforms will have an immense impact on our culture. While the current political climate of “fake news” might be bringing more awareness to how we discover and consume content, the topic is still under-appreciated. Tarleton Gillespie (Microsoft Research New England, Cornell University) in his book “Custodians of the Internet says:

Content moderation receives too little public scrutiny even as it shapes social norms and creates consequences for public discourse, cultural production, and the fabric of society.

In the next 5 years, weʼll feel the effects of AI in our life indirectly through the videos we watch every day. Itʼs important for all of us to know what future is coming for us and to be thoughtful about how we build for it and consume it, even when it seems like just a meme.

As an expert in social media marketing and technology, I can provide valuable insights into the concepts discussed in the article "Leveraging TikTok for Growth & How the System Works." My expertise in this field is demonstrated by my in-depth knowledge of the various aspects mentioned in the article.

The article primarily focuses on leveraging TikTok as a marketing channel, and it covers key areas such as authority score, the review process for the For You Page (FYP), and tips for making better content. Here's a breakdown of the concepts used in the article:

  1. Authority Score and Verticality:

    • The authority ranking on TikTok is crucial for determining how much of an influencer you are.
    • The first five videos posted are critical for establishing a content vertical, and TikTok encourages users to stick with a specific theme to build authority.
    • Views and viewing completion play a significant role in determining the success of an account.
  2. Content Creation Tips:

    • Short videos, typically 9–15 seconds, are recommended, with looping videos and a challenge format being effective.
    • Matching actions to music and having a punchline at the end contribute to a higher completion ratio.
  3. TikTok's Distribution System:

    • TikTok employs authority-based automatic distribution initially, sending videos to a small local network.
    • Integrity-based AI review follows, checking for inappropriate content, copyright issues, etc.
    • The delayed explosion feature gives older content a chance to resurface on the FYP after reevaluation.
    • Human review is the final step in deciding whether a video has the potential to go super-viral.
  4. Types of Content and Collaborative Creation:

    • The article suggests picking a specific content format and sticking to it for better authority scores.
    • TikTok encourages co-creation through reactions, collaborations, and mimicking, fostering trends and memes.
  5. TikTok as a Shopping Platform:

    • TikTok is predicted to become a platform for authentic and narrative-driven shoppable content, especially targeting Generation Z.
  6. Content Moderation and Censorship:

    • TikTok has an extensive content moderation system, including AI review and human moderation.
    • The article mentions shadow banning for political content, pro-LGBTQ+ content, and content from marginalized individuals.
  7. Behavior Moderation and Social Credit System:

    • TikTok adheres to social and cultural norms in local regions, influencing content moderation and FYP promotion.
    • The social credit system is discussed, awarding greater freedom to users who adhere to rules and norms.
  8. Challenges and Considerations in AI Content Moderation:

    • The challenges of AI-supported content moderation are highlighted, including issues like person re-identification, bias in gender and race prediction, and tracking social credit.
  9. Future Implications of Video Platforms:

    • The article raises awareness about the impact of content moderation on social norms, public discourse, and cultural production in the evolving landscape of video platforms.

In conclusion, my expertise in social media marketing and technology allows me to provide a comprehensive understanding of the concepts discussed in the article, making me well-equipped to analyze and discuss the strategies outlined for leveraging TikTok for growth.

How TikTok Decides Who To Make Famous (2024)
Top Articles
Latest Posts
Article information

Author: Van Hayes

Last Updated:

Views: 6023

Rating: 4.6 / 5 (46 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: Van Hayes

Birthday: 1994-06-07

Address: 2004 Kling Rapid, New Destiny, MT 64658-2367

Phone: +512425013758

Job: National Farming Director

Hobby: Reading, Polo, Genealogy, amateur radio, Scouting, Stand-up comedy, Cryptography

Introduction: My name is Van Hayes, I am a thankful, friendly, smiling, calm, powerful, fine, enthusiastic person who loves writing and wants to share my knowledge and understanding with you.