Would TikTok still be TikTok if it didn’t have its all-knowing algorithm feeding you more and more of the content that you want to watch every time you log in?
That could be a key question of the next stage for the app, as it works to come up with alternatives to remain in operation in the U.S., after the U.S. Senate voted to force TikTok to be sold into U.S. ownership, or face a national ban, due to national security concerns.
Chinese officials have reportedly already vetoed any potential sale of its algorithmic code, under China’s revised export-control rules, which stipulate that any sale involving its source code would require Government approval.
Which means that a sale of TikTok as we know is unlikely, and now, according to reports, TikTok’s owner ByteDance is working to come up with another proposal.
As reported by Reuters:
“TikTok is working on a clone of its recommendation algorithm for its 170 million U.S. users that may result in a version that operates independently of its Chinese parent and be more palatable to American lawmakers who want to ban it, according to sources with direct knowledge of the efforts.”
According to the report, TikTok has been working on this alternative feed algorithm for more than a year, which was originally slated to be part of its broader “Project Texas” initiative designed to appease U.S. authorities.
Which may be another path to TikTok remaining in operation in America, though I remain skeptical that it’s even possible to replicate TikTok’s algorithms in any lesser form, given the various parameters are qualifiers that are built into its system.
Yet, at the same time, if any U.S. company was able to buy the whole app, algorithms and all, that could also be problematic, and lead to further queries and concerns from U.S. authorities.
Back in 2020, an investigation found that TikTok had been advising its moderation teams to suppress uploads from users with physical “flaws” including “abnormal body shapes,” “ugly facial looks,” dwarfism, and “obvious beer belly,” among other traits.
As reported by The Intercept:
“One moderation document outlining physical features, bodily and environmental elements deemed too unattractive spells out a litany of flaws that could be grounds for invisibly barring a given clip from the “For You” section of the app. The document reveals that uploads by unattractive, poor, or otherwise undesirable users could “decrease the short-term new user retention rate.”
Wrinkles, eye disorders and various other “low quality” physical concerns were included in the censorship list, as well as videos created in poor shooting environments, like “slummy” style housing and “disreputable decorations.”
TikTok management has repeatedly stated that these qualifiers were never used on TikTok itself, and had only ever been implemented in the Chinese version of the app, called Douyin. The advisory notes were merely ported over as templates, as part of TikTok’s global expansion, but a bigger concern that was largely overlooked at the time is that TikTok’s moderation team was only ever able to reject clips that depicted people who had these traits was because its visual identification process is so advanced that it’s able to highlight uploads in which such elements are potentially present in the first place.
At 750 million users, there’s no way that Douyin’s moderation team would be able to filter every video upload in order to detect and reject those that fail on these parameters. The only effective process for removing videos that include these traits is via TikTok’s visual ID system, which points to the fact that a key element of TikTok’s addictive system algorithm is that it’s able to identify very specific physical traits of people in clips, along with other elements, in order to show you more of what you like.
Which is a concern, for many reasons.
If TikTok knows, for example, that you watched a video of a young, blonde girl with blue eyes dancing, it’ll show you more of that, down to more specifics than you probably expect.
If it knows you like chunky guys with dark hair, guess what you’ll get shown more of? If it knows that you like seeing naked hips, you’ll get more of that.
The depths of TikTok’s content matching, based on a broad range of visual traits, along with the regular text and topic cues, is why it’s so compelling, but it’s also why anyone taking on that algorithm needs to be aware that such specific matching will likely not be viewed favorably by U.S. authorities.
TikTok has seemingly watered this down over time, while on Douyin, the Chinese Government also now plays a role in deciding what gains traction in the app.
But there is a reason why TikTok’s algorithmic matching is so compelling, more so than U.S. apps. And I’m not sure that people really want to comprehend the actual answers.
Which is also why a U.S.-only version of its algorithm won’t work, and would see TikTok lose ground very fast, if it is forced to enact a far more sanitized version of its systematic process.
It may be null and void either way, because what I’m hearing from observers in China is that the forced TikTok sale has become a point of national pride, with the Chinese government opposing what it sees as overreach by U.S. authorities.
As such, it seems increasingly likely that they’ll refuse any compromise, which will mean that, as of January next year, TikTok will be switched off for U.S. users.
So while discussions will continue on solutions, it may come down to international diplomacy, and a stand off between global superpowers.
Yes, TikTok, the app that gained popularity on the back of viral dance trends, is now at the center of geopolitical tensions. What a time to be alive.