Researchers seek to unravel secrets of TikTok’s addictive algorithm

Researchers seek to unravel secrets of TikTok’s addictive algorithm

Researchers claim to have decoded how TikTok’s algorithm keeps users hooked.

The app, mostly used to share short videos, was launched in 2017 and quickly became one of the world’s most popular social media platforms, largely due to its unique algorithm. By 2019, however, it began to be banned in several countries.

How TikTok personalises user experience and promotes engagement is not entirely clear with researchers calling the platform’s algorithm a “black box”.

TikTok, like many private social media companies, has provided few details about how its algorithm works.

“The algorithm is such a black box to the public and regulators. And to some extent, it probably is to TikTok itself,” Franziska Roesner, a computer scientist at the University of Washington, said.

The app is banned in India, Nepal, Afghanistan and Somalia and it could be banished from the United States unless its Chinese owner, ByteDance, sells it within the next nine months to a year.

In spite of these challenges, Tiktok has been so successful that rival social media platforms such as Instagram, YouTube and X have tweaked their designs to incorporate similar features.

Two new studies, which will be presented at conferences next month, claim to have unravelled the secrets of the recommendation algorithm that has driven TikTok’s success.

In the first study, researchers recruited about 350 TikTok users who downloaded their data from the app. The researchers then looked at 9.2 million video recommendations of these users to better understand how TikTok personalised them.

Assessing the first 1,000 videos that TikTok showed these users, the researchers found that a third to half of them were shown based on the app’s predictions of what the users liked.

In another study, researchers found that in the first 120 days the average daily time that these users spent on the app increased from about 30 minutes to 50 minutes.

The researchers labelled each video on a user’s timeline as either an “exploration video” or an “exploitation video”.

While exploration videos weren’t linked to any video that the user had seen before – they didn’t have similar hashtags or creators – exploitation videos were based on their preferences.

“Exploitation videos are the ones that are more like, ‘We know what you like, we’re going to show you more videos that are related to these,’” Dr Roesner, who was involved with the studies, explained.

“We found that in the first 1,000 videos users saw, TikTok exploited user interests between 30 per cent and 50 per cent of the time,” Dr Roesner added.

Another finding was that people seemed to watch only about 55 per cent of the videos recommended to them, suggesting that scrolling past a video faster wasn’t impacting what the algorithm was doing as much.

The researchers sought more transparency from TikTok about how it used people’s data to personalise their feeds.

“Even if that information doesn’t change an individual’s behaviour, it’s vital to be able to do studies that show, for example, how a vulnerable population is being disproportionately targeted with a certain type of content,” they said.