Andrew Tate content was pushed to the account of a 13-year-old boy after just over an hour of watching videos on YouTube Shorts — without his profile seeking out any clips of the controversial influencer.

Sky News set up a fake account of a young teenage boy to see whether YouTube Shorts, Instagram and TikTok would promote videos featuring Tate or his brother Tristan unprompted.

Despite Andrew Tate being banned from all three platforms, our investigation found that almost 30 videos featuring the Tates were served to the account of “Ollie Smith” within a two-hour period on YouTube Shorts.

Once the first two Tate videos appeared on Ollie’s screen, a huge volume of other Tate videos appeared on the fictional teenager’s feed.

At times, Tate videos were shown one after the other or with only one or two clips in between them.

The number of videos showing Tate content was significantly higher on YouTube Shorts than on Instagram Reels, with none appearing on TikTok.

As a result of our investigation, YouTube removed at least nine accounts that served Ollie’s feed with Tate content.

Known for their controversial online comments, both Tate brothers have been held in a Romanian jail facing allegations of sexual assault, exploitation, organised crime and human trafficking since the end of December. The brothers were released and placed under house arrest on 1 April. The pair deny any wrongdoing.

The moment the algorithm changed

The first time either brother appeared on Ollie’s YouTube Shorts feed was a video about Andrew Tate fighting in a professional match against another influencer.

This happened after spending one hour and 12 minutes on the platform, with videos being watched in one-hour sessions.

Around 15 minutes, or 32 videos later, a second similar Tate clip was shown.

After that, the algorithm appeared to shift significantly.

Ollie was served a large number of Tate videos and the gaps in between the clips got smaller.

The yellow bars in this chart indicate every time Ollie was shown a Tate video in a one-hour period, with the grey bars being other videos.

After a break, Ollie continued to be served Tate videos until the end of the experiment, which was limited to three hours of scrolling.

The later Tate videos often featured more mature themes and controversial points of view.

This included Andrew Tate talking about how he physically removed a woman who worked for his sex webcam business from his house, threw her clothes out of the window and withheld her pay. Another featured him talking about how to use money to manipulate a girlfriend.

The YouTube Shorts videos came from 19 different accounts, with seven accounts serving multiple videos to Ollie.

The overwhelming majority of these accounts were entirely or almost entirely dedicated to Andrew and Tristan Tate content, often with usernames paying homage to Andrew. Nearly all of them focussed on making YouTube Shorts videos, rather than longer, traditional YouTube videos.

Between them, the accounts had uploaded more than 1,600 Shorts videos, with 198 of their videos clocking up more than a million views. Three of the accounts had videos with more than 20 million views each.

Sara McCorquodale, chief executive and founder of influencer intelligence CORQ, told Sky News these accounts could be earning up to six-figure sums for videos with the highest views if enough views are generated within a 90-day period.

“People are creating these cash cow accounts because there is an evident desire for Andrew Tate content and this is what generates revenue on YouTube,” she explained.

“YouTubers gain success through responding to demand with supply. I suppose there may be an element of opportunism in this too. Channel creators are keen to ride the wave of Andrew Tate as an potentially addictive and cultural phenomenon while it lasts.”

Some of the Tate videos appeared to have been tailored for a younger audience by using a ‘split-screen technique’.

This is when the screen is literally split in two and shows two different videos at the same time, such as a video of Andrew Tate speaking above a video from a game.

A recent report from Ofcom highlighted this type of video as an example of how children are gravitating to online videos “which appear designed to maximise stimulation but require minimal effort and focus”.

Tim Squirrell, the head of communications at the Institute for Strategic Dialogue, told Sky News: “Andrew Tate has been extremely effective at gaming the algorithms of a variety of different platforms, with YouTube Shorts being just the latest in this.”

He said Tate has found a way to encourage “an army of boys” to post his content across social media by luring them with the money they can earn from advertising bonuses on these clips.

He believes TikTok has put in some work to improve their platform after being scrutinised over this issue, but that YouTube Shorts has suffered from attempting to quickly replicate the short video format that has made TikTok popular.

“[It hasn’t been] tested as extensively, their content moderation isn’t up to scratch, their algorithm is a little bit shifty in terms of how it aggressively optimises towards what it thinks you want,” he explained.

Tate accounts removed after Sky News’ investigation

Sky News spoke to YouTube about the findings of the investigation and a spokesperson said: “YouTube has strict policies that prohibit hate speech and harassment, and we remove content that targets or threatens individuals or groups based on protected attributes, such as their gender identity.”

YouTube terminated almost half the of the accounts Sky News flagged to the platform, adding: “We terminated channels associated with Andrew Tate for multiple violations of our community guidelines and terms of service, including our hate speech policy. If a channel is terminated, the uploader is unable to use, own or create any other YouTube channels.”

Sky News also found another five accounts that served content to Ollie had gone offline before YouTube had been contacted. It is not clear if they had been removed or if they were deleted.

Accounts for Ollie Smith were also set up on Instagram and TikTok, all of which are social media platforms popular with 13-year-olds in the UK.

Instagram and TikTok’s algorithms

YouTube Shorts and Instagram Reels show short videos and, like TikTok, use an algorithm to decide what videos are promoted on to a user’s feed.

While Ollie was not served any Tate videos on TikTok, he was served two Andrew Tate videos on Instagram Reels towards the end of the investigation.

The clips showed Andrew Tate giving advice on how to manipulate and control women.

Callum Hood, head of research at the Centre for Countering Digital Hate, reviewed Sky News’ investigation and said: “These findings show that social media algorithms are still giving Andrew Tate an enormous boost on some platforms, helping his content reach an audience of young men.”

He added: “The combination of controversial content and weapons grade algorithms means Andrew Tate’s extremist messaging is being seen by billions worldwide.

“Despite his arrest and partial deplatforming his popularity amongst young men remain and his followers are now flooding social media with a counter-narrative which portrays him as a victim of a conspiracy. His views are deeply hateful to women and there is a real risk that they could lead to violence in the real world.”

Andrew Tate: Snapshot of controversial influencer’s estimated earnings revealed

Searching for content

While no videos were served to Ollie on TikTok unprompted, after the investigation into the teenager’s feed was finished, Sky News then tested what would happen if Ollie did search for the influencer.

Searching for “Andrew Tate” on TikTok brought up a large number of Tate videos. This included an account calling for Tate to be freed from prison, which was subsequently removed by TikTok after Sky News contacted the platform for comment.

The search results were accompanied by a clear warning message about hateful content and a link to Andrew Tate’s Wikipedia.

However, these warnings did not come up when “Cobra Tate” (a moniker of Andrew Tate) was entered, nor for Tristan Tate, nor for misspellings of Andrew Tate’s name.

Searching on Instagram and YouTube also brought up a number of Tate videos, as well as suggestions for news content and quotes from the British influencer.

Both Meta, which owns Instagram, and TikTok say they take misogynistic and hateful content on their platforms seriously.

A Meta spokesperson told Sky News: “We’ve worked with women’s safety experts to develop strict rules against gender-based hate, sexualised or misogynistic language, and threats of sexual violence, and we take action whenever we become aware of it. Andrew Tate was banned from Facebook and Instagram last year, and we’re working to improve our technology to avoid recommending his content.

“We’re also continuing our focus on making sure teens have positive and age-appropriate experiences on Instagram, which includes limiting the amount of sensitive content they see.”

A TikTok spokesperson said: “Misogyny is a hateful ideology that is not tolerated on TikTok. We have dedicated significant resources to finding and removing content of this nature that violates our policies. We have also taken a number of steps to help our community make informed choices, including the launch of a search intervention – so that anyone who searches for certain words or phrases relating to Andrew Tate or misogyny will be reminded of the dangers of hateful language.”

How we conducted the investigation

It was important for the investigation to mimick how teenagers behave on these apps for our results to reflect reality, rather than produce artificial results.

We created a profile of a 13-year-old boy as this is the youngest age allowed on the platforms, although recent Ofcom research shows almost all children aged 3 to 17 (96%) watch videos on video-sharing sites and apps, by-passing the simple a minimum age app requirements.

Guided by Ofcom research into what social video content is popular for this age group, Ollie either ‘liked’, watched or skipped videos. The more popular a type of content was, the more we engaged with it. This meant Ollie was watching prank or gaming videos in full and giving the videos a ‘like’, but skipping videos that didn’t fall into any popular category for him.

We also made sure Ollie was active on the platforms at times a real 13-year-old would be – on the weekend, or after ‘homework’ and dinner. He was also limited to three one-hour sessions to reflect that many teenagers have their screen time restricted by their parents.

We also wanted to make sure the algorithm wasn’t being unduly influenced. We used a blank phone reset to factory settings and it was only used for the experiment. Ollie’s username, details and even profile picture did not have any information that could sway the algorithm towards a particular topic.


The Data and Forensics team is a multi-skilled unit dedicated to providing transparent journalism from Sky News. We gather, analyse and visualise data to tell data-driven stories. We combine traditional reporting skills with advanced analysis of satellite images, social media and other open source information. Through multimedia storytelling we aim to better explain the world while also showing how our journalism is done.

Why data journalism matters to Sky News