Facebook has apologised after an algorithm generated a prompt that asked users if they would like to “keep seeing videos about primates” on a video of black men.

The video, which was shared on the platform by the Daily Mail on 27 June, featured clips of black men in altercations with white civilians and police officers.

A spokesperson for the social media giant said on Saturday that the prompt was “clearly an unacceptable error”.

They added that Facebook is investigating the cause and have disabled the entire topic recommendation feature in the meantime.

“While we have made improvements to our AI we know it’s not perfect and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations,” the spokesperson added.

“We apologize to anyone who may have seen these offensive recommendations.”

It comes after Darci Groves, a former content design manager at Facebook, told The New York Times that a friend had sent her a screenshot of the prompt.

More on Facebook

Ms Groves said she then posted it to a product feedback forum for current and former employees of the company, prompting the company to look into the “root cause”.

Last year, Twitter launched an investigation after users claimed that its image cropping feature favoured the faces of white people.

Over the summer, Facebook, Twitter and Instagram were all criticised when three England football players Marcus Rashford, Jadon Sancho and Bukayo Saka were targeted by racist social media posts after missing penalties in the Euro 2020 finals.

Meanwhile, hundreds of Facebook employees also staged a virtual walkout last year to protest how the company took no action against a post by President Donald Trump on the killing of George Floyd in Minneapolis.