Meta’s oversight board said the company was wrong for bowing to police pressure to remove a clip of drill music, forcing Instagram to reinstate the post.
The U.K.’s Metropolitan police requested the removal of a snippet of the song “Secrets Not Safe” by Chinx (OS), arguing the track could lead to “retaliatory violence.” Cops claimed the song referenced a 2017 shooting and contained a “veiled threat,” leading Meta to act. They removed the song wherever it appeared on the social media platform.
Listen to “Secrets Not Safe” by Chinx (OS) at the end of the page.
However, the watchdog said the track does not break any of Facebook or Instagram’s rules. They also state allowing a police operation to censor a musician in secret poses a breach of equality and transparency.
“While law enforcement can sometimes provide context and expertise, not every piece of content that law enforcement would prefer to have taken down should be taken down,” the board said in its ruling as per the Guardian.
“It is therefore critical that Meta evaluates these requests independently, particularly when they relate to artistic expression from individuals in minority or marginalised groups for whom the risk of cultural bias against their content is acute.”
Meta Removed 255 Drill Music Posts At Cops Request
The oversight board discovered Met police filed 286 requests to take down or review posts about drill music in a year. The overwhelming majority, 255 of those cases, were approved for removal. There were no requests to remove any other music genre.
“This intensive focus on one music genre among many that include reference to violence raises serious concerns of potential over-policing of certain communities,” the board argued.
Cops were not arguing the drill music broke any U.K. laws, rather they claimed the song had broken Facebook and Instagram community standards. Despite providing no evidence, cops argued the drill music in question was “potentially threatening or likely to contribute to imminent violence or physical harm.”
In a statement, Meta said: “We do not remove content simply because law enforcement requests it – we take action if content is found to violate our policies or local law.”
The board also recommended an overhaul in Meta’s removal policy, calling for transparency concerning content removal requests from law enforcement and other state actors. They also advised Meta to assess their decisions for systemic biases against communities or minorities.