UK police unit helps remove exercise music videos from the web

0

“VSENSORED” BEGINS like most pieces of drill music, a kind of rap. Dressed in black down jackets and balaclavas, the members of the “Zone 2” group, who hailed from Peckham in south London, spit lines at their “opps” (enemies). But listen to the track on YouTube, where it’s garnered 2.8 million views, and you’ll notice something unusual. A buzzer obscures certain words, such as the names of deceased gang rivals. It’s probably not an artistic choice. Some rappers are now toning down or masking their lyrics to avoid falling on the wrong side of the law.

Listen to this story.
Enjoy more audio and podcasts on iOS or android.

Your browser does not support the item

Save time by listening to our audio articles while you multitask

Their particular concern is a Metropolitan Police Service unit called “Project Alpha”, which patrols the internet for “gang-related content”. About 30 officers spend most of their time scrutinizing drill songs, which occasionally reference real-life violence and gang strife (though many tracks are non-autobiographical and some entirely fictional). They dissect slang to determine which leads might prompt offline attacks. Alpha agents then tell social media platforms, including YouTube, where the songs are getting the most attention, that the video violates the website’s own rules prohibiting harmful content. YouTube generally agrees to remove the clips.

More than 350 pieces of online content, mostly YouTube videos, were removed this year at Alpha’s request; only 130 were removed in 2019. The actual numbers are likely even higher. Industry insiders say officers sometimes notify YouTube channels that they plan to flag videos on the platform, in which case channel owners can remove the clips themselves. Some, like “Zone 2”, emit words or download censored tracks.

Civil liberties groups view Alpha as a disturbing creep toward censorship. Rappers feel unfairly singled out – heavy metal bands have been crying death and destruction for decades. Many musicians also doubt that police officers can properly parse hyperlocal drill slang. “People at the Met don’t come from these areas [where drill is recorded]. There’s not a lot of training you can do,” says Toby Egekwu (“traditional knowledge), co-founder of a record label that signs drills.

The Met denies that Alpha limits free speech. James Seager, who leads the unit, says it doesn’t track all exercise songs, only those from people involved in real violent conflict. To keep up with the slang, Alpha hired people familiar with the types of fields where drill music is often made.

The big question is whether Alpha does any good. If rappers taunt rivals by belittling victims or referencing past murders, that crosses a line, Seager says. “The intention is to incite a response, which is often violent… If we remove the content, it prevents an escalation.”

But there’s little evidence that exercise songs directly lead to offline aggression. “In most of these feuds, people don’t need a YouTube video to get hurt. They were going to do it anyway,” says Stanford University’s Forrest Stuart, who studied drill music in Chicago, where the genre was born.Social media can even reduce violence in some cases, says Stuart, because gang members can now build a tough reputation without stabbing people in the streets.

In some cases, Alpha’s work can be counterproductive. Deleting the videos – and potential earnings – of up-and-coming dig artists can ‘deter them from making music’, so some focus on ‘street tricks’ (crime) to make money instead, said traditional knowledge. Banning videos may just make them more alluring. And when songs are removed from YouTube, some rappers have simply re-uploaded them elsewhere. “Censored” is available, in its entirety, on Spotify.

Nevertheless, the number of videos that will be deleted from the Internet is likely to increase. YouTube says it currently relies on the Met to interpret slang and identify real threats in drill music videos. But the Online Safety Bill, which is currently going through parliament, is designed to directly hold tech giants to account for harmful content. Leaders risk hefty fines and even jail time if they can’t properly police their platforms. Social network moderators will also have to spend time refreshing their “bars”.

Share.

Comments are closed.