News

Neo-Nazis And Incels Are Using Pop Songs To Mask Extremist Propaganda On TikTok

Far-right groups are using Gotye, Kate Bush, and MGMT hits to stop their videos being taken down.

TikTok extremism

Want more Junkee in your life? Sign up to our newsletter, and follow us on Instagram and Facebook so you always know where to find us.

Far-right groups are masking hate speech behind catchy TikTok songs to avoid their videos being taken down.

The newly released ‘Hatescape‘ report by an independent British counter-extremism thinktank found that racist, anti-Semitic, misogynistic, and queerphobic videos were getting millions of views on the app using sneaky workarounds.

After looking at more than 1000 examples over a three month period, researchers found that nearly half of the videos used pro-Nazi language, while also perpetuating harmful stereotypes around minority and marginalised groups.

A decent chunk of the TikTok clips were also found to be made and peddled by Australians. At the time of the study concluding, over 80 percent of the videos examined were still up, however, the ABC reported that TikTok has since taken them down.

The popular app still can’t pick up on evasions like changing the spelling of banned keywords and hashtags, manipulating tools like stitches and duets, and using trending sounds in the background of problematic videos — letting creators avoid detection for longer, and allowing them to reach more audiences.

The report found that MGMT’s ‘Little Dark Age’ was “by far the most popular sound amongst extremist creators” despite having “no extremist connotations”. The track went viral with a trend comparing features to historical art, but was co-opted to show off white supremacists, fascists, and dictators. Other tracks like Gotye’s ‘Somebody That I Used To Know’ and Kate Bush’s ‘Running Up That Hill’ were also used to spread Neo-Nazi symbols and memes in videos as well.

Brainwashing Audiences

Senior intelligence analyst Mollie Saltskog told Crikey that far-right groups intentionally use social media apps like TikTok to rope in and brainwash young audiences. “If you have created something you also need to figure out how to make sure that your platform cannot be used to co-ordinate or organise and conduct violent attacks against innocent civilians,” she said.

TikTok’s algorithm detection has come under question in Australia after spreading COVID lockdown conspiracies earlier this month. An investigation by Four Corners and triple j’s Hack determined that it was a slippery slope from an odd misinformation video popping up, and the potential for deep radicalisation and indoctrination.

TikTok user Mitch from Cairns told the program that it wasn’t long after signing up that he started getting anti-vaccination content spruiked by One Nation on his For You page.

“TikTok categorically prohibits violent extremism and hateful behaviour, and our dedicated team will remove any such content as it violates our policies and undermines the creative and joyful experience people expect on our platform,” the video platform said in a media statement.