As calls grow louder for government and big tech to act with greater urgency to counter violent extremism, some experts say thwarting these types of attacks is a very difficult and complex balancing act. .
The shooting massacre earlier this month at a Buffalo grocery store brought social media companies back into the spotlight, some of which are under investigation by the attorney general, who called on the industry to ‘take responsibility’ and to “be more vigilant” in surveillance. contents.
The suspect targeted black shoppers at the Tops Friendly Market on Jefferson Avenue and live-streamed his rampage online. He also left behind at least two screeds — over 700 pages — filled with anti-Semitism and white supremacy theories. Ten people were killed and three others were injured in the assault.
Twitch, a live gaming service, said it deleted the live video in two minutes, but that was still enough time for others to copy and share it on other websites.
“Once those are there, it’s impossible to put the genie back in the bottle,” said Richard McNeil-Willson, a researcher at Leiden University in the Netherlands who studies extremism and counter-terrorism. extremism.
Discord, a chat app with invite-only channels, said the suspect created a private server as a personal diary, and about 30 minutes before the attack invited a “small group” to join the server. The Washington Post reported that 15 people had accepted the invitation. Meanwhile, The Buffalo News first reported and News 4 independently confirmed that a former federal agent believed to be from Texas was being investigated to determine if the individual had been given advance notice of the attack plans.
“Wild West” of hate
The Buffalo shooter has revealed that social media websites 4chan and Reddit are his main sources of information about the anti-Semitic, anti-immigrant and racist theories he espouses.
As a result, State Attorney General Letitia James is investigating social media platforms which she says are “obviously responsible for feeding this young man all his hate.”
“The fact that an individual could publish detailed plans to commit such an act of hate without consequence and then broadcast it for the world to see is chilling and unfathomable,” James said.
As calls grow louder for government and big tech to act with greater urgency, some experts believe thwarting these types of attacks is a very difficult and complex balancing act that is further complicated by the absence in a unified way.
“But also, tech companies need to stand up and take responsibility for the fact that we know there is toxic and life-threatening activity on their platforms,” said Jonathan Lacey, a former FBI agent who operates a tech company. private security consultancy.
Experts interviewed by News 4 Investigates all agreed that social media platforms must be held accountable and that the government must consider different approaches to addressing the problems caused by the rise of lone wolf actors and far-right extremism.
“These social media sites are like the Wild West of hate and extremist speech and planning,” Lacey said. “And law enforcement needs the tools to be able to deal with this threat adequately.”
Who to hold responsible?
Arun Vishwanath, a cybersecurity expert, said a national social media accountability policy was overdue, but he would also like to see action taken against those who amplify hate speech.
“So it’s not just the platforms,” Vishwanath said. “There are also individuals. Let’s hold them both accountable and it’s time to do it because there’s going to be another shootout. Like it or not, unfortunately, that’s the saddest thing to say, but that’s the reality we live in.
McNeil-Willson has co-authored two articles on the Buffalo mass shooting in which he examines in depth the lessons to be learned from this horrific act when crafting new responses to violent white supremacy and an analysis of the copings that he left behind.
“Current government and private sector responses struggle to adequately address these documents due to a combination of factors: an over-reliance on takedowns, difficulties in coordination between governments and social media platforms – but also because current approaches render exceptional the nature of far-right violence as disconnected from mainstream discourse,” McNeil-Willson’s paper determined. “Challenging this conceptualization can pave the way for a more effective response and less focused on security against far-right violence.”
McNeil-Willson said the focus is on preventative measures to try to stop attacks, which he says is not possible.
“I think there should be more concern and more funding and support around broader issues of anti-racism, fighting white supremacy in local communities,” he said.
In European countries, he said there is a broader discussion about community responses to violent extremism and violent extremist ideologies. Programs that focus on community work to divert young people and violent groups from racist narratives to more positive forms of social engagement could be a better use of already limited resources. Social support networks that can help communities become more resilient to extremism should be considered rather than a prevention narrative that focuses more on preventing someone from becoming involved in violence, he said. -he declares.
He suggested creating community centers that can work in areas where far-right extremism is more prevalent to help young people develop responses and resilience through other skills, such as positive engagement. through the democratic process.
“It is better to say why the majority of the community does not engage in violence or why the majority of people do not engage in violence and what can that teach us about how we can better build resilient communities in response to violent extremism?” said McNeill-Willson.
Further, it suggests that communities and policy makers strive to have meaningful discussions about what constitutes hate speech and how to respond to it with a more holistic approach.
“It doesn’t mean banning it, but it does mean having a serious discussion about the implications of this language and how it is used,” he said. “A lot of far-right language is often dressed up in order to make it palatable to the general public.”
Lacey said big tech needs to understand that it can strike a balance between privacy and public safety.
“This case is a terrible reminder that these sites are used to plan horrific attacks against people in our communities,” Lacey said. “Something must be done.”
Dan Telvock is an award-winning producer and investigative journalist who has been part of the News 4 team since 2018. See more of his work here and follow him on Twitter.
Luke Moretti is an award-winning investigative journalist who has been part of the News 4 team since 2002. See more of his work here.