Tech company responsibility
While some reactions on social media were less than sensitive in the aftermath of the attacks, influencers using social media as their source of income are in a precarious position. On the one hand, these platforms are their source of income, but on the other these are the very companies that have failed to ban extremist behaviour, and, in Facebook’s case, even live streamed the Christchurch attack.
Prime Minister Jacinda Ardern said in her House Statement on 19 March that social media platforms need to take responsibility for the content they give people access to.
“We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published. They are the publisher, not just the postman. There cannot be a case of all profit, no responsibility.”
She said the role of international companies such as Facebook and Google does not take away from the responsibility New Zealand must show as a nation to confront racism, violence and extremism.
Christchurch’s tragedy is far from the first time tech companies have been exposed for both providing platforms for extremist views and further harming victims of attacks.
On 14 December, 2012, a gunman entered Sandy Hook Elementary School in Newtown, Connecticut and fatally shot 20 children aged between six and seven years old, as well as six adult staff members. The families of the victims suffered unthinkable grief and a global outpouring of shock and sympathy ensued. However, almost immediately after the massacre, attacks on the victims’ families began online.
Conspiracy and anti-government groups started using Facebook to spread claims the massacre was a hoax. Not only were the child victims named “crisis actors” but parents and families were asked directly how much they were paid to “pretend to grieve” and some received death threats. Lenny Pozner, whose six-year-old son Noah was killed in the massacre, devoted countless hours attempting to clear Noah’s name – individually reporting posts about the conspiracy theories to Facebook, Twitter, YouTube and Google.
Pozner faced years of threats and was forced to move multiple times to different gated communities after his address was posted on online conspiracy groups. By July 2018, Pozner was moved to pen an open letter published in The Guardian pleading with Facebook founder and CEO Mark Zuckerberg to crack down on hate speech and protect victims of mass shootings and other tragedies within Facebook policy.
Pozner’s letter was responding to Zuckerberg’s interview with American journalist Kara Swisher where he said he didn’t believe Facebook should remove conspiracy theories – including Sandy Hook and Holocaust denials – from the platform, rather push them down the site where fewer people could see them. In an email to Swisher after the interview was released on her podcast Recode, Zuckerberg defended his position saying:
“Our goal with fake news is not to prevent anyone from saying something untrue — but to stop fake news and misinformation spreading across our services. If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in News Feed.”
The Social Club’s McGillivray says it is completely unacceptable that Facebook failed to remove the live stream of the attack until after it had finished. In 2017, Zuckerberg promised Facebook would improve its filtering and handling of violent videos after a murder was filmed and posted on Facebook in Cleveland, United States. The graphic video of the killing of Robert Godwin remained online for two hours before it was removed by Facebook.
“It is disappointing that this promise wasn’t met,” McGillivray says. “Given the resources they have available, there is no reason for it not to be the top priority.”
Not only does Facebook provide a platform for extremist ideas to be discussed, but it also allows heinous acts of violence and terror to be broadcast live with seemingly no moderation. The Christchurch attack was streamed live for 17 minutes on Facebook and it wasn’t until a further 12 minutes had passed that the website finally pulled the coverage. By that time it had been viewed by 4000 people.
Seven Sharp hosts Hilary Barry and Jeremy Wells questioned the legitimate need for platforms that live stream content without any lag or monitoring – as we’ve come to expect from live television broadcasts.
“I just can’t comprehend the fact that the gunman was able to live stream his attack for 17 minutes on Facebook Live… Why don’t we just disable Facebook Live? Is it that vital, do we need it? They don’t seem to be able to control it,” Barry said on the 21 March show.
While Wells pointed out that using social media was an optional choice people make, Barry said that that did not undermine the need to monitor content on the website.
“You do expect to be safe there [on Facebook]. I don’t expect for videos to pop up, equally I don’t expect people to be able to live stream violent attacks like that,” she said.