World News

Social media firms under scrutiny over NZ shooting footage

Social media firms under scrutiny over NZ shooting footage

On Friday, a brutal massacre in a New Zealand mosque left dozens of people dead.

A gunman who killed 49 people at two New Zealand mosques live-streamed the attacks on Facebook for 17 minutes using an app designed for extreme sports enthusiasts, with copies still being shared on social media hours later.

One of the shooters appeared to have live-streamed the attacks on Facebook that purportedly showed a gunman walking into a mosque and opening fire on the prayers.

Facebook said it will also remove all edited versions of the video that do not show graphic content, according to a Twitter post citing New Zealand spokesperson Mia Garlick.

The seemingly incongruous reference to the Swedish vlogger known for his video game commentaries as well as his racist references was instantly recognizable to many of his 86 million followers.

Prime Minster Jacinda Ardern said Monday that the tech companies have "a lot of work" to do to curb the proliferation of content that incites hate and violence. "Ultimately, though, it has been up to those platforms to facilitate their removal and support their removal", Ardern said. "I think this will add to all the calls around the world for more effective regulation of social media platforms", she added.

The modern age has brought with it the power of social media. "This is an issue that goes well beyond New Zealand".

According to the New Zealand Herald, some major firms are considering pulling ads from Facebook and the anger is evident from an op-ed in that paper from one of its business writers.

Facebook CEO Mark Zuckerberg is yet to make a public statement about the Christchurch attack.

More news: Indian tycoon Ambani pays debt after court threatened jail

"Since the attack happened, teams from across Facebook have been working around the clock to respond to reports and block content, proactively identify content which violates our standards", she said.

Facebook has hired about 20,000 moderators but critics say they are not doing enough. "Social media firms have made the decision not to invest in adopting it".

Companies including Facebook, Twitter and YouTube scrambled to take it down, but once something goes viral on the social media, it's hard to stop its spread. "That is the problem".

The restrictions have also applied to news media. "Take some ownership. Enough is enough".

Mr Watson‏ hit out at Google for not taking the video down from YouTube immediately. He could face up to 14 years in jail if convicted. That same year, a video of a man shooting and killing another in Cleveland, Ohio, also shocked viewers.

But yesterday morning, the video footage was back online. "It is not in line with our policy relating to terrorist propaganda videos", its editor tweeted.

Jacqueline Helfgott, a professor of criminal justice at Seattle University, said that for some, social media can be a motivator when it comes to committing a crime.

"Newsrooms, platforms, and public officials need to think about how to avoid playing into the hands of people whose deadly actions are created to trigger maximum exposure for their message, and set off new cycles of violence and radicalisation", wrote Silverman.