After New Zealand Attacks, Calls Mount for Social Media Companies to Take Action

March 18, 2019

By Carlo Versano and Brian Henry

It was what Facebook didn't say that was so shocking.

In the aftermath of the New Zealand mosque attack, the social network said its automated technology was able to stop 80 percent of the videos of the massacre at their point of upload. But at 1.5 million uploads, that left around 300,000 copies of the video still able to get through the initial safety net, where they were able to be viewed in the immediate wake of the tragedy, which left 50 people dead.

It was a reminder of the sheer scale of the company that began as a way for friends to share updates and photos and has morphed into perhaps the world's biggest megaphone ー one that can be used for good or evil.

"When I saw the statistic that 1.5 million of these video uploads were blocked, I was astounded," SocialFlow CEO Jim Anderson told Cheddar.

"At the risk of just stating the blindingly obvious, what in the world are 1.5 million people trying to do uploading such a horrific video?"

As New Zealand grieves its dead, its political and business leaders are turning to the social media giants Facebook ($FB), Twitter ($TWTR), and Alphabet-owned YouTube ($GOOG) to take action to prevent hate speech and violence from continuing to spread through its channels.

New Zealand's prime minister, Jacinda Ardern, has taken the position that it's time for social media giants to start acting like the media companies they've disrupted. "They are the publisher, not just the postman," she said.

She was joined by Australian Prime Minister Scott Morrison, who called on social media companies to implement better protections against terrorism and hate speech on their platforms and asked Japan's Prime Minister Shinzo Abe to make it a major point of discussion during the next G20 summit.

New Zealand's three major internet service providers also asked Facebook, Twitter, and Google to take part in an "urgent discussion" industry-wide to prevent the spread of harmful content.

And a growing number of New Zealand companies say they are considering a boycott of Facebook advertising in the wake of the attack. In a statement, the country's ad council said: "We challenge Facebook and other platform owners to immediately take steps to effectively moderate hate content before another tragedy can be streamed online."

But the same technological advances that powered the viral spread of the video make it extremely difficult to control, SocialFlow's Anderson said.

Smart as it has become, artificial intelligence is still unable to fully recognize context, which is why so many of the New Zealand videos were able to slip through the machine-learning filters.

"They'll re-cut the videos, they'll put other things in the video to make them look different," Anderson said, adding that it seems to be a type of "game" for people who take pride in their ability to evade the platforms' control.

Bootleggers have long sought to spread and profit off content that isn't theirs. But as the New Zealand attack highlighted, they never before had platforms at their disposal that make it so easy to publish to a global audience.

"The fact of that matter is now that you can share a video in less than 30 seconds to the other side of the world whereas before it was very difficult to share copies of those kind of things," Anderson said. "Technology has enabled the spread of this information in ways that seem to be very difficult to control."

The sheer size of sites like Facebook and YouTube mean that if even a small percentage of content that's designed to fuel hate or violence is able to make it through the censors ー and even if it's then only available for minutes before a human takes it down ー thousands of people can still see it, download it, and share it again on sites like 8chan, as the New Zealand video was.

It is the perhaps the biggest Whack-a-Mole game in history, happening in real-time at an unprecedented scale and with virtually no regulation.

"In general, there are going to be places on the web and in the world where there are few, if any rules," Anderson said.

But Anderson also noted that the biggest sites, like Facebook, are also "justifiably" held to a higher standard of what they can control.

He said that even partial steps by Facebook ー like disabling share buttons on some content ー can help slow down the viral spread of content.

"Whether you're talking about medicine or social media, if you can slow down the progression of something that's viral, if it's negative, that's one of the ways you combat it."

For full interview click here.