Facebook’s inability to control live streaming ‘not good enough’ says Guardian boss

The boss of The Guardian has lashed Facebook for failing to have tools in place that could have prevented the rapid spread of the live streamed video of the Christchurch terror attack.

Pemsel: Monetising terror videos – even accidentally – is abhorrent

Speaking at the Advertising Week Europe conference in London, David Pemsel, CEO of Guardian Media Group, said it was abhorrent that such material could be monetised through appearing on ad funded platform or sites.

Part of last Friday’s attack on two mosques was live streamed by the perpetrator via Facebook. Extracts from the opening stages, before he entered the mosque, were later rebroadcast by some Australian news outlets. The Australian Communications & Media Authority, which regulates Australia’s broadcasters but not Facebook or other digital players, has already launched an investigation into those editorial decisions.

During the session, Pemsel was asked about The Guardian’s relationship with Facebook. Along with The New York Times, The Guardian Media Group was part of last March’s investigation which triggered Facebook’s most challenging year when it emerged that British political consulting firm Cambridge Analytica had been mining data on the platform’s users.

Talking about the organisation’s “complicated” relationship with Facebook, Pemsel said: “As a chief executive of a legacy media company you can sound as if it’s all unfair and they’re taking our money.

“Its not really about that. The Guardian holds them to account.

“As a business person, whenever I meet anyone from Facebook you just want to say ‘look, with scale and impact on the world comes responsibility’. And I think that responsibility about what they do has come very, very late.”

Asked whether he accepted that Facebook has now reached an understanding about its responsibilities to society, Pemsel hesitated before saying: “I do think they do now have a clearer understanding of the unintended consequences of the success of that platform, and I do think they understand that when they say ‘Hey, we’re just a platform, we connect people’ – well, you can connect good people and you can connect bad people.

“And the idea that a platform can connect people and at the moment you can have no real arbitration or no real sense of responsibility about connecting the bad as well as the good, you end up with the circumstances we are in now.”

“These businesses are new. However, they should have got to their sense of responsibility faster.”

Pemsel said that The Guardian had decided not to run any video or images from the attack video.

“It’s a shame and really disappointing that you need something of such horror to make people now start to talk about the role of these technology companies.”

He suggested that if an organisation such as The Guardian, with relatively limited resources compared to the tech giants, was able to avoid making use of the video, Facebook should have been able too take more steps top prevent it in the first place.

He said: “There is a profound sense of responsibility when these things happen and it’s really difficult for an organisation such as ours to look an advertiser in the eye or talk to our readers and say that type of contact has no role in our ecosystem. It will not be distributed across any platforms.

“The idea that somehow – accidentally or otherwise – somebody should monetise that is abhorrent.”

“Whereas you then get these companies of such scale and such resources and huge amounts of money seemingly unable to create the technology and the tools to be able to stop that from happening. It’s just deeply disappointing.

“And the idea ‘We’re getting to that; we’re on that, we’re trying harder, we must do better’ I just don’t think it’s good enough.”

A spokesman for Facebook referred Mumbrella to a blog post from the company’s lawyer Chris Sonderby on the issue. In part, it stated:

  • The video was viewed fewer than 200 times during the live broadcast. No users reported the video during the live broadcast. Including the views during the live broadcast, the video was viewed about 4000 times in total before being removed from Facebook.
  • The first user report on the original video came in 29 minutes after the video started, and 12 minutes after the live broadcast ended.
  • Before we were alerted to the video, a user on 8chan posted a link to a copy of the video on a file-sharing site.
    We designated both shootings as terror attacks, meaning that any praise, support and representation of the events violates our Community Standards and is not permitted on Facebook.
  • We removed the personal accounts of the named suspect from Facebook and Instagram, and are actively identifying and removing any imposter accounts that surface.
  • We removed the original Facebook Live video and hashed it so that other shares that are visually similar to that video are then detected and automatically removed from Facebook and Instagram.
  • Some variants such as screen recordings were more difficult to detect, so we expanded to additional detection systems including the use of audio technology.
  • In the first 24 hours, we removed about 1.5 million videos of the attack globally. More than 1.2 million of those videos were blocked at upload, and were therefore prevented from being seen on our services.

Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.



Sign up to our free daily update to get the latest in media and marketing.