News

Facebook whistleblower tells Senate Inquiry polarising ads are cheaper to run

Facebook whistleblower Frances Haugen told the Senate Inquiry into Social Media and Online Safety today that polarising advertisements are cheaper to run and get more engagements.

Haugen told the Inquiry: “Facebook says an ad that gets more engagements is a higher quality ad. And research has shown that angry and divisive, polarising ads are cheaper to run than compassionate [ads], or ones that are trying to find a middle ground or educate people. We can’t have a democracy when divisive, polarising and extreme speech is five-to-10 times cheaper than speech that is trying to figure out how those facts work.”

Frances Haugen appeared before the Inquiry

Facebook’s Advertising Policies state: “Ads must not contain content that exploits crises or controversial political or social issues for commercial purposes.” The advertising policies also prohibit inflammatory content and implications of personal attributes. According to Facebook’s Business Help Centre, the metrics used by Facebook advertising to determine success include engagements and also the quality of the creative assets and the conversion rate.

The Senate Inquiry is part of Prime Minister Scott Morrison’s examination of mental health and social media. The Inquiry is separate from the bill to target anonymous users for potentially defamatory comments.

In 2021, Haugen disclosed internal documents from Facebook to the Wall Street Journal alleging Facebook prioritised profit over user safety. Facebook has categorically denied this.

Haugen reiterated her claims to the Inquiry, stating: “The thing that I saw over and over again, inside of Facebook, was that Facebook faced trade-offs. Little decisions about: are you willing to lose a sliver of, half a percentage point, 0.1% in exchange for decreases in misinformation? Facebook shows over and over again to make those trade-offs on the side of its own balance sheet.”

“Mark Zuckerberg’s white paper in 2018 – it’s called ‘Demoting Borderline Content and Integrity Strategy for the 21st Century’,” added Haugen. “Facebook was aware even then, in 2018, that more extreme content was viewed as more attractive by the algorithms. That it elicited reactions and drew in engagement more than more moderate content. It doesn’t matter if it’s on the left, on the right, if it’s hate speech, nudity, people are attracted to interact with more extreme content because it’s just part of our brains.”

Haugen also expressed cynicism on Facebook’s ability to self-regulate.

“We don’t let children grade their own test,” she said.

The former Facebook employee advocated for what she called the One Two Three Model.

“Part of why I care about this process is about healing the public’s relationship with Facebook, because the way the public right now has a lot of hostility to Facebook, because they don’t trust what Facebook says. So I’ve been an advocate for something I call the One Two Three, or company, community accountability plan. So the first step is company. The company should have to do a risk assessment, a public risk assessment that says, we believe these are the risks and harms of our products… that’s not enough because Facebook is not diverse. It’s geographically isolated, very privileged. We have to pair that with something, from say a regulator where we need to go and [speak to] community groups. When you talk to parents, when you talk to pediatricians, we need to talk to NGOs, civil society groups, talk to children and say, what do you believe the harms are of this product?”

Haugen continued: “So company, community gives us a pretty robust picture of what we perceive to be the harms and that needs to be paired with accountability. And so accountability for me takes two forms. One is, Facebook currently does not have to disclose, in a way that it is accountable for; what is it actually doing to solve problems? It’s a cliché at this point of what Facebook does when a new leak comes out. They apologise, they say this is really hard and we’re working on it, but they say the same thing every single time… we need to pair that with data…

“What data, if it were released, will allow us to see that Facebook is making progress on this issue?”

Facebook releases a report called the Community Standards Enforcement Report which it publishes “on a quarterly basis to more effectively track our progress and demonstrate our continued commitment to making Facebook and Instagram safe and inclusive”.

Meta, Facebook’s parent company, appeared before the Inquiry on 22 January with its regional director for policy, Australia, New Zealand and the Pacific Islands Mia Garlick and head of public policy Australia Josh Machin.

On the accusation that Facebook places priority of profits over user safety, Garlick stated “that is categorically untrue”.

Garlick said: “Safety is at the core of our business. If people do not have a positive experience, if they do not find our services useful, then they will not continue to use them. That is why we have been steadily increasing our investment in safety.”

In terms of the detection of harmful content, Machin said: “There’s been a lot of progress in the ability of technology to detect harmful content, but it’s not perfect. There’s a long way to go. There are plenty of instances where technology might miss content but also instances where it may over-enforce or remove material which is not violating our community standards because it’s incorrectly understood what it’s applying. As well as having technology that can detect as much harmful content as possible, we want technology that’s also precise and that minimises the number of appeals people need to go to.”

It’s the latest controversy Facebook has found itself in. This morning Andrew ‘Twiggy’ Forrest backed a criminal claim against Facebook over alleged misuse of his image in cryptocurrency scam advertisements. Forrest is also launching a civil claim in the United States.

ADVERTISEMENT

Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.

 

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing.