News

MSIX: ‘You can’t make free choices if you are addicted’ – How ethics in advertising are used for good… and evil

A respected professor has addressed the ethics that drive the decisions behind advertising, and how the science can be used for good or evil – often unintentionally so.

Speaking at the Mumbrella Marketing Science Ideas Xchange (MSIX) in Sydney on Wednesday morning, executive director of the Ethics Centre, Dr. Simon Longstaff, used the example of a gambling addict whose banking and wagering app data both showed how he burnt through $400,000 in worker’s compensation through a series of bets placed at an average of one each six seconds.

Both the bank and the wagering company could see these transactions, but neither one acted in the best interest of the gambling addict, who was quickly draining his finances, and couldn’t work due to injury.

“Now I’ve worked with a number of very large companies including in areas where there’s potential risk like alcohol and other things, looking at the responsible use of data, the use of large language models and things like that, trying to create systems by which they can actually exercise the kind of ethical restraint that stops it becoming the kind of situation in which a person is gambling to the point where it’s once every six seconds until their one financial lifeline is brought to an end,” Longstaff said.

“I’ve seen it where they wrestle with questions and not necessarily come down on the right side about the extent to which they should take into account the vulnerability of people who are addicted, for example, to drink.

“I mean, eventually the idea was they wanted people to be able to make free choices and you can’t make free choices if you are addicted to anything because your ability to make an informed choice is compromised by your addiction.”

But what do you do when your marketing science will allow you to improve the outcomes that you want, but only with potential risk of exploitation?

“Now people like to get around this by saying, ‘Oh well, no one’s going to do that because eventually it will be disclosed, and eventually it will be bad for the brand and it will be bad for business,'” Longstaff explained.

“This is the so-called conundrum of good ethics, a good business, which is actually supported by the evidence and we’re only quite slowly able to study which shows that more ethical companies do better, largely because they build trust as far as internal and external costs.

“Just because you can do something doesn’t mean that you should do something,” Longstaff declares. “It’s not enough to make the calculation that it’s ultimately in your self-interest from an economic or other perspective to do the right thing – there must be something intrinsically that you believe in which you do for its own sake, in the hope therefore that there will be a consistency of application across the full spectrum of relationships that you actually develop over time.

“Because that’s where I think actually the marketplace is itself developing. During the course of the 20th century we saw an end of a period in which the distinction between companies was based on what they did — a new product in terms of better features, better pricing, time to market, all of those things used to confer advantage.”

Now that technology allows competitors to spot new advantages and replicate them, a company’s products and services alone aren’t enough.

“What they cannot replicate is some underlying cultural proposition which is deep-seated and expressed over time or certain attributes that go beyond just the features in the product,” Longstaff continued, saying the ecology within which businesses and brands operate has changed — “from one in which the ecology is based on what you do, to one based on what you stand for, what you need. It’s an ecology of needing rather than an ecology of doing.”

“And in that sense I think ethics has an important role to play because ethics is ultimately — in its proper sense ,when it’s well understood — it’s not about a question of should we have voluntary assistance for dying and things like that. It’s actually the attempt by human beings over thousands of years to understand the fundamental structure of human choice itself and the way in which core values and principles almost at an axiomatic level shape what we choose.”

Longstaff said we mostly aren’t aware of this reaction, and are impacted by ‘shadow values’ “embodied in systems, policies and structures which pump out messages in the workplace but also in the wider community which can be calibrating messages: “It’s the ability to bring all of those things into alignment and to make the ethical choice about what you do or don’t do which becomes the cornerstone, I think, for future success in this particular area.”

Longstaff said, in the midst of powerful science and technology, “our choices will determine what we are doing.” And, reckless indifference can be just as dangerous as intent.

“Our choices, ultimately, are a reflection of our core values and principles – and we need to work out what those are, at the centre of how this science is being deployed.

“The same conversation that is taking place globally now around the deployment of AI and whether the choices we make may provide a way for better or for a worse world.

“It really is as simple and as complicated as that. As it has always been.”

ADVERTISEMENT

Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.

"*" indicates required fields

 

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing.