Opinion

Facebook has stopped Messenger bots – here’s what you need to know

Facebook has put a pause on its chatbot program as concerns around privacy mount. Here, Decoded CEO Chris Monk unpacks everything you need to know.

We’re becoming increasingly familiar with chatbots popping up in the places where we used to just speak to real people. Whether it’s digesting the daily news, ordering a pizza or asking some questions of customer services, increasingly we are turning to Facebook Messenger and other platforms for the automated answers.

Well, we were until Facebook threw a rather large spanner in the works.

What’s happened?

On the 26th March Facebook announced that – in the light of the recent Cambridge Analytica findings – it is putting a pause on the review of new chatbot integrations.

As a result of this, no new chatbots can be published to the Messenger platform until Facebook has completed a full review of the platform and made some changes. Those changes will include stricter terms and conditions, naming and shaming of developers that violate those terms, and a vulnerability bounty program where investigators are rewarded for bringing vulnerabilities to the attention of the social network.

There will likely be further changes as a result of the review, but Facebook is yet to publish the details of those, although it has started sharing on its blog some of the other changes it is making to the platform in order to better protect users’ data.

It’s important to note that the pause does not affect any existing Messenger integrations. Only new integrations will not be approved. However, there have been reports that this is not strictly the case and certain new integrations are being allowed – and that if you have one in the pipeline you can always submit it for review and see what happens.

Facebook has also stated that it will be re-reviewing previously approved integrations and chatbots to investigate how they are using data and what data they are collecting.

Why did this happen?

Facebook’s actions are an eventual consequence of the Cambridge Analytica scandal reported in March.

In that incident, the personal data of approximately 87 million Facebook users was allegedly misused by the research company which on its website claims it “use(s) data to change audience behaviour”. Cambridge Analytica is believed to have been key to both Donald Trump’s election to office and the result of the UK Brexit referendum.

The data was harvested by an app published to the Facebook platform that collected personal information about each user’s location, content they had “liked” and information about their friends.

The data was initially collected for research purposes, but was later sold to Cambridge Analytica. Facebook claims that the app developer, Aleksandr Kogan, breached Facebook’s terms and conditions, a claim which he denies as he states that he changed the purpose of his app from “Research” to “Commercial” on Facebook’s own platform and so that should be where the blame lies. Cambridge Analytica also denies any wrongdoing.

The resulting furore has seen Facebook fighting a PR war and being pressured to get its privacy house in order. While neither the Messenger platform nor chatbots were implicated in the original incident, it would appear that the social network has some concerns about how developers might be using the data they are able to collect through the platform.

If you are a publisher of chatbots or other Messenger integrations, there is little to be gained by panicking. If you have a new integration ready to go, then submit it for review anyway and see what happens. Crucially, if you have a service out there and running on the platform, do not disconnect it, as you will likely not be able to get it re-approved.

So what?

The impact of the (temporary) ban has yet to be seen and will largely be a function of how long it goes on for, which could be anything from one to four weeks from the initial announcement. Of more interest will be the impact of changes that Facebook will be making to the platform and what skeletons they might dig up from reviewing the currently approved integrations.

The other (perhaps more important) impact is the effect this will have on users’ perceptions of Facebook’s trustworthiness, which is currently at an all-time low.

Personally, I am impressed that as well as many fine words, Facebook appears to be taking firm and robust action to move towards a more private world. It would not have taken this decision lightly, and it will have a real financial impact.

Chris Monk is the CEO of technology education specialist Decoded APAC.

ADVERTISEMENT

Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.

 

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing.