Calling bullshit on the internet 101

Quit blaming Facebook and the fakers feeding you news that you believe without cross-checking; it's time to take responsibility for our own critical thinking in a digital age, says strategist Dave Bathur in this guest post.

Abasche Tunde wasn’t having a good day – and considering he was an astronaut, a day can last a while.


Abasche, a Nigerian working on the Salyut 8T secret Soviet space station, was finishing his rotation and was looking forward to making the long drop back to Earth.

It wasn’t to be. His place on the Earth-bound shuttle was instead taken by cargo, and due to rather unfair contract requirements, Abasche needed to raise three million dollars to nab a ride home. Which is why his cousin may have emailed you already…

This story is, of course, not true – one of the more imaginative email scams doing the rounds.

The point? Personalised fakery is a fact of the internet. It’s the trade off for all the benefits of an interconnected world. What we do with it is what counts.

If you choose to give money to a stranded Nigerian astronaut, fine. If you read fake news sites and decide to vote for an orange demagogue, then that’s fine too. As long as it’s your informed choice. That’s democracy, right?

The problem arises when we turn off our critical reasoning and let ourselves be manipulated by emotionally compelling guff. The even larger problem is that too many of us lack the tools to spot fake news – or the willingness to assess it sceptically.

Blaming Facebook for manipulation misses the Mark. All you need to do is look at the images liked by Mark Zuckerberg, and you see that if he was trying to sway the election result, he failed.


Just like when we get asked to give money to Nigerian astronauts, it’s up to us individually to critically assess the information we receive in our nNewsfeeds if we plan to base decisions on it.

Luckily, there are some really simple things we all can do when faced by stories that honk a bit…

1. Be by-curious. Who’s the story by? Does their bio indicate an agenda? Could the sender account be fake (e.g. recently opened, they have far fewer Twitter followers than the number they follow etc)? Does the url look legit? Did you actually read the article before you shared it?

Most of these things can be determined by a glance or a single click.

2. Try triangulating. Want to be really sure? See if you can verify a piece of information from other sources. For instance, a Google image search will tell you if an image you receive has been doctored from a stock library, where else it exists, and the likely source. Try it next time you receive a shockingly sharable photo.

3. Snopes it. You’re probably familiar with Snopes.com – so use it. Many of the most persistent fake or inaccurate news stories, like the Heineken dog fight, keep cropping up periodically – and often still get a run.

4. Talk to the other side. This is the big one – and the most difficult. In my view, fake news is less the issue than the echo chamber that personalised content algorithms create.

In ‘On Rumours’, Cass Sunstein writes how echo chamber effects don’t just reinforce, but amplify the strength of misinformed beliefs. Other recent research shows that echo chambers increase the rate of spread of such beliefs.

However personalisation makes Google, Facebook and lotsa others lotsa money. Our filter bubbles are here to stay – so we need to know how to pierce them.

This means actively searching for different views. Looking for news sites that you may not necessarily agree with. Seeking out friends who you know have different views on issues, asking them why, and really listening.

This will all start to puncture the hermetic seal of your newsfeed as Facebook and Google recognise the wider range of sites and profiles you have affinity with. It’s also common sense.

These tips offer a few obvious starting points – and if you want more information on managing misinformation then there are some truly magnificent resources.

It’s worth exploring. ‘Post-truth campaigns’ are now conducted at such scale that they sway presidential elections. But misinformation and propaganda are old cons given new life by the fact that technology progresses faster than our education systems and cultural instincts.

The convenience of the internet comes with a price. We should not totally outsource responsibility for critical reasoning to journalists.  Nor should we rely on social media businesses to fix the gaming of their algorithms. Their responses – such as censorship or increased human editorial guidance – often come with their own issues.

Our best defence – and frankly democracy’s best hope – is curiosity, and our capacity to question. There’s no security patch for gullibility.

This article appeared originally here, and is republished with the author’s permission.

Dave Bathur is a founding partner, strategy at Simpatico, a Sydney-based consultancy that provides digital training and transformation support to brands and agencies.  


Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.



Sign up to our free daily update to get the latest in media and marketing.