Airplanes and adtech: The simple lesson that aviation has for digital advertising

Mike McGarry draws an unlikely parallel between adtech's fraud problem and the data-based solutions invented by the aviation industry.

Have you ever been afraid that your plane was going to crash? If you’re anything like me, even a little bit of turbulence might turn your palms sweaty and cause your anxiety to spike.

Perhaps I’m just a wuss? Or perhaps I’ve watched too many episodes of Air Crash Investigations and Seconds from Disaster? Either way I tend to get this feeling in my head that the plane might just drop out of the sky (sorry if you’re reading this at an airport…).

To be fair, if you look at plane crash statistics, the fear of crashing is quite an irrational one. In 2016, 325 people died in 19 airplane crashes. Although that may seem like a lot, 2,500 left-handed people are killed every year from using equipment designed for those who are right handed…

Beware left handers, right handed scissors, much more dangerous than an airplane

But if plane crashes are so rare, the question becomes: how do airlines know if a plane is close to failure, and how do they predict and prevent crashes so effectively?

The answer is although there is very little data on planes falling out of the sky, there is a lot of data on planes and their engines operating correctly.

In fact, there is a machine provided by Amazon called a “snowball” that some airlines will use on every flight. It works like a mini data processing facility, collecting data from the airplane, processing it and storing it for collection upon landing to be ported into the cloud.

Amazon’s Snowball Edge

Once they have the data available, rather than looking for indicators of failure, data scientists use a technique known as “outlier detection”. Essentially, they don’t look for anything in particular, rather they just look for something that is “abnormal”.

Similar techniques are also used when it comes to identifying invalid traffic and preventing ad fraud.

However, something that is not often discussed is the differences in the way that companies are able to identify what a “normal operator” looks like.

One way is to use a group of humans – a panel – who you are 100% sure are human.

Just like with an aircraft, you can then monitor what normal behaviour looks like.

For example, comScore has a panel of over 2,000,000 such people across the world.

Then, when comScore sees a user who acts in a way that is not congruent with their normal user base, they are flagged as an outlier.

You may think that this is erring on the safe side, however keep in mind that the very nature of sophisticated invalid traffic and ad fraud is that it is yet to be discovered. You don’t know what you don’t know.

See below for schemes that have been discovered in the last 24 months:

The question that has to be raised is whether or not it is even possible to have a robust baseline of normality without a human panel.

Going back to our airplane example, imagine if despite collecting billions of data points on your aircraft, you didn’t know if the plane landed safely, or had crashed into a nearby mountain. Quite a blind spot…

So, next time you hear terms such as “anomalous pattern detection” or “outlier detection”, dig a little deeper and see if you can uncover exactly what methodology is being used to identify “normal” or “baseline”.

It can make all the difference in the world.

Mike McGarry is a senior sales engineer at Beyond Intent. This post first appeared on his LinkedIn.


Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.



Sign up to our free daily update to get the latest in media and marketing.