Features

How do you solve a problem like measurement? Media buyers speak about the future of Nielsen

For a number of months, Australian publishers have endured discussions with the IAB’s measurement company, Nielsen, about flaws in its metrics. But what do agencies think of the solutions? Mumbrella’s Zoe Samios speaks with some of Australia’s media agency leaders to understand the influence of Nielsen data on trading decisions.

It would be hard to find a publisher in the Australian media that has had no qualms with industry measurement solution, Nielsen, in recent months.

And while tensions have been stirring long before this year, the past six months have been particularly testing for the company, which first announced its partnership with the Interactive Advertising Bureau (IAB) in 2011.

It’s been a long few months of discussions between Nielsen and publishers, but what do the buyers make of the solutions?

Now, Nielsen and some of Australia’s most well-known publishers are trying to reconcile disagreements, in a desperate attempt to uphold a standard of measurement for the benefit of the wider industry.

Over the past few months, Nielsen has succumbed to discussions with small, medium and large publishers. The discussions have not only questioned the company’s place in market, but their capabilities and methodology.

Some of the many issues Nielsen has been forced to address include its inability to measure Google AMP and https despite promising it could, the seemingly inconceivable growth of some publishers following the introduction of Facebook video to its metrics, along with disparity in audience numbers between its two measurement solutions: Nielsen’s Digital Ratings Monthly (DRM) and Digital Content Ratings (DCR).

Fairfax Media, one of Nielsen’s highest-paying customers, withdrew from DCR back in March.

Fairfax Media pulled from DCR after concerns with its methodology

The discussions over the past few months have escalated to the point where Nielsen has offered discount client prices to a number of publishers, just to retain them.

But why has this all happened? Why haven’t publishers walked? And how much of an influence does the data have on media buyers, anyway?

What has happened?

It’s been three months since a number of publishers first came forward to Mumbrella, expressing concerns around Nielsen’s new digital measurement solution: Digital Content Ratings.

The metric, which has been in market since July last year, was failing to meet the standards of a number of publishers.

In the first instance, the issue was around Google’s Accelerated Media Pages, which were going unrecorded by Nielsen. Weeks later, Fairfax Media decided to resign its contract with Nielsen, with issues related to lack of measurement of Google AMP and https.

At the same time, publishers were provoked by a decision by Nielsen to count Facebook video views towards total audiences. In some cases, audiences bloated by more than 14 times their original size.

But the latest plight is with Nielsen’s more established measurement solution, Digital Ratings Monthly (DRM).

Unlike DCR, which is measured using tags inserted by publishers, DRM is a hybrid method – made up of panel-based data and tagging.

For a number of years, Mumbrella has published Nielsen’s top 10 news website results, monthly.

When those numbers came out, Mumbrella began requesting small to medium publishers’ data. But Nielsen decided to stop publishing these numbers. Just two weeks ago, Nielsen told Mumbrella it only ever published the top 10 sites, with those outside the top 10 only being released on an ‘ad hoc basis’.

The company would not comment on whether publishers had asked for these numbers to be masked, whether it believed DRM’s methodology was accurate, or what media agencies were trading on. It also could not comment as to why the top 10 was still being released if the smaller publishers’ numbers weren’t.

A glimpse into the audience increases the night after the new Facebook crediting launched

Some publishers are claiming the DRM methodology is inaccurate, arguing the small panel does not provide an accurate representation of small or niche publishers.

On top of that, publishers argue the DRM data does not correlate with other third-party measurement solutions.

For some, Nielsen isn’t worth the price they pay at the moment.

Nielsen withdrawing DRM from the public eye suggests that it too believes its data may not be an accurate representation of audience for some publishers. And with flaws in methodology, there’s reason for concern around how the numbers affect the way brands and advertisers perceive them.

One of the biggest concerns from publishers is that media planning and buying decisions are made off the back of these numbers.

To understand the influence of Nielsen’s audience measurement data on agencies, Mumbrella spoke with a number of agencies and their digital bosses.

What audience metrics do buyers look at?

When Nielsen sends out the DRM figures to Mumbrella, it provides three pieces of information: unique audience, time spent on site and number of sessions.

Each piece of data tells a different story about a publisher. Unique audience, for instance, gives buyers a topline view of how a website is performing. Number of sessions or time spent can give a better indicator of quality of engagement.

For a company like Fairfax Media, which is still reported in DRM, number of sessions and time spent are affected, because these are measured through tagging, and Fairfax no longer has Nielsen tags on its website. On the other hand, unique audience is calculated purely on panel-based data.

From a buyer’s perspective, there’s more at play than simply overall audience, time spent on site and number of sessions.

Mike Wilson, CEO of Havas Media, explains the data itself is not of much use without additional information: “We would also want to understand more demographic and location data to help us understand quality.”

A number of different data sets are taken into account when trading decisions are made, says Havas’ Wilson

He says a number of different data sets are taken into account when making trading decisions.

“What is making the audience come to the publisher’s site – quality content, a trusted source of information, etc? This type of information must be taken into account,” he says.

Peter Wilson, head of digital at Nunn Media, says Nielsen’s data will “never provide the full story”.

Peter Wilson says Nielsen can help tell you you are ‘fishing in the right pond’

“All Nielsen can do is help tell you that you are fishing in the right pond,” he says. “The most common use case is looking at the propensity of an individual audience to visit a certain site. For this I would use unique audience over a certain time period.”

Havas’ Wilson agrees, pointing out no data is “100% accurate”.

“If the only consistent choice you have is only partially accurate, then you still have to use it, with obvious caveats. So long as you understand the strong and weak points of any data set and can take measures to normalise it, you’ll be doing well,” he says.

What are trading and planning decisions based on?

Media planning and buying is based on a number of decisions. Data can help inform the buyer of the ideal publication for their client. Given Nielsen is the industry measurement standard, it has a high level of influence.

But in an agency like OMD, the planning and buying process includes more than 35 data sets, tools and third-party data, as head of digital, Sian Whitnall explains.

Whitnall says OMD’s decisions are based on the ‘most accurate data available’

“We further verify our strategy and data sources by overlaying clients’ first-party data and segmentation to ensure that we are basing our decisions on the most accurate data available,” Whitnall says.

And while buyers understand the new DCR measurement solution and how it works, they are divided on its benefit to the industry.

For some buyers, like Nunn Media’s Wilson, what Nielsen DRM and DCR provides is granular, and trading decisions just can’t be made solely on these numbers.

“Nielsen is unlikely to ever include information such as costs and specific placements which have just as great an impact (probably more) on the potential performance of a campaign. It doesn’t matter how many people you can reach, unless it can be done in a cost-effective manner and with the right type of ad placement or integration to meet the goals of a campaign,” he says.

“Some sites may be misrepresented in the data and you should always sense check what you are looking at. Nielsen data will never hold the key to a successful campaign and can’t tell me if the people that visit a certain site will become my client’s next customer and never will.”

Is DRM and DCR data accurate?

Simply put, the buyers see gaps in the methodology, but are careful to point out every methodology will be flawed, to an extent.

For the buyers like Nunn Media’s Wilson, it’s about questioning the numbers, and understanding that Nielsen measurement solutions are part of a bigger picture.

He says while DRM is good due to its panel and “stability”, buyers should never use a single source of information as their truth.

“This goes for any advertising-related metric, all we can do is use them as indicators. Quite often these metrics can also only be compared within their own universes. It is one of the major issues we have in this industry. We believe too readily what is put in front of us,” he says.

Havas’ Wilson says if it’s all a buyer has to measure audiences, where it’s relevant is “immaterial”.

“That said, our market is incredibly innovative when it comes to data. There are some companies we work with who are doing outstanding work in terms of improving quality and depth of insight. Panel-based data is one view of an audience, and we’d prefer more quantitative data. As soon as a provider brings it to market in a reliable way, we’ll introduce our clients to it,” he says.

In the new DCR measurement solution, software development kits (SDKs) are a focus. SDKs track and measure in-app activity, which means off-platform audiences can be measured.

Amplifi’s head of digital and operations, Amelia Elston, says DCR is just one piece of a major evolution in digital measurement.

Amelia Elston says Amplifi values the fact Nielsen’s data is IAB certified

“The end goal is a standardised set of metrics across all channels. Harmonisation of metrics means that agencies and brands need to look beyond reach – but understanding precision in targeting, engagement and attention,” she says.

But she argues the DCR metric provides better insights for clients and avoids the complexities of legacy measurement.

“We value the fact the data is IAB certified, and therefore independent. The tool is able to provide near real-time analysis on digital content, which changes the way Amplifi analyses and manages campaigns on a day-to-day basis,” she adds.

However she says that should there be a “significant discrepancy” between data platforms, Amplifi would not look at just one month of data in isolation.

“We would need to look over time to see if it was consistent before deciding on our source of truth,” she says.

Earlier this year, Nielsen announced it would add Facebook video views towards a total audience number, under its new DCR metric, for those that chose to opt in. It caused a stir among publishers, because Nielsen’s definition of a video view is less than one second.

The opt-in Facebook secondary crediting was introduced earlir this year

Amplifi’s Elston wasn’t too concerned by that decision: “We support IAB and Nielsen in their endeavour to ensure the most accurate and reliable figures can be produced for clients across all digital properties. This includes social media.

“The best case scenario would be to reach a level playing field when analysing screen consumption and behaviour, which we aren’t quite seeing yet.”

But Nunn Media’s Wilson has expressed more concern, noting the relationship between Nielsen and Facebook is part of a wider industry problem.

He explains the common belief that everything is measurable and comparable is “misheld” and just wrong.

“Digital is inherently unstandardised and under constant change so we need to accept that we will never have a complete picture or accuracy. If you accept this then you start to understand there is value in using such tools and metrics, but they are also immensely limited,” he says.

He explains how this causes some publishers to ‘game’ the metric or standard.

“Take viewability as an example – adding a small video that autoplays wherever you are on a screen has made many of the leading publishers be able to claim 100% viewability. But of course the problem is that the video inventory is still low quality, as it is auto playing the ad on a screen size equivalent of a thumbnail image and likely with no sound,” he says.

“As media buyers we therefore always need to apply a sense of caution and sense check everything we do.”

He disputes the idea of DCR as a metric, describing the situation as a classic “chicken and egg scenario”.

“Publishers want their data available to buyers and buyers want easy access to the data.

“Neither are particularly enamoured with paying for the service at the rates that are currently applied. Further to this if the quality of the data is in doubt the value of the service diminishes, if the value of the service diminishes then less buyers will use the service and therefore less publishers want to be involved,” he says.

Another agency head of digital, who wished to remain anonymous, adds: It’s acknowledged by Nielsen that the site-tagging component of their hybrid ratings methodology is having some troubles on secure domains (ie – https sites).  I’m sure that’s fundamentally why some publishers decide not to get tagged up for the daily ratings tool.”

What happens if publishers withdraw from Nielsen?

Regardless of flaws or cracks in DRM or DCR methodology, not a single buyer thinks it is a good idea for the publishers to withdraw.

The industry standard is needed they say, and without it things could be much worse.

OMD’s Whitnall notes while there are “gaps” in the methodology, it is the only independent measurement offering.

“With publishers such as Fairfax pulling out of Nielsen, this only adds to the complexity of creating an industry tool for digital planning and works to make digital more, not less, fragmented,” she says.

And it is also important for Amplifi’s Elston and Havas’ Wilson.

Elston says understanding people’s overall motivations and attitudes is important when identifying how to communicate and connect with Australians.

“So having publishers on board is important but there are many other tools we use to make decisions on digital content,” she says.

Ultimately, if more publishers withdraw from Nielsen, it would become a big concern, says Nunn Media’s Wilson.

“It is not an issue at present and I assume DRM won’t go away, so the panel results are here to stay in some form or another. If they do go and DCR just used tagged data, then I doubt we will be using Nielsen going forward.”

Mumbrella approached Nielsen for comment.

Measurement will be one of the many issues tackled at Publish this year. For more information about the panel, which will take place on September 20, click here.

ADVERTISEMENT
Advertisement

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing