DoubleVerify Advocates for More ‘Meaningful’ Attention Research

woman texting on her smartphone
Photo by Ivan Samkov on Pexels.com

AdAge recently published an article titled “Attention Metrics Don’t Work,” citing two studies that show “little or no correlation between those [attention] measures and sales or other target outcomes.” In response, DoubleVerify’s head of attention, Daniel Slotwiner, addressed the piece in a LinkedIn post, pushing back.

According to Slotwiner, the findings cited don’t prove that attention metrics, in general, are ineffective. Instead, Slotwiner suggests that the study should “lead to more questions about the specifics of the research, the hypothesis being tested, and the underlying dataset.” He adds it should also “pique interest around when and how attention metrics might be suitable (or not) for certain brands and marketing objectives.”

Additionally, Slotwiner critiques the “case study paradigm” in attention measurement, calling it a marketing arms race rather than an effort toward meaningful analysis. He argues “bad results” are only truly “bad” if they don’t lead to more insights and exploration. His message is basically that dismissing an entire metric using one study misses the purpose of research—to identify connections, evaluate them, and use the findings to advance understanding.

Why This Matters:

Advertisers increasingly rely on attention metrics to evaluate campaign effectiveness beyond traditional KPIs like impressions, viewability, and click-through rates. Alternatively, attention tries to capture how audiences actively engage with ads, using tools from eye-tracking and biometric data to tag-based and AI analysis.

Attention metrics have become so popular due to three main factors: (1) media fragmentation makes capturing consumer focus more valuable than ever, (2) the awareness that conventional metrics often fail to capture an ad’s true influence on behavior, and (3) the phasing out of traditional tracking mechanisms that supported legacy measurement, creating a need for alternative performance metrics. Advances in tech now also make it more feasible to collect and analyze attention data at scale, underpinning the growth.

With this shift, more studies are emerging to demonstrate a link between attention and performance. For instance, a TVision and Upwave analysis found a 1:1 correlation between TV viewer attention and brand lift, while a DoubleVerify study reported a 5% increase in purchase intent for Mondelez after using attention metrics to evaluate and optimize campaigns.

Experts React:

Here’s more from DV’s Slotwiner:

The efficacy and role of Attention metrics in today’s media landscape continues to be a hot topic of discussion. This is good! This is a healthy and necessary debate for our industry. Advertisers need new ways to compare ad inventory, optimize creative, make buying decisions and keep up with continued channel fragmentation and new ad formats. Attention metrics hold the promise of addressing many, and perhaps all, of these needs if (and only if) they can reliably be linked to driving value for advertisers.

To refine these metrics and responsibly recommend their use for high-stakes decision-making, it’s critical that we conduct high-quality research across all types of campaigns, marketing objectives, channels and formats. Doing this takes time and requires a combination of broad, “large N” studies, and bespoke brand-by-brand research. The industry is currently engaged in this lengthy process, as evidenced by the ARF studies (https://bit.ly/3A1HYZs), our own research (https://bit.ly/40nxhLc) and studies by other attention providers. Unfortunately, the “case study paradigm” we are engaged in is more of a marketing arms race than a serious attempt to understand, refine and share the nuances of an emerging measurement capability. 

When engaged in meaningful research, one should expect to encounter null findings, counterintuitive results and sometimes face more questions than one started with.

See his full post here, plus check out the responses in his thread. 

Our Take:

Attention metrics are still in their early stages, with standards just beginning to emerge. As new ways of measuring engagement develop, attention is becoming a crucial KPI for the industry. More studies—both supportive and critical—are ultimately beneficial, providing valuable insights for refining these metrics. As Slotwiner suggests, increased analysis will only bring more data, helping the industry establish best practices and identify the most effective use cases.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like