Viewpoint: Does Brand Safety Really Matter?

Earlier this year, the ad industry spent a good few months tied up in concerns about brand safety, after a Times front page exposed ads for brands like Mercedes-Benz, Waitrose and Marie Curie being shown in front of extremist content.

It all seems to have quietened down of late, at least in terms of headlines about big brands pulling their budgets from the offending platforms, but it’s left a lasting mark on the industry. Every other announcement now seems to be about ad companies launching their brand safety offering, in an attempt to save digital advertising’s reputation.

The scare exposed some long-standing industry issues, especially around transparency and our blind faith in the power of the algorithm, and that can only be a good thing. But looking back on it now, I feel compelled to ask: is ‘brand safety’ really worth all the fuss?

The thought was triggered by a piece of research we wrote about recently, which claimed that 75 per cent of the British public would be dissuaded from buying a brand, if that brand’s ads appeared next to inappropriate content.

Not wishing to question the wisdom of crowds, this strikes me as fairly spurious. For a number of reasons, but most simply: people don’t tend to visit the corners of the internet that they consider ‘inappropriate’. If you’re on a porn site, and shocked to see an ad for your local supermarket – you might want to consider that it was you that visited the porn site in the first place.

And porn does appear to be the biggest problem. According to this study, the most common definition of ‘inappropriate’ was sexually explicit content – above and beyond racism, violence and, most troublingly and by the biggest margin, extremism.

If those are your values, fair enough. But if they are, surely you’re unlikely to be visiting sites or using apps with sexual content in order to see the ads?

Let’s take sex and violence out of the equation, because they have a tendency to cloud the issue, and look instead at the other reason people tend to object to online content: politics. The precursor to the big controversy came last December, when advertisers started pulling out of far-right news site Breitbart following the election of Donald Trump.

I have to confess, I was as gleeful as anyone to see Kellogg’s stop giving money to the site. But before that, while I knew about the existence of Breitbart, I’d never actually visited it – so how was I to know that these brands were advertising there? Breitbart’s regular readers, meanwhile, were surely delighted to find that there was finally a cereal out there that supported their views.

This isn’t to say that there’s no moral imperative to withdraw adverting from venues for offensive content, but let’s be realistic – we’re talking about enormous companies here, which can’t really be said to have a single political alignment. And when you start to consider user-generated content on an enormous platform like YouTube, it becomes even more complicated.

Ultimately, these things happen because you can fashion a story out of them – whether because you can get positive PR from announcing your brand has disassociated itself from hate speech or because, as a journalist, declaring ‘Waitrose funds terrorists’ makes for an exciting article. Without us journalists there to report on these programmatic mismatches, I suspect no one would give a second thought to the ads they see while looking at inappropriate content, because to them it’s probably not inappropriate.

As an industry, we can be guilty of convincing ourselves that people pay a lot more attention to advertising than they actually do. This is bad news in some ways – however great and relevant your banner campaign is, the average user probably doesn’t come away thinking about the chain of agencies and ad tech that put it in front of them. But when it comes to issues like brand safety, it’s also a positive – if one link of that chain slips and serves an ad for a wholesome family brand next to an explicit video, the user may well never notice. They’re probably, ahem, distracted by the content itself.