afaqs! news bureau

IPG Mediabrands ranks YouTube as the most responsible social medium

In a first, IPG Mediabrands has released a Media Responsibility Audit of the different social platforms.

IPG Mediabrands has just released its Media Responsibility Audit, the first-of-its-kind, as part of a larger effort aimed at enhancing brand safety and media responsibility in advertising. YouTube's headstart in brand safety efforts has earned it the top spot in the audit rankings.

The social media platform audit was based on the Media Responsibility Principles Mediabrands recently released to the public. They are geared toward protecting brands, and the communities that a brand serves, weighing the impact of harmful content, and evaluating the policies of different platforms and their enforcement.

Representational image
Representational image

The audit did a comprehensive assessment of all the primary social media platforms (Facebook, LinkedIn, Pinterest, Reddit, Snapchat, TikTok, Twitch, Twitter, and YouTube) against the 10 principles to check current status and accountability against each principle. The audit included 250 questions in total and focused on establishing a benchmark on what a responsible platform looks like.

Led by Mediabrands’ performance agency Reprise, the audit showed that many platforms are taking steps to improve their media responsibility performance. Major findings revealed what average versus great looks like, as well as who is leading and setting the standards for the industry.

Every media partner was benchmarked against the best-in-class result, and Mediabrands was able to create tables that ranked media partners overall and how they performed against the industry average.

A key finding, that YouTube tops the overall rankings and performs best against several principles, is a testament to the changes YouTube has made in response to advertiser brand safety concerns three years ago. The audit will occur quarterly to enable platforms to demonstrate progress and help clients hold media partners accountable...

“The Mediabrands’ Media Responsibility Audit comes on the heels of the challenge to the industry to ensure we are all taking part in safeguarding the media channels that are used in advertising, and furthermore, making sure they do not result in or contribute to harm,” said Joshua Lowcock, chief digital officer, UM, and global brand safety officer, Mediabrands.

“What this audit shows is that there is work to be done across all platforms from a media responsibility perspective, and that the different platforms each need to earn their place on a brand’s marketing plan,” said Elijah Harris, global head of social, Reprise.

“Our audit aims to deliver transparency to advertisers and consumers about the specifics of each platform with regard to our 10 principles. The audit is a tool to hold platforms accountable for improving their media responsibility policies and enforcement, and to ensure we can track progress over time. We hope the audit resonates with our industry and we can all work towards creating a greater good together,” adds Harris.

The key findings of the audit are:

  • Policy enforcement matters

    Platforms fall short by not backing up their policies with consistent enforcement of those policies. Most platforms have some level of enforcement reporting, but these are inconsistent and limited in scope. They rarely focus on the platforms holding themselves accountable for their own enforcement of policies. There is a need to better define expectations and metrics to be included within future policy enforcement reporting.

  • Lack of consistency across platforms

    Given the broad regulations that surround anti-discrimination and data privacy (example, GDPR/CCPA​), there are opportunities to become even more consistent in how data collection policies are enacted across various social platforms.

  • Eradicating hate speech is a common goal

    There is a shared recognition across platforms that eliminating hate speech is important, but there are inconsistent definitions of what qualifies as hate speech, inconsistent identification of protected classes of people, and a lack of prevalence reporting and independent auditing of hate speech reports. GARM’s proposed work to resolve these issues will be critical.

  • Misinformation is a challenge

    Misinformation is a challenge across most platforms. While certain platforms work with many organisations to combat misinformation, others work with none at all. Some platforms cited their unique engagement models as reason to de-prioritise fact-checking, but our desktop research shows that even minor instances can lead to unsafe ad placement for advertisers.

  • Non-registered user experiences vary

    For platforms that allow access to their services without user registration, there is an opportunity to be more consistent with that user experience. Some platforms still allow certain advertising placements to be viewed by a non-registered user, which may not result in responsible media delivery.

  • Urgent need for third-party verification

    Only a few partners have specific controls for protecting advertisers from adjacency to content in objectionable, or harmful categories (as in GARM’s brand safety framework). The industry needs to promote and use third-party verification partners more widely, so we are not at the mercy of the platforms’ lack of controls.

“With this effort, we are seeking to raise the bar in the industry by holding platforms to a higher standard of responsibility: keeping brands, and the communities they serve, safe,” said Daryl Lee, global CEO, Mediabrands.

“By taking this objective, data-driven approach, I am confident clients will see improved accountability for media responsibility and, moreover, improved business performance of social media advertising, as a result,” he concludes.

Have news to share? Write to us