‘Artificiality is an affront to the authenticity of influencer marketing’: on AI
Jordan Carroll of influencer marketing shop The Fifth shares a ‘humanist’ approach to the industry’s obsession with AI, including a damning indictment of so-called ‘AI influencers’
The influencer marketing industry has long been known for volatile supply chains, ever-changing algorithms, fluctuating talent prices, and rapidly evolving measurement demands.
As The Fifth’s innovation director, I’m tasked with finding solutions to these challenges. While I embrace emerging technologies and their ability to help us solve problems, there’s one area of technological advancement that raises concerns about its potential impact on the influencer space: artificial intelligence (AI).
Keeping it real
Many thought leaders are currently aligned in their perception of AI as a tool that can augment our current processes and offer great efficiencies, giving us more time to deliver in the areas where we marketers are indisputably most valuable: creative and strategy.
However, there’s an unsettling amount of LinkedIn noise around a different approach. Rather than using it to augment the process of influencer marketing, some want AI to displace influencers altogether, disregarding the human element of influencer marketing that creates authenticity and builds community.
While governing regulatory bodies, such as the European Commission, strive for a human-centric approach to AI, naive ambition and desire to see how far technology can take society is causing some to remove the human aspect of AI entirely. While AI is still in its early stages and far from reaching super-intelligence, its life-like qualities have already deceived us into believing AI-generated media is real (like Pope Francis in Balenciaga puffer). This raises concerns about the increasing sophistication of AI technology, where the line of discernment between synthetic, AI-generated influencers and real human content becomes blurred.
Synthetic influencers stand in direct opposition to the humanist principles fought for by good actors in the industry. AI-generated influence undermines more realistic depictions of human bodies on social media and in advertising. It also threatens representation, diversity, and transparency. AI can be used to perpetuate harmful beauty standards, deprive diverse people of opportunities, and give inauthentic depictions of our lives online. We must reckon with how this will bring our industry into disrepute and the increased urgency for comprehensive AI regulation.
Whose fight is this?
The responsibility to regulate AI ultimately lies with Silicon Valley and big tech. However, their focus is to leverage this groundbreaking technology for financial gains, rather than addressing its potential threats. There are already several organizations, like The Coalition for Content Provenance and Authenticity (C2PA), doing necessary work to establish standards for labeling AI-generated content, and addressing digital rights, body image concerns, and fake news by adding metadata on the blockchain.
A solution like this is necessary for all stakeholders to understand when content or an influencer is a creation of AI (and when not), although the involvement of social platforms and the clarity of the initiative's roadmap remain uncertain.
While some platforms take initial steps to annotate AI-generated content, these initiatives are often not scalable. For example, Twitter has rolled out Community Notes that facilitate crowd-sourcing. Structurally, this could be read as a departure from the EU’s voluntary code of practice, indicating a lack of commitment to identify AI-generated media.
‘AI influencer’: a contradiction in terms?
The consumer perspective in this is critical. Users play a significant role in shaping the future of influencer marketing. However, consumers often prioritize entertainment over authenticity on social media, blurring the lines between real and AI-generated content.
“AI influencer” is a paradoxical term. Anything artificial is an affront to the authenticity of influencer marketing. And a push for more authenticity in influencer marketing was never driven by the industry in the first place. The demand for authenticity in influencer marketing has historically been driven by consumers themselves. If influencers unite against the misuse of AI on social media, it can galvanize consumers and prompt change.
Governments are already discussing AI-related issues, but it is crucial to implement comprehensive solutions that directly address the challenges posed by AI in influencer marketing before problems escalate.
My feeling is that good sense will prevail. Governments, tech companies, and industry stakeholders must work together to develop robust regulations and solutions that protect the industry from the potentially detrimental effects of AI in influencer marketing. During all this, I turn to T.S. Eliot, who reminds us, “Only those who will risk going too far can possibly find out how far one can go.”