Change my mind
New research calls Facebook's role in political polarisation into question. It also reminds us how entrenched people's opinions can be
2024 already has a packed calendar of political and cultural events to hang your campaigns around. Men's football Euros. Paris Olympics. Plus two elections, the big ones on either side of the pond. In fact, you'd be mistaken for thinking it's 2016 all over again, only substituting the Brexit referendum for a UK General Election.
There's already been a slew of long reads and thought pieces opining on the potential for generative AI to alter how parties fight these elections online. The scale of these election campaigns, particularly on social media and video-sharing platforms, gets increasingly mind-boggling with every cycle. Looking at the Conservatives' FB and IG election ads in 2019, you could keep scrolling through the library for what felt like days and still not get to the end. ChatGPT and Midjourney (plus any new tools that appear in the next six months or so) can exponentially increase both this scale and the speed with which parties can mobilise creative around hot-button election issues.
And there's little doubt that candidates on both sides of the Atlantic will still heavily rely on social media advertising to mobilise their core support and convince swing voters in battleground areas to switch sides. But the online landscape looks different from 2019/20, for two big reasons. One is TikTok's prevalence and cultural influence; the other is the slow decline of the two "traditional" social platforms, Facebook and Twitter. The mind boggles at the potential for political parties using TikTok to get it horribly wrong. As any brand or agency knows, making TikToks requires a fundamentally different skillset to auto-generating tailored versions of the same image with tweaked copy for Facebook and Instagram. And, of course, none of the parties or candidates will discount those previously successful methods. Facebook may be a cultural wasteland these days, but it is still a heavily used wasteland where people (particularly older demographics) love to argue in comments about the political news of the day.
There's long been a narrative that this increasingly polarised political debate directly stems from Facebook's business model. That Meta trains its algorithms to give us news and posts that make us more entrenched and polarised in our views. That narrative has been under threat over the past few weeks, with the release of research commissioned by Meta into how its platforms affect American political views.
The headline finding, seized upon as a triumph by Nick Clegg, was that there was very little evidence that Facebook and Instagram led to any noticeable change in people's political views. People were extremely unlikely to change their opinions based on seeing political content, while the removal of political content from newsfeeds made little difference to the intensity of users' political views. In one of the tests, where researchers removed the amount of "like-minded content" people saw (i.e. political posts that chimed with existing opinions), those same people actively sought out the very same stories they were likely to agree with.
Casey Newton, in Platformer, runs through some of the limitations of this research - it's certainly not a study that exonerates Meta's role in fundamentally altering the political landscape. There's no way that the Jan 6th insurrection in the US would have happened without Facebook, for example. But it does provide evidence that the quick fixes certain commentators and sections of the media cry out for (chronological timelines, broader views on show, removing re-shares) simply don't work.
Interestingly, users who researchers nudged onto a chronological timeline spent less time on Facebook and Instagram. Meta's recommendation algorithms may be blunt instruments, but they're successful in their primary aim of maximising dwell time on platforms.
The research forms a powerful argument for the effectiveness of social media platforms in mobilising your base - giving your most fervent supporters additional ammunition that fuels their beliefs and ensures they'll remain strong advocates for your cause, whether you're a political party or a business. And the ability to activate an army of advocates should never be underestimated, particularly if they're noisy enough to move a topic from social to mainstream media.
But the part of the research that stood out most for me was that people actively sought out stories they were likely to agree with when their newsfeeds didn't provide them. It reminds us how bound our sense of self is with our beliefs. You've probably all heard of the studies about how giving someone information that disproves their view on a topic only makes that view more entrenched. The same dynamic is at play on social media channels - we don't want to challenge; we want to feel part of the herd to feel our views all chime together.
It means that when we're thinking about developing campaigns that aim at shifting perceptions on a topic, it's pointless going too big, too soon. We need to start with an in-depth understanding of where audiences are before even contemplating where we want to get them to. Then we need to map out the potential change that we aim to encourage - and clearly, this needs to be on a scale depending on how emotive the topic is. The more an issue is close to people's identity, the smaller the nudge in behaviour needs to be. On issues where people have colder, more objective views, there are more possibilities for more substantive change.
But getting this understanding wrong, based on the research, risks having the opposite effect. Instead of changing hearts and minds, you're encasing them in concrete. Instead of engaging with your carefully crafted, well-thought-out creative, they will look for something that chimes with their views. You only need to look back at 2016 to find a prime example of a campaign completely misreading the mood of the voters it targeted. Messaging built on the rational arguments for staying in the EU alienated the very people it was trying to sway. The emotional message, untethered from reason, nudged people towards a leave vote.
In comms, we're lucky that we rarely deal with binary choices - stay or go, this guy or the other guy. But even so, the lessons of 2016 and this research into the effects of social media on behaviour are helpful reminders never to assume you know how your audience will feel.