Scoop has an Ethical Paywall
Work smarter with a Pro licence Learn More

Video | Business Headlines | Internet | Science | Scientific Ethics | Technology | Search

 

The Impact Of Meta’s Algorithms During The 2020 US Election – Expert Reaction

US researchers worked with Meta to analyse how the social media company’s algorithms influenced what people saw and believed during the 2020 US election period.

The series of four papers, published in Science and Nature, found algorithms are “extremely influential” in terms of what people see on Facebook and Instagram and that there is a significant ideological split in political news exposure. Some of the experiments found that tweaking the algorithms changed what people saw and how much they engaged with the platforms, but these tweaks didn’t do much to change political attitudes over the three-month study period.

The Science Media Centre asked local experts to comment on the research.

Dr John Kerr, Senior Research Fellow, Department of Public Health, University of Otago, Wellington, comments:

“There is a lot to unpack in these studies, but one of the key takeaways is that some features of Facebook and Instagram, such as their newsfeed algorithms, have a substantial impact on the kinds of political news we see in our social media feeds. However, it seems that this has very little effect on people’s actual political attitudes and beliefs, essentially zero in fact!

“Researchers have been concerned for some time that social media newsfeed algorithms are making people more politically extreme by only serving up content that supports ‘their side’, politically speaking. This new US research shows that Facebook and Instagram algorithms do bring up more liberal news sources for liberals and conservative news sources for conservatives, compared to just a basic feed that just shows the most recent posts from friends and groups. However, this doesn’t translate into more extreme or polarised political views. So that’s good news really, these algorithms are not pushing people into more entrenched political camps.

Advertisement - scroll to continue reading

Are you getting our free newsletter?

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.

“What makes this new research especially impressive is that Meta allowed a select group of independent researchers to run experiments on their platforms and analyse users’ data. However this type of access is incredibly rare. In fact, social media companies have been making it harder for scientists to study and analyse their data. Facebook, Twitter, and Reddit have all reduced access to content and data for research. This lack of transparency means we will have even less insight into the prevalence, spread and impact of content like hate speech and disinformation. As noted in the Policy Forum article accompanying the new research, regulatory requirements for major platforms to share data may be necessary to ‘foster opportunities of path-breaking, comprehensive scholarship that does not require a social media platform’s permission’.

“For this new research, it’s not clear how well it will translate to the Aotearoa New Zealand context. New Zealand is a very different kettle of fish in terms of politics. We tend to be less politically polarised as a society. We don’t have a big Democrat-Republican type split like you see in the US, partly due to our MMP system. Our mainstream news media is also less polarised. Unlike the US, where you get major news outlets that have a very right-wing or left-wing stance—think of channels such as Fox News or MSNBC—the bigger players here tend more towards the centre of the political spectrum.

“Acknowledging these differences, the new research suggests that as New Zealand heads into the election, we are more likely to see political stories on social media that match our own views, rather than those that challenge them. And that is thanks in part to newsfeed algorithms. These algorithms are geared towards showing us content we like and engage with, to keep us glued to the newsfeed. Indeed, when the researchers in one of these studies switched off users’ algorithm-based newsfeed and gave them just a basic, uncurated newsfeed, people spent less time on the platform.

“If you are worried about getting trapped in filter bubbles or echo chambers—situations where you are only seeing a slice of the news that suits your worldview—there is a relatively easy solution. Read widely. Actively seek out multiple articles on an issue and different viewpoints. Don’t rely on your social media feed to give you the full picture. Hopefully, you’ll gain a wider understanding of what is going on and, as a bonus, you will sound like you know what you are talking about at parties.”

No conflicts of interest.

Dr Andrew Lensen, Senior Lecturer in Artificial Intelligence, School of Engineering and Computer Science, Victoria University of Wellington, comments:

The Study

“It is great to see outside academics finally being enabled to study the effects of Meta’s algorithms on political polarisation. Tech companies are notoriously siloed and protective of their data, so it is a big and exciting step to see this collaborative research take place. The outside academics did not get ‘free rein’, though – they weren’t allowed to see the data themselves (due to Meta’s privacy concerns) and so had to trust that Meta was supplying the data faithfully.”

Meta’s Angle

“In Meta’s own media release, their President of Global Affairs Nick Clegg claims the studies show ‘there is little evidence that social media causes harmful ‘affective’ polarisation’, whereas the study authors more accurately state that changing the recommendation algorithm for only a few months isn’t likely to change political attitudes. This highlights a danger in these studies where the social media company has control over the study length – Clegg’s comments suggest that Facebook and Instagram play no part in polarisation, which isn’t what the studies say. We must be careful that Big Tech does not abuse the reputation of academics (as trusted independent experts) to create their own disinformation about results.

“We still don’t know how Meta’s algorithm works, as it is a closely guarded trade secret. I would trust Meta more if they allowed independent academics to audit their algorithm to look for bias and polarisation.”

Wider Implications to Aotearoa

“The studies showed that using reverse chronological ordering (i.e. most recent posts first) instead of the current algorithm means users spend ‘dramatically less time on Facebook and Instagram’. In New Zealand, we hear about how our rangatahi are spending more time than any other generation on social media, with flow-on effects on their schooling and wider lives. A simple change like this by Meta could reduce the addictiveness of these technologies. Social media companies won’t do this willingly: time on a platform is income, and they are businesses, after all.

“This election campaign in Aotearoa is already very tech heavy. We’ve seen political parties use AI image generation for political attack ads, and I strongly believe they will be looking to ChatGPT to write social media posts, individualise letters to constituents, and more. We need to decide as a society what role we allow AI algorithms (including the ones in this study) to have in our democracy, as our existing campaigning laws were not built for the AI era.”

No conflicts of interest.

© Scoop Media

 
 
 
Business Headlines | Sci-Tech Headlines

 
 
 
 
 
 
 
 
 
 
 
 
 

Join Our Free Newsletter

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.