Reinforcing Ideological Walls with Facebook’s News Feed Algorithm

walls

In a recent academic paper from Facebook, researchers described how their news feed algorithm is presenting users with content that is related to their ideological standpoint, and removes some “cross-cutting content” from sources they are less likely to agree with. (Cross-cutting content are stories that are more likely to have been shared by those who are strongly committed to a different ideology than you.) The paper has raised concerns about the role that algorithms play in the kind of content that Facebook users are being exposed to.

One general issue with the algorithms that control the selection and display of content in our social network news feeds is that we do not actually know what else these algorithms are selecting our stories based on, or how widespread their effects are. For example, we had the experiment carried out by Facebook in 2012 and published last year where they manipulated the display of happy and sad stories to 150,000 users to see if they would in turn share happy or sad content. It may have been an isolated test, but the attitude behind carrying out such a study did cause me to stop using my own Facebook personal profile.

Some argue that an algorithmic ranking is much the same as an editor choosing what we see in a newspaper: most people would know when they pick up a certain newspaper that they are going to see stories aligning to the ideology of that newspaper and its readers. However, an editor can also decide on any one day that it is in the interests of a newspaper’s readers to see a more diverse range of news stories around an important breaking topic.

In terms of social networks and how algorithms work on sites like Facebook, there is often an assumption of neutrality, and many would think that they are being shown the same types of content as other people would see from their own sets of friends. That is, everyone would see a balanced set of content items overall, perhaps reordered based on “Likes” but not so much on one’s own profile characteristics (apart maybe for the ads on the sidebar which a lot of people realise are tailored). This was apparent after the aforementioned emotion manipulation study, when a lot of people stated that they didn’t realise that the Facebook news feed was filtered at all.

Most social networks (and also non-social services such as search) are trying to personalise your content and make it more relevant, creating the so-called “Filter Bubble” as a result, so in this respect Facebook is similar to many other platforms. What you click on determines what you will see, although the researchers in this paper seem to make a distinction between the news feed selection algorithm and user choices as if they were semi-independent yet similar factors: in fact, one actually drives the other.

As regards other findings from this study, some of the results made sense, albeit with a sample group that had issues in terms of its selection. What the researchers called “hard content” – national or world news and politics – was very polarised in terms of how it is shared: liberals shared stories from liberal news sources, conservatives from conservative sources. Also, placement of stories in the news feed has a significant effect on clickthrough rates (no surprise there).

In terms of the numbers, self-identified conservatives are being shown 5% less cross-cutting hard news compared to self-identified liberals who are being shown 8% less (who does that anger more?!), and also conservatives are clicking on about 30% of the cross-cutting hard news that they are being shown in their feed compared to the liberals at 20%. I would have guessed the reverse.

The problem revealed by this study is that some social networks are now effectively increasing political polarisation, accelerated by algorithms such as the one from Facebook that curates your news feed for you. These news feed algorithms can be changed to suit different conditions, but no one knows how much they are changing over time, if at all. Right now, all we know is that these algorithms are increasing a user’s selective exposure to news, before the user can decide what to select themselves.

Such news selectivity is generally accepted as being counter to democracy. The reinforcement of ideological walls outlined here may be part of a social network’s future plan for content consumption but it is not a plan we have to go along with.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s