Data is power: how our new informational ecosystem on social media defies democracy

The European Pavlovian conditioning

, by Inés Flor García

Data is power: how our new informational ecosystem on social media defies democracy
Photo by visuals on Unsplash

DEMOCRACY UNDER PRESSURE.

This article is part of our feature investigating challenges to democracy in Europe. From 22nd March to 4th April, we are digging into the most troublesome threats to citizens’ freedoms and participation in democratic society across our continent – from data privacy to the right to protest, from constrained access to a free press to violence against women.

The European Commission’s Joint Research Centre (JRC) has declared that ‘the democratic foundations of our societies are under pressure from the influence that social media has on our political opinions and our behaviours. According to the report, almost half of European citizens use social media platforms daily. While these platforms enable people’s voices to be heard, heralding liberation did not come without side effects. Social platforms allow for people to freely communicate and associate, yet their underlying technology favours polarisation, ‘yellow’ journalism, and unreliability, causing dysfunctionality in the informational ecosystem. The JRC report stipulates four key pressure points challenging our democracy: the attention economy, choice architectures, algorithmic content curation, and misinformation and disinformation.

Encouraging addiction

The attention economy model seeks to capture people’s attention in the face of abundant competition and does so using extremely harmful techniques for the proper functioning of our society. Big tech companies make use of deceptive techniques such as the exploitation of personal data, clickbait, identity theft, or micro-targeted advertising [1] to manipulate the actions of users, both within digital societies and in the physical world. For instance, clickbait techniques are pieces of highly emotional content packaged in a certain way so that people’s attention is captured. Our every movement, clicks, likes, comments, shares, scrolls are being recorded and monitored by private enterprises vying to exchange our attention and engagement online to advertisers for profit. As such, the user becomes the commodity and their personal data big tech’s property. Yet, what, precisely, is our data?

On ‘The Current Podcast’ with Matt Galloway, Ronald J. Deibert, a professor of political science and director of the Citizen Lab at the Munk School of Global Affairs and Public Policy at the University of Toronto, addresses the extent to which social media is undermining democracy in light of his new book “Reclaiming the Internet for Civil Society”. During his interview with Matt Galloway, he argues that our data encompasses everything about us, from friendships to all kinds of social relationships and from preferences to dreams. There has been a fundamental shift in the relationship between companies and consumers, bringing social dysfunctionality and surveillance capitalism into our democracies. According to the JRC, by analysing 300 likes, Facebook’s algorithm can know more about a user than their own spouse. This paves the way to ‘micro-targeting’, which used politically on individuals can undermine democratic discourse. The concept of micro-targeting has been defined as a form of political direct marketing based on a combination of online and off-line datasets targetting narrow categories of voters ‘for conducting outreach, persuasion, and mobilisation in the service of electing, furthering, or opposing a candidate, a policy or legislation’ [2].

Manipulating users

Choice architectures define design choices that guide users’ actions and promote engagement on a certain platform. Namely, a user may find it entirely simple to sign up for Amazon and yet find it impossible to cancel their account. Most importantly, the JRC’s report suggests that online, users are generally unfamiliar with the ways their data is being collected, stored, and used when performing basic tasks. Contrary to this view, Prof. J. Deibert argues that most users are aware of big tech’s ongoing manipulation across their platforms. Yet, due to both the devices’ and platforms’ addictive character, which is generally shaped at design stages, people do not realise how harmful these online political experiences are for them and society at large. Deibert compares users to Pavlov’s dogs, which associated the stimulus of food with a bell ringing. Every time that the dogs would hear the bell ring, they’d begin to salivate as they believed it was their feeding time. But, psychological procedures of association and reinforcement may not always be positive or neutral. For instance, not finding our mobile devices in our hands may trigger panic, a sense of loss, and loneliness, thus pointing out cognition transformations and effects that cannot be further overlooked.

In coders we (don’t) trust

Algorithmic content curation presents another challenge for democratic processes. The algorithms that organize content for the users on social media platforms are complex, yet too complicated for their developers as well? Platforms’ developers should be aware of the impact the algorithms they create can have. The JRC report argues that developers should too be accountable for the level of informational toxicity on social media. Without accountability or transparency, we lack control over decision-making policies and principles. One important aspect of these algorithms is that they feed on content with a high level of engagement. This begs the question, can platforms such as Twitter and Facebook ever be democratic and just? Shortly after former president Donald J. Trump’s election defeat, both tech companies endeavoured to censor Trump’s tweets and ban his user account. Nevertheless, this is an underlying contradiction, since their algorithms are pushing forward conspirational and sensational content that further harms people’s understanding of other people, political participation, and worldwide events.

One plus one equals three

The fourth key pressure point of misinformation and disinformation is identified by a recent Eurobarometer survey in all EU countries, demonstrating that almost half of the population comes across fake news once a week. Behavioural science demonstrates that ‘people have a predisposition to orient towards negative news’. Hence, when coupled with algorithmic content curation, the proliferation of highly emotional and sensationalist political information is to be expected. The issue stretches if we consider intelligent agencies that undertake cyberespionage practices and disinformation campaigns to continue confusing people. Such techniques thrive in the social platform’s communication ecosystem, causing more mistrust. The JRC report states that our era is marked by the ‘post-truth’. This means that facts do no longer dictate the truth, but rather emotions and personal beliefs. Core aspects of democracy and democratic representation cannot be understood and assimilated from one’s personal views, but should be shaped after facts, stats, grounded opinions, and a panoramic view of worldwide affairs,

A pioneering architect of the internet, expert in networking, spectrum, and internet policy, David P. Reed, summarises these concerns about democracy in the digital age. He fears that democracy has been jeopardised on account of mechanisms of widespread corporate surveillance of user behaviour and user behaviour modification like political micro-targeting. Representation can become meaningless in countries where highly targeted behaviour modification techniques are used to deceive and manipulate citizens’ choices.

One for all

Ultimately, an underlying issue of social platforms revolves around a power struggle between governments, non-governmental actors, and the people. At the moment, technology is controlled by a few, empowering the powerful and denying the many. As long as power is in the hands of a few, then the outcome is not good for the many or democracy [3]. It becomes clear that the growth of surveillance technologies will destroy the private sphere of social life across platforms and fundamentally condition democratic processes.

Sources

[1] Fernández-Caramés, T. M., & Fraga-Lamas, P. (2020). Towards post-quantum blockchain: A review on blockchain cryptography resistant to quantum computing attacks. IEEE Access, 8, 21091-21116.

[2] Dobber, T. (2020). Data & Democracy: Political microtargeting: A threat to electoral integrity? p. 10.

[3] Anderson, J., & Rainie, L. (2020). Many tech experts say digital disruption will hurt democracy. Pew Research Center, 22.

https://www.cbc.ca/listen/live-radio/1-63-the-current

https://www.wordstream.com/blog/ws/2014/07/15/clickbait

https://ec.europa.eu/jrc/en/news/social-media-influences-our-political-behaviour-and-puts-pressure-our-democracies-new-report-finds

Your comments
pre-moderation

Warning, your message will only be displayed after it has been checked and approved.

Who are you?

To show your avatar with your message, register it first on gravatar.com (free et painless) and don’t forget to indicate your Email addresse here.

Enter your comment here

This form accepts SPIP shortcuts {{bold}} {italic} -*list [text->url] <quote> <code> and HTML code <q> <del> <ins>. To create paragraphs, just leave empty lines.

Follow the comments: RSS 2.0 | Atom