How Facebook's structure as data-gathering, behaviour-manipulating, communication conduit stokes its right wing bias - Tech 2 Asia

Breaking

Tuesday, September 1, 2020

How Facebook's structure as data-gathering, behaviour-manipulating, communication conduit stokes its right wing bias

We are by now familiar with the broken promise of social media. Digital communication platforms were meant to be harbingers of freedom, uncontrolled and uncontrollable by governments. The Arab Spring strengthened this view — academic articles found that social media increased people’s ability to “create and consume political content independent of social elites” and was a “critical part of the toolkit to achieve freedom”. In 2012, The New York Times hailed the “borderless digital movement…set to continually disrupt powerful institutions, be they corporate enterprises or political regimes”.

Several years later, we now know that social media platforms ended up creating political echo chambers and a new era of mass surveillance. Or as a 2016 Wired article put it while reflecting on the aftermath of the Arab Spring, “As it turns out, bad people are also very good at social media.”

But we have moved beyond even this understanding of social media as a double edged sword. What is increasingly becoming clear today is that social media has not just led to neutral polarisation or comparable echo chambers on the left and the right. Both due to global political trends and due to the structure of platform business models, social media platforms like Facebook and Twitter are proving to disproportionately foster and strengthen the right wing worldwide and particularly in India.

Recent revelations about Facebook India’s top public policy executive Ankhi Das’s favouritisim towards the BJP has brought this tendency to the fore. In a piece in the Wall Street Journal, Das is shown to have protected BJP politicians from punitive action against hate speech. In earlier pieces in Newsclick, Facebook is shown to have actively helped the rise of Narendra Modi as Prime Minister.

Also read: Ankhi Das openly supported BJP on Facebook's internal group, says WSJ report; Congress seeks criminal probe against social media giant

We know that Facebook survives on user attention. People are likely to stay on the platform for longer if they encounter views they already agree with. The dependence on user attention is a direct incentive for Facebook to keep people locked in to echo chambers instead of exposing them to news and views as they occur. A great deal of data is crunched by Facebook to keep us within these walls, consuming and creating content endlessly.

But what is it about Facebook that makes it conducive to the rise of right wing demagoguery? Why have these echo chambers not led to the rise of the left? The first and most immediate reason is that Facebook makes money from advertising. In India, it is the right wing today that has the ability to spend money on massive Facebook advertising. Article 14 has shown that the official BJP page is Facebook’s biggest ad spender in India, and several other pages informally linked to the party spend significant amounts on Facebook advertising. Much like the figures for political donations, digital ad spends by India’s ruling party overshadow spends by other parties.

The second reason that has emerged now is that Facebook in India can be directly charged with an active bias towards the right wing through some of its executives. Their support for the BJP precedes its ascendance to power and goes beyond the friendliness a corporation shows to a ruling party. There have been more than enough instances of Facebook applying differential standards of content regulation to right wing posts and other posts.

This is certainly not the first time that right wing extremism has been aided by private technology. In the book IBM and the Holocaust, Edwin Black has shown how IBM officially aided the Nazi regime in Germany through its punch cards and card sorting system. These solutions were custom-designed for automating genocide: IBM helped the regime register, starve, enslave, transport and murder Jewish people at scale. The company was aware of the uses to which its technology was being put. It actively ensured that it made the most profit possible from the Holocaust.

People are likely to stay on Facebook for longer if they encounter views they already agree with. File Photo

There is more to this than the tendency of private enterprises to further oppression when it is profitable. Black has pointed out that the Nazis developed a data lust; that “massively organised information” became a means of social control. He adds, “Unless we understand how the Nazis acquired the names, more lists will be compiled against more people.” During the CAA-NRC debate, we saw the Indian ruling party’s proclivity to create lists against whole groups of people, and private technologists’ willing collusion with such plans. Similarly, it is no surprise that a data accumulating project like Facebook would find itself on the side of right wing extremism even if it does not explicitly intend this.

We must also note the fact that Facebook, with its ownership of Instagram and WhatsApp, controls much of digital communication in India today. People turn directly to Facebook for news and other content instead of looking for it on news websites or other parts of the internet. This means that Facebook acquires most of the revenue generated from content, and news organisations are able to garner precious little revenue even as their content drives the use of Facebook. The situation is so poor that media organisations are finding their very survival difficult, and many are shutting shop. It is for this reason that in Australia, proposed competition regulations would make it mandatory for Facebook to share advertising revenues with consumers.

Any self-interested government would be irrational if it did not try to control such a platform. By shaping what content people are able to access, Facebook plays an important role in shaping public opinion. And by controlling public opinion during elections with the use of such platforms, something a reactionary government is capable of doing, the continuation of a ruling right wing government can be more or less guaranteed even under a democracy.

What then is to be done about Facebook’s right wing bias? The answer lies in tackling the structure of Facebook itself — a data-gathering, behaviour-manipulating, communication conduit. When we acknowledge this structure of Facebook, a few options for action are opened up to us. Some of them are mild — for example, consider the fact that Facebook has the ability to expand or limit the reach of content. It often uses this ability to limit the reach of content that violates its community standards. But what way do we, as citizens, have of knowing that Facebook does not use this ability arbitrarily? At the very least, there has to be an independent audit process to demonstrate that Facebook’s reach limiting algorithms (and other content moderation algorithms) are not deployed unless legitimised through internal policy and process.

What still remains is the need to deal with Facebook’s extraordinary power to decide which content is worthy of public utterance and which is not. To be fair to the company, it has constantly maintained that it cannot arbitrate on the truth. However, its impact, often life-or-death, on millions of people means that Facebook is compelled to try different methods to regulate content. One of these methods is a third party Oversight Board that it has set up to make decisions about which content to leave up and which content to take down. It is difficult to see how the Oversight Board — a committee of 20 members handpicked by Facebook — regardless of their human rights record, is an improvement over the status quo of content regulation. Worth a mention is also the fact that Facebook is a US company and is beholden to US foreign policy interests, a fact that must be kept in mind by everyone who wants to avoid US global hegemony over communication.

The only answer is to ensure that Facebook or any company like it must no longer exist. This is not a fanciful suggestion — we made policy choices in the past that allowed Facebook to exist in its current form. Facebook’s acquisition of WhatsApp should never have been allowed for reasons that are clear now. Its investment in Jio raises further concerns about its entrenchment in the Indian economy through the digitalisation of small businesses, concerns that are exacerbated by Jio’s impending purchase of the Future Group retail business. Facebook’s various arms should be broken up, but that should be just the beginning. The business model of advertising through attention and data hoarding must be disallowed for the safety of our common future. All people interested in democracy, across different nations, must demand public policy that makes such business models unattractive. This kind of public policy can include making data hoarding illegal, making deep discounting through digital platforms financially unfeasible through taxation, banning digital advertising through the sale of personal data rather than group data, and so on.

Our governments will not on their own take this step, entrenched as their power is through the use of social media. We are far from the days of when people dreamed that Facebook and other social media platforms would democratise communication. If we truly want to ensure that communication has a chance at democracy, we must demand the end of Facebook.

Jai Vipra is a technology policy researcher focusing on the economics of digital platforms in the Global South



from Firstpost Tech Latest News https://ift.tt/2YYbnOd
via IFTTT

No comments:

Post a Comment