Facebook Is Messing Up The Democratic Process

Type to search


Analysis Democracy Politics

Facebook Is Messing Up The Democratic Process


Hija Kamran digs deep into why the world got jolted due to Facebook’s conveniently passing on the data of millions of its users , why is it relevant to Pakistanis and what can we do to protect our data?

 

In 2014 Christopher Wylie, a 28-year-old Canadian data scientist, helped co-found and build the UK-based political data mining and analysis firm Cambridge Analytica, with the financial banking of Robert Mercer, a billionaire who made his fortune in tech and has funded right wing causes. Cambridge Analytica played an arguably pivotal role in two of 2016 most life-changing political events – Brexit and the 2016 US presidential election – by combining data mining and data analytics with strategic communications for electoral processes.

In the case of the former, the UK-based firm assisted Leave.EU – which campaigned for the UK to leave the European Union – by reportedly not only providing the pro-Brexit campaign with voter data, but also by working to influence how people voted during the landmark national referendum. Cambridge Analytica utilised their heady mix of what Vox’s Sean Illing referred to “social psychology with data analytics” to similarly analyse and possibly influence the US 2016 election.

On March 16, Wylie revealed how Trump’s election campaign was majorly influenced by a large number of data that had been collected to build psychological and political profiles of 50 million United States-based Facebook users, based on their online activity, through research outsourced to UK-based academic Aleksandr Kogan.

Kogan developed an app, “mydigitallife”, based on a personality quiz that focused on collecting psychometric data, in exchange for $1 to $2 per download. The app was downloaded about 2,70,000 times. At the end of the survey, users would be asked by Kogan to grant him permission to access their Facebook profiles and those of their friends. Based on this data, which was then forwarded to Cambridge Analytica, targeted political advertisements by Donald Trump’s presidential campaign would be developed and distributed widely on social media.

It is telling that although Facebook’s security protocols were triggered by the exceedingly large data harvesting, Kogan – who Wylie had contracted on the basis that his services were far cheaper than other potential researchers — was able to convince Facebook that his academic research was well-meaning, to which the company said “Fine.” This approach by the Silicon Valley giant, treating the mass harvesting of data as harmless data collection because hey, the users gave him their permission, has since caused Facebook to come under fire, even as they claim that this was not a data breach.

Why does this matter, then?

This extraction of vast amounts of data means more than just the collection of user profiles that Facebook stacks and monetises whenever its interests, financial or otherwise, deem fit. The Kogan incident sheds light on just an aspect, and a disturbing one at that, of the technology used by corporations to make profits off the lives of millions of people, with or without their consent, or knowledge.

This example of data mining did not, for instance, just collect the information of people that had signed up to use the app – it also collected the data of their friends, which was permitted by Facebook data collection policies up until 2015 (it has since been revoked). Even in the case of those users that given the app (and by extension, Kogan) permission, they were not properly and fully informed as to where and how their data would be used.

The collected data profiled people based on a 60-question survey that including some perturbing images and asked respondents to rate how important the quotes and image should be for Americans. Examples, as shared by Wylie with The Observer, are:

 

 

 

The collected and subsequently exploited data by the app and its developer was, considering data protection ethical guidelines, stolen from Facebook and was not bought – the latter being a common practice of data mining corporations worldwide, in cooperation with Facebook, which is one of the largest private sector collectors of user data. The data of 50 million people that was harvested by Kogan et al was only actively downloaded by 270,000 people, but mined through their shared connections on Facebook, indicating and reaffirming that many of those 50 million did not know that their data was being used at all. If you were one of the 50 million that had primarily used Facebook for social network, you were secretly made the target of extensive corporate surveillance for micro-targeting of ads. It must be reaffirmed again and again that this did not constitute informed consent, in fact, was the exploitation of the data and trust of Facebook users, regardless of political affiliation. That this data harvesting triggered Facebook’s security alarms, only for the company to choose to essentially approve the data theft also indicates how seriously or otherwise a multi-billion-dollar corporation permitted and effectively endorsed the theft of data by an app developer, with political ramifications that continue to make waves.

Donald Trump’s presidential campaign team, was able to use exploit the data collected through “mydigitallife” to gain an advantage, thanks to Cambridge Analytica’s analytic research, as well as the firm’s ability to manipulate the perceptions of potential voters through social psychology tools. The psychometric information collected on people through the “mydigitallife” app was also used as a benchmark to determine what will be said in the speeches. Cambridge Analytica, it would appear, found a new way to undermine the democratic process via technology, a process that would not be clear to the greater public or governments until very recently.

Theoretically, this would suggest nothing different than regular marketing practices used by corporate entities worldwide. Through the lens of public democracy, however, the very principles and institutions upon which democratic and fair elections are reliant on and promise–giving people the liberty to choose who they want to elect as their representative through informed decision making – come under fire and are thrown out of balance, with many being none the wiser.

It should be pointed out, however, that this is not the first time that social media data has been used for election campaign or referendum purposes. Former US President Barack Obama’s campaign team made canny and successful use of social media during the 2008 US election, to the extent that it has been used as an instructive model for election campaigns strategy worldwide.

 

So, should we be worried?

This is where everything gets real. The short answer? Yes. Very.

Waking up to a software update – whether automatically or manually –  on a Monday morning could mean you are upgrading to lots of new features on your smartphone to enjoy. It could mean that the thousands of apps that were previously not available on your older operating system would now be acquirable, with nothing but good things to enhance your online persona. More Snapchat filters! In the rush to download these apps, however, many of us won’t bother checking the data permissions they require from the user, because we’re too excited, happy or distracted to bother.

It’s Friday and you are ready to bask in the weekend vibes, and as you open your Twitter you see the news of the successful hacking attempt on that new app you downloaded on Monday. You shrug it off, knowing that nothing could possibly go wrong because you don’t have anything to hide, and the hackers can only do so much with your data, so you go back to the world of hacked apps that you have very recently discovered.

Just like yourself, people have been justifying using all these social media apps when digital privacy is brought up by saying that they don’t have anything to hide. A frequent counter-argument is that if you have nothing to hide, you might as well leave your house unlocked. But this is not about having something to hide or being a criminal to be able to practice privacy both online or offline. It’s about the right, your right, to control the things and data, that you own. Just as you wouldn’t want anyone to steal your toaster, you should not let someone steal your Facebook data and make money out of it.

The Cambridge Analytica revelations are also a good opportunity to discuss why and how the online world is apparently free to access and how these billion-dollar platforms make money. The simple answer, as we have come to learn is by selling our data. We are how they make money. We are the product that they sell.

Facebook was built on a surveillance model that works really hard to sell its data to advertisers, political actors, and others with an interest in user demographics. This is its primary business model, one that it has been working to improve and streamline from day one. The myth that many users hold onto is that it lets its users generate and/or share content for free. The reality, however, is that there is a hefty price paid by users, in the form of their data being collected through every click on the site, as well as off-site and elsewhere on the internet. The data collected comprises the users’ browsing history, personal and financial information, whatever is available online. Facebook has also expressed interest in merging offline data, i.e. the activities that you do in the physical world, that users have with their online content to generate a merged pool of total data, all in one place. This collected data can further be used to determine an individual’s whole personality, much in the vein of an episode of Black Mirror. Less we consider this fiction, algorithms – which is how Cambridge Analytica major investor Robert Mercer made his fortune – are already used to ascertain a person’s likes and dislikes, political affiliation, drug addiction, sexual orientation, mental health status, and so forth. And with technology advancing, this algorithmic application will keep on advancing without anyone knowing, if developers and researchers choose to secretly exploit people’s data and sell it off for financial benefits.

What we’re talking about isn’t just be the infringement of your right to privacy, but it can potentially pose serious threats to people’s physical presence given the sensitive socio-political situation that appears to get considerably worse in Pakistan.

What can I do to protect my data?

There’s not a lot that a person can do to save their data from being used for reasons they are not aware of. After the revelations concerning of Cambridge Analytica, a lot of people suggested leaving Facebook to save their data from being exploited, but the approach may not work too well or assemble desired results. Facebook does not completely delete your data even when you deactivate your account. It is still stored on Facebook’s servers, and is retrievable if someone intends to harvest it at any point. Besides this, Facebook has been making shadow profiles on people not registered on the website based on their information collected through friends and other data on them available elsewhere online.

Another reason why this is not a productive approach is that deleting Facebook – a process that the company has deliberately made difficult and hard to find –  is a matter of privilege because, despite all of its flaws, it is still a primary source of communication for millions of people around the world.

In the light of this, having a monopoly in mainstream online social networking has served Facebook really well since it was founded 14 years ago. Deleting one’s account is not feasible and or going to have a significant impact unless or until people have an equally easy-to-use social media platform that has the potential to be mainstreamed as early as Facebook has, while adhering to strong ethical practices such as protecting people’s privacy – a point of criticism that Facebook has long faced, with ongoing global consequences.

Here’s what you can do instead, to ensure that your data is reasonably safe from third parties:

a) Keep a check on which apps can access your data on Facebook and delete any app that you don’t recognize

i) Login to Facebook

 

ii) Click on the drop-down arrow on the top right corner

iii) Select “Apps” in the list on the left

 

iv) Here, you’ll be able to see all the apps that are accessing your Facebook data

 

v) Hover your cursor over any app, and click “x” to revoke the app’s access

 

 

Note: The above disclaimer clearly states that removing the app may not remove the data this app already holds on you. Hence, it’s always good to mindfully decide whether you want to take certain quiz that your friend has shared on Facebook.

b) Demand transparency from social media platforms:

Social media websites such as Facebook and Twitter hoard an exorbitant amount of data on each and every one to have ever signed up on their platform. The lack of transparency and toothless demands for transparency at the state level (especially in the US, Canada, and parts of Asia and Africa), ensure that they are able to get away with most of the exploitation they subject their users to.

As important as strengthened privacy settings are, as is the holding of lawmakers to account, holding of social media corporations to account for their actions and the consequences of their actions. This advantage of accountability is evident in the recent Cambridge Analytica fiasco that has cost Facebook $37 billion in a matter of few hours, at the time of this blog.

c) Demand stricter data protection legislations from lawmakers:

Pakistan does not have any data protection legislation as yet. Digital rights organisation in Pakistan have been actively advocating to push the government to draft one as soon as possible, in the wake of multiple data breaches on national and international level that can pose serious threats to the security of the citizens of Pakistan. An example of the most important and possibly worrisome breach would constitute that of the NADRA database, on numerous occasions, by the British Government Communications Headquarters (GCHQ) and US National Security Agency (NSA).

The Government of Pakistan has periodically passed laws that subsequently infringe people’s right to privacy – the most recent of these being the Prevention of Electronic Crimes Act 2016 (PECA), that aims at regulating cyber spaces and grants sweeping powers to the authorities to surveil at citizens on the internet. Though passed to ostensibly counter hate speech, this law has become a tool of persecution against minority groups and people who live on the margins in Pakistan.

In such instances, it becomes the responsibility of citizens to hold their lawmakers accountable for drafting problematic legislations without transparency, and to demand laws that benefit the citizens rather than serving the vested interests of the those sitting in parliaments.

While individual efforts may seem unproductive and exhausting, they combine together to form collective movements that can strengthen democratic processes. It is the responsibility, of every person to question those in power, to grant better protections to the people in order to keep democracy strong. With that in mind, it is important to stress that one must take informed consent when sharing your data with any online platform, and make sure of strong measures to protect your information, rather than to simply demonise technology that merely does the work of people with ulterior motives.

 

Hija Kamran is a C4D specialist and digital rights activist. She tweets @hijakamran.

Share Now

Disclaimer: Naya Daur believes in providing space for views and opinions from all sides. But we may not agree with everything we publish. In case of columns and articles not published in Naya Daur’s name, the information, ideas or opinions in the articles are of the author and do not reflect the views of nayadaur.tv. We do not assume any responsibility or liability for the same.

Leave a Comment

Your email address will not be published. Required fields are marked *