Facebook And Data Privacy Issues

Photo by Anton from Pexels

Facebook has had to deal with a lot of issues when it comes to keeping the privacy of its users intact.

In 2006, being just barely two years old the company faced its first user outrage when it introduced its News Feed. A year after it had apologized for telling users what their friends had bought and more recently the breaching of data privacy rules and regulations. In its 14years of existence, Facebook has had a history of running afoul of regulators and angering its users while racking up over 2 billion users and collecting record profits.

Since the number of IoT (Internet of things) devices grew from 500 million in 2003 to 30.73 billion in 2020 (www.statista.com, 2021), the ubiquity of data collecting, storage, analytics on devices, applications, systems and social media platforms which are aimed at optimizing sales and enhancing returns as well as personalizing experiences have become disruptive in shaping the global economy, the flow of ideas, and access to information that in turn result in the advancement of innovation around the information marketplace.

These disruptive forces have a notable influence on the rights of citizens such as statutory rights, due process, equal representation before the law, the right to appeal, and trial by jury—and constitutional rights like freedom of expression, voting, and non-discrimination.

Although Facebook had been investigated by the Federal Trade Commission(FTC) in the United States over its violation of a 2011 consent decree and was charged 5 million dollars in fine, more recently, Facebook has also been fined and remains under investigations in six EU countries (France, Belgium, Italy, Spain, The Netherlands and The German city of Hamburg) over its breach of privacy laws and Facebook was also fined €110 million (around $122 million) by the EU.

Photo by Pixabay from Pexels

Facebook has been quite known for tracking of users and non-users of Facebook through cookies, social plug-ins and pixels, as reported. Not having a Facebook account in this case does not provide protection as the litany of available data sources is not limited to facebook, and the analysis can easily apply to other points of one’s personal preference. In addition to this every website that has the facebook logo is linked to facebook and allows for tracking of non-members who might not have opted for the service.

Web beacons most of which are tied to “cookies” are part of the many similar sources of online tracking that can be used across websites, and these accesses can be sold to interested buyers. Also, when real news and misinformation or unconstrained contents, target voters can find reinforcing messages on many sites without realizing they are some of few getting such messages nor are they given warnings that these are political campaign messages.

So, with real-time monitoring of ad responses on targeted individuals, including real-time substitution to find “click bait” that worked, the ad campaign was able to both maximize impact and detect trends which are not normally visible at the macro scale. Tweaking the scale in a few states with as few as 100,000 voters using target-focused, high-impact messages is sufficient to impact election results. This might not be the only reason for the specific 2016 US election outcome, but there is every indication that it was a useful if not a critical contribution (R.M. Bond et al).

Although the idea that a psychographic analysis can have any significant impact on behaviours have been questioned, a recent paper by Stanford professor Michal Kosinki (who was part of the 2013 Cambridge University research team) and colleagues, however, confirmed that it can quite a significant impact with a sample base of 3.5 million users (S.C. Matz et al.,)        

In 2013, researchers at the University of Cambridge’s Psychometrics Centre analysed the results of certain volunteers who took personality tests on Facebook to assess their “OCEAN” psychological profile (openness, conscientiousness, extraversion, agreeableness, and neuroticism) and connected it with their Facebook activity (likes and shares) ( “Cambridge Analytica—The Power of Big Data and Psychographics“). This research pulled in 350,000 US participants and created a clear relationship between Facebook activity (and other online indicators) and this five-factor personality profile. This work however, showed that the OCEAN profile for any individual could be deduced reasonably and accurately by looking at these metrics and without using a conventional psychographic instrument. However, there is no indication that this research exposed participating Facebook users or their friends to any specific privacy abuse. There are indications that the university refused to share information (either individual or the resulting criteria) with what would turn out to be Cambridge Analytica.

Now that it was clear that such an analysis could be commenced, a second research project was reportedly initiated by Global Science Research (GSR) which in cooperation with Cambridge Analytica set out to identify the parameters needed to develop the OCEAN profiles using a personality quiz on Amazon’s Mechanical Turk platform and Qualtries, a survey platform. The quiz required users to grant GSR access to their Facebook profile, which granted access to users’ friends’ data through the Facebook Open API until May 2015. This is how Cambridge Analytica was able to access the Facebook data under scrutiny. Note that, keeping the specific individual data or information was not entirely necessary to accomplish the primary research goal, which was to establish a methodology for psychographic profiling of individuals based on social media and other indicators.

Photo by Tobias Dziuba from Pexels

Cambridge Analytica realized they could integrate this data with a range of data from social media platforms, online purchases, browsers, voting results, and more to build “5,000+ data points on 230 million US adults.” By adding OCEAN analysis to the other private and public data acquired, Cambridge Analytica developed the ability to “micro-target” individual consumers or voters with messages most likely to influence their behavior (B. Anderson and B. Horvath). The OCEAN analysis was paired with many targeted messages in “Project Alamo,” which was employed for the election campaign of President Trump. (R. Lindholm, “Project Alamo’s Data-driven Ads on Facebook won Trump the Election“) Some of these messages were created for the Trump campaign, and some simply leveraged “news” available on the Internet (which might have included content funded through the Russian campaign to disrupt the US elections). As described by Cambridge Analytica’s CEO, the key was to identify those who might be enticed to vote for their client or be discouraged to vote for their opponent (Alexander Nix CEO Cambridge Analytica—Online Marketing Rockstar’s Keynote”). Every vote added or disrupted (in the intended way) tips the election results. This parallels analysis from the US 2010 elections.

After violating The U.K. privacy law which was considered “intentional and knowingly” disseminating fake news and misinformation, the CEO of Facebook also failed to appear before the U.K parliament which was considered a contempt. The U.K. committee’s report builds on preliminary findings issued in July of 2019 calling for more oversight of social media companies and adds to growing demands for increased regulation and scrutiny of big tech firms. Much of the report published focused on Facebook’s data collection practices and the Cambridge Analytica data scandal that exploited millions of users’ personal data.

Facebook numerous privacy issues are now front and center. Facebook’s loose handling of how its data was acquired by app developers has plunged the company into the biggest crisis 15-year existence. The revelation that a data analytics company used by Donald Trump’s presidential campaign was able to surreptitiously collect data on 50 million people through a seemingly innocuous quiz app has forced CEO Mark Zuckerberg to issue a public apology — and promise changes.