What is QUANTech? An easy-to-digest selection of what’s hot in tech and the impact on society to help you keep ahead in this rapidly changing digital world.
We may possibly be witnessing a watershed moment. In the aftermath of last week’s QUANTech publication, The Guardian broke the news with a resounding testimony from a whistleblower. In short, we learned about Facebook’s biggest ever data leak: the personal information of 50 million Facebook users in America were handed over to British firm Cambridge Analytica which « uses data to change audience behavior » as stated on their website. This data leak story has turned into Facebook’s biggest crisis to date. A « major breach of trust » as acknowledged by Mark Zuckerberg himself on CNN last Wednesday.
Most informed Internet users do know that when it is free, you are the product. Facebook’s business model is advertising. 98% of their 2017 revenue came from advertising. At the heart of an advertising-based business model lies data. Personal data for a platform based on social interactions. In other words, Facebook’s business model is inherently about not caring about privacy while they claim to care about it. The more time you spend on the platform, the better it knows you and the greater the targeting options are. The more precise the targeting, the more expensive advertisers pay to leverage it and there you have it, the way Facebook makes billions.
Facebook’s objective is reachable by collecting as much data as possible from as many sources as possible. To collet it all in bulk, Facebook aims at retaining users on the platform as long as possible to make them generate ever more data. It is a race for attention. That is why they consciously built a platform to exploit the vulnerability in human psychology as acknowledged by Sean Parker, Facebook’s first president. The so-called attention economy the Center for Humane Technology is fighting. This resulted —platform structure and notification system aside— in an algorithm to display content that users are prone to craving for, which helps get them back on the platform over and over. Since fake news seems to spread better than accurate information, false information actually helps Facebook make more money. Not a good thing to tell for the social network but this is likely to be nothing but reality.
Facebook’s interest is meeting shareholders expectations and making profits. The good social ethos helps reach that goal, not reverse. User’ interest is social interactions and in this sense Facebook is an amazing product that has expanded communication means. We must be thankful for that. All in all, such contradictory interests were to lead to an inevitable threat to social fabric and democracy, which is what we have been experiencing these past months.
THE BREAKING STORY
Facebook users can give app developers access to personal information while using « Facebook Connect » to access their apps. Until 2014, not only could app developers access the app users’ data but also all of the app users’ friends data, which led to last week’s breaking news after a year of investigation: an app called « thisisyourdigitallife » offered access to the data of 50 million Facebook users in America with only 270,000 people using the app. Those were incentivised to take part in a personality test (the app’s purpose) with a few dollars and they had to be eligible for voting in the US.
Aleksandr Kogan, a researcher at Cambridge University who oversaw that app sold the data to the newly created Cambridge Analytica firm which reportedly used that data for the US presidential election in 2016. A four-month undercover investigation shows the insidious method used by the company. A high quality journalistic work more than worth watching.
In 2015, The Guardian informed Facebook about the so-called breach, which appeared more to be a leak since it was inherently part of Facebook’s way of functioning to open data doors to advertisers and app developers as part of their business model. Facebook required Cambridge Analytica to delete the data and confirm they proceeded by signing a paper to send back. Neither an audit was carried out extensively to verify the data was properly deleted nor did they informed the 50 million people whose personal information had leaked without explicit consent.
For over 48 hours, neither CEO Mark Zuckerberg nor the chief operating officer Sheryl Sandberg did react. The company chief executive eventually posted —as usual— a long-form publication on his Facebook wall to take responsibility and suggest measures to be taken. This publication was followed by visits at several media outlets —The Guardian aside— from CNN to Wired and The New York Times.
Meanwhile, the matter sparked a #DeleteFacebook campaign on Twitter which gained momentum with WhatsApp co-founder Brian Acton tweeting « It is time. #deletefacebook ». Elon Musk replied: « What’s Facebook » to step it up a notch. The latter was challenged to delete both Facebook pages of SpaceX and Tesla with millions of followers, which he did almost immediately. On his part, former Twitter CEO Costolo commented stating that he did not see any « big movement » with this #DeleteFacebook campaign. Others did dig up Zuckerberg’s 2004 words when he wrote about « dumb f*cks! » relating to user folishness sharing their data with him. Actor Jim Carrey used it to take part in the matter fallout.
The momentum was amplified by dozens of articles in some of the biggest news outlet to spread the backlash, explaining the procedure to download personal data —some woke up realising the vast amount of big data piled up by Facebook— and delete a Facebook account. Noticeably a new cover of the Economist sums up the situation concisely.
CAN THEY BE TRUSTED?
Following the breaking story, Mark Zuckerberg announced a set of initiatives among which the promise their teams will review all apps that accessed important data before stricter 2014 rules applied plus they will let people with their data leaked know about it. However, the mea culpa and action plan he outlined do not erase key questions: why did Facebook wait for two years before taking steps to audit thoroughly, inform users involved and act to prevent another leak? Amateurism or data far west? Why do they say they care about the matter whereas we do know they did try to sue The Guardian to prevent last week’s publication? Most people can reasonably think they did not intend to take serious steps before being exposed and having no more choices. This hypothesis is backed by former Facebook’s platform operations manager Sandy Parakilas who told The Guardian that he warned at the company that such a lax approach to data protection risked a major breach. The problem is that they seemed to put their exclusive focus on company growth. Is everything happening to Facebook today logically stemming from the notoriously « move fast and break things » Facebook motto of the early days? It is a question worth asking for. TechCrunch compiled other unsettling questions.
In the light of all pieces put together, how could people seriously believe the sincerity of Mark Zuckerberg’s mea culpa?
What’s more, this data leak goes beyond sheer advertising purposes for consumer products this time, which is raising another serious question: is it healthy to use a platform prone to exposing users not only to consumer products advertising through persuasion or manipulation tactics but also for political purposes? This event grasps more attention in the US as the Cambridge Analytica matter is linked to the very polarising US election of 2016.
The storm Facebook is going through at this moment could help raise awareness about what the data-driven society we now live in means to avoid social cooling. With 2,2 billion users, Facebook has become inherently part of social life and is considered by some as an entity to be treated like a country. Democracy is a at stake. Voices raise for Mark Zuckerberg to come testify before the American congress as well as the European Parliament and the British parliament to tell his version of the story in front of the representatives of millions of citizens. On his part, Denmark is the very first country to have appointed an ambassador for diplomatic relations with technology companies, the techplomacy. Maybe a move that will be followed by others later on.
But the tricky deal here is also about whether we should really start questioning the advertising business model of the Internet. The founder of the World Wide Web (WWW) Tim Berners-Lee recently wrote in The Guardian about the need to fight « the myth that advertising is the only possible business model for online companies ». As we are moving from fairly innocent consumer product advertising to mass manipulation campaigns for political purposes potentially threatening democracy, the need to rethink this model gets pressure. However, it is important to point out that we can’t estimate how impactful such political campaigns were for say the US election of 2016.
The suggestion to rethink the business model of Facebook and other online companies was backed by one of Facebook’s early investor and Mark Zuckerberg’s mentor at the time in a video interview given to Channel4News.
FOOD FOR THINKING
At the end of the day, this whole story has the positive impact of raising important questions about data protection and privacy and more generally about the way to organise a data-driven society. Three steps seem necessary to take.
First off, the urgent need to foster digital culture at all levels of society. We do have to understand that digital literacy is a now thing, not just literacy outright. The education systems in all countries should adapt their programs, more specifically for the younger population. Too many online services used to operate in a far west digital ecosystem with supposed consent from billions of people instead of a necessary informed and resonated consent to use their data.
Secondly, the industry should fight the status quo of the advertising-based business model of the Internet. The biggest mistake at start was perhaps the fact that we Internet users got used to the all free online. We must learn or relearn to pay for services or/and find a way to make them function healthily. Should a Facebook be profitable or function like a non-profit for public good?
Thirdly and more generally: how can countries remain competitive in a transnational digital economy where some countries have strict regulation and others lax ones? We need to think about the right balance between regulation and economic interests. This will be tough as change comes fast but we would better start getting together and work at it.
You can find a compilation of tweets about the Facebook-Cambridge Analytica matter here.