The bitter truth buried in recent headlines about how the political consulting company – Cambridge Analytica – used social media and messaging, primarily Facebook and WhatsApp, to try to sway voters in presidential elections in the US and Kenya is simply this: Facebook is the reason why fake news is here to stay.
Various news outlets, and former Cambridge Analytica executives themselves, confirmed that the company used campaign speeches, surveys, and, of course, social media and social messaging to influence Kenyans in both 2013 and 2017.
The media reports also revealed that, working on behalf of US President Donald Trump’s campaign, Cambridge Analytica had got hold of data from 50 million Facebook users, which they sliced and diced to come up with “psychometric” profiles of American voters.
The political data company’s tactics have drawn scrutiny in the past, so the surprise of these revelations came more from the “how” than the “what.” The real stunner was learning how complicit Facebook and WhatsApp, which is owned by the social media behemoth, had been in aiding Cambridge Analytica in its work.
The Cambridge Analytica scandal appears to be symptomatic of much deeper challenges that Facebook must confront if it’s to become a force for good in the global fight against false narratives.
These hard truths include the fact that Facebook’s business model is built upon an inherent conflict of interest. The others are the company’s refusal to take responsibility for the power it wields and its inability to come up with a coherent strategy to tackle fake news.
Facebook’s first issue is its business model. It has mushroomed into a multibillion-dollar corporation because its revenue comes from gathering and using the data shared by its audience of 2.2 billion monthly users.
Data shapes the ads that dominate our news feeds. Facebook retrieves information from what we like, comment on and share; the posts we hide and delete; the videos we watch; the ads we click on; the quizzes we take.
It was, in fact, data sifted from one of these quizzes that Cambridge Analytica bought in 2014. Facebook executives knew of this massive data breach back then but chose to handle the mess internally. They shared nothing with the public.
This makes sense if the data from that public is what fuels your company’s revenues. It doesn’t make sense, however, if your mission is to make the world a more open and connected place, one built in transparency and trust. A corporation that says it protects privacy while also making billions of dollars from data, sets itself up for scandal.
Continued next page