Facebook: The Medium Is The Message
- Facebook promised to bring people together and Facebook might very well have done that for many.
- However, evidence is mounting that Facebook also is an ideal platform for tearing societies apart, especially for actors with bad intentions, of which there are plenty.
- While existing legal possibilities on the platform already are very much geared to this, very lax data practices have added yet another layer to this destructive potential.
- The lax policies and years of denial show either gross incompetence, or worse, especially of its CEO.
- While Facebook is cleaning up its act only now, the damage has been done and cannot be undone.
Investors in Facebook (FB) are focused on today's hearings with CEO Mark Zuckerberg, and understandably so. After all, Facebook has rewarded its shareholders immensely:
But with the Cambridge Analytica problem, the lax data protection policies and inadequate response to these problems (which were known more than two years ago), the company has come under mounting pressure.
This pressure isn't just from the political arena, but it has seeped into the stock market as the share price has come under pressure.
There's every chance that this pressure will subside though. After all, Facebook's business model is incredibly profitable, together with Google (GOOG) (GOOGL) it is responsible for nearly all the growth in the digital advertising market, where its position seems unassailable.
All it takes for Facebook to stop the rot in its share price is a combination of mea culpa and some rock-solid measures to protect the data of its users. We think that's possible. The EU is forcing companies in this space to do just that, and these practices could be applied worldwide.
So we do actually see the shares rebounding as these data policies are remedied. Is that the end of the story? Not necessarily. We think there more important elements in the Facebook story that could set it up for a bigger backlash in the future.
In our view, the focus on data protection is way too narrow. The problem with Facebook's goes way beyond the lax policies and data practices enabling the likes of Cambridge Analytica (and an unknown quantity of others) to appropriate its data for sinister purposes.
Facebook is a new media, and as such it is profoundly altering the nature of societal discourse, and the evidence is mounting that it's not for the better.
Media critic Neil Postman wrote a book in the 1980s called "Amusing Ourselves to Death" in which he contrasted the political discourse of what he called the typographic age in which the printed word (newspaper) was the main medium, and that of the TV age in which the image came to be dominant.
Postman agrees with Marshall McLuhan, who famously argued that the medium is the message. The typographic age of the printed world produced a rational discourse, where politicians, speaking for hours on end at rallies, spoke as if they were reading from a doctoral thesis.
In the TV age this is no longer viable because the image has reduced everything, news included, to the level of entertainment. The book comes with some strong examples, like the newscaster rattling off a series of terrible events, but viewers only responding to the stain on his tie. Stuff like that.
Now, we have little doubt Postman's views are, at times, a little exaggerated. Not all politicians of yesteryear spoke as if reading from their doctoral thesis, and there's still some serious political discourse on corners of the TV kept alive for more than just its entertainment value.
But grosso modo it is certainly right that new media change the nature of public discourse. Enter social media in general, and Facebook in particular.
We hope to show you in this article that Facebook, as essentially a new medium, has a rather substantial impact on the nature of political discourse. We also hope to show you that this impact isn't benign.
Much of that stems from the unique nature of the platform itself, combining an extraordinary large target audience with extraordinary means of targeting very specific subsets, creating very specific messages for single subsets, and getting instant feedback on its effectiveness.
This might sound innocuous and, for the nerd in you, a technological marvel, but the consequences for political discourse can be pretty dire.
Adding to the problems, Facebook has been extraordinary lax with its data policies, and this has enabled certain actors to add a level of profiling based on "psychometrics," enabling a sinister personalized political messaging based on personality types, exploiting hidden fears, anxieties and dispositions.
While not everybody is convinced the latter works as efficiently as the proponents claim, what's clear is that Facebook was extremely lax when faced with this problem more than two years ago. They're only now taking it seriously because it has reached a large audience and taken a toll on its share price.
Many commentators on our earlier article reacted with something like -"what's the problem, everybody on Facebook knew that they were providing personal data."
Indeed, but what probably not everybody knew was that third parties could get their hands on that data, own it (unlike advertisers), sell it, or use it for really questionable goals.
Let's start with an obvious problem, but one that's not directly attached to any misuse of Facebook by third-party app developers or their clients. While it's Facebook's mission to connect people, by nature it is adding to social and cultural (and in some countries ethnic) rifts simply because of the way the platform works.
The same artificial intelligence algorithms that power tailor made adds power tailor made news, shielding users from stuff they might disagree with. Now, we're aware that this isn't exclusively aimed at Facebook, or even social media, but a feature of the splintering of the media landscape.
Nevertheless, 66%% of Facebook users get news on the website, that's 44% of the population making Facebook the largest distributor of US news. There are tens of millions of people for whom Facebook is their main, or even only source of news, and just getting news that reinforces your existing biases has a tendency to magnify social frictions and rifts.
This is already dangerous enough. We live in a time when there's a backlash against globalization and old identities are reasserting themselves in identity politics, fraying societies into tribes the process.
Again, this isn't new, and social media in general or Facebook in particular aren't the only force behind this. For instance, there are dozens of sectarian satellite TV channels beaming into the Middle East exploiting Sunni-Shia divisions
Tiger mom author and Yale law professor Amy Chua has argued in a new book Political Tribes: Group Instinct and The Fate Of Nations that Americans:
We forget how unusual it is to have an extremely diverse multi-ethnic population and a strong overarching identity capable of binding people together.
Apart from explaining (according to Chua) just about every foreign policy fiasco (neo-con fantasies of Iraq becoming a model democracy after toppling Saddam and the like), that strong overarching identity is now fraying almost everywhere under the weight of splintering media, a host of opportunistic political entrepreneurs exploiting any local "us vs. them" dynamics and increasing economic inequality. And they have a whole new shiny tool in Facebook.
Digital testing ground
Digital media platforms like Facebook provide politicians with a whole new set of tools:
- They can target messages to infinite combinations of possible audiences with stuff that only appeals to them.
- Run tests of massive amounts of variation (40,000 or 50,000 variants of ads a day was normal and up to 175,000 on special days) and get instant feedback on the effectiveness of these.
- All of this is not visible for the public at large (they make use of Facebook's so called dark posts, which are invisible to everyone but the recipient). There's little accountability which offers endless opportunities to produce mixed or contradictory messages (saying one thing to one group, quite another to another group) and outright falsehoods, and do so very effectively. This is how Russian troll factories could deepen divisions as they simply pushed ads to both sides of divisive issues, even organizing demonstrations and counterdemonstrations in the real world.
- Facebook became the most important platform for campaign funding for one of the parties in the 2016 election.
While the issue of way bigger than just the Trump 2016 Presidential campaign, unfortunately most of the examples available in public sources come from that campaign. The issue isn't Trump, but how Facebook is undermining rational political discourse and fraying societies.
Here's the Motherboard article (a reprint of the Das Magazin article that was one of the first to expose this stuff). About that proliferation of versions of the same ads, which Trump himself seems to have internalized:
Trump's striking inconsistencies, his much-criticized fickleness, and the resulting array of contradictory messages, suddenly turned out to be his great asset: A different message for every voter. The notion that Trump acted like a perfectly opportunistic algorithm following audience reactions is something the mathematician Cathy O'Neil observed in August 2016.
This behavior looks like it has been the result at least in part of the feedback his campaign got about the effectiveness of different messages. There's actually a good deal of method in this seeming madness (NY Books):
The Trump world was more like, “Let’s say a lot of different things, they don’t even necessarily need to be coherent, and observe, through the wonderful new platforms that allow you to observe how people respond and observe what works, and whatever squirrel everyone chases, that’s going to become our narrative, our agenda, our message.”
Trump does it in the open, but the highly targeted dark posts on Facebook allows campaigns to target different audiences with different messages, endless experimentation about fine tuning the message through instant feedback and any inconsistencies or outright conflicts between the messages remain hidden.
Out of the window: Rational political discourse. Throw stuff at the wall and see what sticks, or more precisely, throw stuff at hundreds of walls, the targeted audiences and throw hundreds of thousands of different messages and versions at them. They don't bite one another because Facebook allows targeting and dark posts. It provides an evolution-like sorting through instant feedback on effectiveness.
Weaponized identity politics
Facebook data in the hands of maligned actors weaponizes this tendency toward tribalism:
- Fake news
- Playing on hidden fears and dispositions through psychometrics
- Creating a post-truth society
As the comment sections of many websites testify, the Internet already has a tendency to bring out the worst in us. But given the above sketched background of tribal identity politics, these tendencies have been put on steroids and exploited by forces like the operatives and clients of Cambridge Analytica.
With the splintering of the media landscape, the demise of trusted gatekeepers and Internet economics which pays by the click, favoring sensationalist headlines, there has been a proliferation of fake news.
Combine the self-sorting algorithms creating a self-reinforcing biased news loop with fake news and you get people going to pizza parlors shooting people because they read that it's a place where politicians ran a pedophilia ring. Stuff like that.
Facebook has been acutely aware of the fake news problem but has been very reluctant to do anything about it as not to be seen as partisan, even while they could, according sources with direct knowledge of Facebook decision making (from Gizmodo).
Needless to say that the combination of the extremely effective targeting algorithms and outsized Facebook community, fake news is another powerful tool in the hands of those with dubious intentions and weaponizes those who are willing to exploit all kinds of social divisions.
You might want to read this story in Medium where it is demonstrated how easy it is to create a fake news website and promote it on Facebook. And this is just by using Facebook advertising tools.
This basically everybody can do, and as the article explains, it's really cheap. The Russians did it as well (Medium, our emphasis):
While Facebook has recently disclosed that 10 million people were exposed to Facebook ads from Russian operations promoting fake news, our analysis leads us to believe the total reach, including unpaid visibility and engagement via this snowball effect, would have likely been somewhere between 20 to 100 million people.
And, of course, these numbers have kept on rising in newer assessments. Here's a newer one from Facebook itself (from the FT, our emphasis):
The release came as Facebook revised the number of people reached by the Kremlin-connected troll farm, called the Internet Research Agency, to at least 150m people, higher than the 126m reported earlier this week, and significantly more than it initially indicated ahead of government inquiries. Below is a sample.
Apart from that FT article, here (and here and here), are more examples of these Russian ads. There are studies that have arrived at the conclusion that fake news had a substantial impact on voter behavior in the 2016 election.
But this is still simple stuff. The essence of the illegally gotten Facebook data by Cambridge Analytica is:
- CA owns that data (unlike advertisers).
- Which enabled them to create a sophisticated psychometric model, enabling them better fine tuning and preying on people's innermost fears and dispositions.
Preying on hidden fears and dispositions
Psychometrics basically tries to assess humans on the basis of five personality traits. For a long time the problem was data, as collecting it depended on subjects filling out complicated, highly personal questionnaires.
Cambridge scientist Michal Kosinski (and associate David Stillwell) was the first to use Facebook for this. And to his considerable surprise, millions filled out his questionnaire. They then combined these data with what else was known about their subjects to look for regularities and construct a predictive model on the basis of Facebook likes.
Here is Motherboard (our emphasis):
The strength of their modeling was illustrated by how well it could predict a subject's answers. Kosinski continued to work on the models incessantly: Before long, he was able to evaluate a person better than the average work colleague, merely on the basis of ten Facebook "likes." Seventy "likes" were enough to outdo what a person's friends knew, 150 what their parents knew, and 300 "likes" what their partner knew. More "likes" could even surpass what a person thought they knew about themselves. On the day that Kosinski published these findings, he received two phone calls. The threat of a lawsuit and a job offer. Both from Facebook.
It's crucial to understand that the model:
Not only can psychological profiles be created from your data, but your data can also be used the other way round to search for specific profiles: All anxious fathers, all angry introverts, for example - or maybe even all undecided Democrats?
And once you can select people who share certain personality traits you can exploit that with targeted messages, preying on their most hidden fears and dispositions. Here's former (and now disgraced) CA CEO Alexander Nix with an example:
Nix shows how psychographically categorized voters can be differently addressed, based on the example of gun rights, the Second Amendment: "For a highly neurotic and conscientious audience the threat of a burglary - and the insurance policy of a gun." An image on the left shows the hand of an intruder smashing a window. The right side shows a man and a child standing in a field at sunset, both holding guns, clearly shooting ducks: "Conversely, for a closed and agreeable audience. People who care about tradition, and habits, and family."
This stuff isn't surprising. Facebook itself has been mining users' emotional states and sharing this info with advertisers.
We stress that this article isn't about the Trump campaign but most of the examples in publicly available material comes from that campaign. Using Facebook's dark posts (only visible to recipients) the campaign (from NY Books):
“We have three major voter suppression operations under way,” a senior campaign official told Bloomberg’s Green and Issenberg. One targeted idealistic white liberals - primarily Bernie Sanders’s supporters; another was aimed at young women - hence the procession of women who claimed to have been sexually assaulted by Bill Clinton and harassed by the candidate herself; and a third went after African-Americans in urban centers where Democrats traditionally have had high voter turnout. One dark post featured a South Park–like animation narrated by Hillary Clinton, using her 1996 remarks about President Bill Clinton’s anti-crime initiative in which she called certain young black men “super predators” who had to be brought “to heel.” “We’ve modeled this,” the unnamed senior campaign official told Green and Issenberg. “It will dramatically affect her ability to turn these people out.” And it did. Democratic turnout in battleground states was weak, which was crucial to Trump’s victory.
And there are an unknown quantity of other "CA's" out there. How do we know? CA wasn't the only one harvesting data through third party apps. There were other researchers, and when Facebook closed the API (the loophole through which data of friends could be harvested) quite a few of these third-party app developers disappeared or went bankrupt. New examples keep popping up (from CNBC):
Data analytics firm CubeYou used personality quizzes clearly labeled for "non-profit academic research" to help marketers find customers. One of its quizzes, "You Are What You Like" which also goes by "Apply Magic Sauce," states it is only for "non-profit academic research that has no connection whatsoever to any commercial or profit-making purpose or entity."
Combine outrage and fear, which are already thriving on the Internet as it's the stuff that tends to generate the most clicks, with precision targeting (both the variety available to Facebook advertisers and the more sophisticated stuff based on CA's psychometric modeling) and you are enabling actors who understand these instruments to engage in an all-out assault on rational political discourse. The end result is actually already visible in some places, like Russia.
Post Truth World
Perhaps the most disturbing thing is that the concept of fake news itself has been weaponized, most notably in Russia, which has always excelled at psychological warfare and turning people against one another, according to the author of a new book Timothy Snyder's The Road to Unfreedom.
When Putin came to power, one of the first things he did was concentrate the media (from an interview with Snyder on Vox):
He marginalized print media in favor of television, which he can more easily control. Second, he got rid of local news entirely, so that news is exclusively about larger themes of national greatness or injustices against Russia. He then unified television media so that there are five or six channels that are all peddling different stories that essentially transmit the same pro-Russia, pro-Putin message but in confusing and contradictory ways.
According to the interviewer (Sean Illing) they used the instruments and techniques of cold war era misinformation:
They flooded their society with misinformation, then they attacked the institutions responsible for finding the truth, and then they capitalized on the confusion that followed.
The basic message is, according to Sykes:
That you can’t really trust any of the information. That the whole world is conspiring against Putin and Russia. The real message isn’t that Russia is great or that any particular ideology is great; it’s that you can’t trust anyone or anything. There’s no reason to believe in anything. There is no truth. Your institutions are bogus... Instead, all you hear in Russia are fears about the West or fears about religious or ethnic minorities corrupting Russian society.
They now have exported these methods abroad, mingling in a host of democratic processes, and the Facebook platform and data out there is a veritable gold mine for the Russian troll factories engaging in this stuff.
- Advertisement and newsfeed businesses having click-based earnings model that tends to thrive on fear, outrage and hyperbole, simply because it is the stuff people are likely to click on most. Truth, facts and nuance already face an uphill struggle in such a media landscape.
- Facebook combines the biggest community of users and exceptional targeting technology as its business model.
- Self sorting algorithms creating self-reinforcing feedback loops. It's created for advertisers (to make them increasingly relevant and less obnoxious), but applied to newsfeeds it tends to cocoon people in stuff they want to hear and is confirming their views.
- Many people getting most, or even all of the news from FB.
- Facebook, through extreme targeting, enables large scale experimentation and instant feedback to allow for incredible, evolution-like progress in the fine-tuning of messages. The platform enables actors to send different messages to different groups praying on different fears, hopes, dispositions, creating a world of fragmented discourse and different versions of reality. This is, after all, the core technology powering Facebook, earning tens of billions from advertising, but simply repurposed for political ends.
- These instruments can be easily weaponized by malignant actors.
- CA's psychometric profiling allowed campaigns an even more effective tap into these hidden fears and dispositions, enabling another level of targeting.
- This could easily facilitate an all-out assault on the truth and the end of any rational political discourse.
We don't think there's much that can actually be done about this or barring political ads on Facebook. The scandal that's now surrounding Facebook is focused on how third-party app developers could access and own data, and use it for sinister purposes.
Although not everybody agrees that the psychometric targeting is effective, it's actually only a small part of the problem, even if Facebook management was terribly lax with private data and its response (which was known more than two years ago).
That damage has been done, that data is out there, and even with the remedial actions of Facebook now and without psychometric testing there are other ways in which Facebook data can be scraped.
The medium is indeed the message. Unfortunately, platforms like Facebook have a tendency to contribute to centrifugal forces that tend to reinforce social, cultural and ethnic divisions in society, as we tried to show in the article. It's inherent in its business model.
This isn't just a problem for the US (or just Democrats, or Republicans). There are signs that Facebook is having a corrosive effect on many societies and has contributed to stuff like the Rohingya ethnic cleansing in Myanmar, for instance.
This article was written by
I'm a retired academic with three decades of experience in the financial markets.
Providing a marketplace service Shareholdersunite Portfolio
Finding the next Roku while navigating the high-risk, high reward landscape.
Looking to find small companies with multi-bagger potential whilst mitigating the risks through a portfolio approach.
Analyst’s Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.
Seeking Alpha's Disclosure: Past performance is no guarantee of future results. No recommendation or advice is being given as to whether any investment is suitable for a particular investor. Any views or opinions expressed above may not reflect those of Seeking Alpha as a whole. Seeking Alpha is not a licensed securities dealer, broker or US investment adviser or investment bank. Our analysts are third party authors that include both professional investors and individual investors who may not be licensed or certified by any institute or regulatory body.