Employees vs Customers
For companies, the very perception of “being evil” can seriously impact their reputation among a wide range of groups — employees, customers, journalists, politicians, and more.
Facebook is learning this the hard way, on several fronts at the same time.
Back in 1999, when Google’s team grew from a handful of people to a meeting-room-sized group, co-founders Larry Page and Sergey Brin launched a weekly townhall called “TGIF.” The employees would ask questions and the leaders would try to answer, as transparently as possible. During these meetings, concerns could be aired, but the company expected that all conversations would be kept “in the room.”
The tradition continued for years, but everything changed in 2016. After Donald Trump won the presidential election, many employees used the next TGIF meeting to express their outrage. Someone recorded the meeting and sent the tapes to Breitbart, the right-wing media outlet that regularly ran articles decrying Google’s anti-conservative bias. Leaks of the TGIF conversations continued; by 2019, Page and Brin stopped attending the meetings. In November of that same year, Google CEO Sundar Pichai essentially cancelled the TGIF meetings, replacing them with less frequent sessions that focused exclusively on product and business strategy.
When Mark Zuckerberg was building Facebook, he tried to emulate Google’s culture of transparency. The social media company implemented weekly “ask us anything” sessions. Like Google’s TGIF meetings, the Facebook ones remained confidential for years before people started recording and sharing the conversations with the media. But unlike Google, Zuckerberg decided to continue with the meetings, knowing full well that some employees will periodically leak his comments.
Executives can hold a small team together, but when a company grows to thousands of employees, they can’t expect complete alignment on any issue. And Zuckerberg was right about one thing: if you host company-wide conversations, you need to accept that your words will not remain private.
Case in point: over the summer of 2020, one or more employees recorded the audio (plus additional screen shots) from FOUR MONTHS of meetings. The rogue Facebook employee(s) delivered the trove of recordings to Casey Newton, the Verge’s Silicon Valley editor.
On September 23, Newton published his summary of those meetings, “Mark in the Middle” (I highly recommend reading it). Some of the conversations focused on lighthearted topics, like Zuckerberg’s skin-care regime.
But employees also raised more serious criticisms. In particular, many employees expressed concerns about Facebook’s (perceived) refusal to address extreme right-wing content on their platform. Zuckerberg pushed back on this idea:
One of the things that we talk about a little bit less inside the company is that ... the community we serve tends to be, on average, ideologically a little bit more conservative than our employee base. Maybe ‘a little’ is an understatement. … If we want to actually do a good job of serving people, [we have to take] into account that there are different views on different things, and that if someone disagrees with a view, that doesn’t necessarily mean that they’re hateful or have bad intent… I want to make sure that people here recognize that the majority of the negative sentiment that we have faced, measured by write-ins from our community, is actually generally coming from more conservative-leaning folks who are concerned about censorship.”
There is no “maybe” about Zuckerberg’s first point: Facebook users are FAR more conservative than Facebook employees.
Most Facebook users are not aware that the algorithm estimates the political leanings of each person’s profile. According to Pew Research Center’s analysis of Facebook data, users are “roughly equally divided between those classified as liberal or very liberal (34%), conservative or very conservative (35%) and moderate (29%).”
While Facebook users span the political spectrum, the majority of Facebook employees are not only left-leaning (like most white-collar professionals), but also supportive of the most left-wing leaders in the Democratic party. A February 6 article in Vox surveyed political donations from Facebook employees (for Q4 2019). To no one’s surprise, Trump received negligible support. But Joe Biden — the eventual Democratic nominee — raised far less money from Facebook employees than candidates with more left-wing positions (Bernie Sanders and Elizabeth Warren) or more unconventional platforms (Andrew Yang).
In Marketing BS newsletters, I frequently argue my thesis that your customers don’t really care about your company’s political values, but your employees do care — a lot. As a result, many companies tend to support the causes and moral stances of their employee base, not their customers. When companies launch marketing campaigns about social issues, the primary goal is employee recruitment and retention, not customer acquisition.
But Facebook seems, at least on the surface, to be an exception to this rule.
Employees are not the only group concerned about Facebook’s (perceived) lack of responsibility for dealing with extremist political activity on the platform. From media outlets to activist organizations, everyone seems to have labelled Facebook as an “evil” company.
Zuckerberg might refute this opinion, but Facebook is facing greater pressure than ever before. Take, for instance, this summer’s widespread advertiser boycotts of Facebook. I previously wrote about the boycott’s lack of financial impact, but Facebook certainly suffered some reputational damage.
So how can Facebook address its image as an evil company?
On May 6, Zuckerberg announced the creation of the Facebook “Oversight Board” — an independent entity that will make content moderation decisions for the company. The board members, drawn from 27 countries, possess expertise in constitutional law, religious freedom, digital activism, and other relevant fields. The board includes a number of esteemed members, including a former Prime Minister, an editor of a major newspaper, and a Nobel Peace Prize winner.
Last week, a number of civil rights activists formed Citizens, a non-profit organization dedicated to monitoring the actions of major tech companies. Citizens established their own check on the social network — The Real Facebook Oversight Board.
Axios obtained a “pitch deck” for the initiative, which alleges that Facebook’s official Oversight Board is “little more than a corporate whitewashing exercise.” Axios published a summary of the document’s key ideas:
· It suggests that the new [real oversight board] will rely heavily on driving media awareness around issues like voter suppression, election security and misinformation. · “We will use stunts, viral video, celebrity endorsement and skillful media management to throw a spotlight on the real-time threats to democracy from the misuse of social media platforms and big tech,” the document says. · “We know how to make a noise,” it continues. “Democracy needs its own PR team and creative agency. We are it.”
The first “stunt” pulled off by Citizens was audacious. In a September 5 tweet, they quoted the Axios article that quoted them:
Let’s get this straight:
Citizens wrote an internal document criticizing Facebook’s Oversight Board.
Axios received a copy of the document and published some verbatim excerpts.
Citizens tweeted a piece of (fake) breaking news: a media outlet had obtained a document revealing that Facebook’s Oversight Board is a sham.
Citizens used Axios’s name to legitimize the news.
On Citizen’s own website, they claim that Facebook’s “tools are being used to spread lies.” And yet, Citizen employed a PR tactic that blatantly misrepresented the truth!
Facebook, more than any of the other big tech companies, has been thrown into the “bad company” box. Consequently, many activists seem to believe that the ends justify the means; any strategy is legitimate because they are fighting “evil.”
Update on the Citizens tweet: it looks like they faced some oversight on their oversight board for Facebook’s oversight board.
Everything new is bad
How did Facebook find itself in the crosshairs of so many critics?
In part, Zuckerberg’s insistence on staying “neutral” continues to generate tension. By trying not to upset anyone, Facebook has managed to upset everyone.
More importantly, though, Facebook’s actions are met with suspicion because social networks are a relatively novel concept. New things — with unknown dangers — tend to intimidate people.
Netflix has created a lot of buzz with “The Social Dilemma,” a “documentary-drama hybrid [that] explores the dangerous human impact of social networking.” One of the “tech experts” in the film, Tristan Harris from the Center for Humane Technology, looks directly at the camera and says the following:
No one got upset when bicycle showed up. As everyone was starting to go around in bicycles, no one said, “Oh my God, we just ruined society. Bicycles are affecting people, they’re pulling people away from their kids. They’re ruining the fabric of democracy, people can’t tell what’s true.” We never said any of that stuff about a bicycle.
In fact, Harris is comically incorrect — that was exactly what happened when the bicycle first hit city streets. The Pessimists Archive — a website and podcast dedicated to helping us remember our “historical pessimism” — produced an episode about the origins of the bicycle.
Pessimists Archive found some amazing newspaper headlines from the late 1800s:
“Excessive use of bicycle fatal”
“Do bicycles increase selfishness?”
“Bicycles are blamed for youth’s insanity”
“Bicycles affect church attendance.”
Here’s one final (and particularly ludicrous) claim: women riders suffered from “bicycle face,” which made them too unattractive for marriage!
Quite simply, many people believed that the bicycle — a new invention — was a scourge on the nation.
Sound familiar? People might not be talking about “social media face,” but many claims are just as preposterous.
The Social Dilemma, for example, features fictional plot lines and some clear exaggerations. Casey Newton, in a different column, highlights the obvious contradictions:
It is more than a little ironic that a film that warns incessantly about platforms using misinformation to stoke fear and outrage seems to exist only to stoke fear and outrage — while promoting a distorted view of how those platforms work along the way.
The film attacks both Facebook itself and the Facebook-owned “WhatsApp.” But the film fails to mention the fundamental differences between these platforms. Facebook uses an algorithmic feed — you only see a small percentage of your friends’ posts, based on Facebook’s prediction of your likes and interests. WhatsApp, on the other hand, is a communication channel with no algorithmic feed — just private messages sent between friends. If both of these platforms are so dangerous to individuals and society, then why are we comfortable with email? Or are we really safe from the privacy invasions of the telegraph? Or the corruption of our youth caused by the novel?
Newness and nuance
When critics label Facebook an “evil” company, they ignore two important pieces of nuance:
Media channels have long been used for political messages. Think about the different ways that leaders used radio broadcasts, from Franklin Roosevelt’s fireside chats to Adolph Hitler’s propagandic rallies.
Mark Zuckerberg (and COO Sheryl Sandberg) are human. Like most people, they need to balance their personal values and their professional decisions. And like everyone, they are fallible.
If people want to curtail the political extremism on social media platforms, they need to acknowledge that technology is not the root cause. Political manipulation is not an unprecedented phenomenon. For example, Russia has been attempting to meddle in US elections for more than a century.
Today, no one is concerned about the subversive dangers of radio. Forty years from now, I expect that everyone will laugh at the idea that social media was ever seen as a threat to society. Hateful messages and deceptive campaigns have always existed — they have just adapted and evolved with each technological breakthrough. Critics tend to focus on the latest tools and overlook a common human behavior: fear of the new and unknown.
If Facebook disappeared, do you really think electoral interference and misleading political ads would suddenly vanish? Of course not. Plus, parents would still find reasons to be irritated by their kids’ latest source of distraction.
But given Facebook’s intense criticism from both the media and employees, why doesn’t Zuckerberg just change the company’s policies about political commentary?
In 2019, Twitter decided to stop accepting political ads; they earned widespread praise. Facebook could have followed suit. The cynical belief that “money talks” doesn’t even hold water — Facebook generates minimal revenue from political ads. So why didn’t Facebook take a position that might have helped them shed the “evil company” label?
Despite Zuckerberg’s repeated vows of neutrality, Facebook undertakes many actions that would be generally perceived as progressive. For example, the company has endorsed a massive voter sign-up project that Trump has claimed is “nothing less than an attempt to ultimately benefit Biden and the Democrat Party.”
And Sandberg has spoken about Facebook’s support for social issues:
Facebook had been used to raise billions of dollars for charity and had also been an important tool for activists around the world… We don’t get credit for any of those movements... but the brave women who spoke out on Me Too, the brave people who spoke out on Black Lives Matter, the brave people who organized the Women’s March — they needed the tool. There’s a reason this is happening now and it didn’t happen before.
Neither Sandberg nor Zuckerberg wear their political beliefs on their sleeves, but ample evidence suggests left-of-centre opinions on many issues. Take Sandberg’s description of her own leanings: “I have a very strong point of view on this president. It’s a personal point of view. It’s one I hold deeply.” You don’t need to read far between the lines to understand she was not expressing a positive view. Likewise, Zuckerberg has argued that — when it comes to public disagreements with the president — he has gone further than “pretty much any other corporate CEO.”
Clearly, though, Facebook has not taken the progressive positions that the media and employees WANT to see. Why not? Why are they not “marketing to employees” like so many other brands?
Zuckerberg seems to genuinely believe that his platform, like television, radio, billboards, print and newspapers, should aim for neutrality. In his mind, the social network was not created to censor people running for political office. Many CEOs buckle at the first signs of public outrage. But Zuckerberg maintains full control of Facebook. He can’t be fired. For better or for worse, he can stay true to his principles, even if many people disagree with his stances.
But not even Zuckerberg is immune to employee dissatisfaction. Facebook’s future depends on the dedication of its employees. In the leaked audio from Facebook’s weekly meetings, Sandberg claimed that her time spent on recruiting is “the most important thing [I] do.”
In the tech industry, there is fierce competition for top talent. With generous salaries and social cachet, Facebook has historically maintained the HIGHEST retention rate of the major tech firms. At the moment, Facebook seems to enjoy the rare luxury of not needing to market (very hard) to employees.
In the event that Zuckerberg DOES cave on his principles, I don’t expect the reason will be activist groups like the Citizens or movies like The Social Dilemma. Instead, Facebook will change course if/when the company struggles to retain top employees and recruit new ones. The audio recordings prove there’s a rising tide of resentment within Facebook’s walls.
Zuckerberg hasn’t reached his breaking point yet. But the longer he waits, the more time that Facebook will spend in the “bad company” box.
Keep it Simple,
Edward Nevraumont is a Senior Advisor with Warburg Pincus. The former CMO of General Assembly and A Place for Mom, Edward previously worked at Expedia and McKinsey & Company.