How YouTube Came to Promote Fascism
Welcome to Big, a newsletter about the politics of monopoly. If you’d like to sign up, you can do so here. Or just read on…
Today I’m going to do a short issue, a bit of the backstory on this terrifying New York Times article on how YouTube radicalized Brazil.
But first, a correction from the last issue. I wrote about how the New York Fed has a luxurious cafeteria with an entire table just for cakes. A reader sent in this note.
Having worked with the Fed regulating financial markets for 9 years, I have a correction. I believe the NY Fed has 3 cafeterias: one for regular employees, one for management level, and a third for the bigwigs.
Looking for quick fixes, Disney has subsequently shuttered much of Fox film’s development slate, intends to shrink its theatrical output and has ordered reboots of beloved Fox library titles for its direct-to-consumer platform, Disney Plus. Rival studio executives estimate that Disney essentially torched at least $50 million worth of development. Watts has weathered the transition as the sole senior leader. Fox Film CEO Stacey Snider left in the first round of layoffs at the studio, and Fox 2000 head Elizabeth Gabler announced a move to Sony Pictures last month.
And now, here’s a picture of the CEO of YouTube Susan Wojcicki (h/t Techcrunch).
How YouTube Promotes Fascism
I was struck quite forcefully by this story by Max Fisher and Amanda Taub in the New York Times about the impact of YouTube on Brazilian politics. Brazil, as you may or may not know, recently elected a far-right wing President, Jair Bolsonaro, with strong fascist inclinations. Bolsonaro was a fringe character, not taken seriously, until corruption among the major parties, and some Watergate-style dirty tricks, provided an on-ramp for his political success. But that’s not the only reason he was able to build a large political following. One of the theories of how he came to power is that he was enabled, in part, by Google.
Here’s how it happened:
When Matheus Dominguez was 16, YouTube recommended a video that changed his life.
He was in a band in Niterói, a beach-ringed city in Brazil, and practiced guitar by watching tutorials online.
YouTube had recently installed a powerful new artificial intelligence system that learned from user behavior and paired videos with recommendations for others. One day, it directed him to an amateur guitar teacher named Nando Moura, who had gained a wide following by posting videos about heavy metal, video games and, most of all, politics.
As his time on the site grew, YouTube recommended videos from other far-right figures. One was a lawmaker named Jair Bolsonaro, then a marginal figure in national politics — but a star in YouTube’s far-right community in Brazil, where the platform has become more widely watched than all but one TV channel.
Last year, he became President Bolsonaro.
Now, there are a lot of reasons Brazil went to the far-right, and it wouldn’t be fair to ascribe it to mass brainwashing. There are huge problems with corruption, there’s a massive crime wave, and there are good reasons to think Bolsonaro’s election was tainted with a malevolent investigation of his main opponent, Luiz Inácio Lula da Silva, as well as the impeachment on political grounds of a prior center-left President. But YouTube’s impact on Brazilians and their conception of power is also very real.
This New York Times story goes on to note a host of consequential effects of YouTube. These include young men disrupting classrooms with right-wing paranoid fantasies, and large numbers of people angry at health workers because they were taught by “Dr. YouTube” that health workers spread the Zika virus. Much of the story is familiar to those who track QAnon, anti-vaxxer messaging in the U.S., white supremacist incel type communities, anti-Rohingya arguments in Myanmar or any deeply conspiratorial social structures.
Many people want to believe in deeper truths than they see on the surface of life, and they want to feel special. This is not new. What is new is how large information utilities profit through the rapid and automated spreading of deceptive content, supercharging cult-like and fascist movements, and drawing in potential customers to them. Civic leaders trying to hold society together are now up against Google’s supercomputers.
This attack on civil society is a function of two changes in public policy levers. First, Section 230 of the Communications Decency Act has given information utilities immunity from what flows over their servers. Second, advertising money has shifted from publishers to highly centralized intermediaries of information put together because of a lack of antitrust enforcement, no regulation of personal data, and no regulation of deceptive and addictive user interfaces.
These shifts have enabled a business model like YouTube’s, which creates an unwitting incentive to put addictive conspiratorial content in front of people. The engagement maximization strategy by YouTube is explicit. YouTube CEO Susan Wojcicki took over the division of Google in 2014, and her goal was to get users to watch a billion hours of video a day, ten times more than they were watching in 2012. Such a change meant YouTube engineers had to find ways to addict people to YouTube content, and keep them watching with user interface tricks.
Wojcicki’s career is a useful prism through which to see the larger political problem with an institution like Google. Wojcicki is a capable executive, but has little consideration of the ethics of managing information. This became clear in 2013, when she fought with engineering leaders over the ethics of behavioral ad targeting. The engineers largely wanted to keep user profile data segmented; what you searched for on Google wouldn’t follow you around the web through creepy banner ads. Wojcicki wanted to use Google’s pervasive surveillance as aggressively as possible, consistent with industry trends. Wojcicki lost that battle, and was essentially removed from her position in the advertising department, floating around Google with no job for a few months.
Larry Page, who considers her a dear friend, feared she would leave. Wojcicki was in fact so close to Page that there was a slogan at Google: “What Susan wants, Susan gets." She wanted to be a CEO, so he made her CEO of YouTube, removing well-respected head of products Shishir Mehrotra in the process.
Wojcicki brought her ethical compass to YouTube. And that meant she would follow industry trends, or do what everyone else in the tech world was doing. At that point, they were seeking to maximize engagement, no matter the consequences, because to most Silicon Valley executives, there weren’t any consequences except higher stock prices. In other words, Alphabet CEO Larry Page put a crony with limited understanding of ethics as the head of the most important information channel in the world, simply because he liked her. There are many ways to describe a political system that allows such reckless and unaccountable uses of power, but democratic isn’t one of them.
All of this brings me back to a wonderful profile of Google in WIRED magazine by Nitasha Tiku about how Google’s culture has essentially imploded over the last three years. Tiku shows, in gruesome detail, how the internal contradictions of trying to be a ruthless monopoly, a libertarian paradise, and the socially woke savior of mankind collided.
One of the key pivot points for the company was when engineer James Damore, a conservative, wrote a memo on how women have biological differences than men, and these differences explain why there are more male engineers at Google. This provoked outrage internally, and eventually, the executive team fired him. The key moment of the debate in the C-Suite was when one of the top executives rhetorically asked Google leaders to imagine Damore discussing genetic differences around race rather than gender. That sealed it. Google’s CEO fired Damore.
Who swung the debate towards firing Damore? That would be Susan Wojcicki, who was in the meantime unwittingly helping put a fascist in charge of Brazil. She spread fascism not because she’s a bigot or a fascist, but because she had not thought through the ethics of applying advertising to information utilities like YouTube. This doesn’t make her a bad person; she clearly thinks about social justice when it’s put in her face. She disliked sexism in the workplace, and acted on it. But she clearly cannot apply this framework to the broad use of YouTube’s *power.* Wojcicki is a businessperson, not a policymaker, so there’s at least a well-trod argument she should overlook abstract moral concerns in favor of profit.
And ultimately, that’s why we’re going to have to break up and restructure Google, Facebook, and Amazon, and regulate our information commons more deliberately. The people at large tech platforms are, whether they know it or not, making public policy. In this case, Wojcicki was making public policy for one of the largest democracies in the world, destroying large numbers of lives and organizing dangerous political trends. It’s not that she did a good or bad job. It’s that it’s not her job to do this. And it’s not Larry Page’s job to appoint our global communications video regulator, which is what he did when he put her in that job.
On a broader conceptual level, let’s consider what has happened to the flow of information since 2000 or so. New giant information utilities - search, social networking, mapping, video sharing - have emerged who are financed by advertising. Advertising has a distorting effect on the flow of information, which publishers and journalists understand. We haven’t figured out how to shield newspaper content from advertising influence, but at least we try to mitigate it because we recognize the ethical dangers of this grand conflict of interest. Such ethical debates have not happened, except on the margins, in the information utility world. And this world is far more concentrated than any publisher ever was, meaning a smaller number of people have outsized influence on the ethical choices within it.
Wojcicki was just following industry trends, and did not have the fortitude to stand up to groupthink and the goal short-term profit maximization. But why should we expect her to have that? Why should we expect anyone to have that? We shouldn’t. That’s why it’s time to start reexamining our public policy assumptions. We must make it unprofitable to spread harmful content. Does that mean removing by law the ability of information utilities like YouTube to run ads? Maybe it does. Or maybe we can just remove the liability protections under Section 230 for any large information utility that runs ads. Banning user interface tricks leading to addiction or manipulation might be another path. Restoring antitrust is another.
At the very least, it’s time for a conversation about why we allowed executives in northern California to destroy Brazilian democracy, by accident.
Thanks for reading, and if you enjoy this newsletter, please share it on social media, forward it to your friends, or just sign up here.