Episode 159: Transcript

Episode: 159: When Fiction Becomes a Microaggression (with Evelyn Douek)

Transcription by Alexander

 

 

Charlie Jane:             [00:00:00] Annalee, are you excited for the launch of NASA's Europa Clipper mission next month?

Annalee:                    [00:00:05] Well, you know, as we learned in the documentary film 2010, all these worlds are ours except Europa. So I'm really hoping that Clipper is approaching Jupiter in a very demure way, a very mindful way.

Charlie Jane:             [00:00:21] I'm sure it is. The Europa Clipper is a very, like, sweet vessel. It's just, I don't think it would do anything…

Annalee:                    [00:00:26] It’s very cutesy.

Charlie Jane:             [00:00:27] …to violate etiquette or to, like, upset any monolith builders or anybody who might be concerned. Yeah, I mean, I'm just super, I'm super obsessed.

Annalee:                    [00:00:36] Yeah.

Charlie Jane:             [00:00:37] The Europa Clipper is this honking great space probe that's going to go out there. It's going to launch in October. It's going to go study Jupiter's moon Europa up close and personal. And, you know, there's been a little bit of suspense about, like, whether it would launch on time because some people worry that the transistors on board were going to not be able to deal with Jupiter's high levels of radiation. And if they don't launch in October, basically they have a long time before they can launch again because of, like, the orbital mechanics of, like, whether you have to do one slingshot past Earth or two slingshots past Earth and, like, fly around Mars and all this other stuff.

Annalee:                    [00:01:11] Yeah, because they're getting a gravity assist from Mars, right? Yeah, super cool.

Charlie Jane:             [00:01:15] So they really have to make this October date. And it looks like, last I checked, they're on track. We're going to take off in October as planned. And we're going to see a better look at Europa.

Annalee:                    [00:01:24] I am excited. I really want to see the subsurface ocean on Europa…

Charlie Jane:             [00:01:29] Hell yeah.

Annalee:                    [00:01:29]  …because Europa is a frozen moon with a really thick layer of ice. But underneath, it seems, we hypothesize, that there is an ocean. And it could be actually kind of warm in there. And I'm pretty sure my distant relatives are living under there, too. So I'm hoping we meet them.

Charlie Jane:             [00:01:47] Frickin' sea monkeys. It's all the frickin' sea monkeys. That's where they come from. I'm telling you, you heard it here first. Now, I'm super excited. I think it's just so beautiful. And last year, our poet laureate Ada Limón, wrote a poem in honor of the Europa Clipper. And it's just, it's so lovely. Here's my favorite line.

                                    [00:02:04] ‘Oh, second moon, we too are made of water, of vast and beckoning seas. We too are made of wonders, of great and ordinary loves, of small invisible worlds, of a need to call out through the dark.’

Annalee:                    [00:02:18] That's so amazing. I love that our poet Laureate is writing poems about NASA probes.

Charlie Jane:             [00:02:24] About space probes.

Annalee:                    [00:02:26] Man. So do you think that, like, the Europa Clipper goes by Clippy for short? Because I'm just going to start calling her Clippy. She's not going to mind, right?

Charlie Jane:             [00:02:37] Man, I feel like the name Clippy has some unfortunate associations that robots might find a little bit, I don't know, it might be a little bit problematic to call any kind of robot.

Annalee:                    [00:02:47] Well, I'm just going to call her Clippy anyway. I'm going to make her like it.

Charlie Jane:             [00:02:51] So that's an interesting segue into the topic of today's episode, which is microaggressions. You know, what is, what is a microaggression? How do we find them in science, fiction and fantasy? And how do our beloved genres kind of talk about them in good ways and bad? And later in the episode, we're going to be talking to the incredible Stanford Law Professor, Evelyn Dueck, about comment moderation and the legal debates about how to deal with misinformation and other horrible forms of speech online.

                                    [00:03:21] So you're listening to Our Opinions Are Correct, the podcast that has already visited every moon in the solar system twice. I'm Charlie Jane Anders. I write science fiction and other stuff. My latest book comes out in 2025 and it's called Lessons in Magic and Disaster.

Annalee:                    [00:03:39] Amazing. I'm Annalee Newitz. I'm a science journalist who also writes science fiction. And you can find my latest book in bookstores right now. It's called Stories Are Weapons: Psychological Warfare and the American Mind.

Charlie Jane:             [00:03:54] So in our mini episode next week for our Patreon supporters, we're going to be talking about Lord of the Rings: the Rings of Power.

Annalee:                    [00:04:01] Which I keep calling the Rings of Pu'er because I think we should all just be drinking really fancy tea.

Charlie Jane:             [00:04:06] I love Pu'er tea. It's so great. It's the best.

Annalee:                    [00:04:10] The Rings of Pu'er. So did you know that all of this goofery that we're going through right now with you is entirely independently produced? That's right. Your money is helping to fund our silliness and our correct opinions through Patreon. And if you become a Patreon supporter, you are helping make this podcast happen. You're helping to pay for the production, the hosting, our silliness. And plus you get audio extras with every episode. You get mini episodes. You get access to our Discord channel where we hang out and give book recommendations and just chat about stuff. So please think about supporting this totally independent podcast. All of these things could be yours for a few bucks a month and you'd be supporting us. And anything you give goes right back into making our opinions more correct. Find us at patreon.com/ouropinionsarecorrect. And now let us do some microaggressions.

[00:05:09] [OOAC theme plays. Science fiction synth noises over an energetic, jazzy drum line.]

Annalee:                    [00:05:42] So I was really curious when we first started talking about this episode about where the term ‘microaggression’ comes from. I feel like I started hearing it a lot just in the past 10 years.

Charlie Jane:             [00:05:53] Me too. But it turns out the term has a much longer history. It was actually coined by a really fascinating person named Chester M. Pierce. He was a professor of psychiatry at Harvard University and he founded or helped to found the Black Psychiatrists of America. And in 1975, Dr. Pierce wrote this paper where he wrote, “One must not look for the gross and obvious. The subtle cumulative mini assault is the substance of today's racism.”

Annalee:                    [00:06:23] Oh, that's so interesting.

Charlie Jane:             [00:06:25] Yeah. And also Dr. Pierce is probably most famous for being one of the people who helped to create the TV show Sesame Street. He was brought on board as a consultant in 1968. He got heavily involved in creating Sesame Street because he believed that a show for children could present positive images of black people and show black people being treated with respect. And the early seasons of Sesame Street went out of their way to show black people as like an antidote to like the marginalization of black posts on television.

Annalee:                    [00:06:55] Yeah, it's so true because I grew up with Sesame Street in the 70s. And I remember thinking, yeah, it absolutely portrays a diverse neighborhood as normal. And like black people are in positions of power and authority. And like, you know, kids have to obey all the grownups, no matter where they come from or who they are. Not obey, but you know, they treat them with respect.

Charlie Jane:             [00:07:20] But the grownups are also so friendly and kind.

Annalee:                    [00:07:22] Yeah. Yeah. I'm just saying that, you know, the grownups have like our safety in mind as kids. You know, you always like listen to what the grocer tells you to do and that kind of thing. And I love that he came up with this idea of the mini assault. You know, the early idea of the microaggression because Sesame Street is definitely full of the opposite. It's like micro progressions, you know, like there's all these like little tiny moments that cue you in to the fact that you need to be respectful toward people no matter what their racial background is or religious background or whether they're like a fuzzy guy living in a trash can.

Charlie Jane:             [00:07:59] Yeah, Oscar the Grouch.

Annalee:                    [00:08:01] A giant purple elephant. Yeah.

Charlie Jane:             [00:08:03] Yeah. And like Oscar the Grouch is treated kindly, even though he's literally a trash person, you know… Big Bird - nobody believes that Snuffleupagus is real, but they still treat Big Bird nicely. It's just, it's a very kind show and it's a very inclusive show.

                                    [00:08:18] So yeah, then in the 2000s, there's this psychologist named Derald Wing Sue who wrote a series of papers, which we'll link to in the show notes about microaggressions and kind of expanded the concept. And you know, he was the one who kind of explained how psychologists could use this concept in their practice and kind of helped their patients to mitigate it.

                                    [00:08:39] And you know, what he found in his research was that over time, microaggressions against people of color could be more harmful than overt like blatant racism. They could actually do more damage. So he came up with a taxonomy of microaggressions and he wrote this super influential book called Microaggressions in Everyday Life. And so here's a clip of Dr. Sue explaining the concept and how many people may not even be aware that they've committed a microaggression.

Dr. Sue Clip:              [00:09:07] Microaggressions are the everyday slights and indignities, put-downs and invalidations that people experience – that people of color experience – in their day-to-day interactions with well-intentioned, well-meaning people who are unaware that they have delivered a put-down or invalidation. What I found really true about these forms of microaggressions is that the, I hate to say perpetrator – the person delivering the microaggression – thinks of it as a compliment. For example, “You speak excellent English” while on the surface, it appears to be a compliment, reflects a worldview that you are a perpetual foreigner in your own country. You are not a true American.

Annalee:                    [00:09:59] Wow, that is so incredible. I feel like we're constantly having to relearn this, basically, that, you know, we learned about it in the sixties and then we really learned about it in the aughts. And now finally, I feel like the idea of the microaggression has kind of gone mainstream. And like a lot of people know what a microaggression is. They may not realize when they're engaging in it, but at least if you call someone out on it, a lot of people will know what you're talking about. And they, they will understand that even though you haven't like say used the N word, you can still be engaging in really harmful racism. You know, you don't have to like do the most extreme thing in order to be damaging.

Charlie Jane:             [00:10:42] Yeah. And the thing where he says that, you know, people think they're giving you a compliment, like when they say to a Chinese American, “Oh, you speak English so well.” And it's like New Jersey. Like they think, “Oh, but I'm just giving you a compliment.”

Annalee:                    [00:10:55] It's like, no, that's, that's actually, yeah, I don't know. It's the, it is, it's a microaggression. Yeah. It's a way of telling someone that they don't belong even when they do.

Charlie Jane:             [00:11:04] Yeah.

Annalee:                    [00:11:05] So, okay. Let's think about this in the context of science fiction.

Charlie Jane:             [00:11:09] Yeah.

Annalee:                    [00:11:09] How do you see microaggressions popping up in sci-fi and fantasy.

Charlie Jane:             [00:11:14] I mean, classic sci-fi, like sci-fi from prior to like the last 20 or 30 years is full of just like playful kind of little assaults on marginalized people, like disabled people, you know, BIPOC people, queer people, you know, women being kind of treated with like a very jocular jovial kind of disrespect. That's like, “Oh, we're just joshing around.” And you know, that's just also full of stereotypes, you know, that are kind of a microaggression in story form.

Annalee:                    [00:11:45] Oh yeah.

Charlie Jane:             [00:11:46] It's full of portrayals of marginalized people that are kind of, you know, basically committing microaggressions just in how these people are portrayed and what they're allowed to be and how they're allowed to exist within the story.

Annalee:                    [00:11:58] Yeah. I always think of the kind of classic, like golden age science fiction movies where the woman's role is to scream. And no matter what, even if she's a scientist, even if she's like in charge of the mission, you know, she's always going to at some point have to be the one that's like, “Ah!” and a man will have to come in and save her. And that's just kind of the world that you're given. And of course that's an incredible microaggression. You know, it's, it's a way of completely undermining any kind of authority or competence that the character has.

                                    [00:12:28] You know, it's interesting to think about how this gets played out on kind of a metaphorical level, because in the sixties and then again, I mean, kind of going forward, we see things like how Spock is treated in original Star Trek. And he's kind of, I feel like he's often the butt of a joke. He's kind of marginalized. He's like, you know, an officer.

Charlie Jane:             [00:12:55] He's like the second in command on the ship. Kirk encourages this, I feel like. Everybody's encouraged to kind of make fun of Spock to talk about his pointy ears and his green blood and his lack of, you know, emotional responses. Like McCoy does it a lot, but other people do it too, you know?

                                    [00:13:09] And because in science fiction and fantasy, aliens and robots and like, you know, magical creatures are often stand-ins for real life groups. You can kind of see Spock being kind of used as a stand-in for various real life people on earth, ranging from like neurodivergent people to, you know, other cultures. And you see this with how Klingons are treated. You see this with how like, there is a sense of like, even though the ethos of Star Trek is celebrating diversity and celebrating inclusion, there’s also kind of flipsides sometimes of like, when we have a stand-in for like a marginalized group, we do kind of like make fun of them or stigmatize them.

Annalee:                    [00:13:48] I was gonna say that's also, at least in original Star Trek, they're on the cusp where they're kind of leaving behind the idea that it's normal to be overtly racist, violently racist, and moving into an era of, “Well, microaggressions are okay. You know, it's like a way of letting off steam. You know, if I grab a woman's butt, I mean, it's not a big thing, right? I'm not raping her.” I feel like that's often the attitude that you see, especially in the 20th century, where it's like, well, “Why are you mad? I'm not raping you. All I did was grab your bum.” You know, like that's not harmful. And it's like, actually...

Charlie Jane:             [00:14:23] “We're just joking. We're just joking around.”

Annalee:                    [00:14:25] Yeah, “We're just joking around.” And I think that we even see this in Next Generation with the episode “Measure of a Man”, where Dr. Pulaski keeps mispronouncing Data's name and calling him “Dah-ta” [Data’s name is typically pronounced Day-ta], which to me...

Charlie Jane:             [00:14:40] It happens a lot in season two, actually.

Annalee:                    [00:14:42] That's like... it's so interesting because we're seeing that now on the political stage, where Trump refuses to pronounce Kamala Harris's name correctly. And it's the same thing. It's just this microaggression. And it's a way of denying the humanity and the agency of the person. You know, you're saying like, “I can pronounce your name however I want. You know, you're not a person. You don't get to have any say over who you are.”

Charlie Jane:             [00:15:04] Yeah. And what's good is that I feel like, unlike with Spock, where Spock, we're supposed to kind of sympathize with people who are making fun of Spock and like, “Oh, yeah, you know, Spock is such a dweeb. He should just take it or whatever.” When it comes to Data, I feel like TNG wants us to kind of empathize with data. Yes. TNG kind of gives Data the last word a little bit where he's like, “No, ‘Dah-ta’ is not my name. Data is my name.” And it's like, yeah, this is this is a name. It's not just a word. And I feel like that is a perfect example of how something shifts a little bit from like TOS to TNG, where we're slightly more sensitive about some of these topics, you know, a little bit. There's still some problematic stuff, obviously, in TNG.

                                    [00:15:45] So, Annalee, you went to a panel recently about like Asian stereotypes and robots, which I'd love to hear you talk about.

Annalee:                    [00:15:51] Yeah, you and I went to Worldcon in Glasgow, which is the annual awesome meeting of sci-fi writers and creators. And I went to this incredible panel about Techno-Orientalism in science fiction and fantasy. And these amazing panelists: it had Eliza Chan, Kelly KaNiahma, Mai-Anh Vu Peterson, and Rho Chung, all of whom are just incredible. And they had this great conversation about how many visions of the future, especially cyberpunk visions, are either overtly or covertly anti-Asian.

Charlie Jane:             [00:16:30] And like fetishizing, kind of.

Annalee:                    [00:16:32] Fetishizing Asians. And that's something that I think has been talked about a lot about sort of fetishizing Asian women and Asian women being kind of part of a future, almost like a background. “Like, you know, you're in the future when there's like hot Asian ladies everywhere.” But the thing that they talked about on this panel that I thought was particularly interesting and that I hadn't heard discussed as much before was the conflation of Asians with robots. And of course, there is a huge stereotype out there about Asians being robotic. And they were saying, you know, often when we look at these visions of the future, there's this notion that we're going to have robots and that's a bad thing. And that because we've already conflated Asians with robots, there's a kind of subtle message in things like, for example, Blade Runner, that this is a fear of Asia coming for us and Asia taking over. And that oftentimes when you have stories that have anxieties about like, what if we all became robots? Or like, what if we all had to be robotic workers? That it's actually really talking about what if, you know, countries in Asia became major economic powers or like Asian Americans like overran our workplaces.

                                    [00:17:46] But one of the things that they talked about was this idea that Asians are portrayed as being high tech and low culture and needing Westerners to kind of humanize them. And this is like a big trope in science fiction that you have these people who are, you know, brilliant at technology, but there's just something culturally lacking about them. And that's where the robot stereotype comes in, right? The idea that these are robotic creatures, not real people. And you just see it like, as soon as they said that out loud, the high tech, low culture thing, I was like, “yes, that explains so many fucked up stereotypes I've seen”, you know, ranging from, you know, ex machina with the silent Asian robot to all kinds of other representations of robots where they're subtly or not so subtly associated with Japan or China, depending on what era it's in, because often, you know, at a time when, for example, like now in the United States, there's a lot of anxiety about Chinese economic power and Chinese military power, you're starting to see like fears about robots from China or robots from Korea.

                                    [00:18:59] And so I think that was just a really interesting way to me of thinking about microaggressions. And that wasn't the term that they were using on the panel. They were just talking about sort of Techno-Orientalism. But as you were saying earlier, stereotypical representations can be a form of microaggression, because if you keep encountering this stereotype in your science fiction, it becomes, over time, cumulatively very traumatizing to keep seeing this image over and over again, the high tech, low culture stereotype.

Charlie Jane:             [00:19:33] Yeah, I think that, you know, this is part of how microaggressions are delivered to us is through the stereotypes that we're force fed.

Annalee:                    [00:19:41] Yeah.

Charlie Jane:             [00:19:41] Yeah, and I think that one of the ways that we see microaggressions a lot turning up in fantasy, but as well as science fiction is through kind of stories that revolve around etiquette and proper behavior, where someone who comes from a different culture can be kind of shamed for having the wrong etiquette or not behaving in the way that the society that they're in views as like proper.

Annalee:                    [00:20:07] You're eating with your hands or something.

Charlie Jane:             [00:20:08] Right. You're eating, you're using the wrong utensils. You didn't curtsy exactly the right way. You're not wearing your clothes exactly the right way. And, you know, there's this term that was invented in the 90s by Ellen Kushner called “fantasy of manners”, which refers to a whole strand of fantasy stories. And there are science fiction stories of manners, too. But the “fantasy of manners” refers to a story where etiquette and, you know, proper behavior, they're kind of part of the plot. And so here's a clip of Ellen Kushner talking about how she came up with the term “fantasy of manners.”

Clip of Ellen:             [00:20:41] So I said, what about calling it “fantasy of manners”, because there is that element to everybody's of, you know, like a Jane Austen comedy of manners or an Oscar Wilde comedy of manners, that social stuff is as important as the magic.

Charlie Jane:             [00:20:52] And she was basically referring to a whole bunch of stories that had been written in the 80s and early 90s by her and her, some of her friends that made etiquette part of the story. And this makes me think of, I know we talk about this book a lot, but I love it so much: To Shape a Dragon's Breath by Moniquill Blackgoose, where a lot of the stories about like whether Anequs can like actually fit in with the complicated social behavioral stuff that the English are imposing on her.

Annalee:                    [00:21:19] Well, also, I mean, this is a big topic in Ursula Le Guin's work, too, you know, in the Known Worlds books, where, you know, characters who are anthropologists or explorers go by themselves to go try to integrate into a civilization that they're unfamiliar with. And a lot of it is, it's not about like swashbuckling, it's about like learning how to curtsy, like you said, or like learning about what is polite and impolite. And I think like a lot of science fiction that is about kind of the social aspects, and I should say science fiction and fantasy, that's about kind of social worlds, that is part of it. And especially now, the rise in cozy fantasy and science fiction, so much of it turns on like who invited you to dinner. Or like who's going to be allowed to be a cook in the kitchen? And like, you know, what are you eating? And yeah.

Charlie Jane:             [00:22:15] Petty slights that cut really deep.

Annalee:                    [00:22:18] Yeah, and I think I love that because it does feel, you know, it's a way of bringing realism into a fantastical story. But I also think that it's a way of, again, smuggling microaggressions into a story because, you know, the manners that are represented as normal and that need to be adhered to sometimes aren't really questioned, you know, like it depends on the story.

Charlie Jane:             [00:22:46] Right. Yeah. I've definitely read cozy stories where like there's kind of a low key indication. Like I feel like a good cozy story is one where like people who act weird, quote unquote, we bring them in when we give them a big fuzzy hug.

Annalee:                    [00:22:58] Yeah.

Charlie Jane:             [00:22:58] Like I feel like, you know, I think about the rat baker in like Legends and Lattes, but I definitely feel like there are stories where like the coziness consists of a kind of normal behavior that people who don't adhere to it are ashamed.

                                    [00:23:12] You know, two thoughts that I have really quickly. One is that Dr. Sue kind of talks a lot about how a lot of microaggressions consist of being really polite to people in a way that's actually horrible. And also I think that, you know, a lot of what this boils down to is marking who's civilized and who's uncivilized. And like microaggressions are a way of letting you know that: “Oh, you're not really one of the civilized people.”

Annalee:                    [00:23:38] Yeah, that's right. That is something that of course comes up a lot in classic science fiction. And it also leads to like on Star Trek, for example, a lot of humor around, you know, how do the Klingons eat, you know, versus how we eat. Like that's a big thing that comes up all the time, weirdly.

Charlie Jane:             [00:23:56] “Gagh me with a spoon.”

Annalee:                    [00:23:58] Yeah, exactly. Yes, that's a deep reference for Gen X. So I think as we go deeper into this era of cozy SF and cozy fantasy, this is like a thing where you should look for red flags in a story that you're reading or watching. You know, does the story question those kinds of social rules and like question the idea of, you know, which manners are the right manners or does it kind of present certain people as having the proper manners and, you know, characters who are different need to learn them in order to be good people.

Charlie Jane:             [00:24:33] Really good point.

Annalee:                    [00:24:34] And we see both. We're seeing stories that are coming at it from both sides. So good thing to be on the lookout for.

                                    [00:24:40] Okay, so after the break, we are going to change the topic from microaggressions to disinformation and misinformation online. We'll be talking to Evelyn Douek about the future of, well, dealing with trolls.

[00:24:56] OOAC session break music, a quick little synth bwoop bwoo.

Annalee:                    [00:24:59] And now I'm really excited to share this interview I did with Evelyn Douek, who is a law professor at Stanford University who focuses on a ton of First Amendment issues.

                                    [00:25:10] Previously, Douek was a senior research fellow at the Knight First Amendment Institute at Columbia University, and she earned her PhD at Harvard Law School focusing on the topic of private and public regulation of online speech. She's also, most importantly, the host of one of our favorite podcasts called Moderated Content, which is – surprise – about how every issue online basically boils down to questions about content moderation.

                                    [00:25:38] Welcome to the show, Evelyn.

Dr. Douek:                 [00:25:39] Thank you so much for having me.

Annalee:                    [00:25:41] So I have a question that I've always wondered about, which is you're from Australia.

Dr. Douek:                 [00:25:47] Well-picked, yes.

Annalee:                    [00:25:51] Yeah, exactly. Why are you from Australia? So how did you get interested in U.S. free speech laws?

Dr. Douek:                 [00:25:54] Yeah, it's a great question. And I wish I had some really interesting origin story about how, I don't know, U.S. free speech laws killed my father or something. And this is like a whole revenge arc. But I don't have a great story like that. It’s kind of a boring story. It involves, you know, as all good things, a lot of procrastinating. So, you know, essentially I did my law degree back in Australia. And it's not an unusual thing for law students in Australia to, after they're finished, go and do a master's overseas to sort of get that extra credential.

                                    [00:26:25] I came over, I did my master's at Harvard. And then I had this idea that a master's at Harvard, obviously during that year, what I wanted to do with my life would magically unfurl before me and it would be abundantly clear at the end of that period. And I was sort of getting towards the end of it and nothing was popping up. It wasn't obvious what I should do next. And someone suggested a doctorate.

                                    [00:26:49] Now that is a terrible suggestion to make to someone who's like not sure with what they want to do with their life. It's like a terrible reason to do a doctorate is that you're like procrastinating and deferring, like making important life decisions. But it is nonetheless the decision that I made and it worked out very well for me. And this was just around 2016 - 2017.

                                    [00:27:09] I had always had an interest in free speech. I'd done some free speech work back in Australia. And at that particular moment, you know, this was the “fake news” crisis, which like in retrospect seems so quaint, but, you know, the biggest threat to democracy was like Macedonian teenagers writing articles about how the Pope had endorsed Donald Trump and everyone was like, “Oh no, how will we survive?”

                                    [00:27:30] But, you know, this was the big freak out at that stage. And it just seemed to me that like the most interesting free speech questions were the online free speech questions at that point. And then sort of just as a matter of pursuing those because of where the companies are based, because they're all, you know, a lot of these tech giants are all American companies and they are so sort of both culturally and legally infused with the first amendment sort of ideology. And they're based here in this country. That sort of led me down the first amendment path.

Annalee:                    [00:28:01] Yeah. Interesting. So you kind of were bit by a radioactive free speech spider, right? Right in that moment.

Dr. Douek:                 [00:28:07] Right? That’s it exactly.

Annalee:                    [00:28:08] I think the whole nation was.

Dr. Douek:                 [00:28:09] That's how I'll tell the story from now on. That's much better. Thank you.

Annalee:                    [00:28:13] You're welcome. So in the United States, I think the common sense idea is that we have free speech and then there's the opposite of free speech, which is censorship. But in your work and in your podcast, you talk a lot about content moderation, which I think sits between those two extremes. And so I wonder if you could define “content moderation” for us and then talk a little bit about where it does fit on that slider between free speech and censorship.

Dr. Douek:                 [00:28:40] Yeah. So when I say “content moderation”, what I mean by it, generally speaking, is like platforms, systems and rules to determine how they treat user generated content on their services. And it's a pretty broad definition. So it encompasses far more than like what they choose to take down or leave up. It involves sort of all of their decisions about posts that they choose to label, how they choose to sort of enable fact checking, what they prioritize in their news feeds and all of those sorts of things, and the affordances that they give users. I think of all of that as content moderation, because all of those decisions impact what we as users of these platforms sort of can see and can say on these services.

                                    [00:29:23] And the reason why I think it's tricky to sort of conceptualize is we generally think of censorship as government, right? The government censors people. We think of, you know, when a government is passing a law or arresting people for what they say, we think of that as censorship.

                                    [00:29:41] And the thing about content moderation is it's coming from a private actor, right? It's not the government. And even though these are very, very big and powerful companies and they wield like an enormous amount of influence, they are not the government. They can't lock you up. They can't strip you of your civil rights, all of those sorts of consequences that can flow from government sanction. And so how do we think about that? And so, you know, there has been this reaction to say, “oh, no, you dummy, like content moderation isn't censorship because it's a private actor and censorship only means the government.” And I just find that a little bit too simple.

                                    [00:30:15] I think that that's a too simple way of thinking about that, because there are obvious like expressive and free speech interests at stake in content moderation decisions. Like some of these decisions are the most consequential free speech decisions that get made. And, you know, these systems of regulation, if you like, that these companies have set up are some of the largest systems of speech regulation that the world has ever seen.

                                    [00:30:40] So I think that, you know, we should be interested in and sort of cognizant of the very valuable sort of expressive interests at stake in these decisions, even if they are not coming from the government, they are not sort of constrained by the First Amendment. Nevertheless, they do really matter for the shape and quality of public discourse.

Annalee:                    [00:31:00] Yeah, absolutely. And I think that a lot of people's that's the only access they have to public discourse is if they're on, you know, X or Facebook or Mastodon or any number of other, you know, new social spaces that we have online. And I was curious about whether we see content moderation anywhere that's not in an internet space or an app space. Like, is there kind of a precursor to content moderation from, you know, pre-internet times? Like what's going on there?

Dr. Douek:                 [00:31:32] Yeah, I mean, so I have this quip that everything is content moderation.

Annalee:                    [00:31:37] Yeah, right.

Dr. Douek:                 [00:31:40] Thank you.You know, I mean, it is a term of art. And so generally, if you're talking about it, you do generally mean social media platforms – like that's what we think of as content moderation and it's their content based decisions that they're making. But if you think of it as like the decisions that private companies are making about the dissemination of speech and then, you know, you happen to be studying it for a doctorate or something, you start seeing it everywhere, right? Like there's all of these decisions about the dissemination of speech that we could think of as content moderation.

                                    [00:32:07] You can think of like Hollywood's decisions about removing content that might offend China as a content moderation decision. You might think about like, obviously, you know, Spotify as a platform, but like what podcasts they have or what songs they allow people to upload. We might think about like payment processes like, you know, Venmo blocking some transactions that mentioned Palestine, for example, which is something that's happened in the past. That is also in a sense, a content moderation decision.

Annalee:                    [00:32:34] You're preventing adult websites from getting credit card processing.

Dr. Douek:                 [00:32:38] Exactly. You also might think of like, you know, these huge debates about should primetime news shows show the Trump rallies or not? Should they show them in real time? Should they show them with fact checking? All of those kinds of decisions. I mean, it's funny, you hear the conversation about them and they are so similar to the kinds of conversations we have about platforms and their content based decisions and should they leave this Trump post up or should they, you know, attach a label and things like that. And we think of one as content moderation and one as, I don't know, editorial decision making, but they are in many ways very, very similar.

Annalee:                    [00:33:10] So maybe the prehistory of content moderation is editorial decision making.

                                    [00:33:15] I really want to talk to you about the SCOTUS decision, the Supreme Court decision in the job owning case, which I know you've thought about a lot. I've thought about a lot thanks to your work on your podcast. So tell us about that decision, if you can, in a, you know, shortest possible way. But then also, I'm really just interested in how it's going to shape the future of content moderation. Like, where is it going to take us in the next like year or two?

Dr. Douek:                 [00:33:40] Okay. So summarizing it briefly is tricky because I think the record in this case was like 14,000 pages or something. But look, let me, let me try. I mean, essentially the allegations were they were a group of some set of individual plaintiffs and states arguing that the Biden administration, various actors in the Biden administration had applied informal pressure or what is colloquially known, often known as “jawboned” social media platforms into removing content from their services.

                                    [00:34:10] And these were largely, the allegations were largely things like election disinformation, the Hunter Biden laptop story, and things around COVID, right? That like members of the FBI had been contacting the platform saying you need to take down this stuff and, you know, watch out for foreign interference and including the Hunter Biden laptop story. And then, you know, the CDC had been working with platforms to say, this is the kind of thing that's COVID misinformation. This is false about the vaccines. You need to take that down. Or members of the White House also making those kinds of statements and, you know, Joe Biden going out and saying, “The platforms are killing people by leaving this content up”. And the allegation of these plaintiffs was that this was a violation of the First Amendment, right?

                                    [00:34:48] Like that the government could not pass a law censoring that content directly – that’s exactly the kind of decision or law that the First Amendment is supposed to protect against. And so because of that, they should not be allowed to route around the First Amendment and go and knock on these platforms’ doors and say, “Hey, we can't pass a law, but would you mind doing us a little favor and taking this down?” Or not saying it so nicely, but saying, “Hey, if you don’t take this down, we're going to, you know, do XYZ and we're going to regulate you and you were going to make your life hell and we're going to make everything really difficult.”

                                    [00:35:17] Now that is like legitimately a problem. We should absolutely not allow the government to sort of use a backdoor in the First Amendment and use these extremely powerful corporate actors to remove content from the public sphere. If we don't want them doing it directly, we don't want them doing it indirectly. So this is sort of something that I think is a really, really important issue in the public sphere. And how should we think about the government platform relationships?

                                    [00:35:40] Now the problem is that it's also really important that the government can communicate with platforms, right? Like if the government knows about a Russian influence operation on its platform and it has this intelligence, it's helpful to be able to say to the platforms, “Hey, you know, do what you want to do, but we have this intelligence. We've seen this account activity.” Or if the platforms who themselves want to take down COVID misinformation, but you know, they're not, you know, doctors or they don't have sort of this expertise, they go to the CDC who's literally making the rules around what we should do and releasing public health information. They go, “Is this true? Is this health misinformation? Is this not?” And then they get guidance from these authoritative sources to tell them what to do.

                                    [00:36:19] So drawing the line between where is it sort of coercive? Where is it problematic? Where is it trying to evade the Constitution? But where is it helpful? And where is it useful? And where is it, you know, a thing that helps shape the public sphere more productively? That's, that's really, really hard.

                                    [00:36:35] Now the problem with this case is that the sort of, this was like a throwing spaghetti at the wall kind of factual allegations that sort of, there was no sort of clear evidence that these particular plaintiffs had been censored as a result or, there you go, there's that word, censorship, moderated by the platforms as a result of government pressure. It was sort of very unclear. The timeline didn't match up in terms of like when their content was moderated versus when the platforms were talking to the government, all of these sorts of things. It was sort of a sprawling set of allegations that wasn't carefully pleaded and sort of played on conspiracy theories rather than actual evidence.

                                    [00:37:11] They got some sympathetic judges in the Fifth Circuit in the first couple of rounds. But by the time it got to the Supreme Court, the Supreme Court sort of said, “No, we're not having any of this. This is not a properly pled case. You don't have standing to bring this claim.” Now that's probably right. And it sort of says, you know, people that make these allegations, they need to make them more carefully – that hasn't shut the door to all of these kinds of claims in the future. So importantly, if there is evidence that the government is, you know, coercing platforms into taking down particular content, that could be a First Amendment violation.

                                    [00:37:44] But I think that, you know, as a result of this court case, as a result of these allegations, there has been more awareness from the government and from the platforms that like they should be careful around in the ways in which they have these relationships and that the government shouldn't be trying to force platforms to do something that they don't want to do, because that's exactly the kind of thing that the First Amendment is supposed to protect against.

Annalee:                    [00:38:06] But isn't the danger that, you know, there does need to be this open dialogue between the government and these platforms? Like you were saying, like we actually would want the government to be notifying, say Facebook, if they're like, “Hey, we noticed there's some Russians tinkering with your platform and spreading misinformation.” Or if people are in fact spreading election misinformation, it would be nice if they could work with local election officials just to figure out how to get the correct information out there. Like, so it seems like a case like this could have a chilling effect on those relationships.

Dr. Douek:                 [00:38:42] Totally. And I think this is, this is really hard, right? Cause I think it's a balance. And, you know, while this case was ongoing for a while, all of those kinds of communications stopped, right? Like there was reporting that no one in the, in the federal government was talking to platforms about like information operations and things like that, for exactly the reason that you're talking about, right? Is there's this chilling effect. Like, it's not clear what the, what the rule is and so we're just going to stop saying anything whatsoever.

                                    [00:39:06] Now, those kinds of relationships have started up again. They’re communicating in the run up to the 2024 election, right? Exactly. And you know, if there were another pandemic, presumably they would also communicate around that again. But the FBI, and I'm sure other actors in the government, have developed new protocols, new policies around the kinds of communication that need to happen between those parties. And so they haven't released those protocols publicly. I don't know what they say, but you know, presumably they say things like, “Don't yell when you're talking to the platforms, like ask nicely. Reiterate that the platforms have their own first amendment rights to decide what they want to keep up and what they want to leave down. But you know, if you do want to give them a hint, you are allowed.”

                                    [00:39:50] And I think that's, that's got to be right – that these kinds of communications are really important. We don't want them to stop altogether, but it has also got to be right that, you know, that we need to be aware of the very important free expression interests here and make sure that they don't cross a constitutional line. And you know, the thing that I think like is always useful to think about in this context, as well, is this case was obviously a First Amendment case and it was about the US government and the private platforms. But this is the kind of thing that's happening all around the world with all sorts of different governments of varying levels of democraticness, if you like.

Annalee:                    [00:40:26] Yeah.

Dr. Douek:                 [00:40:26] And so, you know, when we're thinking about other governments that are coming to platforms and saying, “take down this because it's, you know, quote unquote, COVID misinformation, wink, wink, nudge, nudge, but it's actually criticism of the government or political speech of some kind.” That's the kind of thing that we should be worried about. And those are the kinds of things that, you know, free speech rules are supposed to protect against.

Annalee:                    [00:40:46] Yeah. So one of the things that's been happening, as I'm sure you've noticed, is that many of the major platforms like Facebook, certainly X, they're either getting rid of content moderation teams, they're getting rid of trust and safety teams that often handle content moderation. It's like all the platforms are abandoning content moderation. That's how it feels. Is that true?

Dr. Douek:                 [00:41:09] You know, something I think really useful to remember in this context is like how fast this is all happening. Because this is like, we think of, you know, what we had maybe a year or two ago as sort of maybe, I don't know, the natural state or the equilibrium, but it wasn't at all, right? Like something that always blows my mind to remember is that Facebook released its community standards for the first time in 2018, right? Like six years ago was the first time that we even saw the public rules that Facebook uses on its platform. That's so new, right?

Annalee:                    [00:41:44] Those would be kind of their content moderation handbook in a sense.

Dr. Douek:                 [00:41:48] Right, exactly. So before that, we knew that they applied some sort of rules, but we had no idea what they were. And people were like, well, this is pretty bad. Like how are you making all these important decisions? And we have no visibility. So Facebook released those standards. And then it was sort of only over the next few years that we got more transparency about like how often are they applying those standards? We got things like appeals. The first label on a Trump tweet was in May 2020. It's not like this has happened since the dawn of time that they were labeling misinformation, right? It's like a really recent development.

                                    [00:42:18] And that all came out of, I think, a reaction to the tech clash. Like there was the 2016, the “fake news” crisis that is part of my origin story of the spider bite. So then there was the reaction to that, which was companies like dramatically ramped up their content moderation, gave more transparency. And then we saw that really sort of in full force in the pandemic and in 2020 around the US election, and then there was the backlash to the reaction to the tech clash, right? And then it was like, “Whoa, what are you guys doing? Like, why are you policing truth?” And that got highly politicized in particular, they came under attack from the right about quote unquote conservative censorship. And now I think what you're seeing, what you're talking about this retrenchment is a reaction to the backlash to the reaction to the tech clash, right? And so it's really unclear how that pans out from here.

                                    [00:43:08] I don't think we have gone anywhere way back to like sort of before all of that. I think there's still these very large infrastructures. There's lots of, you know, tens of thousands of people still working on this at the major platforms. But it's a really interesting question. I have no idea how aggressive are they going to be in the lead up to this election? You know, last election, we had all of this labeling and ultimately the de-platforming of Donald Trump. Now we see all of the major platforms have allowed Donald Trump back on their platform.

                                    [00:43:39] Are they going to take a different approach? Are they going to take the same approach if and when he breaks their content moderation rules? I genuinely don't know sort of how they are thinking about approaching it this time around.

Annalee:                    [00:43:49] Yeah. Do you think it's, it may be the case that we're going to see this kind of back and forth for a while where we get kind of strong push for content moderation, hiring tons of people for that and then getting rid of them again?

Dr. Douek:                 [00:44:03] Yeah, I don't know. Like maybe, maybe it's just going to be a cyclical, like, uh, they, they under enforce and then they get in trouble. And we go around and around. I mean, the other thing to think about, uh, to remember is that for these platforms, like a lot of content moderation is a cost center, right? Like it's expensive. They have to hire a lot of people. They don't really sort of think of themselves as, you know, the speech regulators, as much as sort of, that's the parody that we have of them or that some people try and sort of portray them as in these, you know, sort of, uh, fever dreams.

                                    [00:44:34] I think a lot of these platforms are like, you know, we want to spread, you know, dog memes and, you know, like, uh, and happy things and, and don't really think of themselves as like the big speech police. And they're not really interested in that. And it's all downside, right? Like, I mean, you see, for example, Meta's decision in the last couple of years to, um, de-prioritize political content. And that is like, if they were like evil masterminds that wanted to like swing the political debate, they would not be stepping away from political content. But instead they're just like, “Look, this isn't worth the headache for us. Like we would rush rather be, you know, a happy, happy, joyful place than having to make all of these really difficult political decisions.”

Annalee:                    [00:45:12] Yeah. Of course, they still are making those decisions by deciding what's political and what isn't, but that's a whole other conversation.

Dr. Douek:                 [00:45:20] And you can't get away from it. Like, I think that's the thing is that there is no neutral, right? Like there's no way to not be making these decisions, a decision to leave something up or to prioritize a certain kind of content and not other kinds of content. All of that is heavily values infused, right? There is no way to sort of go back to, you know, a quote unquote neutral platform. That's just a fiction. It never existed and it never will exist.

Annalee:                    [00:45:44] Yeah.

Dr. Douek:                 [00:45:44] But it is possible that they, you know, will be less open about it, less overt. I don't know, less active in various ways, but I don't think, yeah, I don't think they can abandon it completely.

Annalee:                    [00:45:54] Well, thank you so much for joining us and for answering the hard questions with nuance and subtlety. Is there a place that people can find your work online?

Dr. Douek:                 [00:46:03] You know, I'm going to, I don't post as much as I used to. So I'll instead point listeners to the Moderated Content podcast, stochastically released podcast about content moderation. And you'll find myself and Alex Stamos talking about all things online there.

Annalee:                    [00:46:18] Yeah. Love that podcast. All right. Thanks so much for joining us.

Dr. Douek:                 [00:46:22] Bye. This was fun. Thanks so much for having me.

[00:46:24] OOAC session break music, a quick little synth bwoop bwoo.

Charlie Jane:             [00:46:27] Thank you so much for listening. You are awesome and super and we love you.

Annalee:                    [00:46:32] We do.

Charlie Jane:             [00:46:32] If you just like randomly stumbled on this podcast, you can find this podcast wherever you find your podcasts: Apple, Spurble, Snoodle, Boople. All the places. If you like us, please leave a review. Leaving a review makes, really makes a huge difference and helps us so much. And if you want to find us on socials, we're on Mastodon, we're on Bluesky, we're on Instagram, and we have a Patreon: Patreon.com/ouropinionsarecorrect. Thanks as always to our wonderful, brilliant audio producer and engineer, Niah Harmon. Thanks a billion to Chris Palmer and Katya Lopez Nichols for our music. And thanks again for listening. If you're a patron, we’ll see you in Discord. We'll be back with a mini episode next week about Lord of the Rings. And everybody else, we’ll be back in two weeks with another new episode.

Both:                          [00:47:17] Bye!

[00:47:18] [OOAC theme plays. Science fiction synth noises over an energetic, jazzy drum line.]

Annalee Newitz