Episode 36: Transcript

Transcription by Keffy

Annalee: [00:00:00] Welcome to Our Opinions Are Correct, a podcast about science fiction and everything else. I'm Annalee Newitz. I'm a science journalist who writes science fiction. 

Charlie Jane: [00:00:08] I'm Charlie Jane Anders. I'm a science fiction writer who thinks a lot about science. 

Annalee: [00:00:13] Today we're going to be talking about the future of the law and law enforcement and we have a special guest with us to figure it all out. 

Charlie Jane: [00:00:22] Is it Judge Dredd? 

Annalee: [00:00:23] It's Judge Cyrus Farivar. 

Cyrus: [00:00:26] Oh Man. I don't think I subscribe to that podcast yet. 

Annalee: [00:00:31] That'll be pretty good. Judge Cyrus. Cyrus works at NBC News as a reporter covering tech and policy and he's also the author of an amazing book that you should check out called Habeas Data: Privacy Versus the Rise of Surveillance Tech. And he and I were colleagues at Ars Technica, where we spent a lot of time talking about mostly Star Trek, but also technology. So we're going to be talking about, today, how the law is becoming more futuristic, especially around surveillance and also some Star Trek.

[00:00:58] Segment change music plays. Drums with a bass line including bass drops.

Annalee: [00:01:26] Right now we're in a phase where the law and law enforcement are getting to be more futuristic, at least in sort of the cyber punk sense of futurism. We're seeing a lot more surveillance being used. We're seeing a lot of new devices being deployed in the law and there's a lot of ways that the law is trying to regulate technology. There's things like autonomous cars and robots and drug patents. There's AI. There's reproductive technology. So there's a lot of ways that we could slice this. But what we want to talk about today is specifically how surveillance technology is working its way into the law. And we, actually, in San Francisco where we're recording this, we just had a victory for privacy rights. Our Board of Supervisors decided to outlaw facial recognition technologies in San Francisco. Cyrus, can you tell us more about that?

Cyrus: [00:02:17] Sure. Yeah. The bill that you're talking about passed. It's called a first reading on the San Francisco Board of Supervisors by a vote of eight to one. It passed, saying that the city agencies, which most often means the police, but could be other city agencies as well, can't use facial recognition technology. And also has to be much more transparent in terms of how they obtain and describe the surveillance technology generally that they use. But one of the main provisions, as you say, was this outright ban on the use of facial recognition technology.

[00:02:46] And as the first city in the country to do this, the city of Oakland, the city of Somerville, Massachusetts, are probably not far behind and maybe others as well that I don't even know about. But I think we're now at a really interesting moment where facial recognition technology is getting really good. There's been a lot of coverage in the media lately that you guys probably have seen, especially in places like western China, where it's being used to oppress people. And I think there's a lot of concern that this technology that, you know, maybe is flawed in how often it's used, could be used in not so great ways. So that's the decision that the San Francisco government has made for right now. 

Annalee: [00:03:20] And here's a clip of one of our Supervisors, Aaron Peskin, talking about why they made this decision.

Aaron Peskin Clip: [00:03:26] It's not ready for prime time, but even if it is ready for prime time, I think this is a genie that we want to put back in the bottle.

Annalee: [00:03:32] Why do you think that that San Francisco, but also potentially other cities, are drawing the line at facial recognition? I mean we already have a ton of other surveillance that's being used all the time by law enforcement. We have things like recognizing license plates, for example, which can kind of lead to following someone around in their car without ever getting a court order. 

Cyrus: [00:03:51] Right. 

Annalee: [00:03:52] Why do you think facial recognition is kind of this flash point for people?

Cyrus: [00:03:55]  I think facial recognition is one of those things that we've seen depicted in science fiction, whether literature or TV shows or movies, where being able to recognize somebody's face is something that's very personal to people. I mean, people do things to change their face in small ways, piercings, facial hair, dye their hair or whatever, but for the most part, faces are very intimate and more or less constant, right? Our faces change over time, certainly, but at a very slow rate. Obviously, it's, as we've seen in the Nick Cage documentary Face Off, it's very difficult to change your face.

Annalee: [00:04:27]  Uh-huh.

Cyrus: [00:04:28]  And so, you talk about, like, compared to other types of surveillance technology, it's something that I think for many people is troubling and is a way to think about, well, okay, like I know that the police officer can ask me for my ID. But you know, we know that fake IDs exist and we know that there are ways to kind of evade that.

[00:04:50] It's hard to try to figure out how you might resist or evade being recognized as a person, right? When you're out in a crowd, whether you're at a political protest, whether you're at a strip club, whether you're at a marijuana dispensary, right? Even if there's, like, surveillance cameras in that place, you probably have some reason to believe that you're probably not going to be noticed or identified specifically to you. But now that we're rapidly entering a world where that technology is getting really good really fast, that calculus is changing. 

[00:05:21] So I think that San Francisco being a place that is mindful of those concerns, or tries to be mindful of those concerns, and also a place where, you know, this technology is being developed in our own backyard. I think wanted to send a message saying, hey, let's maybe put the brakes on this for a little bit.

Charlie Jane: [00:05:38] Isn't there also a concern that face recognition technology sometimes gets it wrong, especially with certain types of faces and certain ethnicities of faces, that you could get a false identification which could be used to arrest somebody who was actually innocent. 

Cyrus: [00:05:52] Absolutely. Yeah, I think that's a really great point. And there have been many academic studies that have shown that the training data in particular for many, many facial recognition systems are not as accurate, particularly against women, particularly against people of color. And that's obviously problematic. We don't have many specific instances so far of facial recognition being used at the local level for like police departments. But that's coming really, really soon, right? 

[00:06:17] So where I live, just across the bay in Oakland, all Oakland patrol officers, I don't know if this is the case here, have to wear body cameras. The body camera market right now is dominated by a company called Axon, which used to be called Taser. That makes the Taser weapon that you're familiar with. The Axon company has said and they have patents that explicitly say this, that they want to do more facial recognition into body cameras. So that's a technology that we know is coming. It has not yet, as far as I'm aware, been introduced on the streets of anywhere in America. But it does currently exist in airports and in many other instances. Even, there are now stadiums that are using this just for the purposes of instead of having a ticket, right, you can just use your face to get into the thing. 

[00:07:01] This is coming right now, right? This is happening, you know, as we speak and I think many of us don't realize how prevalent it already is. We haven't yet seen it in the context of, as you were talking about, city law, local law enforcement. It seems to me absent regulation by cities like San Francisco saying you don't have it or you can have it, but under certain circumstances, it seems to me that it's almost inevitable.

Annalee: [00:07:23] I know for your book you've researched kind of the backstory on how surveillance devices have been regulated by the law. So what's kind of the deep backstory on this? Like where does the idea come from that we can just say like, all right, we have a technology, but you can't use it. 

Cyrus: [00:07:35] There's the strange distinction between searches and non-searches in the legal parlance, right? So if you think about what the fourth amendment of the constitution says, it protects us against unreasonable searches and seizures, is the key phrase. Unreasonable basically means anything that's not reasonable. What is reasonable, then? Reasonable is, for example, a consent search. If I come to your house, if I'm a police officer, and say, hey Annalee, can I search your apartment? And you say, yes, you've consented to a search. So that's a reasonable search. We're okay with that.

[00:08:05] Another reasonable search would be if a judge signs off on a warrant saying, oh, we think Cyrus is dealing drugs out of his house and we're gonna find, you know, the drugs in his house and the judge has agreed to that. Right? We call that process of warrant.

Annalee: [00:08:19]  And there has to have been some evidence for the judge to do that.

Cyrus: [00:08:21]  Yeah. And there's like a process that goes forward. The judge can't unilaterally say, and the police can't unilaterally say, we're going to search your house or your car or whatever. So an unreasonable search is to prevent the police from doing exactly that, from just coming to kick down my door and saying, well, we heard… somebody told us that you were selling drugs in your house and now we're just going to rifle through all your stuff. Right? That's what the founders were concerned about. One of the things.

[00:08:46] So then we get into questions, like, well, what does the word search mean, right? I think we understand it when, you know, a bag is opened or a door is opened, but what does it mean when it's out on the street? What does it mean when the police, if they're looking at me and I'm riding my bike down Market Street, or whatever, can the police take note of me? Can they make a sketch of me? Can they take a picture of me? Can they take a video of me? Can they just mark the fact on a piece of paper or on a computer that I was there? All of those things are considered to be not searches. So the fourth amendment doesn't even enter into the equation. 

[00:09:20] When you kind of escalate the levels of technology, and this is kind of what my book talks about in more detail, like what technologies are allowed or not. We're, I think generally okay with police using their eyes to see things in the world. We're maybe okay with them using binoculars or some other, eyeglasses or like something like that. There was a case that I write about in my book where the police used a device. It was called the FLIR, a forward looking infrared. So they were peering through the walls of somebody's home to detect heat, infrared heat coming off of a guy's grow lamp who was growing marijuana in his house. 

Annalee: [00:09:52] That's very Batman, right? Like, that’s what Batman did.

Cyrus: [00:09:55]  I guess so. Yeah. I guess I never thought about that. See, you guys have all this deep knowledge. 

Annalee: [00:09:59] Oh yeah. 

Cyrus: [00:10:01] So in that case, the Supreme Court said, no, you can't peer through the walls of somebody's home without a warrant. That decision was… the majority opinion was written by Chief Justice Scalia. So that was a case called Kyllo, K-Y-L-L-O, up in Oregon, this happened.

[00:10:14] We get into these weird situations where it's not clear whether this or that technology is a search.

[00:10:20] Another case that I wrote about in the book that I often describe as being basically Breaking Bad in Minnesota in the ‘70s, is a case called… So, I assume everybody has like awesome handlebar mustaches, is a case called Knotts v. United States, which basically involves some dudes making meth in Minnesota and they set up a meth lab in Wisconsin and they're driving from one place to another, about a hundred miles distance. And they have a barrel, like an oil drum size barrel of chloroform that they've legally purchased, but unbeknownst to them, the police have placed a 1970s era technology. This happened in the ‘70s, was described as a beeper, which is essentially a low range FM transmitter. So it's just giving off the signal of the location of the barrel. So that gets challenged. Ultimately the Supreme Court says, that's fine, it's okay to do that because there's no reasonable expectation of privacy in public.

[00:11:09] They say, look, putting a beeper on this device that gives up the location of the barrel is just like putting a hundred officers on the road. That if you put a hundred officers on the road, nobody would have a problem with that. And so this is basically the same thing was what the court wrote. 

[00:11:22] That court decision which was issued in 1982, the year I was born, is what allows technology that we've had now for decades, which is license plate readers, right? Which is something that captures people's license plates, you mentioned a moment ago, as people drive down the street at very high speeds. And I'm not aware of any court cases yet that have dealt with facial recognition specifically in this regard, but it seems to me that if you buy the logic that there's no reasonable expectation of privacy in public, which means that plates can be captured. It seems to me that it must also be true that people's faces can be captured by a device and kept as data by a law enforcement agency for maybe months or years on end. 

[00:12:03] So, I think that is concerning, for a lot of people who are worried, not only that it might misread or might falsely identify you as somebody or falsely associate you with somebody. But also that, let's say you get falsely identified and then also there's a pattern of behavior that is wrong to what you actually do or how you actually behave. And then, you know, there could be interesting consequences. 

Annalee: [00:12:27] Presumably if you do buy the idea that you have no privacy in public, then you can basically follow somebody using surveillance. You can, you know, say, all right, well their face is here, here, here. I mean in a sort of Person of Interest, kind of way.

Cyrus: [00:12:39] Exactly.

Annalee: [00:12:40]  Which is exactly what they do all the time in that show. And it's not that far from reality.

Cyrus: [00:12:45]  No, it's really not. I've heard this from police officers, say, well look, license plate readers, we're not tracking you. It's not surveillance, they say. Because, they're saying, look, we're taking a snapshot here and a snapshot three days later over there and a snapshot a week later over here. And that's not tracking. That's not like watching you all the time. But I think the counter argument to that is well, given enough data points, it almost is. 

[00:13:11] When I was at Ars Technica, you may remember I did a story about license plate readers in particular, where I obtained through public records requests from the city of Oakland, 4.6 million license plate reader records that they gave me over the course of five or six years. And I felt really creepy about it. Kind of a creepy superpower where I could look up, I didn't have people's names, but I had plates, dates, times and GPS locations.

[00:13:35] So I could see, for example, that this plate typically in the, you know, morning commuting hours has been scanned at this location and in the evening hours has scanned at this location. And occasionally it's, you know, here, there and everywhere. And even with a city council member, I went to him and I asked him for his plate number and I could see and I could accurately guess where he lived because you know, if you drive a car, if you drive home most of the time, you probably park on or near your street. So I could say, hey, I bet you live on this particular block. And I was right. 

[00:14:05] So, you know, I think that license plate readers can shed some light on what we might expect from facial recognition in the near future. 

Annalee: [00:14:15] Yeah, for sure. And it would be much more in depth of course, because it would be not just when you're in a car—

Cyrus: [00:14:19]  Yeah, exactly. 

Annalee: [00:14:19] —but wherever you're going for your rounds. I'm curious about what you're interested in, in the future of tech law. What are you kind of following? What are you watching evolve? Like where is the law kind of struggling to keep up with the technology that's being used. 

Cyrus: [00:14:37]  Yeah, well, it’s really interesting because, so my book is focused on actions of government, and as we were talking about this law here in San Francisco focuses on what the government can do. But there's a whole separate realm about what companies can do or should do or actually do. 

Annalee: [00:14:52]  Yes.

Cyrus: [00:14:52] So one example of that is a another company here in San Francisco that's called Ever and Ever is a company that started out some years ago creating a very normal fun, safe, photo sharing, photo storage app that's called Ever. Uh, it has a very cutesy cursive logo and it’s sometimes called Ever Album.

[00:15:12] The company officially is called EverAlbum as one word, Incorporated. EverAlbum went along and was doing their photo app for some time. And then they decided that they, you know, in the words of their CEO, in an interview to me, were not getting the returns that they wanted for a venture-backed business and so pivoted to having an entirely, essentially a separate brand and almost a separate company within the same company that they call EverAI. So, so if you go to Ever, E-V-E-R.AI, you will see a company that markets itself towards law enforcement, towards the military, towards national security, towards other companies and saying, hey, we have this facial recognition service that we're selling to other companies. 

[00:15:55] And the story that I did with my colleague Olivia Solon at NBC not too long ago, talks about how many people who innocently thought that they were just uploading photos of their friends or their relatives to Ever for their own personal storage purposes. And maybe were being used, you know, facial recognition in some, again, seemingly innocent way, like show me pictures of me and my brother or whatever, which I think most people in a limited context don't have a problem with. 

Annalee: [00:16:22] It's what Google Photos does. 

Cyrus: [00:16:23] Exactly. Lots of services do this, but I think many people are concerned and we spoke with a number of people that were concerned that then the benefits from that technology and research were then being turned to being used for an entirely different purpose. And we spoke with several people who were very concerned about that. So that's a whole ‘nother element of facial recognition and these types of technologies that's being used on the, on the private sector side.

Annalee: [00:16:47] Does that mean that what we might see, say, in five years is a bunch of shitty companies like this Ever company, right? Saying like, “Hey, we can just totally do data rape on all of our photos, right? And that's probably not going to be their actual sales pitch. That’s just me, paraphrasing. But they're saying, like, basically like, we can offer you everything, right? We can offer you just complete lack of privacy on these photos, but then it'll be up to municipalities or counties to tell their law enforcement organizations, they're offering this to you, but you in San Francisco cannot take it. You know, maybe the people over in Fresno will take it. In other words, it will be regulated at the city level, but the company won't be regulated. The company won't be told, actually you can't do that with your data.

Cyrus: [00:17:34] Yeah, I mean, I think that's a great question. You know, it seems to me that we, in this country, pretty much okay with how private companies act, right? If you think about Google Street View and Google Street View came out more than a decade ago. What is Google Street View? It's a private company sending private cars down every single street in America, taking pictures of every single person's home and apartment and workplace and place of worship and all of those things once, twice a year and doing that for years and years and, you know, indefinitely I guess, right? And we're like more or less okay with that as a society.

[00:18:08] We're, generally speaking, less okay with if the CHP tomorrow said, okay, we're going to take pictures of everybody's house, just like Google Street View, people would lose their mind, I think. We’re generally, it seems to me, more accepting in this country of the actions of private companies and less so of what government does.

Annalee: [00:18:29] But I guess my question is, it's one thing for Google Street View to be collecting that information, making it available for Google Street View, which I mean I use all the time, versus Google saying like, Oh, well we also have a special package that we offer to law enforcement. Or Ever doing that, or Amazon, you know, with their AI facial recognition stuff. One of their biggest customers is law enforcement. 

Cyrus: [00:18:49] Yup. 

Annalee: [00:18:49] I guess that's what I'm wondering about is that, it sounds like what you're saying is that companies will probably continue to just offer these packages to law enforcement and some law enforcement agencies will be permitted by their local government and other ones won't and it'll be this kind of patchwork of, you know, you leave San Francisco and suddenly facial recognition is on the table. 

Cyrus: [00:19:09] Yeah, I think you're right. I think, it seems to me, and my colleague Jon Schuppe at NBC did a great story recently about how facial recognition has become kind of the new normal or is starting to become the new normal in cities across America. As you say, they're contracting with companies to do various things, Amazon being one of them. 

[00:19:26] I do think that that seems to be where we're headed. Absent some kind of major federal privacy law, which many privacy activists have wanted for some time now. We now have a law that's going to go into effect early next year. That's called the California Consumer Protection Act. That's specifically about online privacy and online kind of behavioral tracking and things of that nature. 

Annalee: [00:19:47]  We talked about that in a previous episode.

Cyrus: [00:19:47] Okay. 

Annalee: [00:19:49] Yeah, I'm a fan of that one. 

Cyrus: [00:19:50] So the criticism from many companies, particularly many tech companies in the Bay Area is, they say, look, we don't want, as you say, a patchwork of laws. We don't want a California website and a Illinois website or whatever. This is one of the things I, you know, I often say like, you know, isn't federalism fun? This is one of the things that is interesting about the system that we have in this country where kind of cities and counties are often trying to do their own thing.

[00:20:13] If they think the state is too slow and the states are also maybe if they think the feds are too slow or aren't going the direction that they want to go, have the ability to pass their own laws in that respect. There are some movements to try to do a federal privacy law. How that would look like? What would it take? Would it cover facial recognition? Would it just be about online ad tracking or whatever? We don't know yet, but you're right. It seems to me that we're rapidly moving into a world where certain rules apply in one city or county and not in another. 

Annalee: [00:20:41] All right, well on that note, let's take a break and when we come back we're going to talk about the law in science fiction, and, mostly Star Trek.

[00:20:48] Segment change music plays. Drums with a bass line including bass drops.

Charlie Jane: [00:21:02] Star Trek has had a lot of episodes about the law. I feel like there are two major concerns in a lot of science fiction, both of which turn up in Star Trek when it comes to the law. One is the concern of, you know, individual liberties versus security and how much liberty are you willing to give up to be secure? And Star Trek has dealt with that a lot. And then even more so there's the question of who do we consider a person? Who gets personhood? 

[00:21:24] Just as an aside, before I throw this to you, Cyrus, I'm just going to point out that when the LAPD was militarizing back in the ‘50s and ‘60s, it was sort of becoming more of a military organization and taking on a lot of those characteristics with like high tech equipment and more heavy duty surveillance and stuff. 

[00:21:43] The person who was in charge of selling that policy to the public was a young police officer named Gene Roddenberry, who wrote speeches for the then LAPD chief William H. Parker. It's interesting that then Roddenberry went on to create this show that asked all of these questions. What do you think is the main message that we get from Star Trek about those two questions about liberty versus security and also about personhood?

Cyrus: [00:22:07] Wow, that's a big question.

Annalee: [00:22:09] Just sum it up for us.

Cyrus: [00:22:09] Sure. Fifty words or less, and then we get out of here. Yeah, it's interesting. Yeah. I think a lot of people forget that Gene Roddenberry, as you say, started off as an LAPD officer and actually the show that he developed prior to Star Trek the original series, was a show called The Lieutenant, which featured a young actor by the name of Leonard Nimoy. 

[00:22:29] I think he was very cognizant of that background, of the law enforcement and you know, rule of law type of things. I think in the original series there's a number of episodes. I'm less familiar with the original series, so I'm going to defer to you guys on that. From the ones I've seen, I do know that there are a number of kind of legalistic episodes. 

Charlie Jane: [00:22:46] There's that bunch. Partly because those are cheap to film.

Cyrus: [00:22:49] Sure. 

Annalee: [00:22:50] All right. We’re all going to be in a court room. 

Cyrus: [00:22:51] We're all going to hang out here for a while. Right. So I grew up on Next Generation. 

Annalee: [00:22:55]  Same.

Cyrus: [00:22:56] It's interesting because, because next generation, right, like the opening episode in part, in large part, is a legal court drama.

Charlie Jane: [00:23:04] It is a court room episode, yeah.

Cyrus: [00:23:05]  Right, so, Q comes in and essentially is putting humanity on trial, which I have to say it's like a hell of a way to open a series in general.

Annalee: [00:23:13]  Especially with those outfits. 

Charlie Jane: [00:23:17] I know…

Cyrus: [00:23:16]  It’s pretty insane. 

Annalee: [00:23:16] Pretty campy. 

Cyrus: [00:23:18] Uh, right. That's kind of interesting as a show concept, right? Like we're going to judge the actions of these people and like is, to kind of distort Ali G’s, original postulation right? Technology: is it good or is it whack? Right? Humanity: is it good or is it whack? Like I think that's kind of what Q is trying to say in that context. And Picard is there to argue, yeah, we've done some bad stuff. We're hardly perfect, but we do our best to learn from our mistakes and to improve ourselves. And I think, to me, that's one of the core messages of Star Trek in general is that, yeah, it's hard, it's messy. Sometimes it's imperfect, but that we are trying to better ourselves and trying to learn from our past and from other civilizations and other people who do things in a different way. 

[00:24:01] There's a number of really fun episodes, both in TOS, TNG, DS9, right? I mean this theme of kind of legal Star Trek continues on. Um, I don't think in Discovery we've gotten a legal procedural type episode yet. I was trying to think. Not really.

Annalee: [00:24:17]  Not really. 

Charlie Jane: [00:24:18]  I don’t think so.

Annalee: [00:24:18] I mean we've had discussions about kind of justice, but not, there hasn't been a court procedural.

Cyrus: [00:24:25] I guess there's like tiny elements of that. Like I'm just thinking like wasn't Burnham court marshaled at one point? 

Annalee: [00:24:31]  She was.

Charlie Jane: [00:24:31] She was.

Cyrus: [00:24:31] So there’s [crosstalk].

Annalee: [00:24:32] She was court-marshaled and she's in prison. 

Charlie Jane: [00:24:33] That's the very first episode.

Annalee: [00:24:35] And we have section 31, which is explicitly extra-legal and so we can reverse engineer what’slegal. 

Charlie Jane: [00:24:42] But yeah, one of the main writers of the next generation, the amazing, Melinda Snodgrass, actually is an attorney. She has a law degree and she wrote the episode where Data’s put on trial to see if he's a person or not. She put a lot of like her own legal expertise into that episode. Some of the other episodes that she did.

Annalee: [00:25:00] hether he should be called Data or Dahta. 

Charlie Jane: [00:25:02]  Exactly, exactly.

Annalee: [00:25:02] I loved how that was what started it. Was somebody who refused to give— 

Charlie Jane: [00:25:07] Dr. Pulaski. 

Annalee: [00:25:07] Dr Pulaski. Who was like, I can call you whatever I want. It kind of is like the weird precursor to like pronoun debate, or something. 

Charlie Jane: [00:25:13] It’s like microaggressions, Dr. Pulaski is really big on microaggressions. 

Annalee: [00:25:16]  It is a microaggression. I know.

Charlie Jane: [00:25:18] I mean, seriously. 

Annalee: [00:25:19] Yeah. 

Charlie Jane: [00:25:19] Yeah. 

Annalee: [00:25:19] Also, we have this great clip from Drumhead, which is a Next Generation episode where Picard is kind of talking about, he's in court—

Charlie Jane: [00:25:28] Liberty, yeah.

Annalee: [00:25:29] —talking about liberty and justice.

TNG Clip: [00:25:30] You know there are some words I've known since I was a school boy. With the first link, the chain is forged. The first speech censured, the first thought forbidden, the first freedom denied, chains us all irrevocably. Those words were uttered by Judge Aaron Satie as wisdom and warning.

Charlie Jane: [00:25:57]  I love Patrick Stewart so much. 

Annalee: [00:25:58] I love that episode and I feel like, I mean it's interesting cause it's sets up a lot of issues that we see unfolding in other Star Trek series, too, around, what does it mean to administer justice? When is a courtroom proceeding actually doing what it's setting out to do, which is discover the truth and discover who has actually committed a crime. And in the case of Drumhead, there's this whole thing about basically Star Trek’s idea of what racism is, which is speciesism, right? And so like, who is actually a Romulan and not a Vulcan. 

Cyrus: [00:26:32] Right. 

Annalee: [00:26:33] And why did Picard invite a Klingon onto the ship? You know, and like there's this whole thing about how Picard is kind of a race traitor of some kind because he consorts with Klingons. And so I think, I mean, that really gets into that personhood question, too. 

Cyrus: [00:26:47] And I think that idea sort of ping-pongs back and forth. I mean, I think in original series, obviously the Klingons are the enemy and so it would be inconceivable that there would be a member of Starfleet in the Federation, which obviously happens from episode one of TNG. 

Annalee: [00:27:01] Yeah. 

Cyrus: [00:27:00] Again, we get that theme throughout, right. In DS9, I've been rewatching DS9 for the third or fourth time recently. And like the whole Nog storyline of like the first Ferengi in Starfleet is a really interesting idea because when we're introduced to the Ferengi as a species, they're meant to, I think be… we're meant to be repulsed by them. They have misogynistic views, they are driven by profit and greed and lust and that's it. They're very kind of one-dimensional in that way. But I think one of the things that Star Trek tries to teach us, or one of the things that I've learned from watching it is that we can't be limited in our understanding of what people are, what species are. That they're not limited to their programming, essentially. 

[00:27:43] They're not limited to the, you know, six-word description that you get in the episode capsule or whatever, right?

Annalee: [00:27:48] Six-word description of Ferengi. 

Cyrus: [00:27:50] Yeah, it sounds dumb, but I think it's really true that… and one of my favorite TNG episodes, this is maybe diverting slightly from the legal framework. There's the episode of I, Borg where they capture this borg and the idea that you can rehabilitate a borg, you can introduce a dangerous idea and we're not going to commit a war crime. We're not going to commit genocide. We're not going to turn this guy into a bomb, essentially and instead we're going to infect him with ideas rather than destructive power.

Annalee: [00:28:19] Yeah, we’ll just use social media.

Cyrus: [00:28:22] Yeah.

Annalee: [00:28:22]  We’ll just put a meme in there, yeah.

Cyrus: [00:28:25]  That is basically what they did.

Annalee: [00:28:26]  That is basically what they did. But it's a meme that we can believe in because we all know that individualism is great. So, what could go wrong? 

Cyrus: [00:28:34] You know, and it's interesting because you were mentioning Melinda Snodgrass, who, as you say, has this background as being an attorney and applied that knowledge and context to the Star Trek episodes that she worked on. It's funny if you go out online like on Twitter and you ask attorneys who amongst attorneys are Star Trek fans, like they swarm. There’s like a whole amazing community of people on Twitter who are attorneys.

[00:28:57] I did this story for Ars where I was asking about like the present day legal framework for like the Prime Directive. And I got an email from a guy who said, I am an attorney in Florida. I think of myself first and foremost as a Starfleet officer. And I was like, that's amazing. 

Charlie Jane: [00:29:10] Oh my God, that's awesome. 

Annalee: [00:29:12] I wish that that would be true in like police departments, too. I feel like we need more Star Trek fans among police officers, although I don't know, how there may be a lot. 

Cyrus: [00:29:19] There may be. Yeah, it's a good question. I did a story recently about the CCPA, the California Consumer Privacy Act, and my colleague Dave Ingram and I got to interview the Attorney General of California, Xavier Becerra, and I found out that he's a big Star Trek fan. And we talked a little bit about, about TOS, and he talked about the Klingons and Scotty and all that stuff. And I know that there's a sitting California Supreme Court Justice who's also a Star Trek fan. Like, they're out there. 

Charlie Jane: [00:29:42] So this all kind of makes me wonder, because so much of our present day earth legal framework is built around safeguarding private property and private ownership. And in Star Trek, at least in theory you don't have those things anymore. So what kind of glimpses does Star Trek give us of a legal system that isn't centralized enshrining that the right of private property. 

Cyrus: [00:30:02] Yeah, it's interesting. There's a Star Trek podcast, if I may plug a different podcast on this podcast.

Charlie Jane: [00:30:08]  Noooo. Not allowed! [laughing].

Cyrus: [00:30:08] There's a podcast that I love called The Greatest Generation. Their tagline is a Star Trek podcast by two guys who are a little bit embarrassed to have a Star Trek podcast, which is a feeling that I identify with, sometimes.

Charlie Jane: [00:30:17]  I would not be embarrassed.

Cyrus: [00:30:19] You’re a better person than me, Charlie Jane. But, so one of the things that they point out, which I had never really thought about until they kind of called it out in one of their episodes, is that like the doors, at least on the Enterprise D don't really seem to have locks. Like, you can just seemingly, or they have flimsy locks, in a sense. And kind of like with respect to private ownership, there's a million TNG episodes where somebody can security override access to somebody else's quarters or whatever or somehow trick the computer into, you know, going into somebody else's quarters.

[00:30:50] And I wonder if in that universe, like you say, where there is less of an emphasis on private property and private space. Most of the show, as we see depicted, most of the Enterprise as we see it depicted in the show is in public spaces. In the bridge. It's on the Ten Forward or what… we don't have too many episodes that are like centered on people's private quarters. We know that Riker plays trombone or whatever.

Annalee: [00:31:14] Data has Spot.

Cyrus: [00:31:15] Data has Spot, right, and for some reason Data has like his own little special computer in his quarters. That's never really explained. I think that that's a really interesting idea where there's this world where people's private spaces may be a bit different. And part of that, I think, is determined by, I wonder if like if you live in a world where people can transport in and out of spaces all the time, maybe locks don't even matter because like you can just be anywhere, right? In DS9 there's a kind of a throwaway line where Sisko talks about how he used to beam home from the Academy to his dad's restaurant in New Orleans all the time. And Nog turns up at the restaurant eat tube grubs with Joseph Sisko and you could just do that. I don't really know. It seems like, if you live in a universe where, generally speaking, people are taken care of, I think we're sort of meant to believe that in the Federation universe, at least on earth, people are pretty okay. That they seem to be taken care of and that if you have a profession, if you're in Starfleet, if you're running Sisko's restaurant in New Orleans, you're doing it because you want to, not because like you're being paid gobs of money to do that. 

Annalee: [00:32:16] Yeah. 

Cyrus: [00:32:15] If that's the universe that you live in, then maybe it doesn't really matter that you have this or that object or this or that property.

Annalee: [00:32:24] Yeah. Maybe they just have a completely different conception of privacy because we also know that they can be located at any time. 

Cyrus: [00:32:30] Right. 

Annalee: [00:32:31] You know, we constantly see that. They're just being tracked everywhere. And I think, like you said, that maybe the way that that works, is because there is, at least in the places that we see in the Federation, there's not a lot of class division. And so there isn't that sense of like, I have something to hoard, or something. And because there's allegedly great rights, you know, everybody has lots of human rights or personhood. They don't need to worry about their rights being violated, their privacy being violated. Except in all of the cases we see, where people, you know, privacy is violated and their rights are violated. And we see that a lot in Deep Space Nine, I think. And, of course in Discovery, which is sort of in the pre-history of [crosstalk].

Cyrus: [00:33:11] Right. I feel like there's a number of episodes that deal with this land is being taken away from me, right? Like the entire premise of like the Maquis, right. So like the Maquis are this group of people that feel like they got screwed by the Federation, who are far away, and their land was ceded without their consent, essentially. And there's other episodes where people are like, I showed up at this place, there was some rule that I don't agree with or something like that. Or, I was ordered to leave by some authority. I feel like there's a DS9 episode about that where Kira has to go like kick a guy off of a moon or something and has to blow up his oven. He has like an oven in his little garden or whatever. Do you not remember this? Or like…

Charlie Jane: [00:33:49]  Vaguely.

Cyrus: [00:33:50] Then she’s like, put in the position of the oppressor. 

Annalee: [00:33:52] Yeah. 

Cyrus: [00:33:52] Like going and saying, you have to.

Annalee: [00:33:53] She’s put into that position a lot. 

Cyrus: [00:33:55] Right. 

Annalee: [00:33:55] And I mean, and that is, like I said, that's a theme of Deep Space Nine, which is a postcolonial world, right? I mean Bajor was occupied by the Cardassians and not the Kardashians.

Cyrus: [00:34:07] It’d be a very different show.

Annalee: [00:34:07] Yeah, different world. That’s our world. And then they have to figure out how to reclaim that space, like you said. But that's much more about territory and homeland and identity. Going back to this question of personhood, versus privacy. It's not about private property, it's about my homeland and what land or what planet or moon do I have a right to occupy? Because it's part of my heritage or it's where I have my oven.

Cyrus: [00:34:39] Exactly. 

Annalee: [00:34:38] Hey, having your oven in a place is pretty important. 

Cyrus: [00:34:41] Sure. Sure.

Annalee: [00:34:41] I mean, that’s.

Cyrus: [00:34:42] I get that. I make pizza. It's good. I understand. 

Charlie Jane: [00:34:45] So, I mean, the hope of Star Trek is that as our technology improves, as we get the ability to replicate whatever we want and travel vast distances really quickly and create computers that can think better than humans and so on and so forth, the hope is that we will become better people. And that we will actually kind of leave behind some of our shitty behavior that is part of why we need laws now. And so the real question that Star Trek never really answers is what do they need laws for in a future where all these limitations that we face… 

Cyrus: [00:35:15] To me it kind of goes back. There's the old line from, I believe it was James Madison who said, I think in one of the Federalist Papers, right, if men were angels, no laws would be needed. Setting aside the, you know, sexism within the line. But like the sentiment is a good one, which is that humans as we have understood them throughout recorded history throughout time as we understand it, on our planet. We're flawed, right? We've done as a civilization, bad stuff in our history. And the idea of even having laws is a somewhat recent, you guys have studied anthropology, I'm sure, far better than I have. But like, it seems to me that my like tiny superficial understanding of it is that you want to have laws to regulate disputes over property, to regulate the relationship between citizen and government. And a lot of times those laws are bad. We had a system of laws in this country that legalized a system of human bondage. For a long time we had a system of laws that allowed people to be segregated and we're today dealing with the ramifications of those laws. So laws aren't obviously always inherently good.

[00:36:24] I think that we hopefully would need laws to try to help us to nudge… to get not in the universe of we're going to know where everybody's face has been forever at all times. Whether that's the 18th century version, which is why the fourth amendment was written to the 21st century version, which is we don't want the government to take pictures of our faces when we're going to buy marijuana at the marijuana dispensary or whatever. To maybe a future version where, I don't know, I imagine a not so distant future where a machine exists that can pick up the hair follicles or skin cells that we shed and record with DNA bulletproof evidence, yeah for sure. Cyrus was at this recording studio on this day on this time. And then we know he went and got a burrito after. I don't know if I'm gonna get a burrito after this. But you know what I'm saying, right? Because we’re constantly—

Charlie Jane: [00:37:19] Or maybe the computer can predict that you're going to get a burrito.

Annalee: [00:37:22] Yeah, based on…

Charlie Jane: [00:37:22] Like in Minority Report.

Annalee: [00:37:24]  Yeah, exactly based on like blood sugar levels.

Cyrus: [00:37:26] Sure, pre-burrito.

Annalee: [00:37:28]  Because it’s sampled your blood all these other times when you wanted burritos, and it’s like, yeah. 

Cyrus: [00:37:32] Exactly. 

Charlie Jane: [00:37:32] Tom Cruise is going to be moving his hands around and be like, Cyrus is going to get a burrito.

Cyrus: [00:37:38]  Yeah.

Charlie Jane: [00:37:38]  So basically what you're saying is that, you know, in terms of Star Trek, which is mostly what we're talking about, the human behaviors that we're going to need to regulate are things like refusing to see others as people and also still reverting to this kind of desire to surveil people and restrict individual liberty in the cause of security. 

Cyrus: [00:37:59] Yeah, I think that's a good way to put it. You know, it's interesting, but we have depictions in Star Trek of court proceedings or military justice court proceedings, court martial type situations. We don't have a good sense in Star Trek as it's been depicted so far of what sort of civilian justice looks like or what civilian law enforcement looks like. We have this movement in contemporary criminal justice of this idea of restorative justice and you see that played out in schools and in community settings.

[00:38:27] I sort of wonder as I rewatch some of these episodes, I sort of imagined like maybe this is kind of where restorative justice goes. Is that you have this kind of hold hands and do yoga and do therapy and understand the wrong that you've caused to somebody. And that yeah, we want you to like be punished in a way, but we're not gonna send you to Rura Penthe. We’re just gonna like make you sit, read some books for a while and do that and not be a danger to society. But that's one of the things where I think the judicial system as it's depicted in Star Trek is incomplete. Is like, we don't have a good sense of what that means for violent crime or for something like that.

Annalee: [00:39:05] It is interesting that in the mirror universe, the agonizer is such an important piece of that, the thing that marks it as other. This is obviously not us because we would never—

Cyrus: [00:39:16]  We’d never do that.

Annalee: [00:39:16] —do that with our prisoners, you know.

Cyrus: [00:39:18] Or, we did that in the past, right? We had the stocks and the, all that—

Annalee: [00:39:20]  All kinds of stuff.

Cyrus: [00:39:22] All that kind of torture devices and as it's shown, particularly in Discovery, they like spend a lot of time depicting that and showing how awful that is.

Annalee: [00:39:31] Yeah. All right. Thank you so much for joining us. Where can people find your work?

Cyrus: [00:39:34] People can find my work at nbcnews.com they can find my endless retweets of Riker Googling at my own Twitter account, which is @CFarivar. Uh, I'm on Twitter too much. Thank you, guys. This was really fun.

Annalee: [00:39:46] Yeah. 

Charlie Jane: [00:39:46] Yay. Thanks for coming. 

Cyrus: [00:39:47] My pleasure. 

Annalee: [00:39:47] All right, so, you have been listening to Our Opinions Are Correct. You can find us wherever fine podcasts are purveyed. Please leave a review for us on Apple Podcast, that helps people find us. You can follow us on Twitter @OOACpod, and we have a Patreon.

Charlie Jane: [00:40:06]  Woo-hoo!

Annalee: [00:40:06] So you can give us some gold latinum, and… or Bitcoin or Doge coin. We take it all, man.

Cyrus: [00:40:13] Isiks? What about isiks?

Annalee: [00:40:14] Isiks. We take it all, whatever it is. As long as, as it can be converted into US dollars, we love it. And thank you so much to Veronica Simonetti here at Women's Audio Mission. She's our producer. Women’s Audio Mission is the greatest recording studio in the world, pretty much. And I think we can probably prove that. And thanks to Chris Palmer for the music and we will hear you or you'll hear us or something like that in two weeks. 

Charlie Jane: [00:40:37] Yay. Bye.

Annalee: [00:40:39] Bye!

[00:40:38] Outro music plays. Drums with a bass line including bass drops.

Annalee Newitz