NewCo Shift Forum 2018
Now that digital platforms drive physical consequences, what does “free speech” really even mean anymore?
It’s the first amendment to the US Constitution, but it’s also deeply misunderstood. Free speech seems an inviolate right, but when practiced at public institutions like UC Berkeley, or within private corporations like CloudFlare or Starbucks, the concept of free speech is both tested and proven. Join UC Berkeley Chancellor Carol Christ, EFF Executive Director Cindy Cohn, and moderator Nellie Bowles of the New York Times for a challenging and timely conversation on the role of speech in our roiling democracy.
Nellie Bowles: Hi. We are here today to talk about free speech and the corporation, a nice, relaxing topic for a Tuesday afternoon. [laughter] I know you two both have a lot of interesting things to talk about. To just jump right in, when we talk about free speech right now, that term and that concept, it feels like it’s changing radically. I don’t really understand if the shift in how the term “free speech” is being used is political or it’s about platforms or if it’s generational. If you could both just give me a sense of what has changed in the last year with this concept?
Cindy Cohn: I’m not sure it’s just changed in the last year. Certainly, for the free speech movement at Berkeley, what students were fighting for in the 1960s was the freedom of their own speech, the freedom to politically advocate on campus.
Now free speech means something quite different. It means the ability to inhabit a particular public forum as an outside speaker having a particular political point of view.
Nellie: Chancellor Christ, you’ve been at the center of this firestorm overseeing Berkeley, which has now become another free speech hotbed for another generation. What’s happening?
Carol Christ: The way I like to describe what’s happening at Berkeley is what I call the tale of two speeches. This really gets into the digital world. We had two free speech events in the fall.
The first was a speech by the conservative pundit and thinker Ben Shapiro. In that event, the speech was the thing. Ben Shapiro came to campus. He wanted to give a speech before a live audience. The speech indeed was filmed. It’s available on the web. The speech was the thing.
Then, next, we had so-called Free Speech Week. The week was only four days. It was Free Speech Week by Milo Yiannopoulos and a number of his colleagues. This was an event that was supposed to take place over four days, 12 events, 25 speakers. It would really essentially occupy the center of campus and would have been, had it taken place, enormously disruptive.
It became clear as the event got closer that Free Speech Week was a fiction rather than an event that had actually been planned. After Milo Yiannopoulos published the list of 25 speakers on his website, we started getting communications from the speakers that they had never heard of this, that they had no intention of coming.
We realized that the point of this so-called event was to provoke us into canceling it. If you use the analogy of an object and a shadow, with Ben Shapiro, the object was the event. Its shadow was digital. For Milo Yiannopoulos, the object was digital. The shadow was whatever real happened.
We decided to call his bluff. He canceled the event 24 hours beforehand. It really suggests to me that the omnipresence of a digital world has changed not only the stakes but even the stage on which free speech events happen.
Cindy: I love that story because I just heard it off-stage. I think that that’s absolutely right.
I think free speech has been taken and turned and turned into something else and weaponized — that’s one of the early things we’ve seen with fake news or other things. The strategy of a lot of the people on the far right is to get us to attack our own institutions and not think that they’re valuable anymore.
By waving free speech around as if it was a magic shield that lets you be an asshole, they’ve convinced a significant percentage of the rest of us that the problem is free speech, as opposed to the problem is you people. I think that there’s a real risk going on here that people fall for it.
They fall for it in a way that leads them to actually harm the institutions, I think the Chancellor’s exactly right, that are aimed at trying to empower people.
I’ll tell you part of the reason I think this is that at EFF, we hear from people all over the world who are trying to get their voices out. That traditional version of free speech is alive and well is you’re Rohingya and you’re trying to talk about the genocide of your people. It’s alive and well if you’re in Syria trying to document war crimes. It’s alive and well all in the rest of the world.
If the rest of us here in the West in this elite conversation that we’re having in America right now trash the idea of free speech or the idea that the Internet is a place where people who formerly didn’t have a voice or who said things that the majority doesn’t like get to still speak, we’re going to let all the rest of them down. I certainly hope we don’t do that.
Nellie: In a lot of ways, those platforms like Facebook have allowed for false information that the Times has reported has fueled some of the conversation around genocide there in Myanmar. What role do corporations have?
Cindy: Let me be lawyer lady for a second. In the traditional free speech doctrinal thing, corporations don’t have an obligation to uphold free speech. The idea of the First Amendment and even freedom of expression in the international context is about your relationship with your government.
It’s not about your relationship with Facebook. Facebook is not an actor for purposes of traditional free speech doctrine. The University of California Berkeley is, because it’s government funded.
Corporations have rights of free speech if the government was going to come and try to tell them what they had to say. That’s not really, I think, what’s going on here. What people are saying is that the ideals of free speech are the kinds of things that people are worried about, how they’re being implemented by these companies.
As a strict legal matter, the First Amendment is largely irrelevant to the question of how Facebook decides to make its Newsfeed work, or who it kicks off or how it applies its content moderation policies. As a strict matter of law, they can make whatever decisions they want. They can let anybody on the platform. They can kick anybody off.
What’s so interesting, the legal point that you make is absolutely right, that public institutions have different responsibilities in regard to free speech than private institutions, whether they’re private universities or private companies, who aren’t obligated as public universities are to observe the first amendment.
When you get private digital purveyors using the kinds of protections for free speech that exist in our public institutions to broadcast in various ways, to create a permanent imprint out there in the digital world of the free speech that our public institutions protect, that’s a really complicated situation I think.
It does get pretty complicated pretty fast. The other thing that I will say having spent the better part of the last 15 years at EFF trying to help people find their voice online and watching how platforms do that, [there’s] a lot of the pressure to try to get platforms to censor more…
Nellie: There’s a movement within Silicon Valley to get the platforms to self-regulate. You have people like Tristan Harris and a lot of ex-Facebook, ex-Google employees who are saying these platforms need to be regulated.
Cindy: Regulated by who? Remember that if you’re calling for regulation, then you’re asking for (current Attorney General) Jeff Sessions to decide when you’ve violated the law. I think we all might want to stop a little bit and think about whether that’s going to head where we want it to go right now.
It’s all well and good to talk about regulation. You actually have to look at who’s the actual regulator who you’re bringing in. Are we talking about (current FCC Chair) Ajit Pai? Are we talking about Jeff Sessions? Who are we talking about here?
Carol: I think that’s exactly right that the most powerful argument in my view that John Stuart Mill makes for free speech is that if you restrict speech, you have to determine what government authority possesses the power of censorship. It may not be a government authority that you would trust with the power of censorship.
Cindy: Back to your moral question, should we be pushing these companies to engage in more censorship, there’s plenty of things I would like to push the companies to do here. Let me make sure that I put that pin in so we can get back to it.
Companies suck at censorship. There’s a website, onlinecensorship.org, where EFF and another group called Visualizing Impact try to keep track of how well this happens. The through lines are really clear. If you’re a really powerful person, you’ll get other people kicked off the platform. If you’re a person without power, you won’t.
We’ve seen people on Twitter who are complaining about racist things being said to them, reposting that, and getting themselves kicked off of Twitter. They’re not going to get better at this in a significant way. They’re bad at it. They’re bad at it in a particular way that favors the powerful over the less powerful. It’s pretty well-documented that that’s what’s happening.
It’s because you have to try to do this at scale. You’re trying to get people without a lot of training, maybe a little bit of training, trying to decide who’s the good guy and who’s the bad guy in all of these conversations all around the world.
There’s a reason, as the Chancellor says, that we’re very hesitant to do that in the context of government. It’s not because we don’t think that people say awful things or that it’s dangerous or there’s scary things that get said. It’s because once you give somebody the power to be censor, then you give them a tremendous power. Who are you going to put in that position?
I don’t think Mark Zuckerberg’s going to be magically better at this or get better at it. The things that we think about to try to deal with this problem are why is Facebook the only one that decides what comes into your feed? What are their incentives? Are there ways that we can push the company to open up APIs to allow us to take our social network somewhere else?
These are places where Harris and others tend to agree that I would like to foster a situation in which Facebook wasn’t the sole arbiter of what we saw because their incentive is to keep you on the platform.
Carol: One of the really interesting questions I think is whether the utopian vision that many people had when the Internet started to become so powerful a communication means…There was this utopian vision that it created a more democratic place for public speech.
Whereas I think now people are becoming much more cynical about that, more critical of that utopian confidence.
Nellie: Tell me about that because one thing that I am noticing is that a lot of this feels generational. It seems like, I’m almost 30 now, people younger than me, people in college and younger, have a very different attitude towards free speech. What are you noticing?
Carol: I certainly notice that. Many of our students feel that there should be restrictions on free speech.
Think about this generation. It’s a generation that grew up with anti-bullying instruction in the public schools, lots of instruction about harassment. They can’t figure out why it’s wrong for me to say something in a classroom or in a dormitory that someone like Milo Yiannopoulos can get on a public stage and say. What explains that tension?
I think we have to do a lot more educating about free speech, the history of free speech, what it is that free speech protects. I agree that it is very much generational.
It also is I think that the digital world changes the permanence of things, that if I were to say something hateful to someone in this room and it were a live event, it might be hateful. It would over as soon as I said it. Whereas if I said it in a digital space, it’s there forever. I think that changes things. Our laws haven’t caught up with the difference in digital media.
Cindy: That’s certainly true. I’m not sure I would agree with all of the prescriptions. I think the other thing that’s happened…
Nellie: What do you disagree with?
Cindy: I knew you’d want that! One of the other things that I think is generational is that the Internet of the 1990s was open platform with lots and lots of different places. The Internet of the last 10 years is very closed. There’s three or four companies that are the places where you speak.
Again, Facebook decides what comes up in your feed. This isn’t a random event. You don’t have any power over it. They’ve got a set of decisions that are based on trying to keep you on the platform as much as possible.
Neurologically, we know that that’s about outrage. The Internet of the 1990s was about choosing your own adventure. The Internet of right now over the last 10 years is about somebody else choosing your adventure for you.
I think that’s a significant shift and one that shouldn’t be overlooked as you’re trying to figure out what you want to do from here. I certainly got into the Internet because I was excited about the ability of everybody to have a voice. I still think that that’s a good thing in the world. In the vast majority of situations, it is.
We’ve got to deal with the problems now. It’s complicated. This isn’t the Internet of the 1990s. People aren’t saying hateful things on their message boards on Usenet, where you just start another one if you want to leave.
Nellie: Do you think it’s advertising revenue, the economic model of so many online sites that are trying to create maximum number of both looks and time there?
Cindy: Absolutely. I call it the surveillance business model. I have lots of problems with it in other contexts like the NSA spying on everybody as well. This idea that we all have our own individual Internet that isn’t the one that we picked, it’s the one that Facebook’s algorithm’s or Twitter’s algorithms or Google vast knowledge about all of us has decided is the one that we’re going to see.
Nellie: I’m going to have to kick it to the audience for questions.
Audience Member: Hi. Renee DiResta. I’ve been interested in particular in the EFF’s take on notions of speech and how we seem to be at a point where the absolutist view of free speech and the thought that moderation is censorship, which I understand in the strict constructionist definition of the word may be. I hear a lot of this is really hard. This is really complicated. We don’t want to stifle dialogue.
I feel like we’re actually predicating absolutist free speech over freedom of expression and freedom of assembly particularly in the context of harassment. I’ve been following the EFF as it’s been tweeting about the bill that was challenging CDA 230 today. I would like to hear what you actually suggest people do because I hear a lot of it’s too complicated. I don’t ever hear any actual solutions.
Carol: I don’t understand what you mean by what you think people should do. Explain your question more.
Renee: There’s a sense that asking tech platforms to take a stance or the fact that they are in fact arbiters of content as it is every day just by virtue of the fact that the Newsfeed is not a straight timeline, they’ve already made a decision. They’ve already shaped a narrative. They’ve already pushed a view.
I’m curious why when it comes to moderation and when it comes to trying to restore a sense of civility, that’s treated somehow as a different power, as too much power for them to have.
Cindy: We’ve never said that the platforms can’t moderate. They do. They can. What I’ve said is if you look at them historically, the direct hosting platforms suck at it in both directions. They don’t catch things that are bad. They let things go on, on both sides.
The question is, is the answer to double down and tell them, “Suck at it less”? Or is the answer to look at other ways to get at this?
I’ll explain to everybody else what you’re talking about. There’s a bill that got passed out of the House today that’s called FOSTA which would create liability for any host, not just big ones…We’re talking about libraries. We’re talking about local blogs that have comments.
We’re talking about anything that would create the ability for them to get sued or criminally prosecuted in all of the 50 states if there is an ad for sex trafficking that shows up on that platform.
We oppose that bill because we worry that, A, most of the people who specialize in sex trafficking will tell us that it will not solve the problem. B, it will create barriers to entry for anybody who wants to host other people’s speech.
If you want a world in which you have something other than Facebook creating gigantic barriers to entry to host other people’s speech is going to make sure that’s all we ever get is Facebook. There won’t be a next company because they’re not going to be able to get funding to do a startup if your startup costs are that high.
What should we do? I actually think it would be great if we devoted more resources for local police protection for women. I think you should get a temporary restraining order and have it work. I think when you call the cops because somebody’s harassing you, you actually ought to get some help.
I think that by the time you’re turning to an Internet platform, you’re pretty far down the road in these kinds of things happening. There’s things that can happen. I don’t think creating liability for the platforms, especially small platforms, is going to work.
Alexandra Levy’s work at Notre Dame, a lot of people who look at how sex trafficking works are pretty consistently saying that this isn’t a bill that’s going to solve their problem. It’s going to cause other problems. Just because there’s a problem doesn’t mean that the thing that’s proffered as the solution is the right one.
Nellie: Renee, that was a great question.Thank you both so much for coming and for having this conversation.