One Year Later: What Have We Learned About Russia, Facebook and Our Elections?

By

NewCo Shift Forum 2018

John Heilemann, Renee DiResta, Laura Rosenberger and Roger McNamee discuss the future of democracy in an era of unrestrained social media.


One year after the Shift Forum convened one of the very first conversations around the role of Russia in disrupting the US electoral process in 2016, moderator John Heilemann once again convenes a panel of experts to plumb what we know about the story. What was revealed is both fascinating and deeply disturbing.

John Heilemann: All right, guys. We have the third in our series of political policy-related panels this morning. This is the one that I have been most looking forward to, because I think it gets to one of the issues that everybody in this room has been thinking about. Everyone in the country’s been thinking about, and the one that no one in the country was thinking about.

We talked about a little at this conference a year ago. The year before that, no one would have bothered or thought it was an issue. This is the issue of Russia and social media. We’ve had some discussion on this panel at this conference already about this, but this is a panel designed specifically to talk about it.

Roger McNamee’s over there. Everyone in this room knows who he is, the investor famed for his investments, now famed for having abandoned his investments to become an evangelical crusader on this issue.

[laughter]

John: We will talk more about that in a minute. Laura Rosenberger, who, with Jennifer Palmieri, worked at the Clinton campaign. Now is involved in a variety of things, but most interesting for us here, tracking the activity of Russian trolls, bots, and Russian-related social media actors who are doing bad things to our democratic discourse.

And Renee DiResta, who’s the biggest brain in the academy, thinking about these dynamics in the world of social media.

Thank you guys all for coming here. I’m going to start with you, Laura, on this question. I want to ask this. Even before the Mueller indictments came down, we said we were going to start this panel by saying, “OK, what do we know, at this moment, about what happened in 2016?”

I’ll ask in a second about what’s going on now. I want to look at 2016, because, obviously, it’s a huge issue in our democratic discourse. What did Russia do in our election? We don’t know the answer about collusion. We don’t know a lot of things.

Because of all of the discussion we’ve had over the last year or so, and particularly now, we look at the Mueller indictment which has a staggering degree of detail. We now know a lot more about what Russia was doing in the realm of social media to try to influence the election, both in favor of Donald Trump and in favor of discord than we knew even two weeks ago.

I want you to talk about that. Right now, the state of our knowledge about what happened in 2016 with Russia and social media is…

Laura Rosenberger: Partial and scary, but I think still not complete. What I would say is a couple of things. One, a year ago I think there were a lot of people who suspected that there had been an enormous amount of activity on social media coming from Russia aimed at influencing the political discourse in this country.

There were people like Renee and her colleagues who had been doing a lot of work on this, actually, for quite some time and trying to bring attention to the issue.

In the past year, what we’ve seen between the Congressional investigations, between the Mueller indictments, is an enormous amount of information about the use of fake accounts, fake personas, bot networks, political advertising, that was used to try to influence the political discourse around the election.

I would make a couple of points. I know that we’re going to focus the conversation here on the social media piece of it, but one thing I think that is important is sometimes we hear a lot of conversation right now about, “Well, did this really change the outcome of the election,” and, “How do we know if it actually influenced things?”

It’s important to bear in mind a couple things. One, the social media activity was one piece of a broader toolkit that Russia was using to try to influence our politics, sow discord, etc. The Mueller indictment actually touches on a few of the things that jumped off of the social media platforms.

Some of that’s the use of classic espionage tactics, sending agents to the United States to try to get a better understanding of our political dynamics, how they could find wedge issues, where were the right areas to target. There was the hacks of the DNC and John Podesta’s email, which I know were discussed [laughs] a good bit last year as well.

That information was weaponized and then used and released in the social media campaign but elsewhere. We know that there are investigations of potential funding that may have been funneled through groups like the NRA that Mueller is looking into.

It’s important to bear in mind that this was one piece of a broader puzzle where I think we still don’t know the full scale of what we’re actually talking about.

John: Renee, when you read the Mueller indictment, your reaction to it was what?

Renee DiResta: I thought it was remarkably thorough. I was very impressed with how much they managed to meticulously document and put out there. That was something that hadn’t been conveyed to the public previously.

A lot of the challenge that we’ve had as folks who have studied disinformation for years now, because Russia is not actually the first. As we’ve looked at things like the Mueller indictment, the laying-out of tactics, one thing I was surprised by that I actually didn’t know was the extent of the in-person communications.

We knew there had been some communications on Messenger, but seeing the extent of them, the extent of the sophistication around the money laundering operation, the extent of the influence they managed to have in real-world, getting people to show up to protests. We knew about one or two of those, but seeing the extent of it was remarkable.

John: Roger, as soon as the Mueller indictment came out, there was a secondary discussion that played out in social media when one of the gentlemen at Facebook, a senior executive who deals with the ad platform, came out and said something that seemed to buttress the Trump argument.

Say, “Hey, you know, I’ve looked at all these ads that we ran on our platform. I know about this more than anybody else does. Here’s the deal. These ads, I looked at them. They weren’t meant to help Donald Trump mostly. They were meant to sow discord and discontent.”

Essentially, implicit in that was Trump’s right. This wasn’t a pro-Trump campaign. There was stuff going on here, but it wasn’t a pro-Trump campaign. There’s been pushback on these points, a lot of it.

When you look at 2016 at what we now know happened with Facebook and social media, what do you conclude about the advertising piece of it and the non-advertising piece of it?

Roger McNamee: When I saw the indictment, I actually had one question that hasn’t come up yet, which is what is the degree to which Facebook provided advertising support services to whomever was placing ads around the 2016 election? Because if you’re an advertising customer on Facebook, you’re entitled to all kinds of services.

How did they find out about the various tools? How did all of that happen? What was the level of consciousness inside Facebook for what was going on? Because it was non-zero.

When I looked at Facebook’s response to all this, for reasons I cannot explain and which really disappoint me, they have approached this with a deny, deflect, dissemble strategy. As opposed to recognizing that they could be a hero in this story by accepting that their users have been harmed and that they played an unintentional role in making this happen.

As I looked at it relative to whether the question is did all the Russians do is go and sow dissent in the United States and try to drive us apart? Or did they actually back Trump? To my mind, that’s not actually relevant.

The fact that their tools were used in an espionage campaign is a horrific situation. The fact that they’re sitting there pretending like that’s OK is morally repugnant.

These are people I trusted. These are people I have supported for a very long time and whom I tried for months privately to help come to grips with what I thought was an existential challenge for the company. I still do not understand the way they’re looking at this.

To me, whether you liked who won in 2016 or not, this next time, anybody can do this. The platform is still undefended. This time, all you have to do is sign in with a sock puppet named Igor. They’re just going to blame it on the Russians.

Why won’t the Chinese, now the largest advertiser on Facebook in Asia, get involved in this? Why won’t the North Koreans? Why won’t the Iranians? Why won’t everybody running for school board essentially use this platform, which is designed at its base to manipulate the emotions of the people who use it?

John: You think about 2016. Now let’s go forward a little bit and think about what’s happened in the year since. One of the things that we see is we’ve seen a lot of activity. It’s not stopped. The Russians, and others, but let’s focus on Russia here, has not stopped trying to play in our social media sphere in 2017, 2018.

Laura at Hamilton68, you guys built a dashboard. You can go online. You can look at Hamilton68. At any given point, you can have a window into what Russian-affiliated accounts are doing online. You guys track those in a variety of ways. Hashtags are one of the ways you do it.

I want to put a couple things up here on screen from a deck of yours. Put up the geopolitics, this slide here. We’ll talk about what’s on this. I’m going to show two slides, just to give a sense of what you guys do and also a window into things that have been happening subsequent to the election. Just talk about what we see up here on the screen.

Laura: What we see up here on the screen is actually a pretty classic take of what you would see if you log onto this website, which is a public-facing dashboard that’s tracking the messages that these accounts are pushing.

That’s important to understand. It’s not saying this is the broader impact it’s having on the full Twitter conversation, or that these things wouldn’t be being discussed if not for this activity. It gives us a window into what these networks are trying to drive.

What we see here and actually what we see an enormous amount of the time is that the networks will use things like a trending topic or a wedge issue. In this case, you see in the middle in the purple is Pocahontas. Most folks will know…It’s funny. When I do this for European audiences, it totally does not sink in…

[laughter]

Laura: …when I try to explain that this is the derogatory nickname that President Trump has used for Senator Elizabeth Warren. This was after one of his rants about Senator Warren, using the term “Pocahontas.”

What we see then, in conjunction with that, the network is pushing messages about Syria and Ukraine at the top. Donbass is, of course, the part of Eastern Ukraine that Russia has occupied.

White helmets at the bottom is the humanitarian organization that has been working on the ground in Syria to try to save civilian lives. The Russians have actually gone after and tried to smear as a sort of US, CIA kind of outfit — conspiracy theory stuff.

Bottom line being, they’re using a wedge issue to inject themselves into the conversation, a hot topic, and then promote some Kremlin narratives and views about geopolitical issues. Again, trying to influence the way Americans think about these kinds of questions.

John: I want to get up one more slide, and then I’m going to move on to a different topic, but here’s a good example. Before Parkland, which we’ll talk about in a second, this was a fantastic example of a place where, suddenly, there was a huge amount of Russian-affiliated activity.

This has released the memo, in January, just a couple of months ago, a month and a half ago, when Devin Nunes, the Republican chairman of the House Intelligence Committee, put forward a memo that, allegedly, would undermine Bob Mueller’s investigation, the FBI, the intelligence community, and their attempt to try to investigate collusion between Donald Trump and Russia.

This, then, happened. This is a snapshot. I’ll just talk about this for one second. Laura, and then we’ll talk about Parkland in a minute.

Laura: What we see here, again, is this network deciding that a particular topic of conversation is in its interest to push.

Now, again, that doesn’t say that this wouldn’t have been a conversation that was happening absent this activity. What it says is this Russian-affiliated network decided that it’s in the Kremlin’s interest to try to push this particular narrative.

That we saw, actually, with part of a month’s long effort that this network has been pushing to discredit the Mueller investigation, discredit the FBI, discredit the Department of Justice, and undermine those law and order institutions in our society.

John: Renee, I want you to talk about Parkland, because it’s been incredible, in the moments after Parkland, when we saw the families of victims and the classmates come forward.

Suddenly, there was this moment in national media where we were confronted with this new thing — media-savvy, emotionally resonant, incredibly articulate young voices. Then there was this countermeasure that played out in social media and in the mainstream media, but in social media, in particular.

I want you to talk about that because it’s the freshest possible example of how this is happening in our discourse now.

Renee: It’s the freshest possible example of something that happens time, and time, and time again. When we first did the speaker prep call for this, the hashtag was ReleaseTheMemo. Right after that, it was the stuff about the indictments.

This happens, literally, on a weekly basis, at this point. What happened with Parkland was the…What I think is important to understand is that this is a systemic manipulation problem. It is carried out on all channels. Anyone can do it.

We talk about Russia. In 2014 and ’15, we were talking about ISIS. Prior to that, I was working on conspiracy theorists. The framework is the same. The tools are the same. The ease with which you can make something trend is the same. The ability to manufacture consensus has carried on now for three or four years.

What we saw with Parkland was the bots on Twitter, the hashtags on Facebook, the groups on Facebook. Facebook groups are incredibly culpable. There are thousands and thousands of people and groups that spring into action every time there’s a mass shooting to spread the narrative that these are crisis actors.

YouTube is, by far, the worst offender on this front. There has been a thriving underground culture of conspiracy theory videos on YouTube. Some monetized, some not, that have been spreading the narrative that everyone from the parents of Sandy Hook or, you name it. Every single mass shooting, this crisis actor nonsense starts up again, and the platforms do nothing.

The challenge here is that we are now at a point where it is so eminently predictable that…the same way that…there’s that funny…Well, it’s not funny at all, but the Onion headline that they just recycle each time.

John: Recycle every year.

Renee: We can, at this point, recycle the headlines about misinformation and disinformation campaigns, because they are so predictable. You can also recycle the narrative that the companies are going to come out with which is, “We don’t want to be the arbiters of truth. It’s too hard for us to detect this.”

Well, we gave them a strike, and this is where we get to the point where we’re actually stuck in a rut. The only thing that can get us out of this is the platforms taking responsibility for what they’re hosting.

John: This is great. Thank you for that transition.

[applause]

John: Roger, I want you to talk about this. That was a great transition, because it led to the question I wanted to ask, but I want you to do a little slicing and dicing here.

All of the social platforms, who were all invited to be on this panel, by the way, and all declined, notably, when we said we wanted to talk about their response to this Russian problem and to the broader spread of fake news and conspiracy theories, etc., etc. Some of them are represented at this conference, but not on this panel.

I’d like you to just talk with a little more granular detail about the difference between Facebook, Twitter, Google, YouTube, Instagram. They’re broadly similar in their reaction to the scrutiny they’re under now, but there are some differences in terms of how they’re reacting to this new public pressure to deal.

Roger: My reaction to their reaction is to recognize that each one of them is culpable in a different way. They each play a different role. On the political front, Facebook, Twitter, YouTube, stand forward. Relatively to children, Instagram, YouTube, Snapchat, play a much, much greater role.

They have uniformly taken the position that they are protected by either Section 230 of the Communications Decency Act or something, some larger thing, that they’re a platform, rather than a media company, and therefore not responsible for what goes on on their platforms.

John: We’re a newsstand, not a magazine.

Roger: Well, I look at this and I think to myself, “If you were not running an uncontrolled social experiment, a psychological experiment, on 2.1 billion people, I would be more open to that notion. But 2.1 billion out of 7 billion is a social experiment that, without a control group, without any protection, has produced really horrific outcomes.”

It’s easy to see how they got there. We went, in 2000, from a world where you needed rocket scientists, because we lived in a world of limited technology resources. Suddenly, we had enough bandwidth, processing power, storage, and all that, in order to do whatever you wanted. It was possible to hire companies of people fresh out of college without any experience.

They proceeded forward, like they knew all the answers, like history didn’t matter, like nothing that anyone else had ever experienced was relevant, and that they were free to act, to disrupt, if you will, without being responsible for the consequences.

John, The answer to your question is I believe they have all reacted horrifically, that they’ve done the bare minimum to appear to be responsive without actually doing anything substantive to do something about the problem.

They all have all the tools they need. For most of them, it’s going to require some change in their business model, some re-prioritization. That does not need to be harmful to them economically.

Candidly, at this point, I think they’ve lost the right to assert that this is too harmful to them. The harm they’re doing, in most cases, is so great that we need, as users, to put pressure on them.

[applause]

John: I want to have enough time for some audience questions here. Very quickly, I’m going to give you each about a minute to answer the same question. I’m going to start down here. We’ll move down the line.

This is a big problem. On one end of the spectrum, there’s government regulation. On the other end of the spectrum, there is self-regulation. In the middle somewhere is collaborative work between government and industry to solve this problem.

If you had to sketch out not what you think is going to happen but what you think should happen along that spectrum, what does that look like to you as we move towards 2018 where people are all freaked out that the midterms….

We’re going to see just more of this. There’s no reason not to think this is not going to accelerate as opposed to decelerate on its own. Again, along that spectrum, where do we need to get to?

Laura: I think, pragmatically, by 2018, regulation is not going to happen. I think it’s actually a lot of social pressure and media pressure that have shifted the narrative over the last year. The fact that they did take down the most recent Parkland crisis video is evidence that with the sufficient amount of pressure they will do something in keeping that up.

John: Forget about the time frame. I was just using that as a sense of we know that there’s going to be more that’s going to happen. We’re going to be talking about this in 6 months, 9 months, 12 months.

Do you think that the model for how to fix this is mostly a market model where public pressure on the private companies gets them to change without the government having to get involved in any significant way?

Laura: It gets the most rapid change that means that 2018 will not be as much of a disaster as it’s shaping up to be right now.

John: Again, along the spectrum of regulation versus the market model, where do you think we should be?

Renee: I’m largely in the market model, too. I just want to give a couple of examples where I think that this can be effective. We are, as was discussed a little bit yesterday, seeing some good examples of the private sector expanding the thinking on corporate social responsibility, whether that’s on climate change, whether that’s standing up to the NRA.

This is another area where, particularly because Facebook and Google are so dependent on advertising, advertisers have a huge amount of leverage. We saw Unilever two weeks ago, I believe, make an announcement that if the platforms don’t begin to actually address the problem that we are seeing that they’re going to pull their advertising.

That is actually the greatest source of leverage. Two other quick comments on this. I come out of the foreign policy world. For me, some of the comments that one of the panelists on the Club de Madrid panel made yesterday about the implications of US government regulation on a worldwide scale.

The way authoritarians can take advantage of this, I think is an enormously important point. I am very concerned about government overregulation. That’s why I actually think it’s so important for the companies to step up and take responsibility. Bill Gates recently made some points along this line.

Collaborative mechanisms between the companies and with private researchers — there is an enormous amount of good work being done. But there is not collaboration in a way that provides either transparency, or accountability, or that actually lets researchers begin to help solve the problem.

The last point is I was involved in the government during the Snowden revelations and saw the enormous rupture that happened with Silicon Valley during that point in time. We have got to close the trust gap between the Valley and government, particularly the intelligence community.

This is not a challenge that can be solved by one or the other. It has to be done together in collaboration. It is time that we actually recognize that.

Roger: Two quick things. Relative to the election this year, the most important thing is that everyone vote. We need to set record turnout, because the goal of these campaigns is to energize one side and then depress turnout on the other side. If we maximize turnout, we beat this thing in the short-run. It doesn’t solve the problem, but it helps with 2018.

Secondly, I’m very concerned about the market solving this. My goal is to appeal to the employees of these companies and say, “Which side of history do you want to be on?” We’re looking for Daniel Ellsberg. We’re looking for Susan Fowler. Contact me.

It is really important that we solve these problems from the inside. Do you want to be on the side of destroying democracy? Do you want to be on the side of destroying public health for our children? In which case, keep doing what you’re doing. Otherwise, come and join us.

John: When Daniel Ellsberg contacts you, call me.

[laughter]

Roger: We will create a legal defense fund. Thanks, John.

Chris: The one thing I’m shocked by is at the last Web 2.0 Conference, we had Yuri Milner on stage. Yuri Milner put $1.2 billion privately into Facebook and in Twitter.

Those funds largely went to early investors, founders — Mark Zuckerberg, Jack Dorsey. We got to get off this. We’re good. They’re bad. Russia is the bad actor. We’re so fucking dirty in this as a community.

I went to Twitter before they took that money and said, “That is blood money. 165 journalists had been killed, shot dead in Moscow and other places in Russia, because Mail.ru and all the information on that that Yuri Milner knew about was given to the Russian government, so they could figure out who’s trying to undermine them.”

They killed them. We took that money. We all took that money. We’re sitting here now in 2018, saying, “How did this happen?”

John: I love you brother, but let’s have a question.

Chris: The question is “The Guardian” has connected all of this. Yuri Milner got money directly from the Russian state, invested it into Facebook, invested it into Twitter. That went to the founders.

You want to know why these guys aren’t doing anything? They’re dirty. Roger is just not saying it, but you know it. We got to call truth. If we’re going to get truth between our folks, and you’re right, we have to have trust. We got to be honest about where the money is going and why. Jared Kushner was one of those investors, by the way.

[applause]

John: Chris’s question is Roger, would you like to call it bullshit right now?

Roger: I thought that was an eloquent statement.

John: No nweed for a question. Let’s do these two. Let’s get them in quick.

Gordon Crovitz: Thank you, John, Gordon Crovitz. I’m co-founder of NewsGuard, which is launching next week to be a part of the solution to this problem. You talked a little bit about pressure points on Facebook and Google to get them to do the right thing.

We heard users and advertisers. Is there any precedent of a previous problem that was this obvious, that was this hard for them to solve on their own, which understandably it is hard for them to solve on their own, where one of those interest groups was highly effective?

John: Anybody got an answer to that?

Roger: I’m not familiar with any problem that any of these companies have actually had to solve at any scale in the last five years. It’s been an expressway with no pebbles on it. I think part of the reason they’ve handled this so badly is they don’t have the muscles for dealing with conflict.

John: Is there an example of a similar, of a not a problem that these companies have had to solve in that time frame but at an earlier problem, either in the tech/media space or in the history of publishing that presents this kind of…

Roger: The Tylenol situation for Johnson & Johnson provides actually an almost perfect example. When I went to Facebook starting in October of 2016, I was pointing out to them, “You guys could be heroes. You could be heroes in your own story.”

The key thing is to recognize that you do actually have an obligation to users. They are not fuel for your profits. The reason Unilever is so angry is because they recognize that their customers are being harmed by this.

They are advertising on this platform where they’re being harmed. Eventually, all advertisers are going to look at this and go, “What the hell are we doing?”

John: Next one up. Sorry, I just want to make sure we get this done because again we have another speaker. Yes, sir.

Audience Member: Someone on the panel alluded to this earlier, but the point about actually apportioning blame or figuring out who’s actually behind it, which to me sounds like the really scary part of it.

If someone was doing this, one way to do it will be to do it and make it look like someone else did it. That becomes self-serving, whether it’s President Trump saying, “It could be a 400-pound hacker there in the basement.”

Or Putin saying, “Well, it wasn’t me. It could be the Chinese. It could be someone else.” The intelligence agencies were telling us, “Well, these are the people who definitely did it.”

They can’t tell us exactly how they know, because this is classified. Are we at a point already, or could we be at a point in the future where it will truly become possible or impossible to know who did it? How certain is the information, and will we ever reach that point, which would be truly scary?

Renee: I think attributability has been a huge challenge. It’s a huge challenge for the platforms. This actually speaks to Laura’s point that, following the Snowden rev, there is not very much information sharing happening.

That’s very much a cultural thing. I’ve been doing some looking back on statements that people made related to Isis…sorry. I know we’re out of time.

John: That’s all right. Go ahead.

Renee: I found in the EFF archives arguments that the tech industry needs to…It’s not the job of tech to detect terrorism on their platforms.

You look at this cultural conversation that’s gone back years now. What we’re seeing is actually the carry through of that attitude and the reluctance to take responsibility, passing the buck and pushing the blame, and the fact that the intelligence agencies have some thoughts about it.

The platforms have a phenomenal amount of information. There’s no transparency. We have no insight at all about anything that’s happening in there, because they have no outside researcher relationships. We’ve been asking for this for months now. We’re not really getting anywhere.

John: Roger, we have to get off the stage right now. You asked the question at the top about the degree to which Facebook worked with various entities. I don’t think we, at this moment, know very much about some of the interactions that they might have had with Russians.

I will say this. There’s a great piece on “Wired” about this the other day. The Trump campaign was much more successful, because they understood Facebook better.

Part of the reason they understood Facebook better is they did something that the Clinton campaign did not do, which is that they moved Facebook executives into the home of their digital advertising operation under the auspices of a gentleman named Brad Parscale.

Today, Donald Trump announced that his new campaign manager in 2020 will be Brad Parscale. Thank you very much guys. We’re getting off.

Roger: I just want you to know. If we get saved, John, these two are the ones who are going to do it. I’m serious.

[applause]

https://upscri.be/9ca96e/

Leave a Reply