Tech Refactored

Summer Staff Favorites: Joshua Tucker on Social Media and Democracy

July 29, 2022 Nebraska Governance and Technology Center
Tech Refactored
Summer Staff Favorites: Joshua Tucker on Social Media and Democracy
Show Notes Transcript

Tech Refactored is on a short summer vacation. We can't wait to bring you Season Three of our show beginning in August 2022, but as we near 100 total episodes our team needs a beat to rest and recharge. While we're away, please enjoy some summer staff favorites. The following episode was originally posted in April 2022.

On this episode Gus is joined by Joshua Tucker, Professor of Politics and co-Director of the Center for Social Media and Politics at New York University. Joshua joined our center earlier in the month for a guest discussion on his recent book with our faculty fellows. That conversation is explored more broadly today as we focus on his work on the effects of social media on democracy, and his recently published book, co-edited with Stanford’s Nathaniel Persily, Social Media and Democracy: The State of the Field. Join us as we travel through Russia, Facebook, and state of our democracy.

00:07 This is Tech Refactored. I'm your host, Gus Herwitz, the Menard director of the Nebraska Governance and Technology Center at the University of Nebraska. We're joined today by Joshua Tucker, professor of politics and co-director of the Center for Social Media and Politics at New York University, our discussion will focus on his work through the center, on the effects of social media and democracy, and his recently published book, co-edited with Stanford's Nate Persily: Social Media and Democracy: The State of the Field.

00:43 Welcome to the show, Josh. Thanks Gus, it's a pleasure to be here. You're doing so much really incredible work, uh, both individually, uh, with your co-authors and through the center, I'm really looking forward to jumping into and exploring both the substance of the work and also a little bit about how you actually do it because it isn't easy what you do. 

01:01 Can you, uh, start just by telling us a little bit about, uh, the center for Social Media and Politics? Sure. And, uh, thanks first off for the very kind words, they're, they're super appreciated. Um, the Center for Social Media and Politics in a nutshell was an attempt to bring a kind of natural sciences style lab to the social sciences.

01:21 Um, and so generally in the social sciences, we don't have labs, especially not in political science. Uh, and so the center for social media was trying to sort of take the best that that labs had to offer and see if we could import that into social sciences. So we have actually multiple PIs, we have, uh, two of us who are from political science, but one who's a

01:39 biologist / computer scientist. We have, uh, two full-time research engineers, we have, uh, six post-doctoral fellows and, uh, we have at any given time, you know, half a dozen to a dozen, uh, PhD students who are working with us in some capacity. Uh, our work that we do at the center is done in a kind of lab based style where we have, you know, where we have dedicated resources to things like data gathering and fundraising and storing data and developing analytics.

02:05 But then we have all sorts of different teams that are set up to work on, on lots of different projects. So everything's done really collaboratively, uh, and together. And so that's the sort of- how of what we're doing at the, the center for social media and politics. And it, it, it's obviously just the way you describe it, an interdisciplinary initiative.

02:22 You, you started your career studying Russia. Uh, I, I think, can you explain a little bit about your path into this area and how you ended up doing this work? Yeah, sure, absolutely. So a couple of things happen, well, actually three things happen simultaneously. The first was I had been doing, so, as you said, my, my career before I came to social media was on, uh, post communist political behavior.

02:44 So I was interested in how citizens interact with the political sphere, but in post communist countries. And I had done a lot of work on the colored revolutions, which was a series of, um, protests that took place against electoral fraud in Ukraine and Georgia and, uh, and Serbia, and eventually in Kyrgyzstan,

03:02 and, uh, and so that got me in contact and discussion with a lot of people who were studying protest in the area. Uh, and one of the things I heard at all of these panels when they were taking place was that this was not gonna happen in Russia. Yeah. Maybe Ukraine, but the colored revolution was not coming to Russia.

03:16 And then one day in 2011, we woke up and there were a quarter of a million people who are on the streets in Moscow, protesting electoral fraud. Uh, it didn't end up being a successful colored revolution in the sense that the regime stayed and the election results were not overturned, uh, but I started asking people what's going on.

03:32 And the answer I kept hearing was Facebook people where this had been planned on Facebook. And I began to think for the first time, seriously, that if I was going to try to understand what was happening in terms of people's participation in politics and if social media was going to become an increasingly important part of that, uh, of that conversation of the, of, of the explanations and an important part of what was happening, I needed to begin to educate myself about it, and I needed to learn about it.

03:59 Now that happened concurrently with something else, which is that I had an absolutely brilliant PhD student who came to me with a paper idea for a class he was taking with me on parties and partisanship suggesting that we would be able, we could potentially identify or get an estimate of people's partisanship or ideology based, not on the text of their social media post, but on the networks in which they had immersed themselves on social media.

04:23 And I was aware of this beginning field in political science, that's known as text is data, trying to think about turning text into actual data, which if you think about it increases dramatically the amount of data that we can bring to bear on any question, cuz there's way more words than there are numbers out there about anything in which we're interested.

04:40 But at that time, the idea that you could predict someone's ideology from their (indistinguishable) was a field that was kind of in its infancy, seemed like a bit of a pipe tree, but when the student showed me that there might be a way to do this with networks and that he was thinking, and it turned out to be his methods were actually quite accurate.

04:56 I began to think seriously for the first time, you know, I began to think seriously for the first time that social media Dinga might begin to change the way that we could do social science research. So these two things kind of came to together; I became interested in social media as a variable, as something that could explain political outcomes, but I also became convinced that social media data would allow us to study

05:20 politics in new and exciting ways. And that's exactly what we –I told you how the center was set up, how it worked– but that's what we do. We study social media's impact on politics, but we also do research that allows us to use social media data, to study politics in kind of new and exciting ways. And you said there was a, a third thing that had come together.

05:39 Oh, can, do you want to riff on that? Sure. The third thing was, is I happened to be serving on an NSF panel in political science at the time, and the NSF program director mentioned that they had this new, uh, they had a, a new, uh, call for proposals. Um, for a new program that was sponsoring in- outside the box, uh, interdisciplinary research, so that what went then to become that grant from the NSF inspire, uh, program was actually what gave us our start.

06:07 And, you know, as an academic, you spend a lot of time saying, oh, this sounds like a cool new program, I'll put an application in. And I had gotten plenty of grants that went nowhere, but this one fortunately was funded and that gave us the beginning of what would eventually become the Center for Social Media and Politics.

06:21 So that was fortuitous,  but also for these moonshot type ideas that they come up with at the NSF can lead to really interesting stuff. Sorry. Apologies. Yeah. So we'll, we'll definitely come back to, uh, some of that it's hard as an academic doing a lot of stuff to suddenly say, oh, there's this entire new field I need to pivot to because that's the interesting stuff and gang the NSF along,

06:42 we'll definitely come back to that, but we should, uh, uh, spend a, a moment, uh, talking about your book uh, you recently have, uh, published, uh, this co-edited book, uh, on social media and democracy with, uh, Stanford's, uh, uh, I, I should say, uh, Nate, Persily or personally, personally, personally, personally, the book is of course exceptionally timely;

07:02 however, you've been working on it for years. So I, I'm just wondering if you can think back to your proposal for the book. I assume let's say three years ago and tell us what made you want to undertake the project at that time and how you're thinking about the topic has evolved over the last, uh, period of years.

07:21 Fantastic. Yeah. Happy to do so. I do wanna say first though, in response to your other comments, if I had had any idea, when I first applied for this NSF grant, that it was gonna radically change the trajectory of my career, I probably never would've done it in the first place. So it's one of these things that was, uh, increment.

07:36 I had a question I was interested in in, but it ended up having a huge effect. Um, the book came about. So, um, I had helped the Hewlett foundation prepare a report that was on social media, uh, political polarization and misinformation and disinformation. Um, and we, I followed a kind of similar model, which was, I got a lot of really

07:55 smart people and parceled it up into a few different discrete components and invited them to write sort of short literature reviews on that. And meanwhile, at the same time, Nate had, uh, a grant from the, I believe from the Hewlett foundation and Nate wanted to sort of, uh, culminate them in this grant with a book on sort of where we were going with this state of the field.

08:13 And so he and I started talking about this Hewlett report that I had done. And that, or that I had coordinated for the Hewlett foundation. And, and while we thought that the Hewlett report was great, they were like six, seven page sort of short essays, we thought, wow, could we, could we expand this? And of course, as the point that you note the problem with anything that's in a, in a lit review in this field of digital media, which is just a digital democracy, which is just-

08:34 you know, ex- expanding exponentially and changing radically, as soon as you get something out, it's out of date. Um, and so, you know, we thought about, okay, well, so could the Hewlett report was out, but maybe we could go back and give people a chance to sort of expand on this and in doing so we changed around some of the chapters.

08:50 We got some different people involved, but the key thing that came about it is that Nate was interested in getting sort of a summary of people who were doing interesting work on sort of policy proposals for dealing with the, with social media as a kind of industry, as it is existence, and we can talk more about that later.

09:06 And the Hewlett report had been kind of this set of like lit reviews on where was the thing to the field? Where was the literature at this point on a number of discrete topics, which are by no means all the topics people could have asked, but they were sort of things that the foundation had been interested in originally.

09:19 And so we had the idea to try to bring these things together. And the thinking was is that these two groups of people, the people doing the research on the kind of basic science here and the data scientists and the people who were sort of pushing the envelope in that direction, didn't necessarily think that much about the policy implications of what they were doing.

09:36 And then you had this whole field of people that were thinking very seriously about policy around social media platforms, but didn't really know much of what was going on. With the basic science and the basic science here is, is difficult. It's complicated, it's hard. And, uh, and it's a really weird field because there is a, you know, there's a never ending supply of pundits who are happy to step in and sort of say, oh, this is definitely what's going on on social media

09:59 cuz I saw my 14 year old kid on YouTube and they were looking for Marvel Marvel universe videos, and they ended up with a flat earth video. And now I know how the YouTube algorithm works, but actually interrogating the YouTube algorithm is really hard work. That takes a lot of people and a lot of time.

10:13 And so we decided wouldn't it be great if we could kind of bring these two together in one place. And hopefully the people who came for the policy would read some of the stuff that was in the basic science. And then some of the folks that were in who were interested in the basic science, maybe they would stick around and see some of the policy stuff

10:27 if it was in one place. And then the closing chapter that we wrote was to sort of try to mesh these two together, not by summarizing what we had learned from the, from the 12 chapters, from the amazing authors that we had in this or the 11 chapters from the amazing authors we had in this, but to actually talk about data, access and, and access to data, which sort of straddles

10:44 the basic scientific research and the need for policy in this area. So that is such a hard issue that you've identified there, and it already has echoes of the, uh, already earlier, uh, a discussion that we've started about pivoting and changing your work. It's hard to get, uh, academics and researchers with expertise in a research agenda in one area

11:04 to really meaningfully embrace and engage with and learn from and change what they do from their interactions with other scholars, uh, folks who come hard questions from different directions. Um, to the extent that was one of the goals with the book. Um, how do you think it, how well do you think it worked and what did you do to facilitate that?

11:25 Um, I mean, I've had a lot of, you know, experience with get with interdisciplinary work and thinking in terms of interdisciplinary work, uh, through the center itself, you know, where we have PhD students from multiple different fields where we have these academic, you know, one of our PIs comes from biology, two of us come from political science,

11:40 and so it's definitely been, you know, an interesting road to sort of, to blend those things in the book per se. I mean, I think we, um, we gave each of our authors a sort of broad purview to sort of tell us what they thought the state of the field was in this regard. And what's interesting in this regard is the, is the question-

11:59 and this has been raised in particular, I think by people at the night foundation who very, you know, generously made a large injection of funding into this field a couple of years ago with this night centers and the night network that they've, that they've helped to set up. But there's this question of, "Is there a new field emerging?"

12:15 Is there a field that, you know, do we wanna call it digital democracy? Do we wanna call it digital politics? Do we wanna call it politics in the digital information age? You know what exactly this is? And it draws people from lots of different areas, I mean, you, you know, going back to the story of when we first got this thing going, what was so interesting when we went to go think about applying for that first grant and when I was like, I need to like figure out a little bit about what's happening with social media.

12:40 The papers that I found then- and this was back in like 2011, 2012- about politics were being written by computer scientists, were being written by physicists, were being written by mathematicians. Why was it the case? They knew how to work with this kind of data, but they didn't know really- I mean, they didn't know anything about politics.

12:59 And one of the things about politics is everybody thinks they know about politics because politics surrounds them. But you can know a lot about politics and not know about political science and not know about theory, and in fact, our pitch to the NSF was, look, if we don't get some social scientists in this field, right?

13:13 Like we're gonna have this whole new area of knowledge about politics. It's being produced by people who don't know about theory and about political science. And we were proposing that we would teach people the skills. That they needed to do this kind of research, but start with some of the folks who actually came from a social science, uh, a social science background in this regard.

13:31 Um, so I mean, I think that, that, to me, that was super interesting, that that was like, that was my first exposure to the field. Now, today. There are much more VE more venues for people to be talking to each other. Um, and you have, you know, we've had in our lab, we had a computer, a postdoc with a computer science PhD.

13:48 Right. You also have lots of people from the field of communications. You have people from the field of political science we've had in our lab. People from social psychology have co-authored with social psychologists. We have people with have a sociology student in our lab and we have, uh, data science PhD students.

14:01 Uh, so- it is a, you know, it is an interesting question, whether we will, you know, some people have gone back and looked at the history of computer science, which itself wasn't a field, right. There was people who started using computers in chemistry, people who started using, you know, computers and biology.

14:19 And then eventually computer science emerges in the field. This is what's been going on with data science, I think in particular. But then the question is, will this kind of be an offshoot of that in the sense that we will have a kind of new field in this regard to that extent? I think our book has more of a flavor of political science questions and law questions.

14:34 That's why those are the people who, who co-edited the book. And if you had two other people editing a book on social media and democracy, you might have actually seen, you know, different directions that people would've gone and that would've been influenced by the fields they came. We've been talking about your book,

14:48 I have to ask you a question about the book and I'm going to ask you my most obvious question- the name of the book, social media and democracy; the question is, is that the right title or should it be social media or democracy? Yeah. So a good question in that regard, and, and I'm get, you're getting that more and more right now.

15:06 Um, for that I would, I would push back to actually an article that I published a couple a few years before this, um, with co-authors, uh, Yannis Theocharis and Molly Roberts and Pablo Barbera in the journal of democracy that was called From Liberation to Turmoil. And what we tried to do in that, in that article was answer exactly this question, which is

15:24 you know when social media first burst on the scene, when I got interested in it, because it was about protest and authoritarian regimes, it was famously called by Larry Diamond "liberation technology." There's actually an article in the journal democracy called liberation technology. This was gonna lead to authoritarian regimes, crumbling around the world,

15:40 cuz finally there was a communications technology that could not be, you know, that could not be controlled by authoritarian regimes, that people would be able to use to get organized, to share information in these states. And then six years later, Nate actually published an article in, uh, in the journal of democracy.

15:57 Uh, that was called Can Democracy Survive the Internet? And so our piece was how can social media be both liberation technology and can democracy survive the internet? And the argument we make in that paper, cuz we could have said, look, one of them was wrong.  it's one thing it's not the other; but the argument we make in the paper is that you can actually explain

16:16 how it can be both of these things by making two simple assumptions. The first is that social media gives voice to people who are excluded from access to mainstream media. And the second is despite the fact that social media democratizes access to information, and I think that's undeniable, it's still a tool that can be used for censorship.

16:36 And what we do in the article is that we sort of walk through, like, what does this mean in the context of authoritarian regimes, where the people who don't have access to, to mainstream media, are pro-democracy activists? What does it mean when authoritarian regimes respond, begin to respond to online opposition and seek to control the conversation online through things like bots and trolls?

16:56 What does it mean in democracies where some of the people who don't have access to- on some of the people who don't have access to mainstream media are progressive voices? Like if you wanna take a sort of Chomsky the media as an element of the ruling class type approach, right? Like you might imagine to progressive voices are excluded from mainstream media, so you can get things like

17:14 occupy wall street, and black lives matter, and me too movement, but in the same respect, we also know in established democracies, mainstream media does a pretty good job of excluding really anti systemic forces, anti-democratic forces from access to mainstream media. So in democracies, the same tools that help pro-democracy activists and authoritarian regimes get organized.

17:34 These can help anti- anti-democratic forces and democratic systems get organized and they can draw on some of the tools that these authoritarian regimes have developed like bots and trolls to amplify their voices. So at the end of the day, We argue social media is neither pro nor anti-democracy. What it is is it's another tool that can be used by people in their contestation for political power, but it's a very fast moving rapid changing tool.

18:00 So it leads to this kind of very much cat and mouse type to dynamics. We are talking with Joshua Tucker, a co-author of the book Social Media and Democracy, the State of the Field. Uh, we'll be back in a moment to have some more discussion with him.

18:18 I'm Elsbeth Magilton, the executive producer of Tech Refactored. We're so happy you're listening today. We love producing exciting and engaging content for you. The best way to help us continue making great content like this episode is word of mouth. We hope you tell all of your tech interested friends about us and encourage them to listen.

18:34 Should we sweeten the deal? Follow us on Twitter at UNL underscore NGTC and tweet about the show and we'll give you a shout out at the end or during a break on a future episode. Just like Jacob Tevez, David Thaw, Brian Min Wang, Sarah O, and Tammy Etheridge, who tweeted about us this week. Thank you all so much for tuning in now back to our host, Gus Herwitz, in this episode of Tech Refactored.

19:02 And we're back, uh, talking with, uh, Joshua Tucker from New York University. Josh, I I'm gonna put you on the spot and ask you to, uh, uh, pick among friends, uh, the, your book is a edited compilation of essays with many contributors. Are there any favorite essays that you have in there that you'd like to, uh, highlight for us?

19:21 Yeah. And I'm gonna take the academic answer and not a, or maybe it's the press secretary answer and not answer the question you've asked, but, but take the opportunity to tout some of the, the great work that's in there. If I was forced to pick, I would definitely, you know, make sure to tell everyone to read Nate and my concluding chapter

19:35 on the importance of data access and lots of different, uh, proposals for how we can do this. But I do think, you know, it's a, again, as someone who has, uh, as someone who comes at this from the basic scientific research side of things, having all of these policy chapters and having such distinguished authors to talk about it is super interesting to have that there on the part of the book, you know, that I was more involved in, um, in editing and commissioning, you know, I would really, I think, you know, I, I would really push, uh, Pablo Barbera's chapter

20:03 on, um, political polarization, just because I think this is one of those cases that, uh, really is- there's a disconnect between what the academic research is finding and what the received wisdom holds. And, uh, and Pablo by the way, was, was the brilliant student who came to me all those years ago with a way for, for a PhD seminar on how to measure this, but, uh, how to measure partisans attach or political political ideology.

20:28 Uh, because I think there's a, there's a real sense in the, um, in the, in the broader sphere that, uh, that, that social media has contributed to increasing levels of political polarization. And I think those of us who work in this field- while there are disagreements, and there are different people who will push different things.

20:45 The academic research, the academic evidence on this is much, much more mixed. You know, we know that social media usage has gone up and we know that political polarization has increased in the United States as it has in other parts of the world, but the link between the two is not, is not quite as clear.

20:59 And Pablo does a really nice job in the chapter of going through and sorting out evidence that shows evidence of, of echo chambers online, but then showing a bunch of studies that push back on these evidence of echo chambers online. I'd also recommend the chapter by Chloe Wittenberg and Adam Berinsky

21:13 on correcting misinformation, just because I think there has been, this is one area where, you know, we are less dependent on the platforms for doing at least kind of the basic scientific research. You can run lots of lab experiments to try to figure out about correcting misinformation, and there is a lot of research in this area that shows certain things that do work well, certain things that don't work well, there's the famous backfire effect, which was identified as you correct misinformation,

21:38 and then people remember the- they remember just the misinformation. They don't remember the correction, but then there's been a backfire on the backfire effect where there was some important research on this, but then it hasn't replicated, and, and now there's more of a sense that actually correcting can be more helpful.

21:52 Um, so I think that chapter is a great read, because again, anytime you sort of go into it thinking, you know the answer, um, getting a sense of more of the complexity that's out there, uh, uh, can be really useful, but I, I I'd recommend the whole book. It's a good read. So I I'm hearing a desperate plea or I'm making a desperate plea for, uh, our friends in Congress to listen to this conversation, uh, and recognize the complexity of, uh, all this.

22:17 And this is such an important, timely conversation. I know there's a hearing where we're recording on, uh, uh, I guess Tuesday, March 23rd. And there's a hearing, uh, with some, uh, CEOs of big tech companies on these issues, It's a weekly congressional hearing on these topics. Uh, I guess, um, the, these are hard issues and, uh, researchers are struggling and that's what research is, research doesn't produce clear-

22:42 cut, yes, no binary answers. It's a dialogue. It's a back and forth, uh, it's corrections and, uh, verifying. Um, so, uh, I guess two, two questions first, if the question before you get into the next question, can I jump in with one comment on that? Uh, yeah, because one thing I think, you know, there's lots of different debates here about, you know, whether we have echo chambers, how strong the echo chambers are, these kinds of things.

23:05 But I think there's one point as, as you talk about congressional hearings and we think about congressional legislation and regulation, um, that I hope everyone would agree upon. You know, there may be people who have different ideas from the right and from the left about potential solutions um, for potential harms, it might be mitigated- be being brought on society from the rise of social media, but I would hope everyone would be in agreement in that whatever policy proposals you wanna make, whether you come at it from a left wing or a right wing perspective, you would like you, those policy proposals will work best if they are based on sound scientific

23:37 understanding of what's actually happening on the platforms. Are there echo chambers there? Are they not? And the only way we're going to be able. You know, I jokingly referred to as my favorite chapter, the last chapter that Nate and I wrote about data access. But the only way, the reason we wrote that last chapter is because when you go through each of the individual chapters, you see how many of them end with, well, we've learned this one small piece of the puzzle, but here are 16 other questions that are still out there

24:01 and we don't know the answer because we're not able to access the data. And so I, my hope is that, you know, lots of different opinions about what should be done in terms of regulation of these platforms, but what should be underlying all of it is we want to have a better understanding of what's happening.

24:17 And the only way that that's gonna happen is if we allow people who are outside of the platforms who don't work for the platforms and who are not constrained by the platforms in terms of what they can share publicly- if we allow these people to be able to conduct rigorous scientific research, and that involves all sorts of things right.

24:35 To make it rigorous. But if we allow people to who are going to conduct that rigorous research to have access to this data, so that what is going on can be learned and it can be used for advancing our scientific understanding of the impacts of these platforms on society. And then it can be used to inform public policy

24:52 so PSA in the middle of this and my plea to whoever is gonna be co- you know, listening to following the debates on Capitol Hill about it. Uh, well, you, you answered one of my two questions, but gave me a replacement. So, uh, my, uh, two questions still, uh, I, I always like to ask, especially you're, you're talking about a range of, uh, perspectives and scholars working in this area and the, the need for

25:13 considered balance and looking at all the data out there. There are always people who disagree with any perspective. Uh, are there any folks, uh, who have thoughtful takes that, uh, just outright disagree with you that you are able to engage with, or they engage with you in, in a scholarly, uh, debate. Yeah. So there's a couple different ways to respond to that.

25:32 Um, one, one way is actually, you know, interestingly enough, a sort of, uh, a critique of the book came out in the international journal of press and policy a couple of maybe yesterday, maybe two days ago. Um, and you know, and, and while being saying kind things about the book and what it did accused it of being sort of too focused on the type of research that we do at the Center for Social Media and Politics, and too focused on the kind of legalistic research that-

25:55 that Nate does, you know, that people do at law schools, and said that, you know, there are, that you need to take a broader perspective if you want to understand social media and society. So I think one thing I would do is just definitely, you know, call out- this book you know, the book that we have and the type of research we do at the Center for Social Media and Politics, it tends to be data heavy.

26:14 You know, we are trying to study things at scale. We do things at more micro level, but it tends to be surveys and experiments and, and, and, you know, experiments in that regard, but there is incredibly interesting work being done and we benefit tremendously from it in informing the types of research questions we ask by anthropologists, by people who are doing much more sort of qualitative research people who are embedding themselves in online communities, our neighbors down the street here in New York City data and society, you know, uh, Joan Donovan, when she was at data in society, you know, kind of pioneering this spend six hours a day in these online alt-right community groups and things like that.

26:46 So I think- so I think I would say like, we present a part of the picture and we obviously think that what we're doing is important. Um, and, uh, and there's a lot of, you know, incredibly exciting research that's being done in this particular way. But I would, I would urge people who are interested, who are coming to the topic for the first time, right?

27:03 There's the type of research that we'll be largely reporting about in the book. Although some of the chapters do get into- you know, they're not, it's not like they're exclusively talking about quantitative research, but you know, it was said that a lot of the topics we were, we had chosen sort of lent themselves to this type of research and what we, what we decided to include as chapters in the book were more, not the policy ones, but in the, in the reviews.

27:24 So I think that's one thing I would definitely say. Um, and then in terms of, you know, in terms of- so that would be my sort of answer to like broadening, you know, broadening the view beyond what what's in this book. And, and you had said, you know, what would I recommend that people read? I would dig into some of the people who've done more qualitative research and doing more, uh, doing more anthropological research

27:44 and in particularly looking at the differences, you know, trying to get understanding from marginalized communities in society about how they're impacted by this kind of research. You know, we give much broader sort of bird's eye views of what do we learn in the lab? What do we learn at scale when we look at this, but obviously ultimately, you know, this stuff affects real people and it affects different people differently.

28:04 And so I think that type of research is valuable to take a look at as well. So I, uh, I apologize. I've kind of, uh, uh, pulled a quick one on you. I said we were gonna talk about one thing after the break and we're talking about something different, but we'll, we'll come to the fun stuff in a second, but I, uh, uh, also want to ask building on, uh, your observation, that we're limited by the data that we have access to-

28:26 otherwise the old joke of the economist looking under the streetlight for, uh, his keys, um, and the economists are always his, um, keys that they're looking for, that that comes into play here. Uh, a little bit of discussion about how you get access to the data. I know you've, uh, uh, cooperated with Facebook and there's been a lot of discussion recently about how that affects, uh, research.

28:46 Uh, and also, uh, you've mentioned, um, uh, uh, funding from the (indistinguishable) foundation, the Knight foundation in addition to the NSF, I should say, uh, uh, uh, our center here in Nebraska, we also have Knight center foundation and, uh, uh, from the Koch foundation along with, uh, uh, several private funders, um, and support from the University, um, this looks different and it feels different in many ways, from a lot of traditionally, uh, government funded academic work.

29:16 And I, I think there are legitimate questions, uh, that, uh, I personally think about those. And I'm, I'm curious your thoughts, uh, on. The role of these nontraditional collaborations, both from a funding and research perspective on the work, and also, uh, more generally on the, how they are affecting, uh, the university.

29:35 Can the university operate in its traditional mode in these to address these really important issues that we're confronting today? Those are great questions, Gus, and I'll say what I've said a bunch of times on this, you should read the last chapter of the book because we get into both of those topics, uh, in there both the funding topic and the, and the data access.

29:54 So let me do the, the latter first, and then I'll talk about the funding aspect of it a little bit. Um, you know, we argue in the book as a, as the point I've made earlier on in this podcast, right? Like, We we've had transformation, you know, a sea of transformation in our society about the way people communicate about politics,

30:09 learn about information about politics, to say nothing of everything beyond politics, but as a political scientist, that's what I tend to focus on. Right. And we, as a society need to know what is going on, right? Like what happens when you have a couple billion people who are on one social media platform?

30:24 Like what does that do, uh, to Indian politics? What does that do around the world? It is crucially important for us to inform public policy, but it's also, on a positive sense, it's an attempt, it's an opportunity to do the, you know, incredible scientific advancement in the social sciences. Right? This is data that we could never have dreamed about having

30:42 you know, 15 years ago. And, and Nate and I write in the book that it's kind of the best of times and the worst of times to be a social scientist. It's the best of times, because, you know, we have like granular data about a couple billion people on earth, you know, who are spending some of their time talking about politics.

30:58 I mean, it's just completely different from what we used to do, which was, we could have sort of aggregate measures of events. Like how many people showed up at a protest, how many votes did somebody get in an election? We could go out and do surveys, which were expensive. And we know are, you know, all sorts of different, you know, issues around surveys.

31:13 But now all of a sudden we have all this digital trace data. And in a way we're still just scraping the surface of it. There's so much, you know, you can look at incredible things. People are doing with like cell phone data and things like that. So our chances for advancing- and you know, and we're a scientist, we're social scientists.

31:26 We wanna advance our understanding of human behavior in our different fields that we're interested in. It it's just an unparalleled opportunity and ultimately society benefits from when you have higher quality science, you know, that's, I think the belief that we all have in this field, , but it's the worst of times, because again, if you think about it, when I was in graduate school, what were the sources of data that I could get?

31:46 Well, I could get data from governments, right? I spent my spent in parts of the nineties, tripping around Eastern Europe, going to state statistical offices to get regional level economic data and Slovakia and Poland and Hungary. Right. But that was government produced data. So we had that kind of administrative data,

32:01 and then we had things you could run in the lab or surveys that you ran, you- they were expensive. You had to raise money for them, but then you had control of the data. Well, now we're in this very weird world, which we had not been as social scientists where all this data that we need to face, these sort of growing, pressing problems and questions facing society.

32:18 They're in the hands of private companies and that's just- lots of private companies, like a few small, really powerful private companies. And so it leads to all sorts of challenges and conundrums in that regard. And so in the book, what we argue is that we as researchers need to simultaneously pursue three paths, knowing that they all have pros and cons, they all have problems associated with it, but the alternative is to sort of pack up and go home.

32:40 And that's problematic for all sorts of other reasons as well. So the first we say is, you know, when you can work around the platforms, collect data that you can, without collaborating with the platforms that has huge advantages in terms of independence, but it has other all sorts of disadvantages in terms of you're at the mercy and whims of what the platforms decide, and they change their minds and they change things and they, you know, you could be setting up one research project, all of a sudden the day's not available anymore.

33:04 And you're at the limits of what they've come up. The second option is you collaborate with the platforms that has all sorts of other problems, because you have to think about independence and transparency. Um, on the other hand, it can allow you to answer questions you might not be able to answer otherwise.

33:18 And then the third is that you try to work with government to try to get government, to regulate, to change the legal infrastructure, to make it so that this data is available. All of those are uphill battles. All of those are things that have costs and benefits to them. And, uh, all of those, um, uh, all of the

33:35 all of them have cost and benefits to them. And so our argument is they are all front with problems and so people should pursue all of them simultaneously in the hopes that some of them do work out. Um, so that's my that's on the, on the data access question. And let's turn to the fund- funding question.

33:51 Yeah. So the funding question is actually a super interesting one as well. So here is the fundamental problem. As you know, we started off this discussion earlier talking about this center that we've set up at NYU. You know, my belief is, is that, you know, the, for this type of research, need these larger collaborative enterprises, right?

34:07 We have gotten to the point where we're sort of more like bio labs, where we, we need research engineers, we need large data infrastructure collections. We need, you know, we need to work together on these kinds of projects. We need labs, political science is not set up to fund labs. And there's NSF funding at political sciences, you know, 10 million dollars a year.

34:25 The NH- NIH gives 50 billion a year for research. Um, you know, they're not even in the same game, so how do we get to the point where we can begin to get funding at scale? For laboratory based research in the social sciences, it's a conundrum. And it means that we have to expand the pool of people who were thinking about funding.

34:44 This now obviously the best thing we can do is get the, is get government funding behind this, right? And, and that's why we have the reason we have bio and chemistry labs at universities across the country is because the NIH gives 50 billion a year for research or whatever exactly the figure is, you know, in that regard.

35:01 Um, so how do we do this in this cases? Well, There's then this leads to this huge conundrum, because, you know, you said about the economist looking under the lamp for their keys, right? Jesse, Jesse, James, right? Why do you rob the banks? That's where the money is, right? Like we know from this transformation of society, that there are these, uh, corporations that have gotten phenomenally phenomenally rich from this.

35:21 But we also know from a history of kind of big pharma and funding of drug related research, that there are all sorts of concerns about having researchers taking, uh, taking money directly from platforms. Uh, but there's also a question about whether there's a moral obligation in the sense, uh, in the sense that, uh, Uh, whether there's a, you know, there's some sort of obligation in the sense that, you know, if you broke it, you should, you should at least pay to figure out how to fix it if not to fix it itself.

35:46 Right. And so, uh, so there seems to be a solution here, which is that you have these companies that have just made tons and tons of money. You have all these people who are trying to study how this, this, you know, change in society, that's led to these companies making all this money, how, what effect and impact that has on society.

36:04 It seems like there should be some deal to be worked out here, but there's this, there's this fear of, you know, of, of the problems when your funding comes from the people who you're studying. So what we propose in the book is that to begin thinking about kind of a third party, uh, Institute that would be set up, that would be, um, that would be a repository to take funding from

36:26 platforms, from foundations, from government funding, from line items and budgets and these sorts of things that you're seeing more of now and co-mingle that funding and then distribute it to research. And that we think that that's a way- so for example, Facebook got fined $5 billion by the FTC, right? A billion dollars of that could have gone into seed, such a kind of institution

36:47 right? You need large amounts of money, but you also need a way to protect the integrity of the researchers. So that's one possible solution. The other thing is, as you just did, um, you know, you say you get, you know, you, you rattled off the funders that are funding you, you know, the, the next best thing we can do is total transparency,

37:04 right? And so, and, uh, and the, and, and the collaborative work that I'm doing with Facebook now, uh, as part of the Facebook, you know, 2020 election research project, you know, that's one of the things that we have adopted as our mantra. Now that's a whole other story and could take an entire podcast for itself and

37:19 a year from now when we have results, I would be happy to come back and, and talk about that in much greater details. Um, but you know, one of the things, the core underlying processes we've tried to do there is to think about how mu- you know, how to be transparent about what we're doing, right? That research should never be presented without people knowing that some of the researchers involved with it work for Facebook.

37:37 Now we have a- lots of things that we've done to try to make sure that the research is seen as independent and has integrity. And, you know, we don't, you know, all these sorts of things and again, subject for another day. But I think anytime we do this about funding, we need to be totally transparent. Now, fortunately, there's a big kind of open science movement that's been going on in the sciences

37:55 that's come to the social sciences. And, you know, we tend to think about open sciences, replication data, but it also means being open about who your funders are. And, and so we haven't- we have other proposals in the, we have some basic proposals in the book there. About people, you know, bare minimums that people should always be doing to disclose when they've gotten funding from a platform to disclose when they've gotten special data access from a platform, right?

38:17 And again, there's a temptation to be draconian about these things and say, well, you should never get data from a platform, you should never get money from a platform. But the reality is these are all trade offs, right? If we never allow anyone to have special arrangements with platforms to access data, there is a lot of research that won't be done, and there's a lot of information that will remain locked up inside those platforms and will not be communicated to the public.

38:41 And so all of these things are kind of trade offs. And, and one last point on this, you know, one of the other things we talk about in that chapter is really beginning to conceptualize these debates as kind of trade offs, right? Like we can think about privacy as a good, right? And we can also think about the public- public policy being informed by the type of analysis that can be done of these data.

39:02 And we have to think of those, both things as things that are two goods, but there are trade offs between them and we, as a society need to think about where sort of the optimal place that we wanna be. If we push all the data back into the platforms, that would be the best thing we could do in terms of privacy.

39:17 But then that means that the greatest data sets ever created in the history of the universe for advancing knowledge in terms of social science, they'd be used, you know, for one thing. And that would be to maximize the profits of already huge, very, very wealthy companies in the short term, in the long term, they'd be used for whatever those companies want to do.

39:33 And, and, you know, and it's not clear what those companies are, want to do in the future. So we really do need to think about these, these trade offs that we need to think about how we can, how we can do this in ways that's gonna benefit society the most as well. And so there, there are so many topics, uh, uh, I'm, I'm gonna have to find time post pandemic,

39:50 we're gonna need to, uh, uh, do one of these, uh, longer format, uh, with some beers involved. But, uh, I want to, I, I have to, uh, add my own 2 cents on one of the things that you're saying, uh, uh, first. Great shame to let good data go to waste. Uh, always just a knife in the heart, but, uh, also we, we need to think about the incentive structures, uh, that the academy has for academics.

40:14 And one of the things about, uh, your center and what I'm doing here at, uh, uh, Nebraska is, we have the flexibility, I assume, on your end, that you have the flexibility to create incentives for folks who are facing tenure questions to do some work that might not otherwise, uh, be compatible with their promotion and tenure, uh, requirements.

40:34 And breaking those incentives is such a, a hard thing for us within the academy and as the NSF approaches, uh, how it thinks about funding as well to do, but I'm not gonna let you respond to that because we haven't even spoken about the fun stuff. I mean, this is all the stuff that I love to talk about, but your center's doing so much, really fascinating work.

40:53 Um, we, we only have a couple of minutes left, but I, I wonder if, uh, uh, you could highlight a couple of, uh, the really interesting projects, uh, that you and your team have been working. Oh sure. Thanks, Gus yeah, yeah, no, no. We should've started with this 50 minutes ago. We could have gone through all of this. So, I mean, I, let me highlight one concept for listeners who are not, you know, as familiar with this field that I think permeates a lot of the stuff we're doing, which is that when we think about malevolent behavior online, one of the things we find all over and over again across, uh, different studies and different things that we do is that we seem to see a power law

41:28 in terms of internet behavior and a power law are these curves, right? Where you have large numbers of people that have really, really low values on whatever it is. The, the thing that you're talking about it is, and then really small numbers of people that have very high values in this regard. So for example, we did a, we did one piece on who had shared fake news on Facebook during the 2016 election,

41:49 and we found exactly this relationship, right? Like even though everyone was talking about fake news after the 2016 election, we found that we found that, uh, you know, a small fraction of our, of our group, of the survey respondents we had, who had shared their Facebook data with us had shared quite a bit of links to these false news websites.

42:06 But most people hadn't shared any, right, which is not what you would normally get- you were getting at that time out of the conversation that this was happening. We have another paper that we're working on right now, where we've been looking at exposure, uh, to Russian troll data. So we have Twitter, very helpfully-

42:21 speaking of data access- released, released a full compendium of all the tweets by all the Russian trolls. And we had a survey in the field around 2016, where we, uh, were able to, you know, we knew something about the people cuz they were in our survey. And we also knew their Twitter handles so we knew who they followed on Twitter.

42:37 And we were able to see how much they were exposed to Russian trolls. And we found exactly the same thing. Right. We found that while there were lots of tweets from Russian trolls out there in our sample. 1% of our respondents accounted for 75% of the potential exposures, uh, to these trolls and 10% of our respondents accounted for 99% of the potential exposures.

42:58 So we keep coming back and sort of, uh, and sort of seeing these things over and over again, I'll mention a couple other interesting things on that study, which was that, uh, on the first study on the, who had shared fake news online, We were then able, because we had demographic data about people we were able to look at at who were these people who were doing it.

43:14 And the big finding that came out of that was that the huge predictor of who was able of who was actually sharing these links online was age. And that it turned out that people over the age of 65 in our sample shared on average seven times as many links as the 18 to 29 year old. So if you remember in the immediate aftermath of 2016, all we heard

43:33 about combating fake news was digital literacy courses in high schools, right? So, so that, that, that got, uh, uh, a bit of attention. And I think we think helped shift the conversation in a useful direction. One other big project we've been working on now is people's ability to identify the veracity of news

43:50 in real time, we've built some pipelines to send people out articles, popular articles that have appeared in the last 24 to 48 hours, both before COVID and during COVID.  And we've learned a lot about people's, uh, you know, who's able to identify the veracity of, of news that they encounter that's new- in the sort of before it's been fact checked by snoops and these kinds of things.

44:09 And here again, we find the supremacy of politics. We look at lots of different demographic co- variance, and a lot of them work the way you think they would work, like people who know more about politics are a little better about it, people who follow the news regularly are a little better about it. But again, the, the biggest effect we had was something called partisan congruity,

44:27 which was conservatives were much less likely to be able to identify pro conservative, fake news correctly as fake. They were more likely to believe it, but liberals were less likely to believe- were less likely to correctly identify pro-liberal fake news as fake. They were more likely to believe it. So we continue to see, even as we venture into these areas that are-

44:49 these areas that are farther from the sort of core domains of political science, just how important politics is in the- in, in these regards. But there's, there's lots more projects so hopefully we can talk, talk about many of them in the future. Yep. And re- regrettably the, the little hand is telling me it's time to rock and roll.

45:04 So, uh, uh, we have to get out of here, but this has been a wonderful conversation. Uh, thank you for taking the time, uh, uh, to talk with us and for listeners, uh, you can use the internet and just Google Joshua Tucker, um, uh, social media and democracy, uh, and find, uh, links to his work. Uh, the center's, uh, webpage and the book, uh, which I believe is available through, uh, I think it's Cambridge, uh, university press open access.

45:29 So it's available, uh, online for free which is a great price. Thank you uh, Joshua. Thanks so much. It's been a real pleasure. I've been your host Gus Herwitz, thank you for joining us on this episode of Tech Refactored. If you want to learn more about what we're doing here at the Nebraska Governance and Technology Center, you can go to our website at ngtc.unl.edu, or you can follow us on Twitter at UNL underscore NGTC.

45:53 You can listen to or download our podcast on our website or find us on Apple podcasts, Pocket cast, and Stitcher. This podcast is part of the Menard Governance and Technology programming series, hosted by the Nebraska governance and technology center. And if you want to learn more about the Menard governance, part of that, uh, or why I am the Menard director, you can go to our website to find information about our funding.

46:14 The Nebraska Governance and Technology Center is a partnership led by the Nebraska College of Law and collaboration with the colleges of Engineering, Business, and Journalism and Mass Communications at the University of Nebraska. Colin McCarthy produced and recorded our theme music A. C. Richter provided technical assistance and advice.

46:30 Elsbeth Magilton is our executive producer and Lysandra Marquez is our associate producer. Until next time, keep on tweeting.