Tech Refactored

S2E47 - Free Speech in a Global Environment

July 01, 2022 Season 2 Episode 47
Tech Refactored
S2E47 - Free Speech in a Global Environment
Show Notes Transcript

The episode you’re about to hear is being hosted by our student fellows. Our Student Fellows are an interdisciplinary group, representing colleges and specializations across the University of Nebraska. Jeffrey Owusu-Ansah (Law) and Mei Fong Looi (Business) interview Center faculty Professor Kyle Langvardt on how the United States can promote the principal of free speech in a global environment and how the first amendment operates generally on social media and online platforms.

Professor Kyle Langvardt joined the University of Nebraska College of Law faculty in July 2020 as a member of the Nebraska Technology & Governance Center.  He is a First Amendment scholar who focuses on the Internet’s implications for free expression both as a matter of constitutional doctrine and as a practical reality. His written work addresses new and confounding policy issues including tech addiction, the collapse of traditional gatekeepers in online media and 3D-printable weapons. Professor Langvardt’s most recent papers appear in the Georgetown Law Journal, the Fordham Law Review and the George Mason Law Review. 

Related Topic Articles Collected by Jeff and Mei:

00:00 This is Tech Refactored. I'm one of your regular guest hosts, Elsbeth Magilton, the executive director of the Nebraska Governance and Technology Center at the University of Nebraska. The episode you're about to hear is being hosted by two of our student fellows. Our student fellows are a diverse and interdisciplinary group representing colleges and specializations across the University of Nebraska.

00:35 The goal of the student fellows initiative is to familiarize students with the nuances of working with professionals from other academic backgrounds, incorporating their diverse perspectives and vocabularies in order to better inform their own work this semester, we challenged them to produce an episode of tech refactored on a subject of their choosing.

00:53 We hope you enjoy this special episode of Tech Refactored hosted and produced by our student fellows, Jeff, and may with our very own professor Kyle Langvardt from the College of Law. Enjoy this special episode of Tech Refactored.

01:22 Hello, my name is Jeffrey Owusu-Ansah and I'm a rising 3L at the University of Nebraska's College of Law. Uh, my name is Mei Fong, and I'm also a student in university of Nebraska Lincoln, uh, College of Business. And we are both like a student fellowship here. Today we're joined by professor Kyle Langvardt. Professor Langvardt is an assistant professor at the University of Nebraska College of Law. Here at the University,

01:47 Professor Langhart teaches courses in constitutional law and speech and media. Welcome to the show. Yeah, I'm really glad to be here. So I guess, uh, Generally, what we're going to be talking about today is the topic of how free speech and media, um, particularly in social media, um, relates to a global environment.

02:09 So I think we're all generally familiar with the situation in which we see a world leader of a particular country, oftentimes not in the United States, but in some limited cases in the United States, will tell a social media company that is potentially based in the United States, that if they do not remove a particular type of content, that is generally critical of the world leader, that the social media company will no longer be able to operate within their country.

02:41 This is an issue that generally puts the principle of free speech at tension with general principles of capitalism and making money. And, but generally today, we just wanna talk about, um, how we can think about that issue and, um, maybe where it goes from here. So I guess first, uh, as it relates to the United States, I wanted to ask professor Langvardt,

03:06 generally, how does the First Amendment apply to state actors versus private actors in the United States? Just as kinda an intro for listeners? Yeah. Well, I mean, in, in the United States, the first amendment applies to public actors exclusively. Uh, it, it doesn't apply to, uh, private companies. Now there's-

03:27 there's some very, very narrow, uh, exceptions to this mm-hmm , but for our purposes, it's not even really worth paying attention to. So the government is subject to the first amendment, uh, any private company, including really big, uh, Kind of quasi-monopolistic or oligopolistic companies like Facebook, Google, you name it.

03:50 Those are private companies that are not subject to the First Amendment. Okay. Now you mentioned, uh, a moment ago that from time to time, governmental institutions in various countries have given an ultimatum to a company and said, "If you don't stop carrying speech that criticizes the state here, then we are going to, I don't know, revoke your license to operate or something like that."

04:21 This has happened in China, for example. Yeah. If, if the American government tried to do that, uh, that would qualify as state action. So the, the United States, can't just commandeer a private platform like Facebook or TikTok or, or whatever it is and use it as an instrument to censor speech. Um, but if, if the private company acts totally independently, then the First Amendment doesn't apply.

04:55 And so realistically in, in the American context, what you're gonna get, if the government wants certain types of speech taken down, or for that matter certain types of speech to be left up on a social platform, what official actors are gonna try to do is they're gonna- they're gonna try to take advantage of the, the, the slippage between these two cases.

05:19 You know, they're not going to legally require platforms to, uh, to take things down, but they might apply informal pressures. Um, you know, there could be phone calls, dinners, uh, that, that kind of thing. And, and that, that can be a way for public institutions to, um, exploit the fact that private actors are not subject to the First Amendment, they can, they can use these kind of backdoor, uh, conversations to get what they want.

05:50 Ah, so you're saying that's how the issue can kind of present itself in the United States, as opposed to maybe the more explicit version of the same general principle, um, presenting itself in other countries. Mm-hmm  yeah, that's right. No, it doesn't, it doesn't always work, but, uh, but, but that can happen. Yeah. Uh, if, if you wanna know how this kind of public/ private distinction works in, in other countries, other countries aren't necessarily going to be quite so, so rigid about it.

06:23 So here in the United States, we have this really kind of stark black and white, like absolute conception that the constitution only applies to public actors and, and not private ones. Um, Facebook isn't subject to the First Amendment, uh, but in a country like, like Germany, for example, there's a doctrine called–and I don't speak German–but it's called, uh, (indistinguishable)

06:50 And the idea is that certain fundamental rights that you have against the government might apply horizontally to private parties that resemble the government in some way. So, uh, in, in, in a case involving, uh, content moderation or content governance by Facebook, for example, the court in, in, um, in, in Germany, said that, well, Facebook has a, a speech right here to choose what kinds of content it wants to carry, uh, speakers who are using Facebook have a speech right.

07:28 And so we're going to balance them against each other. Uh, that's the kind of thing we just don't really do in the United States. Um, so in other countries, are you aware of any examples of government limiting the speech right? For just like private actors? Yeah. And- and- so just to kind of clarify a little bit about, about the question,

07:49 so examples of governments limiting speech rights for private actors- um, those private actors could be individuals who are using social platforms or search platforms, whatever it is mm-hmm, but private actors could also be the platforms themselves. So if, I mean, I- I don't use social media and I, especially not Facebook, but you know, if I did, uh, and I, I wanted to get on Facebook and, um, I don't know, promote my, my personal brand or something like that.

08:26 I would be, I would be speaking, you know, that that would be speech, but according to First Amendment law, Facebook would also be speaking when it decided either to leave my speech up on the platform or to take my platform or, or to take my speech down, you know, they would be speaking when they decided, uh, where my speech was going to appear in other people's news feeds.

08:53 And so when Facebook handles users' speech that qualifies as a kind of private speech in itself. Now in, in other countries, there have been laws that, uh, regulate both of these types of, of private speech. In Australia, for example, after the, uh, the Christchurch massacre, the country really tried to tamp down on the distribution of, uh, the, the video of this, this mass shooting by making a law that required platforms to take down what was called abhorrent, violent material.

09:35 Well, That that affects the, the speech rights of two types of private actors. It affects the speech rights of platforms that might otherwise choose to carry that kind of speech. Um, and it also affects the speech rights of, uh, people who might want to spread abhorrent, violent material. And, and I think one of the issues that generally resolve revolves around, um, the topic is

10:06 the difficulty that I think we have as objective people on the outside, watching governments act in these ways is how can we decide which country is doing things in the right way or like which country is limiting speech in a way that we personally believe is productive, uh, with a Christchurch massacre, I think virtually everyone would agree that

10:32 It was correct for Australia or New Zealand to, um, restrict the distribution of that video. I think practically everybody would agree with that, but alternatively, you look at countries such as like India, where they will also have local law and a platform such as Facebook, which is using its speech rights to leave, um, speech that's critical of their prime minister up.

11:01 Um, we- a lot of people would look at that as something that should occur or should continue to occur, um, regardless of who it's critical of. Um, so I guess the general question that I would have is would you have any guidance or would you be familiar with any framework, um, around the world or really in the United States of finding a way to like objectively look at these things such that we can, um, I guess parse through what is good and what is bad?

11:29 You know, the, the freedom of speech is. It's a- it's a really vague, vague concept. And there's lots of room for reasonable disagreement about, about how it should work. Um,

11:46 you know, if- if- if we were going to set some kind of a universal conception for what the freedom of speech is, uh, it. It probably wouldn't look exactly like the American conception of free speech. So one example of the difference between the American free speech tradition and the way that free speech works in other countries is that, uh, hate speech in most countries is not protected speech.

12:17 It can be regulated freely. But in the United States, uh, it actually would violate the First Amendment for a state to ban hate speech, however we wanna define it, racist speech, whatever it is. In the United States, the way that the law looks at a ban on something like racist speech is that, that is a law that discriminates against certain types of speech based on its viewpoint.

12:47 And the fact that that viewpoint is abhorrent doesn't really make a- a difference. The idea behind the First Amendment is you have to, you have to really take the, uh, the abhorrent stuff with, with the good stuff. And you just can't discriminate based on viewpoint, um, that you know, that. That's a, a, a really big difference between our tradition and most other countries.

13:08 Uh, I don't think it's obvious that we're wrong in our approach, but I also don't think it's obvious that we're, uh, right in our approach. Another example could be the way that we approach campaign finance regulation here, here in the United States. Uh, the, the government is increasingly hamstrung under First Amendment principles in terms of the way that it, uh, regulates political advertising and funding for political campaigns.

13:35 it- it doesn't really work that way in most other democracies, including democracy democracies that we would consider fairly robust, uh, in, in terms of their speech tradition. Um, another example could be the way that we apply free speech principles to advertising. You know, it's not, it's not obvious that that advertising should be subject to the freedom of speech.

13:57 So i- if you're gonna come up with some kind of universal liable tradition, uh, you have to really kind of zoom out to some very basic principles. And probably the, the best thing that I can come up with is, uh, an approach that you sometimes see in, in international law. And, and under this approach, you have to have three things.

14:22 Uh, the government has to show that there was a, uh, a justification to regulate, you know, some kind of harm that the government's trying to prevent some kind of benefit that the government's trying to promote. This is what's called necessity, there's a need for the law. Then, um, the, the government has to show that the law was proportional the proportionality requirement.

14:44 So you can't take a kind of, uh, a sledge hammer approach when a, a surgical approach to the problem is, is called for, you have to regulate as as little speech as you can reasonably get away with in order to accomplish your goals. Uh, and then the third requirement is what's called legality, which basically means that if, if the government's going to step in and, and regulate this speech, it'd better be able to put point to some kind of law that's on the books.

15:10 You, you can't just have some kind of, um, executive figure stepping in and making ad hoc determinations about what kind of speech is is good, and what kind of speech is bad. So necessity, proportionality, legality, there's gonna be room for reasonable disagreement about what's necessary, what's proportional, maybe less room for reasonable disagreement about what follows the legality principle, but you can at least say that governments have to follow, uh, the general contours of that approach.

15:41 And I think at a really high level of generalization, this is an approach that you could say describes First Amendment law as well. Like it fits into that, that tradition, um, and some private platforms have, have also adopted this, this approach at least, uh, as an official policy. So I guess from that, is there any way that the United States can ensure that, um, other or ensure that technology companies that are based in the United States, um, uphold

16:10 the United States' framework or the United States is principles of free speech? Is there any, um, precedent or is there even a possibility that that could occur? Well, I, you know, I guess, I mean the world's changing fast and lots of things are possible, but, um, under the First Amendment, as it's currently interpreted, it's not possible.

16:35 uh, and it's not possible for the United States to do this even within the US. So, you know, recently the state of Florida enacted a statute that would require large social media companies to, um, follow certain free speech rules. So there were restrictions on the ability of a company to take down like a political candidate's speech,

17:07 for example. Uh, there, there were restrictions on what a platform could do to hide or shadow-ban certain types of speech. Um, You know, lots of, lots of restrictions that you could at least say were, were motivated by, uh, a sort of free speech policy. Well, that law, or, or most of that law has been struck down in court.

17:31 Um, you know, first in the district court level, but the, uh, 11th circuit recently weighed in too. And this goes back to the idea that when Facebook or another private platform uh, regulates speech and, and decides what to censor, what to promote, what to demote, what to shadow ban that all of that is Facebook's own speech and it's protected by the First Amendment.

17:58 Uh, so, you know, it's not just that Facebook isn't subject to the First Amendment. You know, Facebook doesn't have to follow the First Amendment, it's that the government, because of the first amendment has to keep away from Facebook's decisions. You know, Facebook is protected by the first amendment and in some sense, empowered to, uh, regulate speech by the First Amendment.

18:21 Uh, so, you know, I think if, if the government doesn't have the power to make a platform, follow those kinds of rules here, then the government probably doesn't have the power to make a platform follow those kinds of rules abroad. Um, so another question that we have for you is like, could you explain the potential constitutionality of the former president Trump potential ban on TikTok, or just like, what is your opinion on that?

18:50 Well, when president Trump attempted to ban TikTok in the United States and, and also, uh, WeChat- this is a, a, a Chinese platform that got less attention in the American context. But I think in many ways it's a, well, I, I wouldn't say a, a more important platform, but, but for, for people who have family or friends in, in China, it-

19:14 it's kind of a lifeline it's sort of super Facebook in China. You know, when the Trump administration stepped in and, and tried to get these platforms removed, I don't know what the president's motivations were. And, and I think a lot of, a lot of commentators assumed that because it was president Trump and based on his, his past activities.

19:39 You know, he must have been up to up to no good. It must have been something corrupt, but I think, you know, if, if you really think about it, I think the government had pretty good reasons to think about, um, if not banning, at least trying to clamp down on TikTok's operations in the United States. Had good reasons to think about doing that kind of thing with WeChat. I think with, with TikTok, the, the reasonable

20:09 basis to, to regulate that platform would be that it had it, it had at least been a, a, a Chinese controlled company in, in the past, you know, re really under, under pressure from the Chinese state and that the platform had censored, uh, lots of speech that was critical of the regime in China. Uh, you know, just just a few years ago,

20:38 You could try to search for, um, Tiananmen square on, on TikTok and you would get no results. It was as if you were in, you were in China. Well, I mean, given, given the state of relations between the United States and, and China, I mean, I think it's pretty reasonable to worry that a platform, um, under the influence of an adversary state might try to use its power to tinker with American politics or, or tinker with, with American elections.

21:08 You know, the same kind of thing that, uh, we talked about Russia doing back in, back in 2016 and that we were so worried about Russia doing in, in 2020. Uh, with- with WeChat, there were other concerns about, about privacy. There were concerns that, um, maybe the Chinese security state um, might access certain sensitive data relating to military locations, that kind of thing.

21:34 Well, so these were both, I, I think at least plausible and, and maybe even compelling reasons to regulate these platforms in the United States. There was a, uh, a statutory basis, uh, probably for doing this under a law called, uh, the, um, uh, international emergency economic powers act or, or IEEPA, which allows the, the president to make executive orders that, um, clamp down on, uh, commercial relationships with, with foreign governments or, or people, uh, operating in adversary states.

22:16 The question though, as to whether the First Amendment allowed that, you know- this gets, I think, more difficult for the, the former Trump administration. So Tiktok argued, uh, in, in court that, um, it was speaking when it operated its platform, that it was the equivalent of a, uh, a cable network or, or something like that.

22:46 And that if the government was going to tell it that it couldn't operate in the United States and couldn't transmit speech to Americans, that that was just a, an, an obvious prior restraint under the First Amendment, you know, a, a gag order, um, With WeChat, there were users who said. You know, we use this platform to communicate with, with people in, in China.

23:14 It's like our, our one medium that really works to communicate with people who don't have access to platforms like, like Facebook, et cetera. And they said, this affects our speech too. Uh, and both of those arguments, uh, did did pretty well in- in court. Now, a, a lot of this has been, has been mooted, but it, it's not at all clear that, uh, the president had constitutional power to do those things.

23:43 Now, I think I, I'm not sure our approach to the First Amendment in this area is- is correct. You know, it seems to me that if a, it- it, if we had reason to believe that an adversary country was using a, a speech platform to interfere in, in American elections, for example, that to me seems like an area where the first amendment should allow the government to- to intervene.

24:09 But it's a, it's a sensitive matter, I guess, something that is interesting about Tiktok specifically. So you mentioned Tiananmen square and it not being available for search on that, or I think, I believe that was on, uh, on WeChat, on, on WeChat. Okay. On a, on WeChat or if that topic was not available for search on that platform.

24:35 It's interesting to think about how, um, the inability of the United States to promote free speech in other countries, through the platforms that are based here, uh, like that principle doesn't really apply in other countries. So you could think of, um, China or maybe Russia or some other platform that's based in a country, uh, like that, not being able to permit certain searches in other countries, such as the United States, um, and how the United States couldn't-

25:08 almost like counteract that action, mm-hmm such that if currently in the technology environment, in the United States, most of the companies, or most of the apps that most people, my age specifically will use will be apps that are headquartered in the United States, such as Facebook or, or they may be headquartered in places like Ireland, or they may have mm-hmm  companies there, but, or at least they're primarily based and have very large user bases within the United States, such as like Facebook or Twitter or Instagram.

25:43 Um, but you think of something like WeChat and it could be the inverse, which I guess could lead to issues, um, that way. So I guess mm-hmm, more generally, is there any way that you are aware of that the United States–and this is kind of circling around a question I asked previously, but is there any way that the United States–could push back on that?

26:05 Um, I, the only way maybe something such as like trade restrictions, but I guess maybe like very broadly, is there anything you're aware of? Um, the, the United States could, could push back on foreign owned platforms that might, uh, have policies we don't agree with within the United States that limit free speech effectively within the United States.

26:28 So, um, for, for example, that WeChat mm-hmm , uh, example of not being able to search Tiananman square and I presume that was in the United States; you couldn't search that topic within the United States if you were using an app that was owned by a country unsimilar to the United States. Yeah, I-

26:50 well, so I, I, I think, I think trade law is, is basically what you would use. So IEEPA, which I mentioned earlier is, is, uh, part, part of that, uh, that that's a way that the president can step in and regulate, um, international commercial relationships when American security is, is at stake. Um, there are also authorities in the, the commerce department to review, uh, mergers and, and acquisitions maybe by- by foreign owned, foreign owned companies.

27:28 It, it's not too hard to imagine, uh, regulatory tools that you might use in, you know, in, in, in trade law to force foreign owned companies to, uh, abide by American norms. You in- in a way you could think, how does, how does China handle, uh, American tech companies that don't wanna follow, uh, Chinese speech rules?

27:53 You know, it revokes licenses that you come up with all kinds of all, all kinds of ways to intervene. Mm-hmm. The big question is really just what will the First Amendment, um, allow, and as we interpret it now, I would say, It doesn't allow very much, but I would not be surprised at all to see that change in 10-10 years, 20 years.

28:19 Um, it, it, First Amendment's kind of a plastic and evolving thing and it's had ebbs and flows over time. Uh, we will be right back to discuss more about the free speech on the global environment. 

28:32 Momentum. It's building at the University of Nebraska Lincoln with game changing work in precision agriculture nanoscience and digital humanities. We're unlocking mysteries in brain research, solving the impossible with remote surgery, using robots, and we're creating bold futures with world leading research in early childhood education. We don't slow down and we are not letting up. We are Nebraska. 

29:02 So, uh, and right now we are back with professor Kyle Langhart to talk about free speech and global environment. Uh, so kind of going off of our questions earlier, professor Langhart, um, I know that with our last question, um, you were saying that the United States could practically use–or at least in the next 10 to 15 years–the constitutionality or how the United States views the constitutionality of various issues of free speech with the First Amendment as pertains to social media companies develops.

29:39 Um, it is possible that we will use, uh, certain regulatory schemes to, um, possibly restrict what other, what social media companies are permitted, uh, to post within their social media platforms. So, uh, I'm sorry. Yeah, go ahead. I was gonna say so historically, um, uh, I guess you were saying that the First Amendment has kind of ebbed and flowed in a way, so historically, is there a way that we can, is there anything we can point to, uh, to, um, show how the United States possibly gotten more restrictive?

30:20 In a certain, um, industry or in a certain, um, platform such that we can such that the country was able to, or the federal government was able to restrict what Americans are permitted to hear or permitted to say, well, you know, I, I think the, the broad trend since- uh, you know, mid- mid-century has been for the First Amendment just to get, uh, more and, and more and more, uh, protective or, or at least it can look that way.

31:02 If you, if you study the, the Supreme Court, um, or if you focus a lot on, on the Supreme Court. Um, but there are definitely areas where the First Amendment has become, uh, weakened. I don't wanna say restrictive because the First Amendment doesn't restrict speech. But, um, so one example I, I might think of is, uh, schools, you know, If you go back to the early 1970s, um, students in public schools had what seemed to be very robust, uh, speech rights under the, the doctrine that was in place at the time today.

31:46 Uh, the, the trend for a few decades has been for those speech rights to get, uh, weaker and weaker and, and weaker. Now, there was one recent case that pointed in the opposite direction, but, but, you know, generally public schools are, are, I, I would say less democratic environments, uh, or, or they're allowed to be less democratic environments than they were, uh, in, in the, uh, late 1960s or early, early 1970s.

32:16 You see similar kinds of, uh, dynamics in, in other sorts of institutional settings. Um, You know, so, so there, there have been areas. We, we could talk about speech by, by public employees. Uh, public employees probably have narrower speech rights in, in, in certain respects than- than they used to. Uh, so, you know, tho- those are those are

32:45 some areas where the First Amendment has gotten weaker. Now then there are other areas where the First Amendment has gotten much stronger. So I mentioned campaign finance earlier, you know, speech, uh, speech by by employees, the, the speech rights of, of, of management in- in companies, um, you know, business- business interests and, and wealthy people have, uh, it seems more and more First Amendment protection by the year.

33:15 I think, um, I think the, now the technical- or the technically correct way of speaking of speech rights is weaker and stronger is I think they're actually informative to this area because as, at least as it pertains to social media companies, if I take Twitter as an example, mm-hmm, um, largely what in the United States, um, what individuals are concerned about, um, Is divided along party lines.

33:46 Mm-hmm. Republicans and Democrats such that it seems as though people who are generally more right leaning or more conservative, um, would say that they want speech rights to be considerably stronger for individual users, uh, but possibly considerably weaker for the platforms themselves to determine what speech is going to be permitted within their platform.

34:10 Mm-hmm. On the left side of the issue, it could quite possibly be the opposite. It would be that they want that the plat- they want the platform to be stronger, such that it could regulate, um, certain sorts of hate speech. Mm-hmm  this would generally align with like the Australia example. Yeah. It's just kind of counterintuitive that- that that's how left and right would line up on these things. It is.

34:38 So I guess, um, and this may be too broad question, but as we think about this may be, um, asking you to somehow tell the future. But as you think about how jurisprudence in this area, at least in the United States evolves, is it possible that I guess, is there a way in which the situation gets better such that it can apply to other countries?

35:12 Meaning that, is there a way in which, um, maybe the Supreme court could say, "Facebook you are allowed to have stronger speech rights. Generally you are allowed to have a greater determination as to what goes on to your platform, but you need to apply this very consistently, almost around mm-hmm  the world," uh, such that if they're running into issues in a different country, let's say they need Facebook is required to take down certain hate speech or certain, um, negative speech.

35:48 They just have to do it throughout the world at consistent basis. Um, I guess it's a very broad question, but is there a way which you could foresee that occurring? Uh, it, I mean, under the law, as it is right now, the- the government cannot require platforms to do that within the US. And if they can't require it, uh, within the US, then it it's hard for me to see how they could impose that kind of restriction on American owned companies operating- operating abroad.

36:26 Uh, you know, whether it's at home or abroad, you would have the government attaching some kind of a penalty or, or at least an incentive to editorial decisions by these platforms. Now, I, I say editorial decisions- I think that's actually kind of an awkward way to describe what these platforms are doing. To me, what the platforms are doing is regulatory rather than, rather than editorial, but because they're private institutions and because in the American context we pay

36:58 so much attention to this public/private distinction. We characterize these basically regulatory activities by private institutions as editorial activities. Mm-hmm. If the government tries to regulate any of that, you know, they might as well be trying to regulate, uh, what the New York Times can, can carry.

37:13 I think that's a simplistic approach, but that's the approach we have. Now, if we're talking about foreign platforms, That are trying to mess with the us in some way, you know, by using, using a platform as a, as a propaganda vehicle, basically. Mm I could, you know, I, I, I don't know exactly what it, what it would look like, but I could imagine courts

37:39 giving the government some amount of leeway there. And here's one, one case that jumps to mind, uh, this case called holder versus humanitarian law project. Uh, and this was a case that dealt with, um, humanitarian organizations that wanted to provide, uh, training to certain organizations that were on, uh, um, a, a terrorist list.

38:22 Now you might think, okay, that sounds bad. Like they wanna provide train- training to- but what, what these organizations said was we're not, uh, providing these organizations training in how to be terrorists, we're providing them, uh, training in nonviolence. So, you know, here's, here's how to negotiate for what you want in, uh, a political forum.

38:50 Um, here's- uh, here are legal theories that you might pursue that, that kind of thing they said, this is, this is just peaceful instruction. Well, under previous first amendment law, it had seemed pretty clear that, um, the first amendment would, would protect that this wasn't a case where these humanitarian or humanitarian organizations were like inciting these, um, these other groups to engage in violence.

39:20 They were just peaking to them and, and not even speaking to them about violence, but the Supreme court, uh, took a kind of a hard turn in this case and said, uh, no, this is- this, this teaching is a way for these advocacy organizations to like free up resources for terrorists mm-hmm. So if these terrorists are getting a free education and nonviolent techniques, then that means they don't have to spend money on getting that education and nonviolent techniques from somebody else.

39:56 And so then they've saved all this money and they can go and, and, uh, spend that money on guns. It. Now I, I think a really crude way to describe this is that this Supreme court said that, um, the ordinary First Amendment rules just don't don't apply in cases involving terrorism, you know, just- Mhmm. You know, which, which isn't, isn't pretty.

40:20 But, uh, I, I think that's, that's basically what was going on. Well, if that's- if that's the case, then I think you- you can definitely imagine the court doing something similar in a case where say, uh, China was trying to, uh, push Chinese propaganda and interfere in American electoral processes or, or drive civilian strife, that, that kind of thing.

40:45 Um, not sure what it would look like, but I could see a court doing it and I, and at some level I think they probably should do it. Hmm. I guess, um, From that consideration or from that, how is it that, I guess this is also kind of chain topics, but we're getting more broad. We've spoken about how, um, social media companies have these competing interests of, um, in some way, ensuring that they can maximize their profits and be available to the widest potential, um, group of users while at the same time having the consideration of

41:33 how, um, can we regulate our platforms such that it's actually like a welcoming place and an opportunity for actual productive society? Mm-hmm  um, I guess generally from the platform's perspective, would you have any, um, ideas or maybe suggestions or maybe, uh, thoughts on how they can better gauge or weigh these two competing?

42:03 Well, I think the basic problem is, is these platforms, business model. And so the platforms want us to pay lots of attention to what kinds of rules they're setting for speech, what kind of speech they're gonna take down that, that kind of thing. Mm-hmm. But the platforms don't want us to pay attention to is the fact that they make their money

42:31 by driving people to spend as much time on these platforms as possible for advertising purposes and the, the way that you addict somebody to one of these platforms and, and get them to, to just binge on the platform is you keep feeding them more and more stuff that they are likely to pay attention to. And, you know, it could be that the thing you're more and more likely to pay attention to is just some, some hobby you have or some, something like that.

43:01 And, and YouTube will just keep auto playing videos that get you deeper and deeper into the hobby and, and, you know, into weirder and weirder places. But for a lot of people, what what's gonna really cause them to pay attention is, uh, stuff that, that makes them angry, uh, anger. Anger is addictive.

43:24 Negativity is addictive, and content that, uh, deals heavily with, with identity, you know, here's, here's who you are, you know, here's who your people are, here's who the other is, you know, that kind of stuff is also really addictive. Well, so if you got really negative identity driven content and the platforms are just trying to feed people more and more and more of it, and they're kind of

43:54 amping up the intensity, uh, so that, so that people will stay on as they develop a tolerance for this stuff. What you get is, is a dynamic where people wind up getting fed a lot of speech that is, uh, more hateful and, and ultimately more dangerous than they would've, if the platforms had a different, a different business model, mm-hmm , um, the, the platforms want to, you know, they, they, they wanna pollute and then clean up the environment afterwards.

44:32 Uh, what they don't wanna do is refrain from polluting, polluting in the first place. So I think that's what, that's what the platforms, uh, should be, should be doing. And if the platforms won't do it on their own, then I think that's what, what public institutions should be trying to make them do. I think I, I mentioned earlier that the platforms under current law have speech rights to regulate user speech

44:59 however they want. That's one reason not to regulate what platforms can take down and what they, what they have to leave up. But I think a better reason for the government not to step in is that once the government tries to set rules for what kinds of speech, a platform takes down, those platforms are going to over-comply.

45:24 They're gonna have really strong incentives to take down all the speech that's- that's offending, and they're unlikely to have similar incentives to leave up speech that might be on the, on the borderline. This is a dynamic that's called collateral censorship. I think it's also, it, it's also dangerous to have the government or for that matter, a private company, a private censor at one of these companies spending a lot of time drawing lines about what, what you can say and, and, and what you-

45:59 what you can't say. I think that's just a bad direction for, for a free society to go, whether it's public actors or whether it's private actors that are doing it. And so I think that the, the responsible approach is to adopt regulation that goes to this business model, this advertising based business model, um, without getting too involved in the question of what people are allowed to say on the platform and what they're not allowed to say on the platform.

46:28 Yeah. So, um, as we think about this issue, is it best to think about regulating, um, the business model from the incentive- incentive point of view or from the outcome point of view or a starting point? Uh, yeah, I mean, I, I think that if, if you can, if you can regulate these businesses at the furthest distance, From the sensor, you know, and, and the decision of, of what goes up, what stays down.

47:00 That's what, that's, what you wanna do. So I, I think the business model would be the ideal place to regulate. Now, the problem is that in the United States, we regard advertising as a form of First Amendment expression and- and it gets, you know, pretty, pretty strong protection, the way that that courts currently interpret it.

47:22 So it's, it's, it would be hard to have a law that said, for example, uh, platforms can't engage in, in targeted advertising, that would be regarded as just. Flatly unconstitutional mm-hmm. So then I think another approach that you might take is you could ask, well, what's another way that I could, uh, regulate a platform without saying that some content is, is better than others, a, a content neutral approach, and one kind of measure you might look at would be, uh, like the, this feature that Twitter

47:59 rolled out a couple years ago where they said, do you wanna read this first? So you're, you're gonna like retweet, retweet an article or something. Uh, and it, and it's just a reflex, cuz you're just sitting there re retweeting all day. Well, if you haven't clicked on the link and then Twitter might say, okay, you can, you can do this, but are you sure you wanna share this before you haven't actually clicked on the link?

48:22 That's a little bit of friction in there that doesn't discriminate on the basis of content doesn't discriminate on the basis of, of viewpoint, but that might nudge people toward acting a little more, more rationally and might reduce the spread of the most, uh, the most kind of virulent content. I mean that, that's, that's a direction that I would hope could work.

48:45 And, and one that should be that I think we should be exploring because, you know, it keeps the government out of these, out of these business of content discrimination, viewpoint discrimination gives the government less power, um, while also giving the platforms less power and, and potentially cleaning up the speech environment.

49:07 It's like a soft way of correcting the issue rather than the hard way of just saying anyone who speaks about X, Y, Z issue should be virtually taken off the platform entirely. Of course, I, I expect that the, the platforms wouldn't be too excited about this either, because if you have a business model that depends on- on advertising and you have a regulatory model, that's encouraging people to think  advertising and thinking don't don't mix don't mix real well,

49:47 uh, you know, if, if people, if people have to work a little harder, stuff's gonna go less- less viral, people are gonna spend less time on the platform, that's less money, uh, for, for the platforms. So I think, you know, what I I'm sure Facebook would, would prefer to do rather than reducing reducing virality is just say we've just hired thousands and thousands of, of censors.

50:10 And they're going to be, uh, uh, working, working very diligently to take down problematic content. It kind of like pushes the issue almost like a step away by having the opportunity to just say like, oh, if something bad happens, we will try our best to correct it. Okay. Yeah. No, that, that does seem like a very, um, roundabout way of avoiding the issue in a way.

50:36 Yeah. It doesn't, doesn't threaten the business model at, at all. Yeah. And so, in a way, you know, there's all this focus on misinformation, dis- disinformation, and I'm not the first one to make this, to make this observation, but like all of this focus on misinformation that spreads on social media. That's I think exactly where these platforms want us to be looking, uh, because that's a conversation that doesn't, doesn't fundamentally threaten what they're up to.

51:11 Yeah, no, that's true 'cuz it doesn't change the business model. Mm-hmm  the business model is the issue itself. As, as, as, um, a discussion we've spoken about, um, the business model. Not only does it not threaten the business model, it makes these platforms indispensable. So if, if Facebook is playing this role as, as the speech police, you know, within, within Facebook, then that means the government depends on, on Facebook, uh, to, to do that, to keep, keep the internet clean.

51:43 Um, yeah, I mean, right, right now, and. I mean, I mean, you look at, at this narrowly that this makes sense, but, uh, I believe Facebook is taking down, uh, speech, you know, propaganda by Russian officials in, in Ukraine. You know, that, that I haven't looked too closely into this, but it sounds like a, a good thing to do, but that's an example of this, this company, um, making itself necessary, you know, making itself, um, in- indispensable to, to the American government.

52:18 And that's a- that's a real source of, of protection with- it's like one of these like with, with great responsibility comes great power things. Mm-hmm  yeah, that's true. That's very true. Which I guess is- I have to credit credit this to, to Eugene Volokh who calls this "the reverse Spider-Man principle." Okay. I've never, I've never, I had never heard of that.

52:41 yeah. Well, he made it up. So that's clever. Thank you, professor Langvardt for joining us for this episode and thank you for everyone who listening to this and I'm your host, Mei Fong, and I'm Jeffrey Owusu-Ansah. And once again, thank you so much for listening.

52:59 Thank you for joining our student fellows on this episode of Tech Refactored. If you want to learn more about what we're doing here at NGTC or submit an idea for a future episode, you can go to our website at ngtc.unl.edu. Or you can follow us on Twitter at UNL underscore NGTC. If you enjoyed this show, don't forget to leave us a rating and review wherever you listen to podcasts.

53:24 Our show was produced by myself, Elizabeth Magilton, and Lysandra Marquez and Colin McCarthy created and recorded our theme music. This podcast is part of the Menard Governance and Technology programming series until next time, hang in there and keep learning.