Tech Refactored

S2E44 - Breaking Down Section 230 and Moderating Content Online

June 09, 2022 Season 2 Episode 44
Tech Refactored
S2E44 - Breaking Down Section 230 and Moderating Content Online
Show Notes Transcript

The episode you’re about to hear is being hosted by our student fellows. Our Student Fellows are a diverse and interdisciplinary group, representing colleges and specializations across the University of Nebraska. Paige Ross (Law), Ece Baskol (Business), and Alicia Christensen (Law) spent the academic year examining Section 230 of the Communications Decency Act, which generally provides immunity for website platforms for third-party posted content. They spoke with Mailyn Fidler who works on issues at the intersection of technology and the law. Starting in the fall of 2022, she will be an Assistant Professor at the University of Nebraska College of Law. Her research focuses on constitutional rights, criminal procedure, and intellectual property. Fidler is a graduate of Yale Law School, Oxford University and Stanford University.

Note, due to connection issues Ece Baskol's voice is not included in this episode. Ece was an integral part of this project and another student was able to ask her questions. The Center and her team thank her for her contributions to this process. 

You can find Professor Fidler's articles and work on her website: https://mailynfidler.com/research/ 

Disclaimer: This transcript is auto-generated and has not been thoroughly reviewed for completeness or accuracy.

[00:00:00] Elsbeth Magilton: This is Tech Refactored. I'm one of your regular guest hosts, Elsbeth Magilton, the Executive Director of the Nebraska Governance and Technology Center at the University of Nebraska. The episode you're about to hear is being hosted by three of our student fellows. Our student fellows are diverse and interdisciplinary group representing colleges and specializations across the University of Nebraska.

The goal of the Student Fellows Initiative is to familiarize students with the nuances of working with professionals from other academic backgrounds, incorporating their perspectives and vocabularies in order to better inform their work. This semester, we challenged them to produce an episode of Tech Refactored on the subject of their In Choosing. [00:01:00] 

We hope you enjoyed this special episode of Tech Refactored, hosted and produced by our Student Fellows: Ece, Paige and Alicia. We hope you enjoy it.

[00:01:15] Student Fellow: Today we're joined by Professor Mailyn Fidler. Professor Fidler is an expert on issues at the intersection of technology and the law. Her research focuses on constitutional rights, criminal procedure, and intellectual property. Professor Fiddler is a graduate of Yale Law School, Oxford University, and Stanford University.Prior to clerking for the 10th Circuit Court of Appeals, Professor Fiddler was 

[00:01:36] Student Fellow: the check and First Amendment fellow at the reporters' committee for Freedom of the press. Starting in the fall of 2022, Professor Fiddler will be joining the University of Nebraska College of Law as an assistant professor where she will be teaching copyright law, criminal law, criminal procedure, and cybersecurity and sectoral data regulation.

Professor Fiddler is joining us today to discuss section two 30 of the Communications Decency [00:02:00] Act of 1996, which gives online platforms immunity from claims that arise from content posted to their website by a third party. So I wanna start by rewinding the story a bit and talk about the bookstore case in the 1950s that sort of paved the way for section two 30.

Um, so some of the basic information about this case is that there was a Los Angeles ordinance that basically said if you have any kind of obscene material in your store, you can be held criminally liable for that. Um, so this, this case goes all the way up to the Supreme Court where the Supreme Court says that this kind of ordinance is unconstitutional.

Um, and that there's absolutely no way that a distributor, like a bookstore owner could possibly review every bit of content they have and, um, know that before they sell it. So if you're a distributor, you're going to be liable only if you knew or should have known what you're distributing as illegal. So do you see this kind of enactment of Section two 30 as a continuation of the same [00:03:00] basic commitment to protecting conduits of speech, or was there something else at play behind section 230's enactment.

[00:03:07] Mailyn Fidler: So I think section 230 is different. Um, and before I get into why, I wanna introduce sort of the two major sides to the section 230 debate, uh, that's going on right now. Um, there's the camp that wants harmful speech online taken down. It's the first camp, and then there's a second camp that wants platforms not to step in and take content down in ways that they perceive as biased.

So let's talk about section 230 in the bookstore context. Um, if we, if we use the bookstore as an example, there's one camp that would want the government to mandate that a bookstore owner must not sell harm con uh, content. And then there's the other camp that wants the government to tell the bookstore that it must carry certain content, even if that bookstore itself would rather not carry that.

So again, continuing the bookstore example, section 230, it was intended to protect a bookstore's private decisions to carry or not carry certain books. So if a bookstore carried an obscene book, [00:04:00] it would be protected. And also if the books were chose not to carry the Epstein book, it would also be protected.

So the, the distributor liability case you mentioned only protects the first kind of those decisions, the decision to carry. Uh, it doesn't protect the decision not to carry. And so Section two 30 goes a step further than that case and allows online platforms, uh, to take down content without facing legal liability for taking down that content.

Now, I admit it's kinda silly to talk about Section two 30 in the bookstore. In bookstore terms, because online platforms are so much more diverse and more complex. But, uh, it's a helpful, if overly simplified analogy. 

[00:04:37] Student Fellow: Yeah, so some of the other issues that I think we could see in that realm is just the immense responsibility that could fall on smaller platforms if they were required to kind of know exactly what third party posters are seeing.

So do you think that argument against section 230 holds any kind of water? [00:05:00] 

[00:05:00] Mailyn Fidler: So I think those concerns from, you know, from smaller platforms are certainly legitimate. A smaller internet startup that would immediately face lawsuits for content posted by its users, they're gonna face more expenses than one that's protected that.

That said, I think the real question is a political one. Are we willing to impose those costs, uh, for a benefit, uh, or not? And if you're in the camp that wants harmful speech taken down, maybe the benefit of imposing those costs on smaller uh, platforms outweighs the costs. If you prioritize other things, maybe diversity of internet platforms, you might not think that.

And I actually do wanna note that we went through a similar debate about sort of small platforms, uh, with the Digital Millennium Copyright Act, where folks argue that if we impose greater online copyright, uh, liability, small internet companies will struggle. I don't actually know the empirics of how that played out, but there we decided that the benefits of those copyright protections outweighed the costs.

And so [00:06:00] this conversation about trade offs for small platforms isn't.

[00:06:05] Student Fellow: And so I think also we could, we could talk about sort of the, if you look at Twitter versus like Parlor, was there some kind of analogy there in that, you know, there's too much restriction on Twitter or we feel like we can't say X, Y, or Z about this kind of speech, so we're just gonna migrate to a smaller platform.

Um, do you see that as being a bigger issue in the coming years or not as much? 

[00:06:33] Mailyn Fidler: So I think we have yet to see sort of the, the full extent of the effects of this kind of migration. And I don't necessarily think that migration, uh, to smaller platforms like Parlor makes low quality information less dangerous.

It might cabin the effects of that, uh, low quality information to smaller groups of people, but it can also intensify the consequences. Um, That said, you can also get that same [00:07:00] intensification on mainstream platforms through algorithmic targeting, things like that. Think about the kind of YouTube rabbit holes that, uh, have been documented as contributing to extremism.

Um, so that targeting can also funnel people off of mainstream platforms onto the marginal platforms. So I think both, both hold risks. 

[00:07:17] Student Fellow: Yeah, and I think something that I'm sure our listeners are probably more familiar with is, um, the recent bid to take over Twitter by Elon. And his promise is to make this, you know, a more free speech friendly platform.

What kind of, um, methods do you think he will try to implement there and do you think it will be useful? How do you think people would respond? 

[00:07:41] Mailyn Fidler: So, uh, I don't know what Elon Musk will do if I did, uh, , that would be great. Um, but I do think that the, this kind of takeover is highlighting the stakes of having private decision makers make these kinds of content regulation decisions.

So far, um, Twitter has [00:08:00] largely played by the rules. It, you know, the rules. Cons, uh, conceived of as sort of middle of the road. Certainly there's people who wanted them to do more or less content moderation. Uh, if Musks take over succeeds, I think we're gonna see a change to that and that might, um, highlight both risks and disadvantages of having private stakeholders make these decisions.

But that's, um, That is one of the risks of having private decision makers make these decisions as you get, uh, you're not guaranteed sort of middle of the road Twitter. 

[00:08:33] Student Fellow: Yeah. It's been very interesting to me to see kind of both camps, you know, one camp being really welcoming of this kind of transition of saying, Oh, they're outside of the corporate governance.

This is a single individual, so perhaps they'll have. , you know, a more open mind about what will permit on the platform, whereas other people are kind of very suspicious of this because the risks of handing all this control over to one individual and what that will [00:09:00] lead to. Mm-hmm. , 

[00:09:03] Student Fellow: I had a quick follow up question to what you were saying about the migration to smaller platforms.

If you don't mind rewinding a little bit, um, because I was thinking about the. I, my understanding of two 30 sort of puts it, the, the intent of lawmakers to be in that camp of will let the market decide. So if you want more hate speech, then you can go here, Um mm-hmm. , you know, um, that's being facetious, but that, you know, the, the, Would you tell us a little bit about the cases that kind of, Made that intersection that created.

Two 30, but that sort of prompted two 30. There was a couple of cases that had conflicting sort of, um, instructions for websites basically, and on how much to intervene in those [00:10:00] decisions. And then I, I just, the follow up is kind of based on that is, was it there. Do you get the sense that it was the lawmaker's original intent to sort of let that market sort itself out where people wanting certain levels of content moderation or certain types of discussions would have more choices than we really have been presented with at this point.

They've really consolidated. 

[00:10:23] Mailyn Fidler: Mm-hmm. . Yeah, you're absolutely right. So, uh, this is an important piece of the section two 30 debate is to talk about this, the unusual legal circumstances that that led to its creation. So there were, there were two cases that were decided in quick succession back in the early days of the internet.

The first involved a platform that did very little content moderation, and, uh, they were sued for defamation and the court, uh, dismissed the suit because they said, You're acting like a distributor, like the bookstore, and, uh, because you're not doing a lot of content [00:11:00] moderation, you didn't know, nor should you have known that this content was defamatory.

Um, so therefore, case dismissed. The second case, which came out a different way, involved a platform that actually did do more content moderation. Um, in the context it was sort of to make, uh, more family friendly platforms with concerns about explicit content. Um, the court there let the suit go forward, um, because.

Since they were involved in content moderation, they, they knew or should have known about the stuff that was going on. And so section two 30 was enacted to sort of resolve this, this split in these court decisions to allow platforms to do more content moderation, um, without that moderation leading to legal liability.

Um, with respect to sort of smaller platforms, I don't know that there's a direct sort of correlation between those. Um, Wanting to give platforms the freedom to moderate and the consolidation, [00:12:00] excuse me, of platforms today. Um, you know, really it was, it was enacted to enable more moderation that less moderation.

So what we're seeing with these smaller platforms that are now springing up to offer less moderation, that's kind of the opposite of the original intent. But, you know, that's what happens when you, when you give, uh, platforms the freedom to go either way. 

[00:12:20] Student Fellow: I was thinking of it more like, um, That they sort of took that to say.

We'll have all the, all of these choices. So if you wanted to go shop at this store, this store, this store to get your online interactions, this store allowed, like really curated. You get three options. Mm-hmm. , and then this store is just everything and you have to sort to junk and all that, you know?

Mm-hmm. . And so it was sort of a, the online version of that where, I, I, my sense is that lawmakers at the time obviously could not know how this is gonna play out, but that they were not envisioning this consolidated market that they thought it would be more like having all of [00:13:00] these, um, different shops, so to speak, and that that would kind of, So do you think, um, then in some ways, just outside of 230 other forces have kind of.

It's the blame for some of this, right? But some of it is these other sort of things that are happening that kind of maybe derailed what the original intent was. And, um, do you see newer sites like Parlor or other things like that kind of, um, as an effective way to maybe recapture what sort of, uh, the original, I don't know, I don't wanna glorify the  original intent or whatever it's worth, but, um, maybe, maybe you have it- Is that, does that present some maybe solution to some of the issues? 

[00:13:52] Mailyn Fidler: Yeah, I definitely don't think that the drafters of Section 230 anticipated the kinds of platform consolidation that we've seen. Um, [00:14:00] those were forces outside of the section 230 context that contributed to that. Um, You know, I don't wanna glorify sites like Parlor, but yes, in, in some ways diversification of platforms, it's a good thing, right?

You, you get different options for discourse. So, um, in some ways that does track more with the vision of the, uh, section two drafters. Does that answer your question, Alicia? 

[00:14:25] Student Fellow 2: Yes. Sorry. I, and I did not mean to d. The whole line of questioning, but I just thought I would circle back because I, I thought it might be a good opportunity to sort of introduce the gen, the genesis of, um, Section 230, um, and kind of informs how we got here.

So, sorry. Um, Paige. 

[00:14:45] Student Fellow: Yeah. So, um, one of the other things that I've heard in terms of like parlor discussions, Um, as to whether or not this is a good development or a bad development, is that the migration of such [00:15:00] non-mainstream speech, for lack of a better term, is actually dangerous in its own right because the mainstream media, um, consumers who use Twitter or other analogous platforms aren't seeing that speech is there.

Does that hold any water? 

[00:15:18] Mailyn Fidler: Yeah, so that's kind of what I was talking about with the intensification of the effects of this kind of speech. When you take it off of, uh, mainstream platforms where you might get sort of counter voices coming in, if you take that off into sort of echo chambers, um, it's not getting to as many people, but the, uh, you know, the effects can intensify.

That said, I don't wanna overstate, um, The, the role that smaller platforms have in this, because that can happen on mainstream platforms too. And that was the, the, the Twitter sort of rabbit holes, um, algorithmic targeting, that kinda amplification can happen even on mainstream platforms.

[00:15:58] Student Fellow: So one other [00:16:00] development they're talking about, especially with, uh, the proposed Twitter takeover, for lack of a better term, is kind of bringing back a lack of protections. Uh, they've proposed, I, I think Elon Musk publicly said that he wants to reinstate Trump on Twitter. So I'm wondering what the potential effects you may see that come out of things like that.

Um, I know there's a whole camp of people who are really excited about that because a lot of people found that ban unjust. I think if you look at the circumstances surrounding that ban, um, it may hold more water. But, um, I was wondering what your thoughts might be 

[00:16:42] Mailyn Fidler: on that. You know, like I said before, I think if this goes forward, and if they do, uh, if Twitter does walk back some of the content moderation decisions, it's made.

We're going to be in a different ball game about, uh, private content moderation and there might be a lot of people who are unhappy. The sort of [00:17:00] walking back content moderation. I think that move by was probably that certain things that President Trump was saying and it saved them from. Uh, regulation being imposed on them potentially.

Now, I don't necessarily think that that regulation would fly under the first Amendment, but, uh, you know, they avoided that sort confrontation, um, that decision will.

[00:17:28] Student Fellow: I think, I think a lot of people, if you start bringing in the First Amendment, it's kind of its own little rabbit hole there. Uh, because I think a lot of people kind of consider platforms like Twitter to be, you know, quote unquote the, the modern public square. So should we do any kind of filtering, you know, people who strongly adhere to that idea of the First Amendment would probably be very opposed to any kind of modern content moderation within, you know, certain limits.

[00:17:57] Mailyn Fidler: Sure. Um, so [00:18:00] yeah, I think there's a, there's a big misconception in a lot of discussions about online content moderation. Um, you know, that the, the First Amendment is a big issue. It is, but not in the way that a lot of people think it is. So the First Amendment only restricts the government from regulating private speech.

So the First Amendment doesn't, um, Give you protections against Twitter taking down your speech. That said, if you, you know, perceive it as sort of a, a public sphere, um, those arguments, uh, do change. But so far that's not where we are. And as, um, I, I've written in my paper, I think a lot of content moderation can be per, uh, considered editorial speech.

And so that, that gets us into another realm of First Amendment speech, which I think we'll talk about later in the podcast. 

[00:18:48] Student Fellow: So we'll be right back to discuss some of the legal perspectives and implications of Section two 30 and the Legal Shield for online platforms.[00:19:00] 

[00:19:04] Student Fellow 2: If you find this conversation interesting and would like to learn more about contemporary legal issues such as these as well as access. Up to- updated news articles, podcast episodes, resources and essays. Be sure to check out the law fair blog@www.lawfairblog.com. Law fair is a great way to stay connected with current legal events as well as to learn about new and emergent.

Issues. Again, you can check out the blog@www.lawfairblog.com.

[00:19:49] Student Fellow: Okay. So kind of circling back to the idea of alternative platforms, to what extent, if at all, do you think these platforms have been successful in [00:20:00] marketing themselves? And this is specifically as places where content that has been banned on other platforms such as Twitter or Facebook can be found.

[00:20:09] Mailyn Fidler: Yeah, I think they've been pretty successful. And as I mentioned, the internet. It's different from the bookstore context in that you can start in a mainstream platform and get funneled to one of these, um, more marginal platforms, um, through marketing, through advertising, through algorithm targeting.

That's not gonna be the case. If you walk into, you know, a major big book, big box bookstore, they're not gonna send you down the street too, an extremist bookstore so that they have had success in that. 

[00:20:39] Student Fellow: Why do you think we have ongoing discussions about Section two 30 for online platforms, but not for bookstores, considering that both can be distributors rather than content creators?

[00:20:54] Mailyn Fidler: Yes. So the internet is frankly just more important than bookstores, and I say that [00:21:00] as a divided bookstore fan, it pains me to say that. Um, so that's the first thing. The second thing is also just that the legal questions for. Sort of offline distributor liability are settled. Um, that's why we're not having those conversations.

The she debate about section 230 shows that the legal questions for online platforms are far from settled. Some of this has to do with the fact that section 230 is legislative, not constitutional, and so it can be changed, whereas the dis, excuse me, the distributor liability question, um, was settled by court sort of as a, a definitive question of constitutional.

[00:21:35] Student Fellow: As a avid consumer of books yourself self proclaimed, um, I am as well. Do you think, um, that there are negative effects of the legal shield for content distributors in both, uh, the content of social media platforms and that of bookstores?

[00:21:59] Mailyn Fidler: [00:22:00] Um, So I don't think you can have a conversation about Section two 30 and content moderation without acknowledging the fact that there are real harms of online speech. And I do think that those harms are much more harmful online than they are in the, in the bookstore context, in part because it is like having a national bookstore or anyone from any town can walk into, that's just not the case with bookstores.

Um, so the reach, the, the impact of these kinds of decisions online. Is much bigger. 

[00:22:32] Student Fellow 2: So the courts have been grappling with what kind of rules the government can make about speech, which essentially kind of boils down to expressive content, including everything from parade floats to newspaper articles. Um, and it's been about a century.

So your article called, uh, The New Editors Refining First Amendment Protections for Internet Platforms, kind of gives a broad overview of First Amendment doctrine. But it's about online communication. [00:23:00] So what, what's the connection here? 

[00:23:03] Mailyn Fidler: Yeah. So often we talk about the First Amendment as, uh, a protection against government regulation of private speech.

And sort of the case of that is political speech, you know, protest speech, that kind of stuff. Um, the context of editorial protections broadens that a little bit. And so editorial protections. Uh, is a sort of subset of First Amendment doctrine that. Selection over speech. So you don't have to be the creator of the speech under, uh, to, to access editorial protections.

You just have to be sort of curating and selecting speech. And so that's why it's relevant to the online context. Cause what platforms are doing is they're not speakers of, you know, the users post themselves, but they are doing extensive curation and selection of others' speech. 

[00:23:52] Student Fellow 2: Um Oh, oh, sorry. 

[00:23:55] Mailyn Fidler: I was gonna say, I can go through more details of, of editorial protections.

Uh, well, and I was [00:24:00] gonna, I will 

[00:24:00] Student Fellow 2: definitely ask you about that. Um, I was, um, so your articles making the connection that if that, if reforms to 230 are enacted, that platforms will. Fall back to this body of case law that is sometimes kind of muddy and sometimes kind of contested. Um, but there's a good body of case law, whereas with Section two 30, since it provides an immunity, the courts really haven't heard any of these cases on the merits.

It just, they grant motions for summary dismissal based on this section two 30 immunity and the, the intricacies of the case are not discussed so, You're, correct me if I'm wrong, that you're, you're sort of mapping this out as this is what it might look like or what we could, would the tools that we could use should two 30 be tinkered with?

Um, what could you go into those sort of [00:25:00] offline editorial protections that could be used as tools in that 

[00:25:05] Mailyn Fidler: scenario? Yeah, absolutely. So, like you said, there's a, there's a body of offline case law about editorial protections and we haven't really seen internet platforms reach for those, um, so far because they have, they have easy recourse to section two 30.

So I think if Section two 30 does get tinkered with, you're gonna start seeing platforms raising these kinds of arguments, uh, to an even greater degree than, you know, if you have so far. And so, Um, the sort of primary offline case about editorial protections is the, the case. And so this involved a statute that required a newspaper to print or to allow, uh, candidates a write of reply.

So if there was a, an article to newspaper or an opinion piece criticizing a candidate, you would have the option to. Let's say that section two 30 gets tinkered with in a way that, um, you have to allow [00:26:00] both sort of pro certain candidate speech and anti certain candidate speech. You could see internet platforms invoking the same kind of right of reply, um, defense to that, um, on a constitutional basis, um, even in the absence of two 30.

[00:26:17] Student Fellow 2: So, um, so it's kind of. This prerogative to make editorial choices. So the, the law was basically saying if you print this, you have to print 

[00:26:28] Mailyn Fidler: mm-hmm. the other. Right. You have to least offer, if somebody wants to say that, you have to let them say it. And so, um, 

[00:26:39] Student Fellow 2: you, and I think you referred to this earlier in our discussion, but there's, there's the two camps that you kind of discussed about.

Mm-hmm. , like we, the hands off and the. Get in here and medal with stuff camp. Um, do you come down on either side about, you know, if, if you had to [00:27:00] or what, what do you see as the, the main merits or consequences worth considering on each on in each camp? 

[00:27:07] Mailyn Fidler: Yeah. Um, so. The keeping speech up camp versus taking speech down camp, 

[00:27:15] Student Fellow 2: Is that right?

Yeah, there was the, I I believe that there's sort of the, We want it all, I don't want you to metal and moderate sort of at all, because that's, that's 

[00:27:26] Mailyn Fidler: inhibiting mm-hmm. speech. 

[00:27:28] Student Fellow 2: And then there's the, the more, um, sort 

[00:27:32] Student Fellow 2: of status quos ish, um, camp that seems to be interested in, in keeping things. You know, tweaking things more, um, narrowly and to allow more of the regulation that we're, we're sort of used to right now.

Mm-hmm. But maybe do a little, a few tweaks as far as, um, allowing more user, uh, maybe some transparency, [00:28:00] maybe user. Control over some of the algorithmic, algorithmic mm-hmm. things that, uh, inform their fees or what have you.

[00:28:08] Mailyn Fidler: Yeah, sure. So I'll start with, with sort of my take and then go into the, the advantages and disadvantages of both camps.

I tend to be a little bit more. Um, I guess at this point traditionalist about this, that really the government should not be in private editorial decisions and make are editorial decision. Of don't, don't make rules, say you have to put stuff up or take stuff down. Um, in terms of the, uh, leaving stuff, arguments, I, that's aline

of debate. I think there's, there's pretty good defenses there, um, to be drawn from existing case law and it's just not [00:29:00] something that we as a country have. Been okay with in any other kind of media? Um, I think if you're in this camp, you might say that, you know, the internet is different. Um, I don't think those arguments win the day.

Um, but that you might think that on the other side, um, on the taking down speech, I think the strongest argument you could make on that is that our. Conceptions of what is harmful speech, what is hate speech are too narrow both offline and online. So I think the strongest argument you could make there is that we need to reform our conceptions of harmful speech.

And then from there, because we already have carve outs, um, for a low value speech, if we expand what is sort of considered low value speech platforms would just like any other speaker, uh, have to. Respond to a sort of government regulation about low value speech. 

[00:29:53] Student Fellow: And I also think it's important to note that in the historical development of this case law, a lot of proponents of kind of the first [00:30:00] amendment, this is the new public square, sort of said, the solution to bad speech is more speech, not less.

Mm-hmm. . And so they're kind of optimistic. That better speech, for lack of a better term, or like the more mainstream opinion will kind of rise to the top and it will take out some of these more fringe idea. 

[00:30:19] Mailyn Fidler: Yeah. And you know, a lot of people challenge that in the online context with, uh, Algorithms that target things that are maybe viral rather than true.

Um, but at the same time, even in our discussion here, we've talked about the dangers of, um, moving discussions off of mainstream platforms to smaller platforms and sort of developing echo chambers. So even though there's a lot of problems with, uh, sort of speech eats out speech there, there is still some truth to that even online.

I wanted to ask a 

[00:30:52] Student Fellow 2: quick question before I turn it over to Paige and, um, The 11th circuit just ruled yesterday, um, [00:31:00] that the, uh, Florida social media law was, uh, likely unconstitutional. And so they upheld at the band that the, the injunction that was put in place, um, by the district court, and I, I found it.

Striking a bit. I've only skimmed the case, but I was interested in your take on that and it's, it's sort of contrast to the Fifth Circuit opinion, um, on sort of the same issue but involving the Texas similar law. And then as, as a second follow up, I, I thought it was interesting that it's, it's almost entirely framed as a First Amendment issue and it doesn't really.

Engage. It almost flips what we were talking about earlier as far as the, you get to two 30 and we're done. Mm-hmm. as opposed to this, engages on the First Amendment issues. Instead of saying, this is preempted, there's a federal law, 

[00:31:54] Mailyn Fidler: um, 

[00:31:56] Student Fellow 2: there's immunity for these, uh, these [00:32:00] platforms. Don't bring this here, blah.

Um, Blas of an official like. It's a legal term, you know, term of our mm-hmm. Anyway, uh, sorry. So I was wondering what your thoughts were on that sort of turn, because that's unusual, and I know you talk about the, the Florida case in, in your article, which hadn't, hadn't been decided 

[00:32:22] Mailyn Fidler: when you had written it, but, so could 

[00:32:24] Student Fellow 2: you, um, give us your thoughts on.

Scattershot questions. Yeah, sure. 

[00:32:30] Mailyn Fidler: So, , you like least mentioned, um, I was in, involved in some amicus efforts in, in these cases. Um, so that informs sort of where I'm coming from, in my reaction to the decisions. Um, so the, the first thing I wanna say is, you know, you said this is unusual that we're seeing in First Amendment terms.

Yes and no. So, uh, most of the suits up to this point about online content moderation have been civil suits brought by private citizens against the platforms. That's sort [00:33:00] of the, the canonical section two 30 context. Um, with the 11th circuit case and the fifth circuit case, we're seeing kind of for the first time actual government regulations of internet platform speech.

And so, That also shifts back into sort of normal First Amendment land of, um, government regulation of, of speech. Again, that depends on whether you view what the platforms are doing as editorializing or not. And it seems like, I haven't read the 11 circuit injunction decision, but it seems like what they're doing, um, which aligns with my view.

So I'm happy with that. Um, The second point I wanted to make is that these really are, at least in my view, the online equivalent of the tornado case. So they're, they're must carry their right to reply cases, um, you know, to in invoke the Texas one. The Texas law says you can't, um, You know, take speech off the platform of a user on the basis of viewpoint, and you [00:34:00] also can't user from seeing other users post OnPoint step further

[00:34:12] Student Fellow 2: and creates a private. The Texas law, I believe has. Individual users can sue for that. So it's, it's creating a right of action for that too, which seems 

[00:34:25] Mailyn Fidler: beyond even. Yeah. So 

[00:34:28] Student Fellow 2: well think, and, and you're right, I, I, um, glossed over that important distinction as far as that this is, instead of pursuing the private company, it's suing the government for, 

[00:34:39] Mailyn Fidler: um, um, infringing on the First Amendment.

So obviously that, 

[00:34:43] Student Fellow 2: um, would. Priority, but it still seemed that it, I don't, I don't know, two 30 doesn't figure hardly at all where mm-hmm. . I, I just found it kind of surprising in that capacity. So thanks for breaking that down for us too. Yeah. [00:35:00] 

[00:35:00] Student Fellow: Professor Fiddler talked about talking about briefly maybe the buffalo shooting online connections or like the future of abortion related speech on platforms.

So I don't know if we have time to get into that, but I think that could be interesting. Hm. 

[00:35:15] Mailyn Fidler: I can make a quick comment on the, um, On both of those things. On the Buffalo shooting side, um, there is some, there is a good argument, um, that the Texas social media law would prevent online platforms from taking down both the racist manifesto that motivated this, the buffalo shooting, and, um, actual depictions of the buffalo shooting.

Again, viewpoint is not defined in the, in the Texas social media law. So depending on how that is construed efforts to prevent. Bias online could end up preventing, uh, online platforms from taking down obviously harmful speech. So that's, that's something to think about as we, we see more of these bills.

[00:36:00] Um, on the abortion related question, I think this is something that I will be watching very closely over the coming months, is the reaction of internet platforms to potential criminalization of, of abortion. So one place that we have seen Section two 30 successfully away against is in the context of, uh, speech about crimes or criminal speech.

Um, both. So the csta Foster Bill. Took away sort of liability protections for sex trafficking related content for internet platforms. I really do think that we were gonna see states pursuing similar removal of liability for abortion related speech online. That's not even necessarily pro-abortion speech or anti-abortion speech.

I think you're gonna get either side content flagged as violating, um, potential. Down the road state restrictions on, on this kind of what might become criminals speech online. So, um, [00:37:00] I think a whole lot of important debate is going to disappear, uh, from the internet in ways that can be weaponized against, against either side of the abortion debate.

Can you explain about how that's, how that 

[00:37:12] Student Fellow 2: would be different from currently two 30 already has a carve out. For you, can't you? It doesn't, um, you can't say illegal stuff on there. Mm-hmm. under two 30. Um, and so is that because you're envisioning that these would be state laws and that it would, it doesn't apply to that?

Or could you explain how that works? Because section two 30 doesn't, I think it's a common misperception that two 30 allows you to say whatever the hell you want. Right. Sorry. And, um, and obviously you can't. Say illegal stuff that's in the law. But, uh, so how does, can you just explain a little bit, clarify a little bit more about the, if things become illegal, is that because different states would've different [00:38:00] laws or, um, 

[00:38:02] Mailyn Fidler: Yeah, so you're absolutely right.

I skipped over a lot of sort of legal technicalities in in that quick overview. I'm still probably not gonna go in depth, um, into them, but to give a little bit more detail, um, so what we've seen in the cest foster context is sex trafficking context is obviously you can't, you know, uh, affirmatively do illegal things online and be protected by Section two 30 it, but what, uh, Csta Fota did is it sort of, it, it widened.

The range of ways you can go after companies for facilitating illegal con. So it's not just the illegal content itself, it's sort of illegal adjacent, if that makes sense. Um, and so we would see similar things for, um, abortion. So hosting, uh, page on a social media platform that gives information about how.

Circum navigate abortion restrictions. That could [00:39:00] be something that, um, under 

[00:39:02] Student Fellow: these, these new laws would be, 

[00:39:04] Mailyn Fidler: that platforms would be liable for that, as well as for, um, the sort of actual core illegal content. Thank you. Thank you all. Thank 

[00:39:16] Elsbeth Magilton: you for joining our Student fellows on this episode of Tech Refactored.

If you want to learn more about what we're doing here at N GTC or submit an idea for a future episode, you can go to our website at NGTC.UNL.edu, or you can follow us on Twitter at unl_ngtc. If you enjoyed the show, don't forget to leave us a rating and review wherever you listen to podcast.

Our show was produced by myself, Elsbeth, Magilton, and Lysandra Marquez and Colin McCarthy created and recorded our theme music. This podcast is part of the Menard Governance and Technology Programming Series. Until next time, hang in there and keep learning. [00:40:00]