Tech Refactored

S2E18 - Can I Sue Facebook for That Thing My Uncle Posted? The Legal History of Section 230

December 02, 2021 Nebraska Governance and Technology Center Season 2 Episode 18
Tech Refactored
S2E18 - Can I Sue Facebook for That Thing My Uncle Posted? The Legal History of Section 230
Show Notes Transcript

On this episode we welcome Brent Skorup, senior research fellow at the Mercatus Center at George Mason University, to explore his work on the legal history of the often politicized Section 230. Section 230 is a section of the United States Communications Decency Act that provides immunities for website and social media platforms regarding content posted by users and other third-parties - and it is often used as political fodder. Brent helps us break down the highly politicized, famous/infamous, Section 230.

See more of Brent’s work on tech issues here:
The Erosion of Publisher Liability in American Law, Section 230, and the Future of Online Curation

Thoughts on Content Moderation Online

Can Social Media Companies Censor Lawmakers’ Accounts?

Disclaimer: This transcript is auto-generated and has not been thoroughly reviewed for completeness or accuracy.

[00:00:00] Gus Herwitz: This is Tech Refactored. I'm your host, Gus Herwitz, the Menard Director of the Nebraska Governance and Technology Center at the University of Nebraska. Today we're joined by Brent Skorup. Senior research fellow at the Mercatus Center at George Mason University. Brent's research areas include transportation, technology, telecommunications, aviation, and wireless policy.

Today we're going to be, uh, looking into Brent's work on the legal history of the often pretty controversial section 230 for a brief, uh, bit of context here. Section 230 is part of the, uh, Communications Decency Act, which generally is understood to provide pretty broad immunity for website platforms and other [00:01:00] online platforms when it comes to content that is posted by their users or other third parties.

And we're going to put a, a lot more meat onto those bones over the course of this discussion. I am sure. Brent, welcome to the show. 

[00:01:14] Brent Skorup: Thank you for having me Gus. 

[00:01:16] Gus Herwitz: Okay, let's, uh, just jump straight in. Uh, I am sure this isn't going to be at all a controversial conversation because nothing about section two 30, uh, is controversial.

Um, we, over the course of this conversation will be using this term section two 30 quite a bit. So can we just start, um, with you explaining what the heck Section 230 is. 

[00:01:38] Brent Skorup: Sure. And, and this is a very hot topic. I, I've given some talks at, at law schools and universities recently, and, and I've heard from professors, a lot of students are, are writing about section 230.

It's, it's a really hot topic and, uh, just the news, but also in, in the legal community. And I was, uh, lucky enough to have written. Uh, pretty [00:02:00] extensively about this topic a few years ago, and now, now it's, uh, a very popular topic. So, I'm, I'm, uh, yeah, happy, happy, happy to discuss. Um, so section 230, it probably, today many people, you know, it's often, um, uh, uh, denigrated.

It's often mischaracterized and, and probably many people today. Not too familiar with the, the history and the background for it, but it's, it's a law, as you said, that provides a pretty broad liability shield to internet based companies of, of all kinds. Uh, it's, it's pretty broadly worded, and it was part of the 1996 Telecom Act as, as you said.

In 1996, didn't get much attention. Uh, internet frankly, was a bit of an afterthought in, in the 1996 Telecom Act, which was really focused on reforming telephone laws and competition. But it was part of that act, and it was just a coincidence, uh, in timing that you had, Uh, [00:03:00] the web was starting commercialized, people were starting to hop online, and, and in the nineties you had these, these early.

Uh, I guess you could call them social, social networks, uh, called bulletin boards where people would post, um, you know, pseudonym, anonymous, anonymous comments, uh, on these bulletin boards, which were virtual. Bulletins boards and communities

[00:03:19] Gus Herwitz: And back in the, uh, this timeframe, some of those were internet style services.

There was a internet at the time, but it was nothing like the modern internet be. Lot of these were, uh, smaller, uh, communities that you would use, uh, modems to connect to online services, either run by individuals with tens or hundreds of users. Um, and then there were things like Prodigy and CompuServe that were kind of the, the pre-online communities that you could, uh, uh, connect to, Right? That's right. 

[00:03:52] Brent Skorup: And in, in, I think it was 1991 when the, the cubby verse copy serve, uh, case came down. So this, this was litigation. [00:04:00] Um, on, on one of these bulletins boards, there was a defamatory comment posted. And, and, and the person who was, uh, defamed or believed he was defamed, sued cubby, the bulletin board operator.

For distributing, uh, this content and, and that the law was, uh, somewhat shifting at the time. But, but there was a legal theory that you would hold, uh, distributors of tortious comment of, uh, content of defamatory, content liable. Um, so this person sued, believed it been defamed on, on this bulletin board, sued the bulletin board operator, but a court took a look at it and.

On practical, and I think free speech reasons said we can't really expect these, these early, uh, uh, technology companies to be, uh, reviewing all, all these comments. I mean, there, there's, you know, the internet was not huge at the time, uh, but there was a lot of growth. And, and, and the court said, we're, we're not gonna hold this distributor.

Uh, [00:05:00] of content liable. You had almost exact same fact pattern. A few years later in the Straton Oakmont case, uh, bulletin board operator, uh, someone posted a a defamatory com uh, comment. Um, someone sued, uh, Prodigy, the, the bulletin board host and operator, although this time the court. Uh, uh, went the opposite direction.

It was different, different court. This is New York State Court and said that the bulletin board operator can be liable and is liable in this case for, for the defamation posted by its users. Um, and about that time, uh, the telecom act was coming and, and. Some, some early tech companies went to Congress and, and said this, this is crazy, that we're, we're gonna be liable for all what users post.

And, uh, you know, and they, they were faced with, with a problem. Um, you could either review all types of content and take [00:06:00] down what you believe might be Leo or Tortious or copper material, or you can just leave everything up and not take anything down. And, and that, that was what? Uh, web companies, at least in state of New York, believe they're facing, um, uh, with, with this Stratton Oak Oakmont case.

So Congress stepped in. 

[00:06:19] Gus Herwitz: So real, real quick, just to make sure that we're, let, let's paint a word picture. Um, uh, uh, I. In the real world, not offline, uh, not online. I say something that is defamatory, something that is harmful to your reputation. Um, you suffer injury as a result. Your reputation is injured.

You lose your job, whatever, uh, it requires for it to be defamatory. You can sue me. So instead of uh, uh, running into, uh, the office cafeteria and shouting this thing about you and everyone hears I post it online, um, and that, uh, there's no question [00:07:00] that I could still be held liable for, uh, defamation if what I had said about to you was in fact defamatory. And I say that online, the question is whether the platform on which I am posting this material, uh, could also be held, uh, liable for having distributed my words to the people receiving them. Um, even though it's kind of the, uh, same thing as if I run into the cafeteria, Shout it.

Because the platform is playing some active role. They're not like a cafeteria. They're more like a newspaper reprinting or distributing what I've 

[00:07:35] Brent Skorup: said. Yeah, that, that's right. I mean that's, um, I, I should say that the case law around defamation and dis dis uh, distributor liability is, is somewhat confused.

It depends on, on the jurisdiction. We, we, uh, my, my co-author Jennifer Huen, and I. Our piece about section two 30 and, and distributor liability, a note that courts are, are kind of all over the place. [00:08:00] Sometimes they hold someone liable, sometimes they don't. Um, there, there was a case of, uh, I believe in the sixties, someone wrote, uh, defamation something embarrassing on bathroom stalls in a tavern.

And, and the court in that case held the tavern operator liable, uh, for not removing it. Um, and I, I don't know the exact facts, you know, perhaps they remove some, some notes on these bathroom stall doors that it's having, not others, but, but, um, but yeah, in some cases you can be held liable if, if you merely distribute.

[00:08:35] Gus Herwitz: So what you're, what you're highlighting there is there's a role changing technology and different technologies. You've got the, I tell one other person this thing about you. I go into the auditorium and. Announce it using an amplification system. I write it on, Oh, the I graffiti the wall of a building. I graffiti a stall in a restroom, a public restroom where there might be someone who's [00:09:00] able to clean it.

I post it in a newspaper. I post it online. These are all different ways of my defaming you, but those different ways have some intermediary that exists and has some more or less ability to control what's going. 

[00:09:15] Brent Skorup: That's right. And, and say 120 years ago, uh, defamation and distribution liability was, uh, much like strict liability.

Um, it, uh, over decades, courts have gotten more and more permissive and protected intermediaries of all kinds more and more, and, and, and said that there must be some fault. As a distributor before, we're not gonna hold you strictly liable. Generally speaking, there's gotta be some fault, um, uh, before you can hold a distributor liable.

Um, but, uh, but so any, Anyways, this, this is the context in which Congress was acting. In, in the mid nineties where you had these, these two core cases, a, against these early internet companies. One saying they're not [00:10:00] liable, one saying they are. So Congress stepped in, uh, to clarify the law and, and, and passed Section two 30, which again is, is this broad liability shield for internet based companies and said, Look, um, we, we don't, we don't want you to leave everything up cuz all kinds of content would, would be left up.

It would, it would pollute the internet. But we also don't wanna hold every company, uh, particularly, you know, in mid nineties, these small, um, this nascent in industry. We, we don't wanna have force them to review all comments, um, and make on spot decisions about whether it's defamation or, or libel or, or copyright evaluations and so forth.

So, um, you can take down content if you wish. Or you can leave it up and, and you won't be liable one way or the other. You can kind of create the community that, that you wanna create. Um, and you won't be treated as a publisher. Um, you won't be liable as a publisher for, for what you choose to, to leave up.

[00:10:59] Gus Herwitz: [00:11:00] And there's a, a detail in there. I think we, I should pull out. Um, you are free to moderate the content. The, the key there is, uh, some courts previously have said, Look, if you're. If you aren't moderating content, that's fine. You're, you're not actively making decisions, so you're not subject to liability.

But once you decide to moderate some content, you've demonstrated you have an interest in ability to do this. So now, if you fail to moderate content, you've created a duty. To continue doing that. And one of the things section two 30 does is it says, no, you can selectively moderate, you can moderate some content and that isn't going to create an ongoing duty to do so perfectly in all cases.

[00:11:46] Brent Skorup: Yeah, that's right. You, you can, you can choose to moderate or choose not to, and, and you will still not be held liables as a publisher. We're, we're, 

[00:11:53] Gus Herwitz: we're still, uh, getting what section two 30 is on the table? Um, I, I'd like to, uh, [00:12:00] Get two things into the discussion. Um, first simply, why are we talking about Section two 30?

Um, as you mentioned it, it kind of was just a thing that happened in the, uh, telecommunications act of 1996. It wasn't viewed as a big deal, but today it's kind of a really big deal. Um, so. I'd like to get your take on why it is such a big deal today, but then also go back to your research of, uh, how these areas of law were developing prior to section two 30, um, to get your take on.

Is it actually that big a deal? 

[00:12:37] Brent Skorup: Yeah. Why, why is it a big deal? Section 230 has been injected. , uh, US politics in, in a big way. This, this began a few years ago. Um, but I, I, I think President Trump tweeting about it, like a lot of things, uh, really, really, uh, uh, made it a, a, a [00:13:00] front page news story, this kind of obscure law in, in a 25 year old statute.

Uh, President Trump, you know, has tweeted and, and many, many of his, uh, followers and, and, and supporters have tweeted to repeal Section two 30. Um, but I, I should add, uh, in, in New York Times article and lead up to the last election, Joe Biden said, when I was first. Jobs as president, Uh, one of his first priorities would be revoking Section two 30 to to tech companies.

So you, you had this unique circumstance where both, both, uh, presidential candidates had, had, had urged repeal of Section two 30 and, and presumably for 

[00:13:41] Gus Herwitz: the same reason. 

[00:13:42] Brent Skorup: Uh, as you can imagine, uh, there's not too much overlap on, on their reasons. Uh, for and and Republican. Many Republicans have taken up, uh, this charge and many Democrats have taken up this charge of, of revoking or, or modifying sex two 30 a as, as someone who's studied [00:14:00] the law and, and media law, I, I would say Democrats have a better understanding of what would happen if, if you repealed or, or modified sex two 30, although I think both sides, um, uh, misunderstand the history of, of publisher liability and, and constitutional law.

Um, So both and, and part of underlying a lot of this, I mean, political parties are, are big fact. I mean, there are a lot of factions, right? I, I think part of it and perhaps the most generous. Interpretation of the repeal or modify sex two 30 movement is, is this view that, okay, this liability protection, it was made in 1996 for, to protect this, this brand new industry of, of web companies.

Today it's, it's protecting Facebook, Google. Apple, uh, Twitter and others. Most of these, uh, some of the biggest companies in, in the world. And, and this liability [00:15:00] protection doesn't make sense anymore. That, that's a generous interpretation. I think a lot of it's much more political. It's tech companies love this law.

We dislike tech companies, therefore we, we dislike this law. But, um, but there are various factions and, and there are various, uh, nuanced views on this, but I think that's probably the most generous. Uh, view, which is it made sense in the 1990s when you, when you had these, uh, this uncertainty and you had this new industry, but it no longer makes sense today.

And 

[00:15:32] Gus Herwitz: are either of those takes Correct. Did it really make sense in the 1990s and does it really no longer makes sense? And I, I, I can flesh that out if you'd like. Um, in, in the early, in the 1990s, um, could the common law have continued to develop in legal institutions have continued to develop to address these concerns without congressional intervention?

Um, uh, and today, uh, if [00:16:00] we were to revoke and undo or change modify section two 30, what would the effects. Uh, and obviously that depends on how we, uh, 

[00:16:09] Brent Skorup: modify it. Yeah. As, as I said, I, I think, I think Democrats have a more accurate view, at least in the short term of what would happen if you appealed Section two 30.

So let's, let's imagine for a moment, and I should say I don't view this as an imminent, um, possibility. Frankly. I, uh, partly because the, the parties, even though they might have misgivings or dislikes Section two 30, they, they dislike for very different reasons. Um, but, but suppose. Did get rid of this liability shield for, for tech company.

In, in the short term, you would see a, a lot more take down of content, um, uh, of, of all kinds. Uh, anything that a tech company believed they might be liable for, uh, they would just err on the side of take it down, um, uh, and, and be done with it. So, so we don't, so we don't get sued. So we don't get, so we're not held liable.

Um, and, and, and. [00:17:00] Usually when Democrats speak of it, this is kind of what they intend. They, they, they want, uh, tech companies, uh, uh, from their point of view be, be more responsible and take, take down more, uh, disinformation and misinformation and, and other types of harmful content. Um, so I, I think, I think Democrats have a clear vision of, of what would happen in the short term, in, in the long term, though.

I think what would happen and, and we, we make the case and, uh, Jennifer Holls and I in our, our law article on this topic, Is that in, in the long term, I think courts would develop, uh, standards and, and, and legal rules that would look a lot like Section two 30 in particular. It, it would look a lot like conduit liability, which is the type of liability you have for, uh, phone companies and, uh, telegraph companies where generally they're, they're not, they're not liable for what's distributed, even if they know what's being distributed might be defamatory or, or, or, or otherwise illegal.

[00:18:00] Um, and. We, we traced the history. I, I should, again, I want, I wanna emphasize courts were not, um, all, all on, all in March step on this. Uh, the decisions quite frankly are all replaced, but there is a trend towards, uh, requiring more fault in distributors than previous. And, and there is a trend opening, opening up this conduit liability beyond.

Just telegraph and phone companies, but open it up to, to wire services, uh, and then eventually, um, to, to traditional publishers. There, there was a, there was a case, I believe, in the eighties, a def defamation case against the AP and, and Newsweek and some others. Um, uh, but a court held that, that, uh, these publishers, including Newsweek, were not liable for, for the, uh, alleged defamation in, in the news story.

The, the interesting thing about this case, Is that, uh, this was not simply republishing a wire service. This was [00:19:00] original reporting based on a wire service. Um, Uh, and, and still Newsweek was, was not liable for, for that. We, uh, 

[00:19:08] Gus Herwitz: let, let's, uh, take a pause for a moment. Um, for our listeners, we're about to go through a tunnel, which is kind of like a conduit, and we will see you on the other side, uh, to have some more discussion of, uh, conduits, social media liability, evolving First Amendment standards.

And oddly enough, we will, uh, uh, add into our discussion some discussion of censorship. Usually when we talk censor. We're taking things off the table for discussion, so see you on the other side of this tunnel. 

[00:19:40] Morgan Armstrong: I'm Morgan Armstrong, a student fellow at the Nebraska Governance and Technology Center, and part of the Space Cyber and Telecommunications Law Program at the University of Nebraska.

Did you know the University of Nebraska College of Law also has a space, cyber and telecommunications law program that started in 2008? The program features tracks for law students and advanced degrees for [00:20:00] established attorneys interested in satellites, international law, radio spectrum, or just about anything in the great expanse of space.

Check them out on Twitter at Space Cyber Law. Now back to this episode of Tech Refactored.

[00:20:19] Gus Herwitz: So as we uh, come back to our discussion, I want to take a brief moment to remind our listeners that we enjoy hearing from you. Uh, we would love it if you have ideas. For topics, for future discussions, please go to our website or tweet us at UNL_NGTC or directly at me at at Gus Herwitz. And we, uh, look forward to incorporating some of those into future episode topics.

We are back now talking with Brent score up about section two 30 of the Communications Decency Act. Um, and I am just being entirely selfish today because there is nothing more that I enjoy talking about than, uh, [00:21:00] the, uh, evolution of First Amendment law in light of changing tech. So we're just going whole hog into this legal history discussion.

Uh, and you listeners are, uh, just along for the ride today. Uh, so Brent, uh, we were talking, uh, a bit about some of the, uh, evolution and development in this area of the law before the break. Um, uh, how do you think the law would've continued to change if Congress hadn't stepped. I 

[00:21:27] Brent Skorup: think it would've developed in time, uh, to resemble something resembling or it would resemble conduit protection or, or sex two 30.

I mean, there, there's a lot of overlap there. As I said, course we're starting to protect traditional publishers, uh, in this way and, and they were not liable unless they directly. , uh, made the defamatory statement or, or, or legal statement. What, what did the 

[00:21:51] Gus Herwitz: standard that the courts were developing look like?

It, it's not no liability. It's not, uh, strict liability. Um, [00:22:00] you have a sense of what, uh, courts were starting to look at 

[00:22:02] Brent Skorup: it. It's, uh, Again, courts, courts were all in place there. There was, um, in some cases they, they just looked for some elements of, of fault in, in others. Uh, they might like, like you mentioned earlier, if, if you were capable and had curated or edited content in the past, you were expected to always do that.

Um, and they, they kinda, there were these different. Protections. Um, there was wire service defense, uh, for that was exclusively for wire service providers, uh, like the ap and so, uh, or the Associated Press. Um, but it got broadened to include other, other, other types of non. non-providers and, and to include publications.

Um, and there was a case that that expressly expanded conduit liability. Uh, the O Ville case, uh, I believe in the [00:23:00] 1990s in district court in Washington. Um, and this was, I believe it's 60 minutes episode that, uh, alleged that Apple growers in were, were using pesticides, uh, in, in their apples. And, um, you know, any, anything touching apple, growing in Washington state is, is a big deal.

They sued their local TV station for, for this defamation. The court in the OAL case this, this Washington Apple case had an analysis, and this is pre section two 30, but an analysis that looked. Exactly like what, what you find with, with the motivation for sex. Two 30, they said yes, TV broadcasters, uh, transmit all types of content.

And yes, they do kill stories. But we can't expect them to review every single statement or preview every single statement of every program that they contract for. It would, it would just be a a, a huge nightmare, just practical, nightmare, and constitutional nightmare to have to hire lawyers to re [00:24:00] review all this content or be subject to massive.

Uh, litigation and defamation cases. Um, and, and you had, you had a few other, and they said we're, we're, we're expanding conduit liability to include broadcasters. And you had, you had a few other cases, broadcast cases. Um, and mind you, this is all pre section two 30. Um, and, and there were some other early internet cases expanding conduit liability.

So, um, in short, Courts, uh, were generally moving the direction of expanding conduit liability. And they were doing this for, for two reasons. Uh, again, practical reasons, just about, uh, in, with new technology, whether broadcast radio or, or the internet. Just the impractical, impractical burden of having to review all types of content and a related factor were constitutional arguments that, um, this, this would.

You'd be in constant litigation over the speech that you're transmitting, and it would lead [00:25:00] distributors to pull down, uh, preemptively all kinds of content. And for courts, that's, um, that's, that's a, a constitutional problem. If, if you have, uh, uh, laws that are, uh, causing. News distributors, uh, to, to remove preemptively all, all kinds of speech for fear of litigation.

So, 

[00:25:20] Gus Herwitz: so there, there's something fascinating in here that, uh, you, you've touched on a couple of times and I just want to, uh, be sure that we draw this out. Um, you, you've touched on the concern that. Uh, defamation liability or whatever liability, um, platforms, publishers, distributors could face for speech that they're hosting or distributing from third parties, if that has the effect of stifling speech.

Courts might find that problematic. Um, and you've mentioned, so, uh, in invoking constitutional concerns, uh, at least, uh, two times that you've mentioned this one, the idea that if, uh, platforms need [00:26:00] to, uh, engage with lawyers at a crushing level, uh, to the point that they're just gonna stop hosting speech, that that's constitutionally problematic.

Um, and then, uh, uh, in the last point that you made, um, So what, what's the, I I, I think there, there's a really nice first amendment point here that, uh, folks tend not to focus on in these discussions. Um, what, what, I'll, I'll just ask, what is the First Amendment balance of Section two 30 and these speech concerns more generally?

[00:26:30] Brent Skorup: Section two 30 in a lot of ways codifies this kind of, uh, this legal movement to expanding conduit liability and, and liability protection to traditional publishers. I, I think a lot of what's driving this, there, there was the, the Smith case, uh, Supreme Court case in 1959 is, uh, where it, where the Supreme Court said, you cannot hold.

Distributor is strictly liable for illegal content. I believe. Uh, Citi was making book bookstores strictly liable [00:27:00] if they had obscene content in, in their books. And, and the court said, You, you, you can't hold them strictly liable. You. And again, very similar justification for, for what the court said, which was you can't expect bookstore owners to review every single page of every single.

Holding them strictly liable, uh, is unconstitutional. There's gotta be some kind of fault. Um, and it's that that mandate from the Supreme Court, there must be some kind of fault before you can hold them liable. That is driving this, this move towards conduit liability and, and section two 30. . Now, I, I, I think, and, and I've, I've heard some people take, uh, my argument and, and, and my paper and, and people who might adopt it and say, Well, that means we don't, we don't need sex two 30.

It's, it's redundant anyways, Um, I I'm not sure that's quite true. I, one, I, I think there was a benefit to passing sex two 30 in the 1990s. Uh, when he, when he did have. This uncertainty and you, [00:28:00] you had new companies and I, I think there's a benefit today. I mean, just, um, just, you know, again, certainty, but also, um, you know, you can, if I persuade you, you know, there might be a common law solution to this.

The common law takes. Decades, centuries to, to get a fixed standard. So, um, it, it, it, it probably doesn't make sense to, uh, repeal section two 30 with, with the hopes, uh, that there will be some, uh, judge created, uh, policy that, that has, uh, the same effect. 

[00:28:32] Gus Herwitz: I should, uh, just explain for listeners who might not be familiar with the, the concept of the common law, um, the, this is the idea.

Uh, the law forms over a period of time as individual courts and judges make a series of decisions. And some of those ju uh, some of the decisions judges make are going to lead to good results and over time they'll tend to stand and be improved upon and modified and incorporated into [00:29:00] the law. And over time, judges will make bad decisions and those will be reversed by legislatures or other judges who recognize, hey, This idea had a problem.

So over a period of, uh, hundreds or thousands of individual decisions in tens or hundreds of jurisdiction, We start to get a coherent body of law, and that's how much of our law has formed. It takes a long time sometimes, um, not always. Uh, and it, uh, really stands in, uh, contrast to the idea that a legislature might just say, This is what the law is, and they'll have some hearings and maybe investigate for a couple of, uh, maybe a couple of years, sometimes a couple of hours, and say, This is what the new law.

Um, and that's very sticky and very hard to change. Um, and, uh, a very different approach to, uh, developing new laws and legal norm. . 

[00:29:53] Brent Skorup: Yeah. And, and just, just to wrap this up, uh, you know, this, this, this topic, um, I, uh, [00:30:00] yeah, I think there's a lot of politics around it and you don't always get clarity and clear motivations when it, when it comes to politics, but I, I think both, both sides misunderstand, uh, the, the history of section two 30 and the law here, the relevant law and the trends in the law.

Um, because if they did, I think it would really. Um, depressurize these discussions and, and, um, so I guess I'm a, a fainthearted defender of, of section two 30. I, I, uh, I believe it, it serves a, a, a good purpose, uh, generally speaking. Um, it's, but, but it, it's, it's also not the word of God handed down to man.

Uh, it's, uh, uh, there are, um, areas of it that could use, um, inquiry and, and study. Academics and, and, and judges. Um, but, uh, but it's, it's really taken on. Um, this focus from politicians I, I think is really undeserved. [00:31:00] 

[00:31:00] Gus Herwitz: So the, the last thing that we should touch on, for better or worse is, uh, the topic of censorship.

Um, and, uh, I, I'll just ask is, uh, section two 30, uh, I'll, I'll ask it this way. Is section two 30 pro or anti censorship or both? 

[00:31:18] Brent Skorup: Section two 30 gives companies. Uh, internet companies and, and their communities. The, the freedom to decide whether, whether they can be pro censorship or, or, or pro free speech.

And online, of course, you see that, I mean, there's, there's all kinds of communities that choose all, all different types of, of, uh, what, what to censor. Um, you know, more on the, on the more loss fair, you know, it would be a website like, For Chan, um, you know, that that really doesn't take down too much content and there's a lot of, uh, uh, discussing content on there.

Um, on, on the other side, you've got, you know, Facebook, which, you know, intentionally is trying to create [00:32:00] a, a safe, uh, social network where children and grandparent parents and, and parents can, can enjoy to be. And that means removing a, a lot of content. Um, uh, You know, pornography and animal abuse and, and racism and hate speech.

Um, uh, so, and Section two 30 protects both, um, you know, both ends of the spectrum. You, you can take a hands off approach or you can take a fairly intrusive approach about what you'll allow, um, not to mention, you know, openly, you know, partisan sites that, you know, won't, won't tolerate other, other viewpoints.

Um, and so I, I, I think, I, I think there, there's. Uh, a good balance struck. With that. 

[00:32:44] Gus Herwitz: Yeah. So the, the distinction or the, the question of censorship and section two 30 is, uh, it, it's a hot topic and something that we, uh, debate and discuss a great deal, but it, it's really the wrong question because. [00:33:00] Section two 30 is about empowering private platforms to do stuff, and we call that moderation.

Uh, what's the difference between censorship and moderation? Well, censorship tends to be government compelled moderation. So the, it's the government deciding what content can or can't, uh, be posted, uh, or hosted. Uh, so. Uh, as hard as it is at times to believe both legitimately and illegitimately. I say this somewhat facetiously, Facebook is not the government.

They are really big. But, uh, when they're making moderation decisions, uh, they are making moderations decisions and they're taking corporate risk and they're, uh, uh, instantiating their own corporate values. And it's really hard if you believe, for instance, uh, that we should. We should be able to have platforms that, uh, support socially valuable causes, while we need to allow them to make content moderation decisions in order to do that.

And you can't allow [00:34:00] that for some, but not for others. Uh, uh, uh, without risking both. Okay. Well, uh, Any, I'm, I'm confident that, uh, the discussion about Section two 30 is going, uh, not going away. I almost said it's going nowhere. Um, I, I think that that might have been a, uh, a bit of a, a Freudian slip, but, uh, I'm sure that it's, uh, not going away.

Any, uh, less thoughts, uh, for our listeners, um, to, uh, take with them as they, uh, go out into the world and encounter Section two, section two. 

[00:34:34] Brent Skorup: Well, I encourage you, if you'd like to learn more, if something I said, uh, was intriguing you, you can, you can read, uh, my law article in Oklahoma Law View in 2020 with, uh, Jennifer Huddleston about the erosion of, of publisher liability in American law.

Mm-hmm. 

[00:34:48] Gus Herwitz: Okay. Well, thank you, uh, Brent Skorup for joining us today. Um, I have been your host, Gus Herwitz. Thank you to our listeners for joining us on this episode of Tech [00:35:00] Refactored. If you want to learn more about what we're doing here at the Nebraska Governance and Technology Center, or submitted idea for a future episode, you can go to our website at ngtc.unl.edu.

Or you can follow us on Twitter at UNL underscore NGTC. If you enjoy the show, please don't forget to leave us a rating and review wherever you listen to your podcasts. Our show is produced by Elsbeth Magilton and Lysandra Marquez and Colin McCarthy created and recorded our theme music. This podcast is part of the Menard Governance and Technology Programming Series.

Until next time, keep moderating well. [00:36:00]