Tech Refactored

Social and Hard Technologies

September 16, 2022 Nebraska Governance and Technology Center Season 3 Episode 4
Tech Refactored
Social and Hard Technologies
Show Notes Transcript

Author and Law Professor at Washington and Lee University, Josh Fairfield, joins Gus to discuss social and hard technologies, and topics from Fairfield's books: "Runaway Technology: Can Law Keep Up?" (2021) and "Owned: Property, Privacy, and the New Digital Serfdom" (2017).

Follow Josh Fairfield on Twitter @JoshFairfield
Follow Gus Hurwitz on Twitter @GusHurwitz
Follow NGTC on Twitter @UNL_NGTC

Links
Nebraska Governance and Technology Center

00:00 Welcome to Tech Refactored, a podcast in which we explore the ever-changing relationship between technology, society, and the law. I'm your host Gus Herwitz, the Menard director of the Nebraska Governance and Technology Center.

00:25 On this episode, I'm speaking with Josh Fairfield. Josh Fairfield, I'm the William Donald Bain family professor of law at WL University school of law. Josh is a leading law and technology scholar, his articles on protecting consumer interest in an age of mass-market consumer contracting regularly appear in top law and law and technology journals

00:44 and his work has also appeared in outlets like the New York times, Forbes, and Financial Times. He has also written several books on these topics in 2017, Josh released his first book titled owned property privacy and the new digital surfdom. I talk about how we don't really own anything that we buy are fully bought, paid for movies, books, music, online,

01:04 we don't really own it. And in 2021, he released Runaway Technology: Can Law Keep Up? With the surprising answer: yes, that, uh, law in fact can evolve faster often does evolve faster than technology. It's just social technology. It's a technology of a different kind. And once we know how it moves, how it works, we can evolve our language of cooperation or law faster than technology.

01:27 Before he became a lawyer, Josh was a technology entrepreneur, serving as the director of research and development for language learning software company Rosetta stone.

01:40 You, you know, Josh, you're, you're one of my favorite people in the world to talk to because you always, uh, blow my mind a little bit. And also I think you'll appreciate this characterization. You also rewire my brain a little bit and change how I, how I think about things. I wanna start by something that you just said, uh, as we were chatting, what is technology and what, what is social technology?

01:59 Yeah, sure. All right. So my favorite definition of technology is, uh, by a Polish science fiction author, Stanislaw Lem. And what he said is technology is "the domain of problems and their solutions." That Really resonated with me because I began to notice something, and that's actually what this new book is all about.

02:21 I began to notice something a while back, which is that we often use social technology to handle the same problems that we use hard technology for, and the two often have a kind of a handoff. So you and I, for example, are in right now, we're looking at each other over Zoom and we're recording this. But we could convey information like this in a classroom.

02:41 The social technology of a classroom has certain features. Everybody's looking at one speaker, one speaker is conveying information to many people. We can replace that with one person speaking on a podcast and being listened to by what we hope is many people. We've we've swapped the social technology of the classroom for the hard tech of the podcast.

03:02 Or you could think about, you know, the mail carrier and email, or you could think about, um, you could think about money transfer networks that were done through informal connections versus banking systems. Um, the the point is that we sometimes do things socially. We sometimes do things technologically.

03:20 And when I started looking at it through the lens of, of Stanislaw Lem's definition, I realized that law is technology. That it's a problem set. It's the domain of problems, specifically the problem of "how do we live together without violence?" and their solutions. And we in fact evolve our social technology, we evolve our law, just like you see evolutions of hard technology.

3:45 And that was the basis for me to begin to say that, that things like the idea that law always lags technology is necessarily nonsense, because it's just, it's like saying technology lags technology. The real question is where do we put our efforts? Like where do we try to develop things that- that's what determines what stays ahead of what?

04:03 So to spin out an example–and part of my question with this example is, is this a good example? Sure. We could think about ways to avoid water damage or flood damage to houses, and there are lots of ways we could just not build houses near riverbank. Uh, that's a social technology, or we could have building codes for how we're going to do it,

04:24 That's more of a legal technology, or we could build houses out materials that are more water resistant and we could develop better sump pump technologies, or we could develop, uh, uh, dams and levies to control the water. Right. And that's a more of a technological, uh, and a civil engineering perhaps, uh, solution.

04:41 So you- you're- you're saying these are all ways of solving a specific problem. They're all technology in some sense, you got it. And in fact, then there are loops, right? That nothing is either one or the other. I'll I'll give you an example during the, you know, during the pandemic, I guess it's not over yet.

05:00 There were these COVID tracing apps. Now some of those COVID tracing apps were actually quite good technology. They use the GAEN framework, they were purely out of tokens. They really protected privacy, but because people in Virginia, California, where these were first floated out so distrusted– there was such a, a social lack of trust,

05:19 they didn't use them. They, the application failed technologically to do its job. The failure was a social failure in New Zealand. They floated apps that were terrible in terms of privacy, they actually had people fo- like government officials following up directly in people's homes saying, "Were you there?

05:39 What were you doing this?" And yet, because people had trust that the government would do what it said, what it would do with the data, those COVID tracing apps worked and had massive uptake, a- and massive function. That's an example of how, like an app worked or didn't work. The functionality that worked or didn't work was a failure or a success of social technology.

06:00 So these things loop back into themselves, hard tech can work or not work because of a failure of social technology and social technology can work or not work depending on the hard tech affordances that we've got available to us. So is that an example? Equilibrium or disequilibrium or perhaps sounded or unbounded constraints.

06:19 So what, what I'm thinking is there, there's an irony. What you just said, the more we're concerned, the less that we socially trust the technologies, the higher expectations and demands we're going to have or how well they work. So we've got greater concerns or we can design better technologies that we might not use because we don't trust them.

06:37 If we have less concern, we're going to have less expectation of the technology. Right. So we're gonna design poor technologies, but we're going to use them. Right. But we're gonna use, and, and in a sense that, that points to why we keep trying to design our way out of problems. I mean, if I had to say what the new book is about, it's that we keep trying to-

06:55 to hard-tech design, our way out of problems that are essentially social problems that are problems of society. That's why it keeps failing, and our attempts to keep putting more and more features in to lock this up tighter and tighter doesn't resolve the fundamental disconnects of a society that hasn't really worked out social ways of using this stuff in life- giving ways.

07:15 I want to get to your book, I have one more question or topic to put on the table. Yeah, sure. They'll- they'll bring us there. We talk a lot about privacy. Do we have any idea what the heck we're talking about when we talk about privacy? Okay so obviously you're just like, you're baiting me here, this is great.

07:31 Um, because I've got this new article out on it, uh, called you know, you keep using that word why privacy doesn't mean what lawyers think it does. So on the one hand, let me give you the flip answer first. You said the word privacy, I heard the word privacy. And we both know perfectly well, broadly speaking, what we're talking about.

07:53 Now you and I can get down into the weeds and we can disagree about a whole bunch of stuff. And we can actually talk ourselves to a point of fundamental disagreement about what privacy is, because when you take the lid off of the word privacy, there's a bunch of different contexts in which it's used by a bunch of different people for a bunch of profoundly different reasons.

08:13 So we can look at that and we can say, oh, there's no essence to privacy. There's no core of it. But there's no essence to the word liberty either, there's no essence to the word law. There- there's no essence to the word run. There are 645 different completely different contexts in which we use the word run to run a computer program, a run in your pantyhose, a run that was salmon, to run a bath.

08:42 Look what's happened with the word privacy is because corporate surveillance wants to grab so much of this and they don't wanna be regulated. And so what they consistently say is we can't have effective regulation around privacy because nobody knows what it is. It's a nonsense sentence. If you applied the same requirements to liberty, if you applied the same requirements to democracy, if you applied the same requirements to the word, is they all fall apart

09:11 if you try to look down for some essence. So what is privacy? I'm gonna give you the linguistic answer. Privacy is the set of contexts in which humans make reasonable bids for privacy. And you say, is that circular? And I say, yeah, language is circular. What's justice? It has to do with fairness, what's fairness.

09:33 It has to do with justice. It's not like there's some core definition that you can drill all the way down to when you start looking at words, it's turtles all the way down. What we need to do is look at the set of context in which people make claims on privacy and ask ourselves: as a society, is that good social technology? are humans thriving when we vindicate those interests, when they make those claims for privacy, or do we fail as a society when we vindicate those interests?

10:05 And frankly, I think that pendulum has swung pretty far in one direction. I think right now we would be much happier and people would be thriving far better if we vindicated a lot more context in which they say the word. Notice I've avoided defining it because I think definitions do damage. Define Liberty for me.

10:23 Oh, you can't define Liberty. We can't have any Liberty. That can't be how this works. Right? Define democracy. You can't define democracy. Oh, we can't have any democracy. Can't be how this works. You said two possibly competing things in there, but I, I want to, uh, flag both of them and see how you resolve them.

10:40 You started by saying lawyers define privacy and lawyers talk about privacy. And you, you, you used the word lawyers in sure. Identifying the problem, but then you went and said, and the corporations say that we can't regulate privacy for all these reasons, or, or what's not. Um, yeah, and I, is this a law problem, a lawyer problem?

11:01 Is it a- a corporation's running circles and fundamentally, is this a, an issue with privacy or the nature of law and regulation? Right. We can go back early 20th century and that's when we start having rights debates and start talking in this country really about, we have rights and suddenly, well, the Supreme court needs to define what this right means and strict a scrutiny versus intermediate scrutiny.

11:23 And come on guys, let's- I'm gonna say something that'll get me in trouble here– let's go back to Lochner and rational basis and just have judges saying, look, is there some reason that the legislature's doing this thing instead of trying to define things into buckets of rights and concepts like privacy?

11:41 Well, but what I do think that right's talk has real value. I think privacy talk has real value. I think humans value and thrive. I've lived in countries where people have more rights. I've lived in countries, the United States where people have less rights. I do think that right's talk is valuable. I think that the tension you describe can be unpacked quite neatly.

12:02 The idea that we can't have privacy regulation because nobody can define privacy is a lobbyist move. I watched this unfold in the do not track debates where the companies said, oh, we've committed publicly to enabling a technological, do not track framework, but then they scuttled the whole process by saying, yeah, but nobody can really tell us what do not track means, do not track–

12:25 and I know this is crazy– means do not track. So it's a lobbying move and it's a familiar lobbying move. It's like, no, you know, what does blank even mean? Is not a real request for an honest discussion, but what blank means, it's an attempt to get out from under the discussion. Now, lawyers as a guild are susceptible to that attack because we are really in the definition.

12:50 Because our superpower, when we write contracts, when we write constitutions is defining words in order to constrain their multiple meanings. And so when somebody asks us for a definition, we provide one and when they say, "oh, but that doesn't capture these cases," we begin to wonder if there's a problem with our tool set.

13:08 There isn't, it's just, we need to understand when we define "good faith" for purposes of a contract, it has nothing to do with the word good or faith or both. It just means what it's been defined to mean in that specific context. And that's okay as long as we don't drag that back out and say, oh good, faith must only mean what I defined it in this contract to mean.

13:29 So I think lawyers– we're enamored as a guild with definitions. And so I think we're a bit susceptible to this claim that we can't have law if we don't have a precise definition, even though it's nonsense for every basic term in the English language on which law rests. Justice, define it for me. How does this play out in public policy?

13:53 And I, I guess perhaps now we're going to another of your papers which queued up on, uh, can the law keep up with technology? Um, so the, the "do not track" example, we say we're going to pass a law that says, "do not track", and there's an intuitive meaning to what that means. At the same time, every company when they have a website they need to do some

14:12 logging of access to the website for billing purposes and development and debugging purposes. Yep. And all this stuff. So they need to do something that looks like tracking. So we have immediately a disconnect between at least three different parties. We've got the lawyers who are going to be defining these terms to mean something,

14:32 the companies who are trying to comply with them under the threat of potentially of serious legal sanctions and the public, the consumers who have at least one understanding probably more than one understanding of what this means. So let's take your proposition as correct that we can pass a law that has content that says, "do not track."

14:55 Yeah. We have a lot of questions about what that means. So how does the law actually implement that in a way that ultimately gives meaning to it the same way we interpret, read and use any statute? Right. The problem you've described there is the problem of statutes, the problem of law generally, which is we use words, the words, every last one of those words is susceptible of multiple meanings.

15:20 And then what we're really trying to do though, is not have fights about meanings. We're trying to say which version of the rubber hitting the road works out the best for this society helps us live together. Well, what do you do in the normal run of things? You ask a court, the court brings its wisdom to bear on the two competing interpretations.

15:42 I love Bob [indistinguishable] narrative article on this, where he says, essentially what happens is law bubbles up from below. You get these two competing interpretations of what not tracking means the judge doesn't actually invent law, but simply kills off one of those stories, lets the other story grow and flourish.

16:01 That precedent comes up against other precedents and the ones that seem to fit more naturally with our lived experience. Again, flourishes, this is within a healthy system. We do not presently have one and bubbles up to the next level and so on and so on. And by the way, an interpretation that might have been helping humans thrive under a prior system can begin to break over time.

16:24 And so the judges revisit it and the people bring new narratives to play. And those new narratives compete out again in an organic iterative, humble story based analogy based process. I mean, my answer in short is that's what the common law does. It makes sure that we don't get hung up on definitions–

16:44 and as Holmes said, right, "experience is the life of the law,–" instead bring our experiences, our stories, our narratives. And we say, look, in this context, this interpretation of tracking played out and it really damaged me. My kid got stalked and defrauded. And in this context, um, we can't run our business without you bring those arguments and you bring that lived experience and you pack it into the series of narrative balances that courts look at.

17:11 That's what we do. It's why, you know, the European union, which has quite elaborate privacy regulation by this point; but notice that they run most of their technology law, they run most of their privacy law discussions through common law style courts, because they need this process of story gathering and comparing to work out where the rubber hits the road.

17:32 When words meet reality. My answer's the common law man. Yeah. So that raises a bifurcated question. Everything's bifurcated. Some things are bifurcated, um, a bifurcated question of, okay, I've got the trifurcation first. You said our, our system is just broken. We'll need to ask why that is, but two questions about ways that it might be broken ?

17:53 We have a very statutorily defined approach to developing these new legal norms. So is it a problem either that we're not in a common law system anymore, that we are trying to codify and use statutes to address these issues, or is the problem that we're impatient spoiled lawyers with clients who want answers now.

18:15 And the, the reality is these are hard questions and it's going to take a generation or two to sort them out through the common law process, interpreting statutes. And we just need to wait and allow for that process to play out. All right. Well, so your trifurcation, I mean, I guess I'll- I'll go for the last one first,

18:34 we can do this really fast. I, I think again, the idea that like, it's, it takes time. It, it, doesn't a functioning system moves really quickly. I'll give some examples, you know, every time there's a new technology that comes out that has anything to do with intellectual property or whatever people say, "oh, this is absolutely gonna destroy the industry," blah, blah, blah,

18:53 and it just doesn't at all. An example: when people first started using cryptocurrency, their view was law couldn't really touch it. And a lot of people went to jail. Like law was already there, law didn't have any problem dealing with this bleeding edge technological issue. I've never seen law develop so fast in my life because this touch on the banking system so it can move,

19:14 and should when it's healthy, move at a pretty fair clip. I think what's wrong with the system to move to your first part of the cation. What's wrong with the system is that we have divergent languages, that there are divergent, informational spheres in this country and they touch less and less. The Venn diagram now does not overlap.

19:41 I do think that one of those informational spheres is more connected to science and is more connected to being corrected for errors. And that reveals my own affiliation. But the core problem is clear from anybody's perspective, which is that we are not speaking the same language at all anymore. And therefore there is no basis for using our superpower language is human superpower.

20:07 We use it to evolve our social software in historical time, rather than waiting for like geological time to evolve, you know, ants and bees, they're like us. They cooperate at huge numbers. They have to evolve in geological time. We can just declare- you know, we can just deity the king and declare a Republic. Ants and bees

20:27 can't do that. Once our superpower, though, to evolve our social technology breaks down, we get stuck. And that's been more and more the case. Law can keep up, but not if we can't talk, not if we don't have common language. You and I have previously, uh, spoken about the role of language and society and cognition.

20:46 Sure. And not, not a sort of, we're not gonna quickly jump into just a- a 32nd sidebar on, on all those issues. But at, at, at one level we can think of language as the human operating system? Yeah, exactly. Massively parallel. And, and in order for us to interoperate together, we, we have to be running an interoperable operating system, have this shared language for how we process reality, how we understand our social construct, our social, uh, contract and with different languages, different conceptions of the meanings of words.

21:17 It doesn't matter left or right, or right or wrong, or however, you're gonna think about this. It's just a social. That you have to have shared social facts. And I, I use the word social contract there, which is a subtle way to move on to, uh, talking about your book. Sure. Uh, so, uh, can, what are you trying to do in the, the social con?

21:38 Okay, so the social con is exactly what we've been talking about it all along. You keep saying, we're gonna get to the book; we actually never left the book. It says this look, we have this problem of language. And our problem is that there are too many nodes. If you think of language as a operating system that runs on nodes made up of communities, of people, those communities of people generate language by how they talk to each other.

22:04 They generate all kinds of language, they generate language like, you know, fake news, they generate language like black lives matter. They give meaning to those words by how they use them. And that process of language generation is supposed to be how we progress towards the future. My personal theory is something's gone wrong with that process.

22:27 We can't speak to each other anymore, and a large part of it has to do with we've changed the ways in which we're speaking, just to give a real concrete example. Cause that's- that was abstract as hell. It's hard to hate up close, it just is. Anybody who's felt that sitting across the table from somebody who is from a, you know, different political persuasion, but you still feel that sense of friendship.

22:49 It's just hard to hate up close. It is not hard to hate over the internet. And yet the internet is making the claim that it or platforms on it– Facebook, Zuckerberg I'm looking at you– are going to replace kitchen tables, that Facebook chat rooms will replace kitchen tables will replace diner tables as the relevant center location context for our language.

23:15 So the first part of the social con is saying, wait, whoa, hang on. We are in the process– especially during the pandemic, we're in the process– of a mass transfer of ordinary social interaction of that language generating activity that we rely on. We're transferring it to a different context. And we have paid no attention.

23:37 I mean, none, to the social technology that we have left behind in that shift, the affordances of a table at a diner, across which you're sitting from somebody who's of a different political party; there are so many affordances. If you think about what's going on there, everything from the bandwidth of the interaction from carrying tone of voice, to smell, to you being able to pick up the bill in order to make that- adjust your friendship to the smell of the food, to the hubbub in the background that causes you to lean slightly closer together.

24:08 We haven't even begun to think about why some contexts create life-giving language and some contexts generate hate and division. And that's what the book is about; the book is tracing this untold story of when we move- or it's supposed to I'm, you know, I'm writing it and I'm behind on it and, and I wanna catch up on it.

24:29 Um, but, but that's the idea that it's tracing the affordances, the social technology that we left behind when we moved from kitchen tables to Facebook chat groups, and we felt that loss a little bit, like, think about what was lost when we moved from the classroom to teaching over Zoom; everybody felt this sort of subtle undefinable loss, the information was still being conveyed.

24:52 You could still see people's faces even, you could see their reactions, in some ways it was easier! In some ways, some of those teaching techniques are gonna stick, but there was no question that there was an undefinable loss, an energy, a sense of connection, a sense of parallelism connection between the students looking at each other to the side, there was just all kinds of things that were lost in that transition.

25:16 That's what the book's looking for. It's saying we need to pay attention to the slowly evolved and almost invisible social affordances that actually form the warp and weft, the loom on which we weave human connection and language. And so much of that has simply dropped out because we dumped ourselves into a different context, and we paid no attention to that whatsoever.

25:37 So I want to probe the strength of the claim or perhaps the extent of the claim a bit. Sure. So you've given two polls or two positions–uh, I don't know if they're polar opposites, um, that the kitchen table setting or the, the across from a diner setting in person and online, not at all in person– I'd start by positing

25:57 I think over the course of the 20th century, the development– uh, earlier than that of the telegraph and the telephone– made the world a smaller in some negative ways, but also in ways that made us more common share greater humanity across cultures. And obviously we had two world wars over the 20th century, but uh, also made us much more peaceful a world than we have been at any point in human history.

26:25 And that was largely facilitated by a new mode of communication that lacks all those affordances, and perhaps this is a story of commonality at the social level that was facilitated by these new forms of communication, but we lost something at the individual level or not. Um, so that that's one example I want to throw into the mix and the, the other example, which I can

26:47 already see both of us perhaps rolling our eyes at. So I, I think a lot of people, probably most people would recognize some truth in what you're saying. So the question then is how do we get those affordances back? Can we, without throwing out, uh, all of the wonderful parts of the internet and, uh, modern communications? And I'll just ask, will the, metaverse bring back those affordances?

27:12 Oh, man. All right. So three, you know, and it's funny that you asked about the metaverse first of all, if you're talking about the metaverse in terms of like Facebook's horrible experiment, no. Um, that just is a pure graphics design question, um, is so profoundly uncool that it has no chance. Um, the, but, but, but it does, it's a legit question cuz you know, as you know, I cut my teeth in

27:35 the 2000 to 2010, um, explosion of online communities and the addition of graphics to online communities, virtual worlds, and virtualization to technologies, which did add affordances, the sense that somebody else was there with you in the room. And you're right that a lot of my intuitions are hopeful.

27:54 I've just given a very negative account of moving from kitchen table to Facebook. But in the end, I'm also seeking to account for- you know, my very closest friends for a while were people that I met in online games. You know, my closest friends still often are people that I've met online and have used these technologies to, to grow closer too. How can technology- here's the point. It's clear we can do better even than we do without these technologies.

28:22 The issue is, can we remember, can we find what we needed from real space? Can we solve the issues that are causing us- our, our languages to just break up on our cultures to come apart and then can we get the benefit of the affordances that we didn't have before that can draw us all closer? I think that information technologies can do what we do around the kitchen table, even better, but boy, they're not doing that right now.

28:51 And so the step one is identifying what we lost and what went wrong. Getting back to baseline would be a really good place, getting back to any kind of shared language, any kind of common humanity, any kind of shared culture would be a great one, but then moving forward... yeah, I do wanna see these affordances make it even better.

29:10 And so in that sense–not Zuckerberg's crazy crap metaverse–but a metaverse, which is a loose association of these technologies that bring us really close together and form and weave close human connections. We are so desperate, so hungry for community right now. If we can use these technologies to actually feed that need for community in ways that cause people to thrive, instead of going into crazy conspiracy theories because they need the community, like that's the way we thrive, but step one is undoing the damage we caused in this mass transition where we didn't think about it.

29:46 But yeah, at the end of the day, I do think like, You know, not Facebook's metaverse, but in general, this aggregation of technologies, I, it can do better than baseline. We're just nowhere near it. So, Josh, I, uh, invited you to come in, talk on the podcast and talk about your book and I'm just throwing hard questions.

30:01 I'm- I'm doing the, uh, the dissertation "I'm grilling you" sort of thing. Go for it. That's fine. That's nice, man. I got another four you. Are you sure that modern communications has been making things worse or has it possibly been revealing the tensions and problems that we didn't recognize were already there? 

30:22 And that we're able to connect with people who are similar to us today and form new communities that we didn't recognize. I, I just thought, oh, I, this was my proclivity, I was weird, I was wrong to think, whatever I think or feel whatever I feel. But now I'm realizing that there actually a lot of other people like me and the majoritarian politics were actually excluding a lot of other views,

30:49 and now we're able to form new politics and new coalitions and it, it creates more tension, more hostility that we didn't have before, just because we didn't recognize that these were questions to be asking and issues that were being ignored by our society. Yeah. And I think that there are two sides to that coin.

31:08 On the one hand, there's no question that LGBT kids, um, have needed to, for their own safety and sanity, form communities and that they have in fact seen that society has been casting them under poisonous toxic norms that- we're killing them. I'm literally killing them, when you look at the studies, and that they've done that in self defense.

31:29 It's also true that racism and hate are back and they are fueled by extremist communities and amplified voices from highly motivated extremist groups that doesn't cut one way or the other. I do think that what we've seen, especially around disinformation and misinformation is something new is a function of the old quote-

31:57 I think it was Winston Churchill, right? That a, a "falsehood could make it around the planet before the truth even gets its pants on" that literally there is- if there's a law out there, I forget the name of the law, but it's just simply the cost of generating falsehood and spreading falsehood is, in order of magnitude, less than the cost of debunking falsehood.

32:16 And that that plus near-zero communication plus the profound and planned negligence of the companies that profited from outrage and spreading outrage in order to foster engagement again, Facebook, you know, they come in for it, that that combination has created- so, you know, I could reference 

32:36 Yochai Benkler's work where he just tracks tweets and connections between different people within different spheres. And you can watch the different communities- specifically, you can watch the center-right of the political spectrum vanish, and you can watch the creation of a separate informational sphere that is distinct and cut off from other connections within, you know, from, from the center right

32:59 kind of all the way through to the left, you can watch that kind of separation off. That kind of informational separation- we've done it before and it can be done by government propaganda, um, we've certainly seen that we're seeing it right now, um, with respect to Russia, right, where we can see that government propaganda still works.

33:20 It's not really what's tearing us apart in the United States. That does create a separation of informational spheres that is a real change from the seventies. That is a real change over the past 40 to 50 years. Um, and I think everybody has marked that progression. So is the concern there, or is your concern- more about the loss of positive affordances or the creation of new negative affordances, the ability for those who would harm our society for their own benefit to do so using these tools?

33:56 And I- I will accept an answer of yes, both. Yes, both only if you explain, uh, them equally. How they fit together, sure. I love it. Um, yes, both. But I, but I do think there's a strange relationship, first of all. Yeah. The idea that extremists can get out there and be heavily amplified. I do blame the extremists, but in some sense, that's letting the platforms off the hook.

34:21 It was the platforms who amplified the voices in order to foster engagement. So the, the thing is it wasn't just a neutral shift, we didn't just move from the kitchen table to a virtual kitchen table. For to anybody who was like around, in like the early mid-nineties, you know, chat rooms and whatever, right?

34:37 Like there was sort of a glorious anarchistic kind of feel to the whole thing. You didn't have the sense that there was anybody behind the scenes using this entire system, kind of pushing everybody quietly in a given direction for gain. That's not what's going on in your Facebook feed. That's, you know, again, I'll use Facebook, but it's a proxy for all of them.

34:59 That's not what's going on. There is a profit motive. There's nothing wrong with a profit motive by itself, but because that profit motive has needed to stay hidden to extract the maximum amount of data and use it without the knowledge of the people it's being extracted from and engagement is fostered throughout rage.

35:16 And the rest of the things that we now know about what the algorithms select for that undercurrent of building the platform in order to extract data and drive engagement has caused our experience of moving from the kitchen table to the chat room to be subtly twisted. That's the problem. The problem is the amplification,

35:38 largely for reasons of that the algorithms have decided that people who are really mad or really into it are really invested and the algorithms discovered that and drive it through. And, and that's why that's our experience of being really mad as the comedian says, I don't know if you've been paying attention recently, but everybody's really mad all the time.

35:57 That's why that's our experience. I wonder how much of this has to do with our shared language and technologies generally being built on some understanding or conception of trust. Yeah. So you, you spoke about the internet in the 1990s, and I show my students ads for the early internet from MCI talking about how this is a pure place of mind and everyone- there are no races or genders everyone's going to be equal.

36:24 And John Perry Barlow when- the declaration of independence of cyberspace, right? Sure. It was such utopian model and it was beautiful, it was wrong, it was beautiful. Uh, when, when we think about our communities, part of being a community is having an immune system. Part of a language- and I I'm just asserting this, I don't actually have any basis for 

36:40 asserting it is having some sense of skepticism, some ability to reject noise and the, the foundation of the internet was, in many ways, legitimately utopian and idealistic, absolutely naively so, but it didn't have a sense of security and threat built in and right. Uh, early two thousands, when people started coming to the internet, uh, en masse, it wasn't nearly so commercial and it was just a- "great, I can talk to my friends,

37:06 this is wonderful." And now we've got Russian misinformation campaigns and attention economy concerns, uh, which- we could have an entire discussion about pros and cons of the commercialization of different aspects of the internet. But there, there certainly are problems there and manipulations there that aren't in the best interest of the users.

37:27 How much is this, uh, a problem of our language not having an immune system or inherent skepticism here? And to what extent do you think this makes the problem particularly difficult than other legal systems? Going back to the example I started with, uh, at the beginning, uh, how do we build houses that aren't going to get flooded?

37:47 Well, in that case, you're not worried that the river is going to deliberately try and flood the house and respond to whatever rules you put in place. But that might not be the case here. Right. Um, yeah, that's a multi-layered and, and deeply difficult question. Let's focus on the analogy of the immune system, because everything that I've been saying is about the value of finding the right analogy and building it into a way to talk coherently about our problems,

38:14 right? So let's do it. Is there an immune system in language? Yeah, there is. And it's strange, like words have so many meanings and yet we can use them because they don't have all the meanings, right? It's Orwellian to say, "war is peace", right, "ignorance is strength". To assert something and its opposite, we begin to- we we've just constructed those diads as, as opposites to each other, but we can correct things.

38:45 This goes into a little bit of philosophy of language. It- it is important to be able to say, no, that's not how we use that word. I'll give you an example that goes back to our privacy discussion privacy policies. They're not. They're data use policies. The- this use of the word privacy in a policy that says we're gonna take all of your data and sell it to whoever the hell we want to is frankly or well in the immune system is statements like that is statements rejecting the use of the language saying, no, you used words wrong,

39:23 that's not what that means. That democracy doesn't mean, for example, silencing people, there are a number of ways in which we can reject that. The problem is in your analogy of the river that responds and begins to invade when we've got languages. When we've got basically completely divergent cultures, that immune system fails

39:43 because one language isn't going to accept the corrections issued in another. This is highly politicized, but that's maybe one way of looking at the debates around what's called 'cancel culture'. Do you see it as a, an attempt to really press on people who are not paying attention to the damage their words cause, or do you see it as a- as a politicized attempt to silence?

40:11 The, the differing interpretations stop the immune functioning from happening because they're so widely divergent. I think there's our big issue; without common language we're left to, um, we're left to either create it or we're left to watch things, watch the social fabric continue to unravel. I said at the start of our conversation that I always love talking to you because you, you change how I think about things and you, you reprogram my brain a little bit.

40:36 I- I've been, uh, working on a paper, uh, that'll be, uh, coming out soon in a book that one of my colleagues is, uh, editing and also the Journal of Free Speech Law that uses Claude Shannon's information theory and the idea of signal-to-noise to- to think about online ecosystems. And one of the things that you just said made me think one of the problems with many of these platforms is they're amplifying the noise. Right.

41:00 Absolutely. And we, we generally want to filter the noise and amplify the signal, but there actually are mechanisms that might be amplifying the noise, which just destroys the capacity of any information ecosystem. So I'll ask you a question that I'm sure you have a easy answer to. What do we do about it?

41:20 What do we do about it? Okay. The answer is actually easy. Just, you know, we gotta see it and then we gotta do it. And that turns out to be really hard. If communities surrounding terrible ideas, bigotry, hatred, generating toxic language is the problem, then communities generating language that helps humans thrive is the solution.

41:45 What I tried to do was set out the kinds of communities that seemed to generate language that leave humans doing better after they got done with. Then when they started as a counterpoint to the kind of communities we've been looking at and talking about here, that leave pretty much everybody, the hearers and the speakers worse off after they're done saying it.

42:05 And the communities like as far and here, if anybody hears this and wants to drop me an email at, you know, fairfieldj@wlu.edu, please do, uh, you know, I, these are my ideas, but man, uh, you know, with one or two more great ideas from anybody else, it'd make the writing the book so much easier.  the kinds of communities that seem to work to me are ones that are non-hierarchical that, that is they don't use coercion force as the basis for determining what rules the group is gonna propose.

42:36 It's the reason, for example, why we immunize legislators from prosecution while they're legislators, like you don't want force to be the idea because what you want is competition of ideas and if somebody's holding a gun to your head, tat's all the more discussion there is. Like it's the bad idea wins because of coercion.

42:56 It wins because of force. So non-hierarchical non coercion based. The second one that I've identified or that I think I've identified is that they have to bring profoundly different tool sets to the table, right? So if you've got a team made up of everybody, they can be brilliant people, but if they've all have the same tool set, they're gonna keep coming at the problem with a limited capacity,

43:19 whereas if you bring in people, who've got completely different tool sets. If your team is not just a. The three best computer scientists you've ever met or the three best lawyers you've ever met or the three best privacy lawyers you've ever met. But instead has a sociologist, a doctor, and a teacher on it, or like for example, a person who's been through the experience of having their data stolen by identity theft or their data leaked by a major online platform.

43:43 You bring those sets of experiences together. And there's research out there that shows that even if each one of those committee members, those group members is not your best available athlete. They don't have as many tools as like the best person you could bring to it. But together they come at it from completely different directions.

44:01 And so they end up generating a language that helps humans thrive better. And the third characteristic that, you know, again, pulling these out of a hat. And if anybody's got better ones, just let me know. You know, like I'll- I'll- I'll steal it. And I'll thank you in the book is that there has to be shared values, like by definition, coming from at things from different experiences and different tool sets means that often people have quite different purposes.

44:24 And it doesn't work if a group is working to completely cross purposes, there has to be a community of wanting everybody to do better of wanting the thrive. There has to be a shared goal. You know, an example of that would be like a- a corporate research team. They have a shared goal of like building a product that gets to market that earns money that satisfies people's preferences.

44:46 That's a shared goal. They can come at it from a wild range of different things, but the shared goal helps them function: so non-hierarchical, profoundly differing tool sets and experiences, shared goals. If you, if you create groups like that, and you know, we find groups like this as health generating nodes, all throughout our language, all throughout our culture, juries research teams, academic faculties, right?

45:11 There's 1,000,000,001 places to where we see this pattern replicated and what I think we need to do is just find whatever else we might need to make communities that generate life-giving language and foster that build those communities intentionally, um, so that they then start generating the kind of language that leaves humans better after they use it, rather than the kind of language that leaves humans worse after they use it.

45:36 Well, Josh, as I would've predicted this conversation has been a hoot and a hollering. It's always so much fun man thought a lot, um, and really enjoyed it. Any last thoughts? No, thanks so much for having me on and I always have so much fun.

45:50 Tech Refactored is part of the Menard governance in technology programming series, hosted by the Nebraska governance and technology center. The NGTC is a partnership led by the  College of Law in collaboration with the colleges of Engineering, Business in Journalism and Mass Communications. At the university of Nebraska Lincoln tech refactored is hosted in executive produced by Gus Herwitz.

46:14 James Fleege is our producer. Additional production assistance is provided by the NGTC staff. You can find supplemental information for this episode at the links provided in the show notes to stay up to date on what's happening with the Nebraska Governance and Technology Center. Visit our website at ngtc.unl.edu.

46:32 You can also follow us on Twitter and Instagram at UNL underscore NGTC.