Tech Refactored

What's Happening With Twitter and FTX?

November 18, 2022 Season 3 Episode 13
Tech Refactored
What's Happening With Twitter and FTX?
Show Notes Transcript

On this episode of Tech Refactored, Gus is joined by Nebraska College of Law Professors Kyle Langvardt and James Tierney. Together they discuss the latest confusing and controversial happenings with Twitter – the blue check mark, content moderation, and Elon Musk’s acquisition of the platform from a corporate governance perspective. Later in the episode, they shift the conversation to the FTX debacle and cryptocurrency, and discuss what parallels may exist between Twitter and FTX.

Follow James Tierney on Twitter @JamesFTierney
Follow Gus Hurwitz on Twitter @GusHurwitz

Links
Nebraska Governance and Technology Center

Disclaimer: This transcript is auto-generated and has not been thoroughly reviewed for completeness or accuracy.

[00:00:11] Gus Hurwitz: Welcome to Tech Refactored, a podcast in which we explore the ever changing relationship between technology, society, and the law. I'm your host, Gus Hurwitz, the Menard Director of the Nebraska Governance and Technology Center.

[00:00:26] James Fleege: Hi, this is James Fleege, producer of Tech Refactored. Gus is feeling a bit under the weather today, so I'll be introducing our guests for this episode. Gus will be speaking with Kyle Langvardt. 

[00:00:35] Kyle Langvardt: I'm a professor here at Nebraska College Law and a senior fellow at the Nebraska Governance and Technology Center.

[00:00:42] James Fleege: Kyle was our guest for the first episode of season three in which he joined Gus to discuss platforms from the First Amendment perspective. Also, joining Gus today is James Tierney.

[00:00:51] James Tierney: I'm James Tierney. I'm also a professor here at Nebraska College of Law. I teach the corporate and securities law classes and research how investors [00:01:00] and markets work.

[00:01:01] James Fleege: Towards the beginning of this episode, we did have some technical difficulties and briefly lost connection with Kyle, so our apologies for that. But without further ado, let's get into the episode.

[00:01:17] Gus Hurwitz: So Kyle Langvardt, and James Tierney joining me today to talk about all things tech, or at least some things tech or some things particularly [indestinguishable] because we have some really some weird stuff going on in the tech world with Elon Musk's acquisition of Twitter and the FTX cryptocurrency meltdown. It may or may not be a crypto meltdown or just a corporate fraud meltdown story, but my colleagues here are expert in both and eager to talk about both, so Kyle, I'm, I'm gonna just start with you. Can you tell us a bit about what we have seen over the last week or so with Twitter? 

[00:01:57] Kyle Langvardt: Well, so maybe go back just [00:02:00] a little longer than a week, but Elon Musk completed his acquisition of Twitter, set about firing about half of the, the staff and has kind of set out on a really erratic series of changes to the company's content moderation policy that have led to diverse fiascos and along the way is, is likely in trouble with the FTC.

[00:02:25] Gus Hurwitz: Yeah, so you highlight in there that there are content moderation issues here. There are broader corporate issues. The Federal Trade Commission has had Twitter under a consent decree for some time, and it's debatable whether Musk is still in compliance with that or Twitter under Musk, he's apparently given employees or his lawyers have told employees that they're, they're not subject to the decree.

He's declared that the decree is invalid because it was signed under duress and apparently the European Union is getting ready to leverage probably on the order of. 400 to 500 billion worth of [00:03:00] fines. Yeah. Maybe a slight exaggeration on, uh, Twitter. Your main area of focus, Kyle, is on the content moderation side.

So let's get your thoughts on that. And James, then I'll pull you in on the corporate side, uh, a bit on what's going on with Twitter. 

[00:03:15] Kyle Langvardt: Yeah. Well, so on, on content moderation, I guess the first thing I would say is I never really understood the value proposition for Elon Musk in purchasing the platform. Didn't seem to be a particularly lucrative property. But, um, one, one thing that he did say was that he wanted to promote Twitter as, uh, this, this forum for, for free speech. And he said that he was concerned that the way Twitter was being run was overly sensorial or biased or whatever. He, he also said that he wanted to get rid of a bunch of bots that were on the platform.

That, that the platform had a lot of problems with bots. Well, I think one lesson maybe that Elon Musk is learning the [00:04:00] hard way is that it's really complicated to figure out how to translate the concept of free speech to an online social media platform. Uh, because, you know, one employee at Facebook once, put it this way to me, he, he said, you're dealing with this constant beer bong of posts.

Um, and, you know, just content that needs to be sifted through for various reasons. You know, some of that content is unlawful or, or dangerous. Some of it's just irrelevant or, or un uninteresting, and it needs to be deprioritized if the platform's going to be a usable service. And so when you think of free speech, usually we think of this kind of hands off approach to speech, but in an online platform, there's just no such thing there. There's no way to run a usable or safe online platform without engaging in a lot of censorship. 

[00:04:57] Gus Hurwitz: So we, we seem to be having a bit of technical difficulty [00:05:00] with Kyle. Uh, I'm not quite sure where we're going to be able to pick up, uh, what he's been saying, but James, I'm going to jump over to you now with a question and we'll loop Kyle back in, uh, when he rejoins us.

So James, when Musk announced he was going to buy Twitter, it was a big deal and then a short time thereafter, he announced he wasn't going to buy Twitter. And from a corporate law perspective, it became an even bigger deal. Uh, lots of discussion about how the courts were going to, uh, address this. What's your take on, uh, what's going on with Twitter, both from the corporate, uh, law perspective, the m and a perspective perhaps, but also from the corporate governance perspective?

[00:05:38] James Tierney: Sure. So just to get my cards out on the table here, you know, After the announcement that he was going to try and take Twitter private, I, I bought shares in Twitter, uh, in part as a demonstrative for, you know, my introductory business associations class because I, I kind of wanted to make some of these concepts of, you know, how to structure [00:06:00] transactions and the way that we structure transactions is gonna affect both shareholders' willingness to approve it, as well as there's some things that they can and can't approve.

So I think in that sense it was, Really great, uh, learning opportunity from my perspective, as, you know, as a, as a one time shareholder who got, you know, the control premium as part of the takeover. You know, that's, that's ultimately, uh, you know, part of the demonstration, right? That I think most students, when I asked how I should vote the proxy on the, on the merger, uh, vote, you know, most students said, yeah, take the money and run, right?

But there there is this tension that goes to, uh, you. Points Kyle's making about how hard it is to run a robust platform with the kinds of, you know, network effects that come from mass adoption. And if there's one thing we've seen in this whole saga, it's that mass adoption kind of relies on, you know, user trust and trust.

Both, you know, in the kind of quality and truthfulness or [00:07:00] veracity of, of what you're seeing. The people, users on the platform are who they say they are, things like that. But then also just kind of, You know, the, the general user experience of is this something where it's really kind of unpleasant and I no longer wanna be involved?

And I think one of the, the things that this whole saga has shown is that there is sometimes a tension between wanting to optimize for different goals. And it may well be that one of the reasons why Twitter was kind of trading. Significantly less than this takeover price, which of course, you know, uh, 54, 20 50 $4 and 20 cents, you know, it's a, it's a joke price.

It's based on, you know, the number for marijuana four 20, but it had nothing to do, at least the, so the theory went for the kind of fundamental value of the company. And so that, you know, that puts boards at a disadvantage from a corporate law perspective, because, no, at least the way that Delaware law is, is structured in the circumstance when it's clear that a company [00:08:00] is gonna be sold.

Board has heightened duties to make sure that they're gonna get the best price possible for the shareholders. And that's just one of the few areas of law where you really have to just look out primarily for shareholder wealth maximization. And if it turned out that selling to this group of perhaps overconfident and, uh, you know, highly capitalized, uh, perspective, uh, investors made some outside investors in Twitter better off than, you know, the board's gonna, uh, Serve that up to them on a platter.

Um, whether we think that that's kind of good as a, in equilibrium kind of depends on how much we trust shareholders to mediate for themselves in doing that kind of shareholder voting, uh, you know, what, what sort of values they're gonna look for. But at least to the extent that under existing corporate law boards don't have a whole lot of discretion, at least in this context, to say like, actually, you know what?

Elon would be pretty terrible manager. And so we actually just kind of [00:09:00] trust, you know, the wisdom of a more distributed shareholder base or maybe public shareholders who are willing to take, uh, a lower return on their investment who have perhaps, uh, smaller kind of cost of equity capital, uh, requirements that switching from the one pool of investors to the other.

There might have been, you know, some folks who, uh, would've preferred the Ancien regime Twitter to what we see now. But like Delaware corporate law isn't set up to do that. So I will say, you know, the, you asked about how this kind of plays into to corporate law, that, that's kind of the strategic question that I have kind of wrestled with, um, you know, over time.

But then there's also just this really important illustration of this. In the certainty that Delaware corporate law in particular brings to investors and to different counterparties, right. I mean, the way that the contract was written, it didn't seem like he had a whole lot of outs, at least under the facts that we.

Kind of understood them. [00:10:00] Probably not going to prevail on, you know, kind of a material adverse event or change kind of, uh, escape hatch out of this merger agreement. And the general orientation of the Delaware courts toward corporate law stuff like this is to. Ensure, or at least promote the certainty of private ordering in contractual arrangements.

And, you know, Delaware courts don't very well like to let you know the most sophisticated, richest man in the world, parties to contracts, you know, with, with the, as many lawyers as they could possibly paper the deal with. Get out. Of their contractual arrangements. Mm-hmm. on, you know, kind of textual ground.

So if you followed the way that the dispute over, whether, you know, is he or is he not gonna close on this acquisition over, over time? There was a lot of, from my world, you know, kind of the, the Peanuts Gallery, uh, speculation about like, is he gonna be able to like, get it out? And, you know, I think the, probably the best commentator on that [00:11:00] trial is the, you know, the Delaware Chancery Daily, you know, they kind of like are specialists in, you know, what's happening in the.

Or business courts. And I think the whole time like, look, he's not gonna get off the hook. Right. And that ended up being true. And ultimately, you know, he ended up, uh, kind of withdrawing from, you know, his plans to try and, you know, get out of the acquisition as soon as it turned out that like maybe he, uh, was gonna have to sit for a deposition.

Right. And that's not to say that that's like the actual cause, right? Um, there may well have been other kind of intervening events that, you know, in his mind made him decide to go forward. But it does show. That Delaware corporate law has a, actually a very important mediating role in ensuring that in this area where we have essentially capital markets as aggregators of a certain kind of governance power through, you know, money as kind of economic claims on being able to control how policy and stuff happens, right?

Delaware courts don't want that right up against, you know, the world's [00:12:00] richest man. They. You're gonna sit for this deposition, whether you like it or not. , I guess he doesn't like it, right? So 

[00:12:06] Gus Hurwitz: I, I guess one question that I'll, I'll ask both of you from your respective perspectives. If you look at what's been going on with Twitter for the last week or so, the deal closes.

Elon comes in, starts firing people left and right. Acts as the entire content moderation team trust and safety team, depending on, uh, what you hear. Maybe not all of them. Maybe they're one or two people still on staff. All the compliance teams, all of that, uh, starts losing advertisers left and right. Can he just do this?

I mean, he owns the company now. Can he run it into the ground if he wants? If assuming that that is what he wants, I expect that he. Trying to do other stuff, but I, I think the, the interesting starting question, I'll, I'll start with you James. Are there any debtors or any lenders or any, any remaining shareholders or anything out there that has recourse [00:13:00] if he pushes the company into insolvency, into bankruptcy, into no longer being a going concern, and then Kyle from the, uh, harms that Twitter can cause?

To the users who rely on it, uh, rightly or wrongly from the the free speech perspective, the section two 30 perspective, are there any consequences that he faces if he just turns off the content moderation aspects of the 

[00:13:26] James Tierney: business? I'll try and speak first to the question of what. He can do in terms of running the company into the ground.

And then I'll, I'll speak a bit more to the, I think the extension of the point about insolvency and, uh, ultimate governance control over rump Twitter. Right. So, you know, when you think about what. He can do, could he possibly run it into the ground? I mean, you kind of have to think about when push comes to shove, what can any of us do?

It's the things that someone else with some kind of power or ability to kind of [00:14:00] coerce how we order our affairs and do our conduct is able to force us to do. And that could be, you know, kind of economic inducements. It also could be, you know, uh, through legal institutions, you know, when push comes to shove, can you.

Court, like the Delaware Court of Chancery to issue an injunction or to say, actually, you know, we're ordering a transfer of shares or of money, uh, you know, within the jurisdiction of the court. So, you know, if you think about it in that sense, what can he do? You know, can he destroy the company a. At a certain point we're talking about insolvency and so, we'll, I'll turn to that in a second.

But in, in terms of, you know, how the negotiation is gonna work out between different claimants who are senior on the proceeds of liquidation, if it turns out not to be a going concern or you know, in a reorganization, you know it. Absent that possibility, what can he do and how can he be brought to bear?

He can, he can lose his equity stake, right? That's kind of the, one of the worst things that can happen to him as a, I think certainly as the ceo, [00:15:00] as a, uh, possibly could be a controlling shareholder, uh, you know, depends on how, uh, Delaware Chancellor he would, would come out on that important topic, you know, with respect to some of his other ventures in, in Delaware law, uh, recently.

But, you know, to the extent. Sits in one of these seats and, and has fiduciary duties to the company. You know, you may have duties not to run it into the ground, but when are those duties actionable and the background, you know, default orientation of the courts. So the business judgment rule is to not second guess decisions that have.

You know, rational business purposes, and you know, that's maybe papering over some of the doctrinal nuances, but that's good enough for our purposes. And so only in certain circumstances are you gonna be able to prevail and hold him liable for running Twitter into the ground. You know, that could be. Any number, I won't turn this into, you know, a class on, on corporate fiduciary duty.

It could be any sorts of, uh, sorts of things, but it might not just be triggered by a series of bad ideas. Just the fact that he's a bad manager, suppose. Right. You know, [00:16:00] if one were to have the opinion that he's a bad manager, uh, you know, that would not necessarily give rise to a fiduciary duty breach.

Right. There'd have to be something quite a, quite a bit more that probably the, the stronger inducement on his part, uh, rather than the threat of liability for running it into the ground is to. Look, you know, you, you structured this, uh, acquisition in a way that possibly was from the outset going to be kind of insolvent and a series of things that you, the steps that you took, uh, in your brief tenure at the helm of Twitter kind of has made things worse.

And so I think you saw, uh, suggestions yesterday that he's gonna step down as ceo, have someone else come in. You know, the, ultimately the problem is that if you have. A company, uh, with a fairly large group of creditors with, you know, fairly high, uh, you know, cashflow requirements for servicing the debt.

In order to make your equity stake worth it, you kind of have to strip, strip down the company to its [00:17:00] barest, uh, parts mm-hmm. to, to reduce your cost. Right. And this is, you know, this is like a classic response to a highly leveraged, um, acquisition. Uh, and you know, we see we. Story play out, you know, time and time again.

Um, I, I shouldn't laugh, right, because it's like in, in many cases it can be like a real, real destruction of ongoing value of, uh, of an enterprise. But, you know, the, the, the possibility that, that, that process goes wrong and, you know, the, the attempts to strip down and make it more of a nimble, uh, operation, they end up alienating users or, or doing something else.

And ultimately, If he gets that gamble wrong, and if we end up kind of in a bankruptcy situation where he gets a fairly low kind of negotiated equity stake as a result and, you know, more, more of the existing creditors get to share in the, the rest of the proceeds of the, the reorganization or, you know, however else it, it works.

I mean, I think ultimately he's gonna come out of this, uh, a real hit to his equity here as well as into some of his other affiliated ventures. [00:18:00] From that perspective, you know, that might be the most important economic inducement rather than the threat of liability. But what can he do? And when push comes to shove, uh, what's he gonna be forced to do is a story yet to be written.

[00:18:12] Gus Hurwitz: So Kyle, what about on the, the user side and the speech side? 

[00:18:16] Kyle Langvardt: Yeah. Well, so I mean, maybe to just kind of transition from. What James was saying, and I think the way you put it, guess you asked, you know, if he just turns off content moderation now, I mean, he's not gonna completely turn off content moderation, obviously, but, uh, the, the, the reason that you can't do something;

[00:18:34] Gus Hurwitz: I like how confident you say he's obviously not going to do that. I, that, that's a very bold and confident prediction.

[00:18:39] Kyle Langvardt: I mean, I guess, I guess nothing's all that obvious when you're, when you're speculating on the, the behavior. You know, megalomaniac, but, um-

[00:18:48] Gus Hurwitz: And eccentric billionaire please.

[00:18:50] Kyle Langvardt: Eccentric billionaire. I don't, I don't know if megalomaniac is a clinical term.

I don't want to, you know, crossing lines there, but, you know, the, the reason that it's [00:19:00] glib to talk about just going on a platform and having a, a policy in favor of a free speech is that the platform is financed by ads and, and advertisers wanna protect what's called brand safety. So the, the idea is you don't want your Proctor and Gamble product being advertised next to a beheading video that, that's, that's bad for the brand.

Um, you, you don't want your Proctor and Gamble. Uh, logo being used by an imposter. I, I'll talk about this blue check mark thing in a little bit. Um, if a platform is not well groomed, Which involves a lot of, you know, intensive content control on what third parties can post, then brands lose a lot of value in the company tanks.

This is a big part of the reason why running a social platform is not fun. It's, it's boring, it's depressing, it's really expensive. Now, a specific problem that Musk has [00:20:00] already run into, and I think a lot of our listeners probably know about this, is this, uh, Twitter blue program that he rolled out. So, This post some pretty specific brand safety issues.

Twitter had had a problem with impersonation in in the past, and so it's solution to ensure that politicians wouldn't be impersonated and that brands wouldn't have their identity stolen. That kind of thing was to issue these blue check marks to. Authentic speaker. So at Real Donald Trump, when he was on the platform, there was a blue check mark there.

So you knew you were getting it from Trump and not, and not an impersonator. Well, you know, the, I think for people who felt aggrieved about the kind of pecking order on Twitter and, and maybe, uh, Twitter. Policies governing different types of content, what it was gonna promote, what it wasn't gonna promote.

There's a lot of heartburn around this and, and I think that Musk himself referred to [00:21:00] the blue Check Mark program as like a lords and surfs kind of dynamic where if Twitter bestowed this blue check mark on you, you were one of one of the Lords, and you could get your content amplified. Otherwise, you were just nobody.

Well, so, you know, Musk's idea was, we're. I guess democratize this and we we're gonna have an even content neutral approach to this policy. And what we'll do is if you want to be one of the Lords rather than the surfs pay, uh, $20 and, and we'll give you a blue check mark. And then later on it got talked down to $8, or I guess 7 99.

Well, You know, that's not very well thought out because once it costs eight bucks to get a blue check mark, now you can pay eight bucks and, and now you're Eli Lilly on Twitter saying that insulin is free. Or now you're Elon Musk saying things that I guess even Elon Musk wouldn't. Wouldn't say, um, or maybe you're, you're impersonating, uh, the Lockheed Martin company, something like that.

Now, that undermines the [00:22:00] integrity of the platform experience in a lot of ways because you can't tell who's who. But as far as the people being impersonated are concerned, They could suffer reputational damage at a minimum. These brands that we're advertising on Twitter are gonna be hesitant to advertise there again, because the exposure they're exposed to, and potentially they may even want to sue on some sort of tort theory.

Now here, this is something that's really interesting and Gus, you know, you're, you're the one who teaches torts, so you'll, you'll have to put some kinda label on this theory, but, but I think I could see it, a brand saying, Twitter had let the public know that if you saw a blue check mark, you were hearing from the real thing.

You know, you were hearing from this, this brand, and whatever it is, and Twitter created that impression and then it switched up the rules on the blue check mark so that the blue check mark just meant. Uh, you'd paid eight bucks. The problem is now you've still got this user population on [00:23:00] Twitter. Who thinks when they see a blue check mark next to Lockheed Martin or Eli Lilly that they're hearing from Eli Lilly or, or Lockheed Martin.

Basically this course of conduct by Twitter, both before and after the Elon acquisition, winds up misleading users in a, in a kind of negligence. You know, I'm thinking that's, that's pretty much what the claim would look like. Should I, this is just kind of negligent soup that I'm putting together. Well, 

[00:23:22] Gus Hurwitz: the, the challenge of course is still going to be section two 30 and the issue, uh, uh, there that immediately comes to mind is, are the blue check marks in some way, Twitter, putting the speech out on its own or amplifying it, On its own so that, uh, it's no longer the users generating the speech mm-hmm.

but Twitter generating the authoritativeness of the speech, could that be a, uh, a circumvention of, uh, the two 30 liability shield? Um, yeah. And that, that's a tough argument to 

[00:23:55] Kyle Langvardt: make. I, well, I, I think there's some vulnerability. I don't think I would [00:24:00] feel, you know, if I were representing Twitter, I don't think I would feel.

As confident in this dispute as, uh, platforms often feel when they have section two 30 defense and here's. I mean, I think you already alluded to this, but just for the, the listeners, you know, section two 30 isn't just a general liability shield for everything platforms do. What, what it really is, is, is basically a rule that says that platforms are not going to be exposed to liability in connection with the way that they handle third party content.

Now, that's not the statutory language, but I, I summarize it that way. The contemporary interpretation of, of Section two 30 has drifted pretty far from the statutory language now? I think so here's what I would say. I mean, I think initially what, what Twitter would say is we're protected under section two 30 because all these Blue Check imposters, those are just these third parties posting third party content to our, our platform.

[00:25:00] Uh, this is just kind of a, a classic section two 30 case. We're not responsible for hosting their content. And then I, I think the, the reply to. Like you said would be to say, Hey, this is not a case about, that's just about the third party imposter. It's the case about your content Twitter and, and your content is that blue check mark.

That's, that's become misleading. Now here too, I think Twitter has an a counter to that, and the counter is something that's called the neutral tools argument. So there have been cases where someone says, for example, the Facebook. Algorithm played matchmaker to terrorists who congregated on online and, and then they hatched plans to go and kill people.

In a case like that, you know, Facebook has said, look, we don't have a pro terrorism algorithm. We have a stuff you might like and people, you might like, algorithm that matches people to stuff they might like and people they might like. It's a neutral tool and we're not [00:26:00] responsible for, uh, the, the content that people feed.

The courts have been pretty receptive to that. Now, I think what Twitter would say here is, okay, yeah, the, the blue check mark is our content, but it's a neutral tool. And what's really at issue in this case is how third parties use our neutral tool. There's nothing wrong with us putting out a neutral tool that can be used for good or evil.

They'd say, well, you know, in this case, the blue check mark costs eight bucks. That's the criteria. That's as neutral as you can possibly get. The difficulty with that argument, though, is that what's really at issue is the nature of the blue check mark prior to Elon Musk's arrival, and at that time the blue check mark, you know, before Elon got.

Was the opposite of a neutral tool. You know, Twitter was making case by case determinations about who deserved a blue check mark and who didn't. So I think the question here is when Twitter produced its own [00:27:00] content before, did that lead to. Obligations going forward in, in how it handled that, that blue check mark.

I think there could be a root around, uh, section two 30 here and, and blood's kind of in the water on section two 30 right now. . Yeah. 

[00:27:16] Gus Hurwitz: And it, it's, uh, important to note that the Supreme Court is getting ready to hear a pair of section two 30 cases mm-hmm. That, that go to some of these issues. Throw out, uh, one of my own theories about why Musk bought Twitter.

He's been a pretty outspoken against anonymous speech online, and I think that he views Twitter or viewed Twitter as potentially being a platform for authenticated speech online. And that's part of why he made so early on this blue check mark move. So I, I wasn't entirely surprised. Of course, this is my own pet, borderline conspiracy theory.

who knows what goes on in Musk's mind. So I, I want to turn [00:28:00] to the, the second topic I want to, uh, engage you both on, which is FTX and cryptocurrency. And the, the loss, the actual loss of, uh, 36 or 38 billion as opposed to the right now only conceptual potential loss of, uh, 44 billion through Twitter . Um, the, the two of you have co-authored recently a, an article about cryptocurrency.

Corporate law and related stuff. We should only talk for 15 minutes or so about this, but James, can you read us in on what is going on with FTX and Bin and Edda and. Spf. 

[00:28:41] James Tierney: Sure. So, you know, the, the background here is, as you mentioned, Kyle and I did this project that looked a little bit at how we might regulate the kind of, you know, exuberant speculation, uh, that we see in lots of.[00:29:00] 

Risky asset markets. And in particular, one of the themes that we've seen over the last couple years has been trading platforms, either kind of on the exchange side or on, uh, the broker dealer side. Uh, just thinking about two different categories of market participants. You know, the, the way that. These intermediaries relate to customers can be relevant to the sort of demand that we end up seeing, uh, retail investors, which is to say like ordinary people, uh, like perhaps many of us who have a little bit of, of money that they are, uh, able to use.

To invest or save for some purpose. And you know, what we've seen is an incentive on the part of these intermediaries to try and attract people through different means. You know, sometimes through user experience design and sometimes through just offering them a highly volatile, uh, lottery like asset that may be.

Really, uh, desirable to [00:30:00] folks with aspirations for riches kinds of preferences. You know, I'm like, I'm really looking forward to kind of move up in my, uh, wealth percentile. You know, if, if you have preferences for skewness, uh, you know, for kind of a large magnitude, low probability. Positive outcome, then you might be really attracted to, uh, trading in these kinds of assets and, and different participants kind of push in one way or another.

And all of that is to say, we see in these risky speculative markets a greater desire for. Trying to get some return on investment, and that ultimately seems to have been the, the downfall for, for ftx. There are lots of ways that people can kind of get into engaging with risky, speculative, you know, markets or practices, right?

Some people, you know, they turn 18, they go to a casino for the first time, they lose it all. And so they decide I'm, you know, I'm never gonna do kind of this. Risk consumption in the future. [00:31:00] Other people really like it, right? And so they're gonna look out for those opportunities. And cryptocurrency markets are, I think, are, uh, you know, one of the emerging areas, or I should say, you know, very mature in, in some sense, uh, areas of speculation here.

The, the issue is, you know, if you, if you make a lot of good choices, uh, early on, or if you just kind of, uh, are born on third base and think you've hit a, hit a triple right , um, then you might not have had some of the foundational experiences to think like, actually I can, I can lose really big. And so I think, you know, Part of the story with the implosion of FTX is you see lots of people potentially making the right bets, right?

They, they're able to, uh, attract outside investment. Maybe they're able to, uh, turn some of that investment capital into kind of higher winnings that then they can share. And if you imagine that you are someone who's like a super sophisticated market participant, [00:32:00] uh, advised by all the best bankers and lawyers, you could imagine how you might be able to try and, uh, inject yourself into an intermediation position in, in this new kind of speculative market.

And in fact, you might be able to build up a whole. Asset class and market infrastructure around this and that, that seems to be what has happened. Um, you know, in the crypto exchange space over the last few years, there's just been a ton of demand and a ton of capital, uh, moving into, into this area. But then the flip side to it is, If you haven't had big failures before, or if you kind of like to see the profits roll in and don't like to see the profits stop, then you might be really concerned about robust regulatory responses.

And as we've seen, you know, the crypto um, community has really resist. Did, uh, the sort of robust either, um, financial sector safety and soundness kind of regulation approach that we've seen, or alternatively, you know, kind of more of a securities based approach. The main reason, you know, my cynical take for that, the main [00:33:00] reason is that, you know, if cryptocurrencies or crypto assets are securities, then you know, you can't lie about them right.

There, like antifraud special rules that that come into play. You know, the registration stuff, the public offerings, I think that's all a sideshow, but you know, the. Key problem is if you don't have that backbone of regulatory apparatus to kind of help make sure that the other people who know how things can go wrong, you know, if, if you don't provide that stuff voluntarily through private ordering and if you're outside the kind of regulatory scope, because, you know, a crypto exchange like Fdx is gonna be and was regulated, at least at the relevant US party, um, was regulated as a money transmission, uh, you know, company, uh, with, with just kind of a very.

Uh, sort of regulatory background that's going to lead to maybe some laxity in your approach to how you're running your business. And so, earlier today we got in the bankruptcy filing in ftx, you know, the, the first day declaration that talks about, you know, once the adults came into the room, what [00:34:00] is, what's, what does this actually look like?

And it sounds. You know, uh, you know, as I was, uh, remarking to a colleague earlier today, it sounds like there were a bunch of people who had the same sort of thought that I did when I was 10 years old. , I'm gonna be the world's richest man so I can give money away and help poor people, and. Then they didn't think about how they were actually gonna implement that in practice.

Right? And so, you know, you have all these investors come in and then it turns out that, you know, you're sharing all of your passwords to all of your bank accounts on like a shared, you know, Google document or you know, whatever that none of the sorts of internal controls that actuals. You know, sophisticated, experienced investment professionals or investors would expect.

And so I, I think, you know, FTX is ultimately, uh, a story of the, the confluence of this mass speculation and the thought that if only we can kind of create more wealth out of this news speculative asset class, then you know, we'll be able to kind of achieve what other. Uh, whatever other ends we're trying to do, combined with kind of a deep skepticism, [00:35:00] if not maybe a naivete about why we have all of these background rules in the first place.

You know, fraudsters love it when the money is going up. The problem is when the money runs out. It, 

[00:35:11] Kyle Langvardt: so, one question, really basic question I, I have about all this is why SBF know Sam Bankman? Had this kind 

[00:35:21] Gus Hurwitz: of like, wait, wait. That's a nice correction. Yeah. SPF as I've been referring to him and as he's known, he's a person, he is the, the founder and the creator and the effective altruist behind the creation of ftx, this, uh, crypto company.

[00:35:36] James Tierney: Yeah, and I mean, 

[00:35:36] Kyle Langvardt: my sense, at least of the way people were, would talk about Sam Bankman Fried until very recently was. Okay, this, this crypto sector may be really shady and, and there may be all kinds of fraud going on and unreasonable speculative risk. It's a terrible investment in all kinds of ways. But this guy, Sam Bankman, freed, [00:36:00] this is like the, the wise man.

Uh, this is the most reasonable person in, in the room, and he's the person who. Uh, the vision for what a responsible crypto sector might look like. Now, obviously that's not, not not the case, but why was, do you have any sense of how he was able to cultivate this? 

[00:36:26] Gus Hurwitz: Halo. Yeah. So I am going to say something that could get me in a lot of trouble.

Okay. Um, and with apologies to those in the law professor community, we're all law professors and who knows who's going to listen to this, but I'll actually ask this question in a peculiar way that's actually particularly relevant, uh, today and tomorrow. What happened to FTX really isn't about crypto.

It's not about cryptocurrencies. Mm-hmm. , it's about, uh, uh, leveraged lending. Mm-hmm. and not keeping assets on the book to cover loans and impro priorities with how a company was using its assets and [00:37:00] managing its assets and lending, SPF lending money to himself and his other company. And what, one question in the background here is, is he a criminal or is he just stupid?

And I, I think that's a very fair question. I, I don't know. I, I think it's entirely possible that he just lacked the sophistication to realize that what he was doing was problematic or why was problematic. Now, he might still be technically a criminal, but there's a, a for that alone, but it, it's a different sort of problem.

But the question I wonder is he just, Elizabeth Holmes. And I, I say this in a particularly interesting way. Uh, she was a Stanford student. She got her start her support with folks in the Stanford Bay Area community high profile individuals backing her, both reputationally and financially to build up what became one of [00:38:00] the greatest corporate frauds in American history and.

Spf. His parents are Stanford professors. He comes from that same community reputation. Lots of money, lots of speculation, lots of innovation. I wonder if what. Got him going was both a legitimate drive to do great things and a push from where he was socialized to do great things and think big things and a community that wanted support, that sort of attitude and ambition and a willingness within that community to take big risks and to support folks without necessarily doing the due diligence because that's how you get on the next big think train.

That's what I wonder and I, I will throw that out to both of you to see if either of you want to, uh, uh, touch that lightning rod . 

[00:38:55] James Tierney: Well, uh, certainly, I, I, I don't think the, the Stanford campus is all [00:39:00] that pretty, so it won't be too bad if I get banned there for life. Right. . Yeah. I think that does connect both of your comments.

The idea of how could we possibly get. Enough people to support this project. Well, part of it, as in lots of cases, is just to use your existing social capital, right? And so I, I am not going to go successfully raise, you know, uh, a billion dollars or, you know, whatever it is, uh, in private placements for my, uh, unregistered crypto, you know, uh, , uh, hedge fund because no one knows me and cares about me or what I'm gonna do.

The more that I'm plugged into those circles, the more I'm gonna get those introductions. And you know, if there's a natural affinity on other dimensions, you like risk takers, you, you think big thoughts, you're disruptive, then, you know, maybe the people who otherwise would get red flags, uh, you know, red flag vibes, uh, from hearing this stuff might actually find it [00:40:00] attractive.

Right. So, you know, I mean, I don't wanna speculate exactly about how he was able to, you know, get so much support. I will say, uh, Gus, the, you know, you said it's not about crypto, I mean, Think that's true in a sense, but in, in another sense, it sounds like one of the, uh, ultimate problems here, uh, might have related to crypto in that the ways that the contagion across these different affiliated entities spread might have related to, you know, the taking of its own.

Tokens, the f like FTT tokens, which were kind of claims on its capital structure. Mm-hmm. as, uh, as collateral. Right. And so, uh, that is something that is incredibly risky to your balance sheet. And so if, if that's the case, then, you know, from a risk management perspective, the fact that they were able to kind of min, uh, you know, what's.

Has the economic characteristics of a security and then kind of take that as collateral. If that's actually kind of how, uh, the emerging thought is, you know, maybe, uh, the conflagration might have, uh, you know, started then maybe that does mean that there's like a crypto element [00:41:00] to it. But I, I take your point that more broadly it's, you know, uh, it's not so much about crypto as much as the crypto bro.

Who doesn't care all that much about regulation or doing things by the book and saying like, we're actually just gonna make a lot of money, and it's a lot easier to, um, you know, to ask for forgiveness than permission. And to the extent that, that, that kind of worldview enables this sort of thing, then I'm not sure that it's a, it's a, you know, particularly useful worldview.

Oh, 

[00:41:26] Kyle Langvardt: I mean, you know, the other thing that I, I kind of wonder about is how big a role just kind of obscurantism. Plays in this. So crypto instruments are extremely hard to understand. I don't know, I mean, maybe I'm not very smart. It seems to be very, very hard to follow all, all the various permutations of, of all these different types of assets and exchange products and, you know, yield farming opportunities.

And, you know, I, I wonder whether they. Are not that many [00:42:00] people out there who would feel confident enough to tell a guy like this that they understood that what he was doing was, was wrong. And, and you know, I, to me, I, I kind of suspect that the crypto industry. Cultivates this, there's this kind of, you know, hyper, hyper-technical, uh, mathematical aspect to it.

There's this like kind of infinite variations on, on the types of instruments that are involved. You know, I'm sure that, I'm sure that crypto investors get just completely overwhelmed and, and dazzled by the details, and that can't be so great for, for their ability to gauge what they're looking at. Well, I, I don't 

[00:42:38] Gus Hurwitz: know.

I'm, I'm going. I'm actually gonna push back a little bit on that, Kyle, because, and recent Horowitz and Sequoia, they're sophisticated, they understand this stuff, and. All discussions that we're seeing now. The moment that, uh, sophisticated folks who understand this stuff looked at Ft X's books, they said, [00:43:00] wait a second.

Th this is not kosher. So it, it doesn't sound like you needed to have a, a detailed understanding of, of cryptocurrency to do basic due diligence on where are your assets. It, it's taken literal. A couple of days, um, for folks to recognize, uh, that there, there's something foel, something Rodin going on here.

But I, I actually want to pivot back to and start wrapping up the discussion because Ja, James, you, you characterized, uh, uh, spf. I'm, I'm going to, uh, mangle exactly what you said, but. Something along the lines as the, the crypto bro who doesn't care about, uh, regulation or playing by the rules because he's there to do great stuff.

And, uh, you, you could have been talking about Elon Musk and Dogecoin with that. So, uh, I'll, I'll just throw this to each of you again, but uh, as a closing question, are what we are seeing with Twitter and FTX similar. [00:44:00] Is there something that ties them together? Uh, or are they fundamentally different 

[00:44:04] James Tierney: phenomenon?

Well, just to pick up on the idea of, you know, the crypto bro as archetype, you know, it's, it's maybe unfair to, to limit this to just the, the bro or just the crypto bro. Right. What we're actually, at least what I. Discern to be. The connection between the two cases is a kind of overconfidence that comes from having been successful in a domain of your life, right?

And thinking that you are very successful at making a particular kind of money. Means that you'll be successful at transferring that skill, uh, or that knowledge to a new domain. And that seems to be something that, uh, you know, certainly fits, uh, the, the Musk case. You know, uh, he, he seems to be very good at doing, um, financial machinations to get himself rich and, you know, that is what it is.

But to the extent that you think then, that you're gonna be [00:45:00] great at everything. Transfer falsely into thinking that you're gonna be the greatest, you know, setter of bespoke, uh, content moderation policy. And if that's not true, then you know, it might give us pause before thinking of the kin kind of example of, of entrepreneur as someone who is always going to be right in their business decision making.

It might make us a little bit more skeptical of that. Uh, 

[00:45:29] Kyle Langvardt: I think both of these illustrate in, in different ways the hazards of investing too much social power in some kind of charismatic leader. And you know, the thing that I find. I guess the most distasteful about the Elon Musk fiasco at, at Twitter is that we have to psychoanalyze this person in order to understand how the, how the public sphere operates.

To me, that's a, that's a bad [00:46:00] way to run a democracy. and you know, I think both of these are, are areas where the public needs to find ways, even if it's difficult to protect what's, what's important without just entrusting the, these, these huge questions to privatization. 

[00:46:19] Gus Hurwitz: Kyle, James. Uh, thank you both. This has been, uh, a fascinating discussion.

Frankly, I didn't have any idea where it was going to go, and I'll, I'll add, I'll actually add a special thanks. Uh, you, you both know that I'm, I'm under the weather right now. Uh, and we had to schedule this, uh, discussion somewhat at the last minute. So I, I appreciate you both jumping in and getting on the call with me to, uh, to talk about these things and I look forward to continuing to talk about them as I am sure, both of these issues are not going anywhere and future similar issues, uh, will continue to see. 

[00:46:52] Kyle Langvardt: Yeah. Thanks for having us. Thanks very much.

[00:46:58] James Fleege: Tech Refactored is [00:47:00] part of the Menard Governance and Technology Programming series hosted by the Nebraska Governance and Technology Center. The NGTC is a partnership led by the College of Law in collaboration with the colleges of engineering, business and journalism and mass communications at the University of Nebraska Lincoln. 

Tech Refactored is hosted and executive produced by Gus Hurwitz. James Fleege is our producer. Additional production assistance is provided by the NGTC staff. You can find supplemental information for this episode at the links provided in the show notes to stay up to date on the latest happenings within the Nebraska Governance and Technology Center, visit our website at ngtc.unl.edu. You can also follow us on Twitter and Instagram at UNL_NGTC.