Tech Refactored

Open Source Software

December 02, 2022 Nebraska Governance and Technology Center Season 3 Episode 14
Open Source Software
Tech Refactored
More Info
Tech Refactored
Open Source Software
Dec 02, 2022 Season 3 Episode 14
Nebraska Governance and Technology Center

Open source is everywhere – over 97% of software uses it in some capacity. It's free to use, but who creates it and why? Who maintains it? What are the security concerns? Kyle Langvardt fills in as host to discuss open source software with Chinmayi Sharma, Scholar in Residence at the Strauss Center, and Lecturer at the University of Texas. Together they examine many of the topics addressed in Sharma's paper, "Tragedy of the Digital Commons".

Follow Chinmayi Sharma on Twitter @ChinmayiSharma
Follow NGTC on Twitter @UNL_NGTC

"Tragedy of the Digital Commons" by Chinmayi Sharma
Nebraska Governance and Technology Center

Show Notes Transcript

Open source is everywhere – over 97% of software uses it in some capacity. It's free to use, but who creates it and why? Who maintains it? What are the security concerns? Kyle Langvardt fills in as host to discuss open source software with Chinmayi Sharma, Scholar in Residence at the Strauss Center, and Lecturer at the University of Texas. Together they examine many of the topics addressed in Sharma's paper, "Tragedy of the Digital Commons".

Follow Chinmayi Sharma on Twitter @ChinmayiSharma
Follow NGTC on Twitter @UNL_NGTC

"Tragedy of the Digital Commons" by Chinmayi Sharma
Nebraska Governance and Technology Center

Disclaimer: This transcript is auto-generated and has not been thoroughly reviewed for completeness or accuracy.

[00:00:00]Gus Hurwitz: Welcome to Tech Refactored, a podcast in which we explore the ever-changing relationship between technology, society, and the law. I'm your host, Gus Herwitz, the Menard Director of the Nebraska Governance and Technology Center.

[00:00:25] Kyle Langvardt: Hi, I'm Kyle Langvardt. I'm filling in for Gus Herwitz this week. I'm a professor here at the College of Law and I'm also a senior fellow at the Nebraska Governance and Technology Center. And I'm here today with Chinmayi Sharma. 

[00:00:39] Chinmayi Sharma: My name is Chinmayi Sharma. I come from the Straus Center at the University of Texas, where I'm a scholar in residence and a lecturer at the law school.

And my research focuses primarily on cybersecurity and internet policies. 

[00:00:51] Kyle Langvardt: She’s gonna be talking about her recent paper, tragedy of the Digital Commons. This is a paper about open source software, which most of you probably have some [00:01:00] familiarity with. If you think about something like a browser, like uh, Firefox or that kind.

But as Cheney's gonna discuss with us, open source software is much more ubiquitous and pervasive than I think most of us appreciate. And on top of that, she's gonna talk about a number of security issues that arise from open source software and some of its unique vulnerabilities. So with that, let's talk to Chiney Sharma.

You have this paper Tragedy of the Digital Comment. Where you are arguing that open source provides a lot of social value, but that it also exposes us to a lot of social risk, and you provide some prescriptions for how to deal with this. But I wanna start because I have no experience in software development.

And I'm not a cybersecurity expert. So I just wanna start at, at some very basic questions as a starting point. I mean, what is open source [00:02:00] software? Could you explain that to the 

[00:02:01] Chinmayi Sharma: audience? Yeah, absolutely. It is a very near and dear to my heart because I used to be a programmer before I became a lawyer.

And open source is essentially like sanctioned plagiarism where people write code and they make the intentional decision to put it out there into the world and. . Anyone can look at it, anyone can use it. Anyone can modify it, and in many cases, anyone can profit off of it. So a lot of what program development is, is having a problem, and instead of being like, I should solve this myself, what you immediately do is you go to the internet Stack overflow GitHub, wherever you get your source of developmental knowledge, and you see whether or not somebody else has already solved this.

and you click a button and you import that and you use that in your project and it saves you time and money and also might give you access to things that you wouldn't have been able to build yourself or wouldn't have built as well 

[00:02:59] Kyle Langvardt: yourself. [00:03:00] And could you give us an example of what types of problems we're talking about that open source might solve or that, that a piece of open source software or code might solve?

[00:03:08] Chinmayi Sharma: Yeah, it spans so. Open source can be something as simple as, my first development project was to build a simulation for a figure eight ball, uh, or a magic eight ball. And so that was useless. And so I took that and I put it online and I'm sure no one used it because, uh, It's silly, but you can also have very useful programs that do kind of the more tedious parts of development.

So for example, error logging, or you can just have, for example, the Android platform is open source. Okay? And so what that means is it's built a toolkit to build applications for phones. And you can use that toolkit and modify that toolkit to build applications as you. and the difference between that enclosed source is you have [00:04:00] access to the source code there, but even beyond that open source, for example, the Linux kernel is an operating system.

Mm-hmm. , and it is one of the most widely used open source projects internationally. It's massive. It's used on servers, it's used by the government, it's used by the private sector, and so that's like a full stack operating system, artificial intelligence. There's also a lot of open source artificial intelligence project.

Um, and when people are beginning to dabble in that, they often use those projects as baselines off of which to build. Okay. Yeah. 

[00:04:31] Kyle Langvardt: So I guess maybe one thing to take away from this is open source is really pervasive then, because, you know, when I was in. , I guess when I was in law school with Gus Herwitz, who's the director here, I used to use Linux partially because it was free, but mainly I just kinda like wanted to look cool, like I felt like it made me look smart or something.

[00:04:50] Chinmayi Sharma: And it's a hipster operating system?

[00:04:51] Kyle Langvardt: Yeah, I guess so. Yeah. I mean, I was trying to, to impress Guss, you know, whatever that, whatever that is. But um, but you're saying this is like [00:05:00] normal people? are using open source software all the time. Who creates this stuff if it's free? 

[00:05:07] Chinmayi Sharma: So the arc of open source is a fun and interesting history where at the beginning a lot of the people that were building an open sourcing things was from the academic community in large part.

Mm-hmm. , because that is kind of the general ethos of academia is that you want to put the knowledge you've discovered out there and allow others to build off of it, check it, and. Et cetera. And so it was a natural fit as an in inception point. But then after that, it was driven largely by volunteers. So for example, Linus Toold is considered like the father of open source.

He's extremely famous. He runs the Linux Foundation top down. Like nothing happens in terms of how it is built. That doesn't go by him first. And he started off as a volunteer. There were companies that donated to an independent foundation to help support him, but whatever he was building, he said, I'm building it for free.

And so he built [00:06:00] Linux, he open sourced it and he invited as many other people as wanted to to contribute to it. And so a lot of these were developers who you imagine like typing into the command prompt and their basements kind of an anarchist vibe to them of like self-regulation, like Don Perry Barlow, like the government has no role to play in the internet.

Mm-hmm. and at the time, companies were very averse. to open source. They saw it probably as a competitive threat. I think Microsoft might have called it the scourge of the internet. Yeah, but very quickly the quality, sophistication, quantity of open source that was being produced just couldn't be ignored.

And so there was a big shift. Um, in the early two thousands and increasing every year where companies and governments started to play more in the open source space, not just as consumers, but as contributors. And so now today, the mix is you have your hobbyists that come back from a full day's work and start building [00:07:00] on their like, Debian Project for fun.

Mm-hmm. and for fulfillment. But you also have paid Google developers. Um, Google actually has several full-time developers and their only job is to contribute to open source. So these are paid corporate developers, and then you also have government entities. So the NSA contributes regularly to the Linux kernel and other.

Do as well. D O D actually has a complete policy around the use and contribution to open source 

[00:07:28] Kyle Langvardt: projects. I can see why the academics would do this. Why the anarchists? I can see why, why the government would do this. Why are private companies spending money on this? Is this just some public service thing or, or is there some element of self-interest here as well?

[00:07:43] Chinmayi Sharma: Yeah, I, it's fun when you start to research open source. , the vast majority of papers that have been written has been trying to solve the riddle of like, who is doing this and why. Yeah. And it's still great questions, but for companies specifically, I'm sure that there is some level of benevolence [00:08:00] there.

For example, Google and Microsoft are taking huge stances to promote open source and doing more than their share. Mm-hmm. to support it. That aside, it makes sense for a company to contribute to open source because it also benefits their use of it. Mm-hmm. So for example, if I'm Intel and I have a project that's really important to me, but this new.

Regulation came down, which means I can only build my products in this particular way, and that project is not quite compatible with it. Then I don't have to start entirely from scratch. I can fork that project. I can make the changes that I need to make to it, and I can recommend it back to the original maintainer, which would be ideal if they're like, oh, awesome, wholesale, take it, adopt it, because that means that then, you've built what you needed, but now the maintenance responsibility just goes back to the team, right?

That was originally taking care of the project. Or what you can do is make your own project because you're like, oh, this team is not gonna take my suggestion. And then you maintain it yourself. But [00:09:00] even that's better than closed source code because you're taking that project, you're putting it out there.

There are other companies that are also gonna be subject to the same regulation, likely need the same kind of modification to this project, and those are entities that can contribute to your project. So you kind of built a new community. And so for companies using open source as a no brainer, cause it's free, right?

But contributing is self beneficial because no open source project is gonna be perfect for them out the box. And so they can make improvements to it that benefit themselves. And because companies are companies mm-hmm. , it probably benefits the market. There's not, these are not the secret sauces of companies.

Yeah. And I 

[00:09:36] Kyle Langvardt: guess if you are Google or I, I guess Alphabet or you know, a big company like that, maybe there's a kind of BlackRock sort of thing going on where you're investing in the entire market because you're kind of siphoning off a little bit of value from the entire market all the time. 

[00:09:51] Chinmayi Sharma: Yeah. I.

There are definitely people who have very cynical views on why corporations contribute to open source, and there's a lot of concern around [00:10:00] corporate capture. So for example, Microsoft is a massive contributor to the Linux Foundation, and people are concerned that the Linux Foundation kind of sits in Microsoft's.

Lap and just does what Microsoft wants 'em to do so that it's no longer an open source project for the public. Good. Mm-hmm. , but an open source project for Microsoft. Right, right. Um, but to the point of like kind of having your hands in a lot of different markets. So Google's actually profiting off of the open source community in that they're providing a new service called Google's Assured Services.

Mm. Where they take open source projects and they say we are going to vet them for how secure they are. If they're not secure, we are going to clone and maintain internally our own copy of this and make sure that that copy is secure, and then we will sell you. Access to that secured copy. And so basically they're not selling software, they're selling assurances on software, open source software.

[00:10:57] Kyle Langvardt: Interesting. Okay. So it seems [00:11:00] pretty clear where the, where the value is, I mean both to individual developers, but just to the whole market. Let's talk some about. The risks that you identify in, in the paper. So just at the outset, I could imagine somebody thinking, okay, if this source code is out there, everybody can kick in the tires.

It's kinda like we have a lot of eyes on the street or whatever and bugs or, or, or exploits or are gonna be less likely to occur because so many people can see what's going on. Now, I think in, in your paper, point out pretty persuasively that that's really far from the truth, but could you tell me why 

[00:11:41] Chinmayi Sharma: that's wrong?

Yeah, no, I mean, what you described is kind of every open source developer has a tattooed somewhere to their body of . All, many eyes make all bugs shallow, which is the same thing of if you have everyone in the open source community able to look at your code, Shirley, we will find faster. Anything that's wrong with it, that [00:12:00] holds true in theory, and the reason it does not hold true in practice is because not every project.

enough eyes on it. So Linux is robust. I keep coming back to it because it really is like the bastion of open source. Mm-hmm. , it's built such an incredible product. It has great infrastructure around it. There's a foundation supporting it. Thousands of developers contributing to it. I mean, it's basically functioning like a company, even though it's still providing an open source project.

They have eyes on it. Yeah. You know, people are constantly reviewing because they're using that project. Other projects that might be equally important. are smaller and don't have the same eyes on it. And I think the perfect example for that is the heart bleed incident, which happened in 2014. So to not get into like too many boring tech stuff.

Open SSL is a cryptographic library that was open source that was essentially like the best of its kind at the time. And what it essentially does is it makes your like H T T P. website to https. Make sure that your, the [00:13:00] data that you're sending vis-a-vis your browser is secure. Okay? That project had a huge vulnerability.

that was identified to affect about two thirds of websites worldwide. Wow. And the project was being developed by two volunteers named Steve. So there were two guys who were not paid that were maintaining this incredibly important and popular project with very little funding. And at the end of heart, People said this is a resource issue.

They should have had at least six to seven people. Mm-hmm. on staff. They should have had at least $50,000. Instead, at the time, they had had a budget of $2,000, and now I think they have a budget of $10,000. Wow. And so despite the like incredible impact that Heart Bleed had, we have not been able to allocate resources optimally to the projects that.[00:14:00] 

If, if it had had all of the same resources that Linux had, would hardly have happened. Likely not. So 

[00:14:06] Kyle Langvardt: ear early. You said that Microsoft used to call open source the scourge of the internet. I mean, is it the scourge of the internet? , if these kinds of problems are common, you know, the no one's really getting compensated to maintain the software.

Predictably enough, you don't have enough people on the project. Somebody could die now. Nobody's maintaining it. It seems like a recipe for disaster. I mean, does this mean that we should move toward, uh, obscurity in, in software development, secrecy in software development, kind of proprietary software? 

[00:14:37] Chinmayi Sharma: Uh, I'm really glad you asked that question because I think that that is, A big tension in the community and hard when you write a paper about open source security to not seem like you're suggesting that it is somehow less secure than closed source code.

But I wanna go on the record. I am in no way saying that I think that open source code actually in many instances is more secure than closed source code. I think the whole impetus for me writing [00:15:00] this paper is that its problems are unique, but proprietary code has massive vulnerabilities too. I mean, we saw solar winds and that was not an open source attack, colonial Pipeline. There are many companies out there that are failing to implement reasonable security measures. And the reason I think that that's even worse, Then open source is that if they're not doing it, no one else can do it for them. Whereas if I'm a maintainer and I put my project out on the internet and I'm not maintaining that project, you can come in and say, “Hey, I found a bug. Like all you need to do is press a button and incorporate my solution and it'll all be fixed and I can choose to do that.” Or if, like you said, they call it the bus factor of how many people need to be hit by a bus for a project to die. If I get hit by a bus , you can fork that project and just take it over.

And so I think the resiliency of a project is so much strong. when it is [00:16:00] open source. I think the problem is not that the code is insecure, it's that the institutions that exist around it are just not as strong as exist around proprietary software. So for example, the transparency into who is buying what and where.

Mm-hmm. The thread of liability when something with your name on it is defective. The same influences and incentives don't apply to open source. 

[00:16:24] Kyle Langvardt: So I mean to back way up here, your paper is called Tragedy of the Digital Commons. I, you know, I imagine a lot of listeners here are familiar with the, the concept of the tragedy of the commons, but.

The idea is that if everybody tries to graze in this common field at once, that nobody owns, then there's not gonna be any, any more grass left for, for the cattle. The whole thing's gonna be used up. So I suppose maybe there are some people out there who might take the tragedy of the commons as, as some kind of proof that commons are bad.

Mm-hmm. But generally that's not most people's reaction. Right. It's more just that we have to deal with commons. Through some sort of [00:17:00] specialized technique other than just giving people lots and, and breaking it up. So I, I want to talk about the kinds of techniques that you would have for dealing with this particular commons problem and, and still preserving this value.

One kind key recommendation that you have here is you say that open source should be identified as critical infrastructure. And so here I, I want to just start by asking what that actually means. I mean, we hear this phrase critical infrastructure all the time. It, I kind of get it, but what, what happens if a sector is designated as, as critical and, and who does that?

What happens legally? 

[00:17:36] Chinmayi Sharma: Yeah, great question. As I was researching, this was one of the more surprising findings that I had, which is not a lot, not a lot happens when you're designated as a critical infrastructure. More. The signal is that more might happen in the future, but up till now very little much of critical infrastructure regulation is done through voluntary public-private [00:18:00] partnerships and encouragement.

To be a critical infrastructure sector, you have to be designated as one, and CSA right now is kind of the body that we go to for and what's. Cybersecurity and infrastructure security agency. And so they are right now kind of in charge of managing critical infrastructure as a regime. Okay? And so they have identified 16 existing sectors on top of those 16 existing sectors.

They have critical national functions, so like, Types of service delivery that are considered especially critical. And then you also have section nine entities that are entities that are within those sectors that are providing those critical functions that are especially important. So you have a lot of different ways where you could kind of be critical infrastructure, but the overarching umbrella is you're within one of these 16 sectors.

And it's not unprecedented that new sectors or sub-sectors get added to this list. So recently, after 2016, election [00:19:00] infrastructure was added as an critical infrastructure sub-sector. Mm-hmm. And that's kind of my case that I point to for the potential value of elevating the status of open source as critical infrastructure for one.

So there are individuals who believe that everything should be privatized and I'll never get them on my side. Mm-hmm. And there are individuals that think critical infrastructure is stupid and it should be left to the private sector and I'll never get them on my side. But there are some that believe that there's a role for government to play in securing critical infrastructure.

And to them, I say, Open source is critical infrastructure. Of those 16 sectors I mentioned every single one of them uses open source.

[00:19:40] Kyle Langvardt: And I wrote down, and this is, I got this from you, but I, I just, I wrote down just a few of these as examples. So I guess the chemical industry, communications industry, dams, Energy.

I mean, you can tell I was going in alphabetical order when I wrote these down and I ran out at, at some point. Yeah. Transportation. Um, so these are sectors that have been identified so [00:20:00] far. You would put open source alongside those. So what is it that actually kicks in once c a or CSA identifies a sector as as critical?

Does it mean. There are some studies by the government. Is the government giving out some money or what 

[00:20:18] Chinmayi Sharma: happens? So since I think a lot of people are opposed to bloating, government beyond where it needs to be, and I think that this is fair. In this case, I would say it should be a sub-sector underneath the information technology.

Okay. Existing sector. Cause I think it's a natural fit there, but it is not. Right. considered a part of that sector. And to put a fine point on that information technology, commercial software includes open source. So when that is falls in the critical infrastructure sector, technically the open source components they're using does as well.

Mm-hmm. , but critical infrastructure is not about. , it's not as much about the asset itself as it is the entity, and so the people who are being brought to the table are the end [00:21:00] users, the consumers of open source and not the open source community at its source. And so what kicks in when you are theoretically elevated in status is you have access to grant.

you have, um, the potential to be included in an IAC information technology and information sharing analysis sector. S Yeah. Yes. I mean, first and foremost, it means that you're important. Yeah. It means that when people are thinking about things, they're thinking about you. Right now, all of the cybersecurity regulation we've been seeing coming.

Focuses either on federal government systems or critical infrastructure. And so you get brought into the loop of things the government cares about. Oh, that, okay. That's interesting. Yeah, and so that's big. You have access to grants that DHS has. DHS regularly gets appropriated grants to dedicate to bolstering the cybersecurity of critical infrastructure.

But I think most [00:22:00] important, You get access to kind of the coordination power and information sharing powers of the government. And so I think when we talk about a tragedy of the commons problem, it is a coordination failure. It means that entities are not coordinating with each other to act in the collective best interests.

So they act in silo in their own. perceived self-interest, which leads to an adverse outcome for 


[00:22:23] Kyle Langvardt: Yeah. And could you elaborate that on a, a little more, this is kind of a, a prisoner's dilemma sort of, sort of 

[00:22:28] Chinmayi Sharma: problem. Yeah. In this situation, I guess to keep it tied to open source, it's possible that we can convince everyone that rationally speaking, it is in their best interest for all of them to shoulder a portion of the cost of securing open source.

But for that you would require, being willing to do that and trusting that others will do the same, because if 90% of us contribute to our portion, but the remaining 10% don't, we don't hit the goal. When there's a coordination failure in that you can't get all of the relevant people to the table, you [00:23:00] can't decide how to distribute it fairly, and you can't hold people accountable for it, then you have people that are like, well, I don't think anyone else is doing this.

Mm-hmm , so I'm not gonna do it. I'm gonna act in my own self-interest, which is continue to use open. Without contributing back because I feel like I'd be throwing money or resources into avoid. And what this results in is a depletion of secure open source. 

[00:23:22] Kyle Langvardt: Okay, so it's not in any one entity's power to create a better environment to operate in.

So from the individual perspective, it's like, okay, I'll just accept it that I'm in a bad environment. What's the best course of action in this bad environment? Right? So you said that getting access to these government resources helps resolve the coordination problem. So then how. Play out what? What's that look like?

[00:23:47] Chinmayi Sharma: So the single largest recommendation I have that I will put the full might of my, like, I don't, I'm not equivocal about this , I genuinely think this absolutely needs to happen, is there needs to be a thorough [00:24:00] census done of opensource projects and critical infrastructure. And what this means is cooperation from every single.

that is at all involved in the building and use of software in critical infrastructure sectors. And that is a massive coordination problem. We're talking about inventorying, which projects are where mm-hmm. And so that's not just, uh, as simple as like providing someone with a tool and say, put this USB hard drive into your computer and we'll just see what's on there.

Some of these systems are so old and some critical infrastructure entities have maybe one, if any, IT specialists on staff. That they're not going to be able to do this on their own. So there needs to be some sort of quote unquote cost distribution of who's gonna take on the burden. Mm-hmm. to go around and see which open source is where.

And for that, I think you do need a central governing body. And I think it has to need to be the government because. There are profit motivations for private sector companies to do this, and I think that they, [00:25:00] whether intentionally or unintentionally, can be blinded by what is in the best interest of that company or maybe even the best interest of that industry.

But the projects that Google is using is not necessarily the same project that a wastewater management company is using. Like if we're talking about. Old critical infrastructure companies, these are entities that have probably bought kind of the cheapest. Mm-hmm. software on the market, had a contract 20 years ago and have just regularly renewed it year over year because they didn't wanna shift over to something else.

Mm-hmm. , because that would be really costly and expensive. I mean, we've heard about like the federal government. Revamping its IP for many years now. Yeah. And so coordination would be the government kind of, I think mandating, yeah. Funding and carrying out with concrete deadlines and reporting requirements on this initiative.

But the second thing is like open source wants to secure itself, but it has a hard time doing that without information on who was using it, software and what they're using it for. Mm-hmm. And the problem is they don't have a direct line of access to all their consumers because anyone can go and pull an open.

Component, you're [00:26:00] not, you don't have to ask permission. You don't have to sign any paperwork. And so it's a very opaque ecosystem. So with coordination and for example, CISA has like industry specific groups that it brings together for threat landscape, information sharing. And so if you brought, for example, representative from Linux, a representative from Apache, representative from Debbie and what have you, you would have open source people at the table under.

okay. Who, which industries are facing, which. Which of our projects are implicated by it. And then in terms of like patching, one of the big problems is there might be a fix available for a vulnerability, but they're not implementing it. And sometimes it's because they're irresponsible. But other times it's because if I implemented it, it would break everything else in my system.

But for that, Linux wants people to patch, mm-hmm. and they wanna develop a patch that's good for their users, but without knowing that, oh, I didn't patch because of this issue. , they can't actually serve the [00:27:00] community the best they can, so I see there needs to be that constant dialogue, I guess. more succinct way 


[00:27:05] Kyle Langvardt: saying it.

Okay. And then so, so the government could help developers kind of centralize information about who's using their project, how some end user's use of the project is interacting with, with other projects, all that kind of stuff. Would the government. Be setting new requirements in, in any way for, for developers or for, or for end users of open source 

[00:27:33] Chinmayi Sharma: software.

Um, so this is the part of my paper I think that gets the most pushback. And I understand because anytime you recommend imposing liability on someone new, especially the tech sector, there's gotta be pushback to do the census building that I talked about. I don't think you need to put new requirements on c.

outside of cooperating with this public-private partnership that's attempting to collect information, and I don't think that's particularly [00:28:00] onerous because mm-hmm. , for example, right now the federal government is already requiring federal contractors to come up with their own inventory list of, I'm selling you this piece of software.

It includes all of these. Different sub software components, so they're already having to do that internally, and this will just be cooperating with this larger holistic effort for the landscape. Okay. Going beyond that though, I am pro liability regime for open source, but not on the open source developers.

If I am building a project and I'm putting it out there into the world for. , I'm not telling you to use it. Mm-hmm. , I'm not. Mm-hmm. selling it to you. I'm not profiting off of it. If you decide to use it and you don't look through it to see that it's secure and good, and you put it in a product and you sell that product to someone else and they get hurt by it because it was vulnerable and got exploited.

That's on you. That shouldn't be on me. Yeah. The dumb analogy I always give is like if I picked strawberries from the side of the road and made a strawberry [00:29:00] shortcake and sold it at a farmer's market, and I gave people food poisoning. It's not the fault of the person who grew the strawberries. Yeah. It's, you know, you can't pick things up from random places and not do your due diligence.

And so my like proposal for a liability requirement is you should be taking reasonable security precautions in your use of open source and you should be doing this for all software. I think another thing I hear when I talk about this is like, yeah, well that's no different from commercial software. I'm like, you're right.

It is no different. Yeah. People should. Doing secure development practices with open source as well. But the reason I think it is a little bit more urgent with open source is when I don't take reasonable security practices, I threaten every other entity that is using the same project that I'm using because that vulnerability could impact numerous other people.

So for example, I'm a vendor that sells, uh, rapid Books Okay. Accounting software. Yeah. Uh, and I use open source code in it. And I have a pretty big market and I don't vet that [00:30:00] code and there's no security checks done to make sure that whatever patches have been released, I've implemented, et cetera, et cetera.

And there's a vulnerability found. Yes, all of my downstream customers are going to be impacted by this. But once an open. vulnerability is found. Every hacker out there knows that it's not just my direct customers that are impacted. Oh, I see. It's potentially every other company that could have used that project, and it's not hard.

to guess which companies mm-hmm. are using which projects. You go to the project, you see how many downloads there were, how recent the commits were. There are ways to assess which project is likely to be pervasive enough that you should start attacking everyone everywhere. And we saw that with Log four J, and so it's a different scope of impact.

And so I think that it should be treated as seriously as the threat as it is. And also because if you start putting the liability just as much as open source has network effects, reasonable security practices also have network effects. So for example, Me being a very diligent, security minded software vendor [00:31:00] can actually improve security for numerous other companies as well.

Because I'm taking such care that I am fixing things, I am telling the maintainer that there was a problem. Mm-hmm. , I'm telling my downstream customers that there's an issue. I'm putting an advisory out there of like, Hey competitors. I know you might be using this too, but like, just so you know. If I start doing that, then you can have cascading effects on other entities.

[00:31:25] Kyle Langvardt: Yeah, that's really interesting. So this, this actually winds up being kind of an indirect subsidy at, at at least potentially to the more popular open source projects. Is that right? Or is that, 

[00:31:37] Chinmayi Sharma: it's an interesting way to put it, and I think that I'd have to think about it more, but I don't think that's wrong because part of what I'm asking for is not just for companies to be reasonable users, but.

Contributing to open source is not just giving them money. Yeah. But it is also kind of providing them with the help and information that you would be doing internally anyway. Mm-hmm.  So if I [00:32:00] wrote this code from scratch, whatever I am doing internally in terms of revising it, testing it, documenting the issue, telling my customers, because I'm contractually obliged.

you should have to do that with open source, even if you're not contractually obliged to. Yeah. And that's what I think would be a part of like these reasonable measures. And in terms of subsidies, I think if you kind of put out these requirements, what this ends up being is I think an indirect subsidy of security services for small to medium size businesses, right?

So if you're a smaller company, you don't have the same resources that Intel does to do security checks. But if Intel is doing those security checks and has this kind of like mm-hmm , open source mindset of responsibly sharing this with the rest of the ecosystem, then the least you can do that. Every small business has a capability of.

is looking out for these notices Yeah. And getting that information and making it actionable. Yeah. Um, without having to use all the tech to scan all of their software and see what's going [00:33:00] on. And so I think, again, back to economic theory, like put it on the least cost avoider. 

[00:33:04] Kyle Langvardt: Yeah. Yeah. That's really interesting. Beyond putting liability on vendors, is there anything else that the government can do to try to avoid this commons problem that you describe? 

[00:33:16] Chinmayi Sharma: I think that putting liability on developers is a start. I think that beyond that, this kind of quote unquote census review, I keep talking about; the reason it's important is because even if all of the existing major software companies become overnight compliant and responsible with open source, we still have open source libraries embedded deep in really important systems that we don't know are there. And they could be orphan projects. Like you said, the maintainer might have died, moved on. No one's touched it in years, and so. You can't just rely on that maintainer anymore. Yeah, there needs to be a contingency plan. Do you switch to a different project? Do you find a maintainer and staffed out that on them?

On that old project? [00:34:00] It becomes a very complicated issue because software is not something you can kinda like lift and shift. It's very hard to extract software from a network and so that's just going to take a lot of coordinated effort on the government's part. Right. I think it's gonna be very expensive too. 

[00:34:14] Kyle Langvardt: And does the government have a lot of flexibility in how it, how it responds to these types of problems? 

[00:34:18] Chinmayi Sharma: I mean, given it's the government in some ways, yes. Yeah. The most flexibility and in other ways, no, because federal procurement is notoriously arduous with numerous requirements and very slow. And the reason open source has.

Such a boon, I think, to many federal agencies or software vendors is that you can pull that piece in without having to go through right. All of the rest of these processes. So for like replacing that, I don't know how that would interplay. Mm-hmm. with like the federal acquisition regulations and things like that.

But I think it's also a case by case basis. Like for example, if something is like too deeply embedded [00:35:00] that it would break things for us to remove it, then you need to find. someone to take over that project. Right. And then is that open source still or is that just a government owned open source project?

And so I think there's gonna be a lot of those decisions made. And for example, if you do the census and you realize this wastewater management company is using this open source project, there's no one running it. So I am just gonna take it. I'm gonna clone it and I'm gonna run it in the government and we're gonna try to make sure.

Mm-hmm. . , everything continues to work. There might be other entities out there that were still reliant on the original project that will now no longer get the benefit of like your new maintained project. And that could be okay. But that's kind of like a equities balancing thing that you'll have to do.

Yeah. When you decide how to resolve these 

[00:35:48] Kyle Langvardt: things. Well, and so as with so many things, just getting detailed information about where the threats are is the most important step. It, it seems. 

[00:35:56] Chinmayi Sharma: Yeah. And money. Yeah. so much money. We've already [00:36:00] talked, so like wastewater has. So many ransomware attacks recently.

Yeah. That have like potentially changed the chemical levels of water and impacted the public and they have no money. Yeah. They have no, it budgets and so it's a problem. There's a big gap. 

[00:36:16] Kyle Langvardt: But my, my sense in reading this was for some of these problems, I mean the actual dollar figures involved to correct them would've been really small, you know, six figures or, or less Now, When you add up the cost at a systematic level, does it it turn out to be a lot of money or is it still surprisingly 

[00:36:37] Chinmayi Sharma: low?

Um, and so that's why I think the census is so important. It's gonna depend on how many projects we find we need. We don't even know. We don't even know. And that's the problem. Yeah, like Log four J is a perfect example of a project. That did not have adequate resources. So the Cyber Safety Review Board, which was set up by the Biden [00:37:00] administration, just did a review of the log four shell incident, and they found that if there had been enough resources and a security researcher on the Apache volunteer team that was maintaining log four J.

that vulnerability might have been found earlier and this issue could have been avoided. Why 

[00:37:17] Kyle Langvardt: isn't this being done already? At least the census? I mean, it's surprising to me that the government doesn't already have this 

[00:37:24] Chinmayi Sharma: information. They, to their credit, they are taking steps to get that information.

It's funny, since. . I started my project to, when I had to rewrite a draft. A lot has happened and I think that's very good. Mm-hmm. , but still a lot more needs to happen. But why they haven't done it to date? Because open source is the thing that was running everything behind the scenes and nobody talked about.

Mm-hmm . Nobody cared about, nobody knew about it. Why it's happening now. There is a man named Alan Friedman who works at cis. Who basically single-handedly spearheaded this initiative for [00:38:00] software bills of material or SBOs. Mm-hmm. . And these would be the ingredient lists that would tell the government when a contractor is selling to them, what is in that software.

What that solves is when they're a contract renewals or new software being sold, you'll know what's. What that doesn't solve is for systems throughout the federal government. And I'm not just talking about the federal government, I'm talking about all the critical infrastructure sectors. Yeah. We don't know what's there already.

And that's a highly manual task to do because, for example, log four J with so many dependencies deep that even if you scanned for software at the surface, you wouldn't mm-hmm. have mm-hmm. detected Log four J because software's a Russian doll, like, to use this software, I need to import this software.

Right. Which also relies on these other two projects, and so it gets you some of the way, but I have not heard anything about it being retroactive. Mm-hmm. to all [00:39:00] preexisting sales, and it is only for federal contractors. So a large, large, large portion of critical infrastructure is privately owned. And so if they source software from non.

Federal contractors, they will not get SBAs. And even if they source it from federal contractors, they might not get SBAs because those contractors are not required to give them to the non-federal government. Is 

[00:39:20] Kyle Langvardt: there any just kind of like cultural resistance to governmental involvement in open source?

A kind of like freak flag factor where they they, they just don't want the feds in. You mentioned that anarchists have been involved with open source? 

[00:39:36] Chinmayi Sharma: Uh, I don't know that open source wants to claim that , but the mentality is not far. Okay. Okay. Um, yeah, there has been a lot of evidence of pushback. I think there are right now a lot of policy initiatives.

That have tried to explore everything that could possibly be done without involving the government. There has, in very, very, very recent term, period, been some initiatives that are [00:40:00] trying to explore the potential role for government. I've been in a lot of these conversations and across the board, most people are scared to even bring up the term liability.

Hmm. They see the government as a nudger or a potential source of funding and nothing more beyond that. Mm-hmm. , they think that this is an open source problem and should be fixed by open. Alone. And the reason I am skeptical of that is another example of what you were saying in terms of pushback to any kind of oversight is, for example pi.

PI is a very popular open source registry, which means basically just like a lot of these Python projects will like sit in that registry. Mm-hmm. and organizes and supports them, they are now gonna require. authors or developers on critical projects to implement two-factor authentication. Same thing most universities require students and faculty to do.

All it entails is once you log on, you have to push a button somewhere else, check 

[00:40:57] Kyle Langvardt: your phone and there's 

[00:40:58] Chinmayi Sharma: a code. Yeah, [00:41:00] exactly. Yeah. The reason they did this is because the single biggest threat to the developer side is account takeovers, where people will try to take over an open source project. to become the new, like head honcho.

Mm-hmm. , and then insert malicious code so that everything, it flows down. Everyone else that's using that project Wow. Gets affected by it. And studies have shown that implementing two factor authentication reduces this by like 95% or something absurd. And still the open source community, not all of it, a minority, but a very loud minority, rebelled hard against this and said, we're just gonna take our projects offline.

But what happens when you take your projects offline? . It's like if somebody, you're standing on a road and all of the other roads get cut off and nothing works anymore. Yeah. Because so much software might have depended on that one project. Yeah. And so I think that there is definitely pushback to this, and this might be the lawyer in me, but I think when you [00:42:00] see that, you know that nudges and incentives will only go so far.

Mm-hmm. . . I think that that's when mandates come in. 

[00:42:08] Kyle Langvardt: Yeah. Has anybody raised any First Amendment arguments? Always? 

[00:42:12] Chinmayi Sharma: Oh no. There's always a First Amendment argument, and I'm not gonna be, I, I don't wanna guard ahead of the horse in terms of that being brought, but, um, 

[00:42:24] Kyle Langvardt: seems okay. Maybe I'll, maybe I'll have a, a paper to write on that.

I was gonna say, you know, I take the First Amendment seriously. I don't. software development as speech, particularly. Seriously, but I, yep. 

[00:42:36] Chinmayi Sharma: I, I don't know. Agreed it, it, it's hard to know what you do if you work back from the result of implementing minimum security measures is a First Amendment violation to.

So nothing that touches tech can have standards put on it. Mm-hmm. . And so that's a hard, that's a hard place to be. 

[00:42:57] Kyle Langvardt: Yeah. That's exactly where it leads. Then you have a First Amendment, [00:43:00] right. Basically to manufacture product 

[00:43:02] Chinmayi Sharma: whatever you want. Yeah. However you want. Yeah. Yeah. 

[00:43:04] Kyle Langvardt: That's insane. But I guess maybe just one final question is, one big theme in your paper is that people enjoy developing software. They find it fulfilling and people. doing security work. Uh, why is that? Why, why is security work hard and why is it boring? Like, why does nobody wanna do this? 

[00:43:29] Chinmayi Sharma: It's like, you might like cooking, but you don't, it's hard to find the person that's like, I like dishes as much as I like cooking. Um, security work at its core is tedious.

When I'm building a new feature, it's exciting. Mm-hmm. , like there wasn't a button before and now I wrote some stuff and all of a sudden there's a button. Yeah. And then now I wrote some more stuff and I click the button and it does something Cool. Instant gratification for a coder. That's what keeps you addicted.

Security is you prove the [00:44:00] success of your work in the absence of something. So it's not that you get to see something happen, it's that cool. A lot of time is passed and nothing's happened. Mm-hmm. and so I must be doing a good job.

[00:44:11] Kyle Langvardt: So it's just to clarify, it's formally impossible to prove that a piece of software is safe.

Is that correct? 

[00:44:19] Chinmayi Sharma: I mean, no. That would be like saying it's formally impossible to prove any kind of product as a hundred percent safe. There is always gonna be a vulnerability in software, but I think you can get to the point where it's concretely safe enough. 

[00:44:33] Kyle Langvardt: Oh, I see. Okay. But you can never get to 100%. And so I guess what that means is there isn't going to be some moment where you can say, okay, the, the job is, 

[00:44:42] Chinmayi Sharma: the job is fun.

That's exactly it. Okay. Security. So what's fun for me is I build the project, I put it up, people are like, oh my gosh, this is the coolest project. And I'm like, awesome. Moving on to the next thing. Yeah. It's boring to stick around. Um, I wanna get another new project where I get a lot of affirmation.

Security means [00:45:00] tying myself to that project for eternity. Reading the bug reports, reading, you know, people are like, oh, I think this code will make it better. I need to review that code. Mm-hmm. before I incorporate it, people have feature requests like, Hey, if you could you add this thing, it'll make it more useful.

And all of that takes so much time. And 

[00:45:16] Kyle Langvardt: is bug reports pretty interesting or they 

[00:45:19] Chinmayi Sharma: Most of them are garbage. Okay. Yeah, most of them are pretty useless. And then, but even then, you have to read the useless ones and you have to prioritize them. You're like, maybe I'll get to this one eventually, but the fact that the color green on this screen is not the same as the color green on the other screen is not a vulnerability.

Yeah. Um, and so, so that's hard. And the thing is, especially when you're a volunteer, you didn't sign up for that. Mm-hmm. That's the kind of labor job that you tried to get away from by doing this fun thing in your free time. Right. The second thing is developers aren't trained on security, this is a big workforce issue.

Just because I'm a good developer doesn't mean I know how to do security well, and they're trying to fix that with better curriculums and [00:46:00] security trainings, but they're not trained security professionals, so they can only do so much. And then the third thing is from the corporate side, you might think that a corporate open source developer would be more security minded because they have more to lose if something goes off.

but even in the corporate side, and this is like broader than open source, this is just for all software feature development is always rewarded more than security work. So if you have limited time and limited funds, what you want your developers to be doing is building out new features. There's very little reward in it for a developer to do any security work.

And you, there's the totem pole. You have a boss and the boss has a boss, and the highest levels. care less about security than they do about building the new cool thing that they can sell. And this is a comment in a lot of interviews I've done with corporate open source developers. This is a theme I get, which is even if I wanted to, like, I'm not supposed to.

[00:46:58] Kyle Langvardt: Why is that? Is that, is it [00:47:00] just companies not being sensitive enough to, to risk?

[00:47:02] Chinmayi Sharma: I think the cost calculus just doesn't weigh in favor of investing in security. Yeah. It just doesn't hurt enough for them to have bad security. And we've seen this, Equifax was an unpatched, open source security vulnerability. Mm-hmm. The fix was out there. They just chose not to implement it, and it caused a massive breach. And even after that, Equifax is back and it's still around, and it might have a consent decree in place and it might need to have minimum security. , but until that point they were like, the probability of this happening is low.

The fines are not even, I mean, some have criticized not even that high. Why would I do all of this for something that has such a low chance of happening and might not be that big of a deal, even if it does happen? 

[00:47:45] Kyle Langvardt: So I mean, security itself is subject to kind of a commons. Problem. And then all of that just kind of explodes.

Absolutely. When it, when it's open source software. 

[00:47:55] Chinmayi Sharma: Yes. Yeah, that's exactly right. I think cybersecurity is a public good. You can't [00:48:00] exclude anyone from having it, and me using it does not prevent you from being secure. The cyberspace. Yeah. But no one company feels the onus of. Securing the public. So for them it's just how much do I stand to lose as a company versus how much do I stand to gain?

And the opportunity cost of money spent on security is money spent on something that can make them more money.

[00:48:26] Kyle Langvardt: Thanks to Chiney Sharma, I know I've learned a lot in this interview and also from reading the paper. For those of you who might be interested, the paper's going to be forthcoming in the North Carolina Law Review and as in depth as Chiney was in this interview, the paper goes even deeper into this area, and you will walk away even more concerned about the many points of failure that open source software presents us with and next week.

Gus Hurwitz is gonna be back hosting the show as he usually does. He's gonna be talking with Eric Goldman of Santa Clara Law School about section 230. So if you are into section [00:49:00] 230, be sure to listen to that because Eric Goldman is Mr. 230.

[00:49:07] James Fleege: Tech Refactored is part of the Menard Governance and Technology Programming Series hosted by the Nebraska Governance and Technology Center. The NGTC is a partnership led by the College of Law in Collaboration with the Colleges of engineering business and journalism and mass communications at the University of Nebraska, Lincoln Tech Refactored is hosted and executive produced by Gus Hurwitz.

James Fleege is our producer. Additional production assistance is provided by the NGTC staff. You can find supplemental information for this episode at the links provided in the show notes to stay up to date on the latest happenings within the Nebraska Governance and Technology Center. Visit our website at

You can also follow us on Twitter and Instagram @UNL_NGTC. [00:50:00]