Tech Refactored

Cybersecurity and Interdisciplinary Work

October 14, 2022 Nebraska Governance and Technology Center Season 3 Episode 8
Tech Refactored
Cybersecurity and Interdisciplinary Work
Show Notes Transcript

Peter Swire, Law Professor at Georgia Tech, joins Gus to discuss a range of topics centered around cybersecurity, the benefits of an interdisciplinary approach, and the recently announced Privacy Shield Agreement.

Follow Peter Swire on Twitter @peterswire
Follow Gus Hurwitz on Twitter @GusHurwitz
Follow Nebraska Governance and Technology Center on Twitter @UNL_NGTC

Links
The Effects of Data Localization on Cybersecurity by Peter Swire and DeBrae Kennedy-Mayo
NGTC Website

Disclaimer: This transcript is auto-generated and has not been thoroughly reviewed for completeness or accuracy.

[00:00:00] Gus Herwitz: Welcome to Tech Refactored, a podcast in which we explore the ever- changing relationship between technology, society, and the law. I'm your host, Gus Herwitz, the Menard Director of the Nebraska Governance and Technology Center. My guest today is Peter Swire.

[00:00:27] Peter Swire: My name's Peter Swire. I'm a professor at Georgia Tech in Atlanta, Georgia. 

[00:00:32] Gus Herwitz: Swire has been a leading privacy in cyber law scholar, government leader, and practitioner since the rise of the internet in their early 1990. 

[00:00:40] Peter Swire: These days, my majority appointment is in the School of Cybersecurity and Privacy, which is part of the College of Computing.

[00:00:45] Gus Herwitz: He also has an extensive background serving in government, having worked with both the Clinton and Obama White Houses in various capacities. And he also has experience working with industry, including having worked with organizations such as the Worldwide Web Consortium on the [00:01:00] development of the global Do Not Track process.

Needless to say he has a true interdisciplinary background. His work has been influential around the world, including having helped to shape the global privacy agreements such as the Safe Harbor and Privacy Shield agreements, which help to facilitate how global companies can do business in both the United States.

And the European Union. My conversation with Peter covers a wide range of materials from teaching cybersecurity and privacy topics to interdisciplinary students to the nature of interdisciplinary work. We also spend some time towards the end of our conversation discussing the recently announced executive order on the Privacy Shield agreement, which is the latest effort to coordinate privacy, law and values in the United States and European Union to allow American companies to do business with European citizens.

Let's start with cyber security and this interdisciplinary approach that you are taking to it. And I, I'm a cyber security person. [00:02:00] I, I think most people talking about cyber security policy today recognize that this field, Somewhere between technology, computer science, business, the law. Uh, I'll just ask, what is the field of cybersecurity?

[00:02:13] Peter Swire: Well, I've tried to write about that some. I've been in the field long enough. I've tried to write about a lot of things. But, um, there's a, uh, set of computer science things that sometimes called the OSI stack, which goes from level one, which is the physical layer to layer seven, which is the applications.

And computer scientists are pretty familiar with the idea that there's these different layers and each layer has characteristic risks and attacks associated with it. So in the communications of the ACM a few years ago, which is a computer science magazine, I proposed a Pedagogic cybersecurity framework that proposed layer eight, which would be the organization, What does the CISO do? Layer nine is the state or government. What is the law and policy done at the national level? And layer 10 is [00:03:00] international relations. 

You know, US and Russia and China and allies. And so I think this framework, Helps us realize that you have business schools because layer eight's about organizations and management, You have policy in law schools because layer nine is about the state and you have international relations and the law that goes with that, that's layer 10 international.

So all of those fields have an obvious role in handling some of the cybersecurity risks that organizations face. They often fail, and I think that's why characteristically when you face hard problems, you need the computer scientists and. Organizational folks and the law folks, the policy folks, the international relations folks. 

[00:03:37] Gus Herwitz: So I, I love that framing in part because putting my engineering computer science networking hat on, most folks know that the OSI model is a reference model that no one actually uses in practice . Um, that, uh, in practice when you're implementing a network, you use a simplified model or you have things that cross between the layers.

So I. [00:04:00] I would ask, what is the interaction between the layers , that the formalization helps us understand who are the actors, but not how they interact 

[00:04:09] Peter Swire: with each other? Well, I, I mean, one way to describe what I was saying is here's reasons why all these folks need to be at the table when you're facing overall problems here, or why in our school of cyber security and privacy at Georgia Tech, my view is we should be having international relations and management and law and policy.

You know, as part of what it takes to address cybersecurity and privacy in terms of crossing the layers. And these abstractions can't ever capture the reality, but the abstractions help us simplify to remember some key facts. And so that's what I was trying to do. Uh, when you actually build a system, you have to be aware of when you're building the network, you're gonna.

The network people and you're gonna need the encryption people and you're gonna need lots of applications, people, So it's, it's not a surprise that these different people with different sets of skills are gonna be needed to face the overall 

[00:04:57] Gus Herwitz: challenge. So I'm going to ask a [00:05:00] impossible question, and since it's impossible, I don't care that I'm going to ask it badly.

I'm just interested in using it to get you to talk about the general idea why is cybersecurity, So difficult. Is it a technical problem? Is it, uh, we don't have people who are trained to think about this? Is it a, uh, we don't have the right set of laws in place? Why is this a difficult field and what's the direction to go in order to make it mroe attractable?

[00:05:25] Peter Swire: One of the things I say to my students early in the semester when I teach cyber cybersecurity is that on the internet, we all live in a bad neighborhood. That we're milliseconds away from an attack from people who might be physically thousands of miles away from us, living in a different culture, having different motives than we're familiar with.

So in the physical world, we learn how to build walls that are pretty good. People spend centuries building castles that were pretty hard to get into. But on the internet we expect there to be inter penetration and the ability to interact just going to a webpages interaction. And because of that interaction [00:06:00] and because of.

Speed of light transmission and the global scale. We are all on the defense against a large, complex set of threats all the time. And there's a well known, you know, cliche for, for armies, you can't defend everywhere at the same time, especially if they can concentrate their forces on the attack in one place.

And so if you defend everywhere you're defending, nowhere is one of the cliches on the military side. So that's enough to generate why it's hard. And then the, the next part is that there's a, a fundamental trade off between security and usefulness that every organization faces. You know, there's videos about how you can lock down a computer so much that it doesn't connect to the internet, and you bury it under the sea and people can still dig it out from under the sea and, and get into it eventually.

There's no. Perfect way to have a useful computing device that is fully secure. And so companies under budget constraint, governments under their constraints, individuals with [00:07:00] limited cognition, not knowing how to do everything. All of these are limitations that mean we have to use simplifications, and that means a determine attacker, in many settings can break through.

[00:07:12] Gus Herwitz: And that's fundamentally different than in the real world. Or I guess it's, it's not fundamentally different. A determined attacker can get through my front door of my house easily, but it's very unlikely that a determined hacker is going to try and do that. My default posture in real life is not defensive when I wake up in the morning, when I go outside, I don't think.

I'm likely to be attacked online. Your default posture needs to be different. Are we suited to the online environment? Does the online environment need to adapt or be reconstituted? Restructured in a way that has built into it an assumption that the, the users online can't always be in a defensive posture.

[00:07:58] Peter Swire: There's a sort of [00:08:00] well known move in cyber security from an earlier. Castle model or m and m model m and m model is a hard exterior and a soft chewy middle. So once you break through the shell of the m and m, you know, you get everything sweet. That kind of model where you could put a real firewall around the organization doesn't match the kind of inter penetration of an organization and the outside that we live with today.

So the, the cliche, I think there's two key cliches that you have. One is zero trust. . So I Gus said he was Gus and, but I have to authenticate him again every time because maybe he's Gus the man in the middle. So that's zero trust, which has gotten much more attention in the last few years, including from the Biden administration and then resiliency.

So here's a great thing about credit card numbers. If somebody steals my credit card physically or steals my credit card number virtually, I can call American Express and in no time it's canceled and I get a new. So the loss of a credit card number is not bad. And even [00:09:00] better, I can have insurance policies like the bank pays for my losses, and LifeLock might pay for any identity theft that comes along with it.

So the insurance is, is making me have some protection to rebound from the attack to be resilient. So those are good features. By contrast, biometrics are not resilient. Somebody steals a really good image in my. It's very hard to get a new finger. Right? So this one reason to be really cautious about believing that static biometrics are gonna be a good way to handle attacks when we know databases get corrupted.

That's the point I've, I've often made. So, zero trust and resiliency are ways. That we don't make the assumptions of 20 years ago for cybersecurity, we have to be able to bounce back from the known attacks we're gonna live through. 

[00:09:45] Gus Herwitz: It's a nice demonstration of the trade offs. My fingerprint, or my biometric markers much more usable.

They're always with me. I don't need to worry about remembering them or remembering what they are or forgetting them at home. At the same [00:10:00] time, they have a different threat model, a different attack vector. They have a different negative use case associated with them. Passwords are like credit card numbers.

Yes. They can be changed. They can be thrown out. Yep. So I want to go back to the interdisciplinarity of the field. Folks in business think about these issues differently than folks in engineering and computer science. Think about them differently than folks in domestic law. Think about them differently than folks in international relations or government.

Think about them differently than fol the military. The the answer has to be that. Are constructive views that they can work together, but in practice they oftentimes seem like they are possibly even antagonistic views, at least they are veto gates or their challenges to each other. How do we get these different voices thinking constructively together if they are in fact able to do so well?

[00:10:56] Peter Swire: So there's, there's two basic ways [00:11:00] to handle these multiple discipl. One of them is for an individual to become multilingual, to speak engineering and to speak law and to speak military. And the other is to have a team that's multilingual, that you have six people to represent or or up to six people to represent the six fields you're worried about.

And I think that early on when the military people have never talked to the corporate people, have never talked to the engineering people, yet you need people from different silos. To begin the conversation, to begin to find the common vocabulary, to begin to find where are the places of similarity and different, because you don't have somebody who speaks the six languages.

I think what what does happen is, you know, we're cyber security started, like I, I taught it. As I said to you earlier in 2004, the law of cyber security, so that's 18 years ago. No one had taught, as far as I can tell, that Law of Cybersecurity as a semester course before that. So we're 18 years in, so there's more people who've had an opportunity to get trained in at least two or three or [00:12:00] four of the disciplines.

And also we have institutional learning. So we have procedures in bureaucracies or companies now, which is, you've gotta check with the product person, you gotta check with the marketing person, you gotta check with the different functions for data breach. You know, the California data breach laws were roughly 20 years.

Nobody had a data breach plan. While any responsible organization today has a data breach plan, they say Call number one, call the ceo, Call number two, call the cso. Call number three, call the lawyer. So it has a how to get the relevant expertise in the room and code it in a corporate policy that says who you need to do.

So we build up, and we're doing this with supply chains and solar wind. So we're building up layers of institutional experience and policies. that instantiate these conflicts so that we know what to do when the next one arises. And time helps with that. It isn't gonna be the first rodeo anymore. It's not the first data breach even for the company.

It's not the first data breach. So, There's procedures in [00:13:00] place, there's people who've been through this before. I wonder, 

[00:13:02] Gus Herwitz: are those procedures themselves going to be the eventual attack vector? And the idea that I have behind that, uh, I isn't just, we can talk about, or, uh, lawyers frequently talk about roadmaps to a evasion and.

When you have rules and laws in place that tells attackers what possible vulnerabilities are, that that's one way that we could take the discussion. But I'm actually thinking something that I think is a bit more, uh, nefarious or troubling as a thought. You taught, uh, a cybersecurity class in 2004. You are on the cutting edge of these issues.

You have been working in this area, the, uh, majority of the time that this field has been maturing. That the technologies have been maturing and I started working on these sort of issues back in the mid early days of the internet, and I started as a computer programmer working on, uh, kernel programming and deepened the network stack and all of this [00:14:00] stuff.

And back then, everyone working on these issues had some facility with the technology because they were the only people interested in these issues. But nowadays, we talk about digital natives, we talk about everyone is using the technology. It's no longer just. To the folks who understand the technology using the technology.

So I wonder as we reach maturity with these technologies, and as you say, we experience an incident and we figure out how to respond to it, and we experience another incident, we figure out how to respond to that and we have some evolutionary path dependency and what we're doing, we're going to be looking under the street.

Constantly looking under the street lamp to figure out how do we respond to these dangers. The world is much larger than under the street lamp, so are we defending the right place when we're just defending under the street lamp? 

[00:14:53] Peter Swire: Yeah. But the, the advantage we have is we're subject to so many attacks. So if you were subject to one attack [00:15:00] and you learned how to defend against it, that's the street lamp that's being in one place out of the whole block or outta the whole city.

But Microsoft, I used to be on Microsoft's advisory board for privacy and security. They have the advantage of getting attacked, you know, I don't know how many zeros to put on it, Many, many, many times a day. And that's been going on for 20 years since they did their big security shutdown 20 years ago. So now they're in the millions or billions or trillions of experiences.

At that point, you have a map of threats that's really pretty big and elaborate. You can do lots of big data on it. You can do machine learning on it. So I think one of the features of cybersecurity is the attackers get to try at scale. Millions of times and the defenders eventually get to try at scale millions of times.

And in the physical world, if you're physically going in against a machine gun, it's your first time and you might get shot. But if you had a million tries to look at the past to get to the machine gun nest, you might find one that's safe to get there. So [00:16:00] both the attackers and the defender. Scale up and have a lot of learning.

One of the things I think Solar Winds taught people is that the attackers weren't getting good stuff anymore. Out of the easy attacks. The Russians to get in had to do this super elaborate attack on this super complicated and defended software by solar, wind, and once they got in, then that supply chain.

Vector would get them into the places they really wanted to find out about, and maybe they could escalate privileges. That's super complicated. You don't do that kind of super complicated attack if the easy attacks still work, so hallelujah. A lot of the easy attacks don't work. Script, kiddies, idiot attacks, those are much less effective than they were 20 years ago. 

[00:16:48] Gus Herwitz: Mm-hmm. So it's a really nice counterpoint to the idea that I don't worry about my front door being knocked down in my real life. And if we were [00:17:00] in a world where my front door was constantly getting attacked, well, I would be very concerned just about waking up every morning. Well, it turns out it's great that everyone's front door is constantly being attacked.

not. I'm making a better front door, but because there's a market for, And the front door manufacturers Yeah. Are also getting attacks, so they're making better front doors. 

[00:17:21] Peter Swire: Right. The solar wind example is one of many I've heard from people in the field, which is the low sophistication attacks are failing a lot and are not being that remunerative going forward, which is another way of saying that the relatively fancy defense.

Are pretty good against a lot of the normal attacks and this one reason we're seeing more attention to advanced persistent threats because they have the creativity and the resources to get past today's defense. But there's a sort of happy story, which is the sort of fairly dumb criminal, doesn't win anymore.

[00:17:57] Gus Herwitz: I want to, uh, pivot to one of your other [00:18:00] hats or the intersection between two of your hats. You, you mentioned that, uh, you've done work. You were on the, uh, Microsoft, uh, Privacy and Security advisory board. You've done a lot of work in industry in that sort of role in addition to being a professor, and you've also served, uh, multiple times in pretty impressive government positions.

Working on these areas. I wonder if you could reflect on the relationship between your academic work in these areas and your government work and how they relate to each other, how they benefit each other.

[00:18:32] Peter Swire: One thing you've just pointed out is how old I am, but you know, That's okay. I'm, I'm used to that.

I'm ready for Medicare next year. But, um, what comes to my mind is if you're talking to business and you're talking to government and you're talking to big theory, academics, that you need to be very aware of your audience, and that's true for the multidisciplinary things we were just talking about. I teach at Georgia Tech.

Half my students are computer scientists, so I better say it in a way that I, [00:19:00] undergrad or graduate, computer scientist can get it. And some of my students are public policy students or business students or whatever. And I used to teach law students. So then how can I stand up in a room of a hundred people in my privacy or cybersecurity class and say things so that the whole audience comes along with.

And that's a partly practice. I'm better at it than I was as a junior professor. Partly it's going into business settings and seeing how to talk to business people who've got the stock price and the bottom line on their mind every moment going to government people where it's the next election or it's the bureaucratic imperatives of the agency.

And what do they care about? So you try to think about for the audience, what do they worry about when they wake up in the morning and you try to figure out what are the words they use? So I use less and less jargon when I can, and I also try to find ways to tell stories that make the point without having to do the theory.

So I've used a story with you in a previous setting of they were gonna put bankruptcy records online back about [00:20:00] 20 years. That would turn out to put bank account numbers, social security numbers, and the balance of the bank account numbers online as well. Providing a marvelous target for the bad. And as soon as you tell that story, the people who are gonna put it all up online have to stop and say, Oops, I really didn't think I was facilitating massive theft.

Now we gotta figure out some way to deal with it. Now you asked about the academic side and the practical side. So to talk to government or business or my undergrads, I have to say it in words, they'll understand and I tell a story that. Now for the theory folks, for the laws, professors, it's possible that I'm not a great fit because I'm so weird in what I'm interested in.

So if I was a brand new professor, I would have a hard time writing for the Harvard Law Review right now, cuz I don't put it into the set of footnotes that, you know, show the right deference to the right theories that would make the second year law students at Harvard pick my piece. And I've mostly over time found publication outlets that don't require [00:21:00] that.

Early in my career, I published. Duke and Virginia and Texas and you know, all those kinds of places and, and I had to try to figure out the game of the second year law students, law review articles, but now I have enough outlets that usually I can find some way to publish it. And I don't care in an online world so much where it's published and I don't need the credentials cuz I have other credentials.

But for junior academics, you have to persuade your colleagues that you're a real academic. And by the way, when you go into government for a rotation, if you do that, You have to persuade the government people that you're not some RC Fary academic. You have to persuade them that you are a government person doing serious government business.

Mm-hmm. . And so how do you talk to that audience? What do they care about? What's the job? When I walk into the meeting, a lot of law professors, in my experience, having been in faculty meetings, don't use every moment in a meeting to achieve their goals for the. , they sort of like to talk. And that's sort of like partly why they're law professors.

I've seen law professors [00:22:00] argue the room out of the position because they talk for too long. So when I go into a meeting, I think about my goals for the meeting. This would be in government especially. Think about what do we have to achieve? We have, you know, 45 minutes to do it. How are we gonna use the time to get to that goal and then get it done?

And when I'm, you know, giving an academic talk, then I have to figure out how to say it to that. But it's one more audience instead of being some difference and kind. When I think about. 

[00:22:24] Gus Herwitz: Which raises a, a related issue? The role of the professor, law professor, or any type of professor? We have multiple audiences in everything that we do.

Mm-hmm. Um, we are doing research, talking to our colleagues in our field. Some of us are trying to influence policy, talk to folks in government or industry. Some of us are trying to get consulting gigs, so we need to be relevant to, uh, Fulton, really applied industry. And of course we're talking to our students and all of our students have different goals as well.

Um, some of them want to be researchers, some of [00:23:00] them just want a job. Some of them don't know why they're there and they're hoping that we're going to entertain them. So lots of different purposes, um, in whatever it is that we're trying to do. And, and I'll go back to. Your unique position in such an interdisciplinary role, and, uh, you, you mentioned this before, how do you speak to such a diverse range of students with professional interests, academic interests, and experience from dis- different disciplines. 

[00:23:30] Peter Swire: When I got to Georgia Tech, I left law student teaching and I went to Georgia Tech with engineers and, and computer scientists and stuff. I didn't know how to teach them. Turns out business students wanna know how this class is gonna help their business plan and computer scientists like code to run.

And you know, engineers like to build things. And I had to go through a period of listening to try to understand who these people were and why were they in the class and what was it that I could do that would work for. My wife's a requirements [00:24:00] engineer, software engineer, requirements engineers, go out and interview all the stakeholders for a project and then figure out what the constraints are given all those stakeholders, and then try to come up with a path to get the project done.

And I tried to do that by listening to reading the student comments, you know, listening to what they said, you know, when they came into office hours talking to them, What did you like about this or didn't like about. I think what I saw was, especially in cybersecurity, which is more technical, my evaluations went up as I listened longer and as I practiced saying things that seemed to work for them.

But that's feedback, That's learning from experience. That's not thinking you're the pro, but instead trying to learn from the people you're trying to communicate with.  

[00:24:41] Gus Herwitz: So I want to transition to a completely different topic. Right now. We, we've kind of buried the lead, um, with, uh, our, our conversation. Um, the big news, or some of the big news in both of our worlds right now is President Biden's recent executive order.

About, uh, the new proposed Privacy shield agreement, [00:25:00] which your own work has heavily influenced. I will just invite you to tell us a bit about what the setting is here and what the executive order does. Then we can, uh, talk a bit about that. 

[00:25:13] Peter Swire: Okay. I could bore all the non-specialists really quickly, and I'm gonna try not to do that.

So, Europe has stricter privacy laws in the us. And they have a court of justice, their Supreme Court that's notably strict and has been very worried about the US government doing too much surveillance since Snowden. And so there's been a series of cases brought by an Austrian lawyer named Max Shrems that have ended up saying, You can't transfer data to the US because.

There's too much NSA in other government surveillance, and there have been increasing enforcement actions in the last two years where Facebook, it was serious enough that might have to close down in Europe that they told their investors that this is a material risk that they'd have to close. All European operations and Google [00:26:00] Analytics has been found to be illegal by at least four data protection agencies in Europe, in different countries.

Okay, so this is pretty serious stuff. And the fines can get big, The fines can be 4% of your global revenue. If you're a. So in the last round of this case, the Court of Justice said there were two problems, and the one I focused on is what's called redress, which is a person in Europe has the right to make sure that nobody's processing their data incorrectly.

They can go to their local government or their local company and say, Hey, are you. Following your rules and under the court of justice view, they can also come to the NSA and say, Is the NSA following all the applicable law and not looking at it too much? Whether or not you think that's a good idea.

That's the doctrine from their highest court. And, um, in privacy field in 2016, Europe in the United States came up with this kind of half baked idea of what they called an ombuds person. Which was like 20% of the independence that a European person would have in the same job. It was [00:27:00] badly done. I never said it was worth it.

I never approved it. Not that anybody cares, but I just thought it was badly done from the day I saw it. Mm-hmm. . And guess what? When you do a half bake thing and it goes up to the Supreme Court, they say it's half baked, and they say, go back and do it better. And so when they, when it came out and I, I run a think tank called the Cross-Border Data Forum.

And I've had the pleasure or not of having worked on these issues with Europe for many years. Wrote a book about it in 1998, was in the US government during Safe Harbor in 2000. So I've been around these issues for years and my view was redress was hard. It was actually a difficult problem. There wasn't any answer that anybody had.

That would meet all the criteria for European law. Six or seven things. The court said you must do all six and would fit the different and somewhat weird US constitutional structure with the role of the executive and stuff. The basic 

[00:27:53] Gus Herwitz: idea here is I'm a European citizen and I have concerns about how my data has [00:28:00] been used, access shared by the United States government. There needs to be some mechanism by which I can complain and get redress. I have to have, uh, some something that can be done about it. 

[00:28:11] Peter Swire: Right. That's a good description of the rule. And so with other academic experts and a former state department lawyer, French law professor, we started working on this and we gave it a try in 2020.

We gave it another try. Late in 2020, we spent a year trying to spec out all the aspects of the problem. And by last January, February, we published. A detailed roadmap for how you might do this. And my view is I have not seen any other proposal that actually meets the criteria or comes close. So it's not that I love this sort of crazy quilt, weird, It's independent, but it's part of the executive branch as an executive order.

But there's a regulation. It's not like I love this structure, but I say it wasn't a compromise. I say it's like a Rubik's. There is [00:29:00] one solution that does appear to meet all the criteria, and that's the one that was announced on October 7th by President Biden and the Attorney General. It's pretty much the thing that we wrote in January and February.

That's not totally an accident. We had been talking to the government people and we had been academics giving our ideas and they said what they thought things were. . And so when we published it in February, you could see it as a trial balloon, right? Here's this complicated weird thing that nobody started with, and it went out into the world and nobody shot it down.

There hasn't been any substantive takedown of the piece, in my view, that really engages with what it says that it does. So we've had, what is that, eight months since we published it, No one's taken it down. The governments have kept negotiating the fine points, and so now it was announced on Friday. I'm pretty optimistic the European courts will see that this answers what they said.

The six criteria are, There's other parts of this new deal on Friday about is the US doing disproportionate and unnecessary [00:30:00] surveillance. I think that's a vaguer standard and is more in the eyes of the judge. But on redress the US flunked with privacy shield in 2016. But I think the governments have done their homework this time.

So I think we have an answer on that particular requirement. 

[00:30:15] Gus Herwitz: So the uh, 2016 Privacy Shield, as you noted, had this OMPUS person approach, and what you had proposed is some sort of independent commission. Can you explain what that independent commission is? 

[00:30:29] Peter Swire: Well, so on Tuesday, October 11th, when we're taping this with my co-authors, we published in the IAPP, the International Association of Privacy Professionals.

Fairly detailed article explaining this, and I'm not gonna read it, but the basic idea is the attorney general by regulation can limit his or her own discretion when they set up this independent court kind of thing that would look at the complaints. The people are not government employees, they're independent and they can't be [00:31:00] fired unless they break the law and they get to look at all the data to make sure they're doing the investigation right.

So basically, The center part of this proposal is that there's, or not proposal, now it's the law of the United States. The central part of the new structure is that, uh, the individual in Europe can have these independent adjudicators will scrutinize the record. And if they say, You gotta delete the data, nsa, the NSA is told by the president, you gotta delete the data.

And so now we have a system, it has to be implemented, they have to build it over the next 45 days or whatever. But basically we have a system. To meet, in my view, the European requirements. And, 

[00:31:38] Gus Herwitz: and the core idea there is that the attorney general can effectively tie himself, herself to the ma exactly issue of regulation that says, We cannot, the president cannot order me to, uh, share this information.

Without going through a significant amount of procedure, which is going to make this public and give affected [00:32:00] individuals the opportunity to, uh, to object. Yeah. 

[00:32:02] Peter Swire: I'm gonna, I'm gonna have you say this the next time cuz I'm, I'm getting tired of saying it. 

[00:32:07] Gus Herwitz: Uh, so, uh, I, I guess the question is, Is it gonna work?

Um, we, we know that Max Schrems, the, uh, Austrian activist who has brought two previous pieces of litigation that, uh, led to the rejection of the, the safe harbor that had been the effective law of the land for 15 years or so in the early 2010s. And then the Privacy Shield in 2016. We just know he's probably already ridden.

He said he will. Yeah. Yep. So he, he's going to challenge this. Is it going to survive? Are we in a perpetual state of European courts rejecting whatever the agreement is, and then the US and European governments that want an agreement to exist negotiating something else. 

[00:32:52] Peter Swire: So I've really put my energy into working on the redress part.

And my view is that this [00:33:00] is a good faith answer to each requirement of European law. Now, can a judge who wants to strike something down, find something they don't like? Yes. But I think a fair minded judge would look at redress and would say, given the US constitutional system, the US has really, really tried here to meet all the requirements and it actually formally has met all the requirements.

So I think on the redress, Um, reasonably optimistic. Of course, I've worked on it. So then maybe I'm a bad witness and, you know, maybe I'm unfairly biased in favor of the only current proposal to meet all the requirements of European law on the necessary and proportionate part. I think American listeners would, you know, what's a reasonable search under the Fourth Amendment?

And we know that judges over time, and as technology has changed, have given really different answers to things. Under Olmsted, a wire tap was on a public street, so it was okay for the government to wire tap after cats and other cases you. Uh, Fourth Amendment protection before you can wiretap somebody.

So we've [00:34:00] seen courts in the United States fight over what counts as reasonable, and the courts in Europe to date the Court of Justice in Luxembourg has been strict about government surveillance. They're in a big fight right now with all their member states about whether the member states can keep police records about who's been using the.

Called data retention, and the police have gone back to the court, I think four times now, and the court keeps saying, No, we really mean it. You can't do this thing. So the member states are under that pain of their own criminal systems not being able to operate the way the governments in Europe think they should, and they're feeling the pain because the highest court has a strict view of the subject.

I don't have enough knowledge on necessary and proportionate to have a view on it. There's been important steps taken this time that the US wouldn't take last time in privacy shield. The United States has said as a matter of law, in the executive order, they will only do in intelligence [00:35:00] surveillance that is necessary and proportionate.

That's the European legal standard. So the US government while saying that, we interpret it under us. Has taken the European legal formulation and said that is now the law of the land in the US when it comes to intelligence surveillance. US was unwilling to do that in privacy shield in 2016; so Hurray, Schrems has gotten the US to adopt the European legal standard.

Is that enough? Are the other safeguards enough? Is a wire tap unreasonable or not? You know, we, we have these problems on that issue. One of the things I would say is that reality intrudes on judges in different ways, at different moments. I think after snow. The European reality was a lot of outrage and understandably so about US surveillance.

My own view is if we look at the Ukraine invasion in February of 2022 by Russia, the US intelligence services are look pretty darn good. They were saying before the attack, the attack was gonna happen. You have to take it seriously. It could be a broad attack. Uh, Ukraine took various measures [00:36:00] including moving, backing up their records out of country so that they wouldn't be bombed out of existence.

So the US Intelligence Services not only helped Ukraine, but the frontline states like the Baltic states in Poland and others, people in those countries look at what the US intelligence has done, and they think that the US intelligence has helped save their countries from a very grave existential risk.

Maybe some judges from Estonia will look differently when the court. Case comes this time. That's a realist. You know, what did the judge have for breakfast? Kind of story, but, But historical facts matter. Snowden has receded. The US has done a lot of what Europe has asked for, and the US is working its tail off to try to protect Eastern Europe right now.

So maybe the US gets a little bit more of a benefit of a doubt. 

[00:36:43] Gus Herwitz: Yeah, and the situation that you discuss as well internally to the European Union, is it possible for the union itself to change these legal standards?

[00:36:55] Peter Swire: Well, as we discussed in, in an earlier conversation, Gus, the Treaty of Lisbon, which is what has [00:37:00] established the charter of fundamental rights, which is the legal basis for the strict court.

The treaty of Lisbon does not have an amendment process. The Europeans I've talked to say they'd have to renegotiate the entire treaty of Lisbon, which is like going to a constitutional convention again for the United States. While that ain't gonna happen, nobody in Europe thinks that's gonna happen and, and so if there's a really dysfunctional Supreme Court there, court of justice for the European Union, they didn't design their instrument very well to find ways to put limits on it.

And you get constitutional crises at moments like. You know, there was a conservative Supreme Court in the 1850s and, and they came out with Dr. Scott and they had their view of things, and that became a rallying call for massive change from the north. Now, obviously the cases are entirely different, but I'm just pointing out that if judges go in a direction that's different from what the society can withstand, at some point the society pushes back on the judges and [00:38:00] you know, to say that the US is the big risk in the world to Europe.

We have consensual commercial activity and you know, at some point I just don't think the US is the big risk. Mm-hmm. . And so I don't know how the Europeans will go through their own process to govern themselves here, but I do think the court is at risk of really being outta step. For the police stuff like data retention.

The member states are increasingly frustrated there and they're, they're basically, some of them are not following the court's decrees anymore on data retention. So when you get that kind of lack of rule of law, because the court's rules are unworkable, at least in the eyes of the government, There's a constitutional crisis brewing in Europe that's, uh, that's nontrivial.

[00:38:40] Gus Herwitz: Well, I, I think it's time to start wrapping up this discussion, but I, I just want to end on a, a note of thanks and appreciation. You've actually changed my own thinking about, uh, some of these, uh, issues. You really believe in good-willed, people working hard to come to practical solutions, and so much of the [00:39:00] discussion.

In the, the privacy space in particular and privacy shield. It's about we have incompatible views on privacy and it's just impossible to actually make anything productive happen unless the Europeans or the Americans fundamentally change their privacy views. And yeah, I, I, I think that there, there is a more optimistic place and that you have been.

Trying to bring us there and push us in that direction and for a long time. Yeah. Yeah. Well, it takes a long time, but it's good to know that good-willed people, uh, working for practical solutions might be able to bring about results and change with a 

[00:39:33] Peter Swire: lot of listening to what other people really need to do, what they have to do. And part of why I work now with co-author who's a French. I don't say anything about European law if I can help it unless my French colleague has cleared it first. I don't say anything about the technology on cybersecurity. A lot of times without talking to a technical person that I trust. So that, that's part of your answer to multidisciplinary. You, you need a team with [00:40:00] authentic expertise to really get sometimes to the best possible result.

[00:40:03] Gus Herwitz: Well, thank you again for being part of our team today, uh, team of two on this discussion, and I, I look forward to next time we get to chat. 

[00:40:12] Peter Swire: Great. Thanks so much, Gus.

[00:40:17] James Fleege: Tech Refactored is part of the Menard Governance and Technology Programming series. By the Nebraska Governance and Technology Center. The NGTC is a partnership led by the College of Law in collaboration with the Colleges of Engineering business in Journalism and Mass Communications at the University of Nebraska Lincoln. 

Tech Refactored is hosted and executive produced by Gus Herwitz. James Fleege is our producer. Additional production assistance is provided by the NGTC staff. You can find supplemental information for this episode at the links provided in the show notes to stay up to date on the latest happenings within the Nebraska Governance and Technology Center.

Visit our website at ngtc.unl.edu. You can also follow us on Twitter and Instagram at UNL [00:41:00] underscore NGTC.