Tech Refactored

S2E39 - Dismantling the Black Opticon with Anita L. Allen

May 06, 2022 Nebraska Governance and Technology Center Season 2 Episode 39
Tech Refactored
S2E39 - Dismantling the Black Opticon with Anita L. Allen
Show Notes Transcript

Penn Law Professor Anita L. Allen joins us on the podcast to talk to Gus about race and privacy in the past, present and future. Anita also discusses her latest paper, Dismantling the “Black Opticon”: Privacy, Race, Equity, and Online Data-Protection Reform, which touches on the ongoing bias and discrimination African-Americans have faced, and continue to face, as it relates to discriminatory oversurveillance, discriminatory exclusion, and discriminatory predation.

Disclaimer: This transcript is auto-generated and has not been thoroughly reviewed for completeness or accuracy.

[00:00:00] Gus Herwitz: This is Tech Refactored. I'm your host, Gus Herwitz, the Menard Director of the Nebraska Governance and Technology Center at the University of Nebraska. Privacy is one of the more pressing and controversial issues of the day. Online at offline, in the United States and around the world, citizens advocacy groups, business interests and legislators are all working, oftentimes fighting to get privacy rules right.

Today, we're going to make that discussion even more difficult by adding in consideration of race. We're joined today by Anita Allen, professor of Law and Philosophy at the University of Pennsylvania, and author of a recently published article in the [00:01:00] Yale Law Journal Forum Dismantling the Black Opticon Privacy Race Equity, and Online Data Protection Reform to talk to us about race and privacy. Professor Allen, welcome to Tech Refactored. 

[00:01:13] Anita Allen: Thank you. It's great to be with you. 

[00:01:15] Gus Herwitz: So I, I want to start by asking a bit on the background of privacy. Um, it, it has a very modern feel in the internet age, but it's not at all a new idea. And you in fact have been studying it since before we started talking about it in the context of the internet or before it was cool, as I say.

So I'd like to start with a bit of context and history about what privacy. Most people, I think, think at least that they have some sense of what privacy is. So rather than just asking you for some history, I want to ask you this in a slightly more loaded way. In the American tradition, we usually trace the concept of privacy to a famous article by Samuel Warren.

Lewis Brandis a uh, 1890 article. The Right to [00:02:00] Privacy and perhaps unsurprisingly, Warren and Brandis were well known political figures at the time. Warren in particular, was known as a bit of a New York socialite, and they were upset that the press was snooping on lavish parties that Warren hosted using this new technology at the time, the camera.

In other words, these wealthy white male, New York socialites literally created what we think of today as the American right of privacy to protect their ability to throw big parties. And this largely tracks with European history. In a 2004 article, James Whitman discusses how European understandings of privacy originated in concepts of individual dignity.

Most notably, the dignity of healthy, wealthy, young male, white aristocrat. So that they could assume prominent roles in society free from their youthful and uh, youthful indiscretions that invariably were found in their pasts. So with that as some preface I want to ask, is privacy about our right to be let alone?

[00:03:00] Is Warren and Brandis say, is it about dignity? And if so, whose right to be let alone and whose dignity? 

[00:03:09] Anita Allen: Well, thanks for that. So, both Warren and Brandis, uh, were well known in Boston. I'm not so sure about New York, but in Boston and, uh, Samuel Warren was part of a, like one of those like old Boston Braman families.

Louis Brandis was a Jewish man whose family had, had, had, uh, not been born in Bra in Boston, but they, he was from the Midwest actually, and, and immigrated to, to, to, to to Boston for law school, I believe, and stayed there. But they were, they were, as you say, concerned about the, the upper classes and, and in that famous article in Harvard Law Review in 1890, They addressed what they describe as effrontery, the affront of press.

Gossip is somehow, sometimes the way I, I describe what their main concern was, but they also elevated it to, to the protection of [00:04:00] mankind, spiritual nature. Right. So we have this right to be, let alone, to be free from, from the affront of press gossip that demeans us and diminishes us. And uh, and uh, uh, it it it assaults our dignity in, in effect.

Right? So that was, so the, so, so the idea of the right to privacy in that article is part of the giled age is part of the, the era of celebration of wealth and, and, and its privileges. And, and the article is written that way. It, it's very beautifully elegantly written, but the article. Has nothing to say about the privacy of ordinary.

In the article, ordinary people are seen as part of the problem cuz they're the ones who are the consumers of gossip. They're the ones who, who, who want salacious stories about fallen women and fallen rich people. That they're the ones who are, who are part of the problem. They're the enemy of the, of the private man.

So I have looked at that article. I, in fact, one of my first privacy articles just was something called How Privacy Got Its Gender. And I look very [00:05:00] closely at the Warren and brand article, the point of view of gender, not race yet, but of gender. And it struck me that, that they trade upon the idea that women are delicate flowers.

They need privacy from having. Photographs used without their consent. They need privacy for having their babies and their homes without strange people being around. And they trade on that idea to help, to support their, their, their theory. But yet there's really nothing in the article that's for the advancement of women, as, as it is nothing in that of advancement of African Americans.

Recently freed from slavery or Native Americans who were still being fought in the Great West, or the West closed in 1890, officially the same year as Warren in Brenda, that article. But there was still, uh, uh, you know, disputes and, and, and dangers between, uh, white people and, and Indians and, uh, others out west.

So, so that's, that's kinda my first response to your, to your questions. But the article was definitely, as you suggest, a reflection of class privilege and of male privilege. 

[00:05:59] Gus Herwitz: [00:06:00] Then what is the role of privacy? I, I'll just follow up, uh, uh, directly. And what was the role of privacy for anyone other than, uh, wealthy white males?

[00:06:13] Anita Allen: So I like to tell a different origin story of the right to privacy than the one that most people tell. So for me, American privacy law does not begin with a Warren and Brande article begins, uh, several decades earlier with a case called State versus Mann, North Carolina case, uh, uh, written by, um, opinion written by Judge Ruffin who owned slaves.

But this is a case in which a woman, uh, who was enslaved named Lydia was shot by the man who, um, hired her out from her owner and the man who shot her, shot her because she was trying to escape one of his beatings. Interestingly, the white community rallied around the enslaved woman and they prosecuted, had him prosecuted for, for assault and battery, but he appeal.

And on appeal, Judge Ruffin said that, Oh, because of the privacy of this master slave [00:07:00] relationship, we can't interfere with this man. And so we can't uphold this conviction. It would undermine the, the whole institution of slavery upon which, you know, our, we are all dependent and implicated. So he used a privacy concept as a legal concept to stand in the way of actually allowing punishment of somebody who committed a heinous uh, crime against an enslaved person.

And that same idea that that privacy protects the home, the family. Uh, our businesses, our mattress servant relationships, our master slave relationships came up in a case a couple decades later called State versus Rose, in which a man was charged with beating his wife, but then that was overturned unground to the privacy of the family.

So, despite the history I just described, I think privacy is indeed a positive value that has positive functions in our society, and despite its roots in the protection of slavery. And wife beating and white male, upper class privilege. We all need a right to privacy that is based on our [00:08:00] equal, uh, dignity, our equal freedom, and that's what I'm in favor of.

That's what my work is mostly about. 

[00:08:06] Gus Herwitz: Is there a, there, there was, I believe, a transition at some point in how we thought about and used privacy over the course of the 20th century. Certainly, uh, Griswold be Connecticut, Roe v, Wade Loving v Virginia. These are all cases that are rooted in one level or another in concepts of privacy and individual autonomy and dignity.

Was there some transition? In how we think and talk about privacy that led to these, these cases, and how are those transitions either reflected or intention in our discussions today? 

[00:08:47] Anita Allen: So the first positive law, privacy rights we had in the United States as such, were. Tort law rights because Warren and Brandis was, they defended the idea of a new privacy port and we got that tort [00:09:00] and that tort ended up having four dimensions, which William Prosser recognized in, in, uh, in 1960 article, and then, uh, found it way its way into the restatement of torts in, in the 1960s.

But the idea there is that privacy is, um, all about intrusion upon seclusion. We have a right to, to not be, have our seclusion intruded upon. It's about having rights against publication and private. It's a right against having our name and likeness being appropriated. And it's a right against having our, our, uh, reputations, our identities ourselves placed in a false light.

So that's what privacy be started off, meaning in American law in the, uh, early 20th century, starting with the, the p versus New England life insurance case where Georgia became the first Supreme Court to, to adopt the right to privacy. But it quickly, it became a right throughout the country. And most states now have a, you know, a, a common law right to privacy or, or similar something by statute.

Then in the 1950s, we began to see the privacy concept used in constitutional law and [00:10:00] a, a shining moment is, uh, the case of NAACP versus Alabama, NAACP versus Alabama, in which African Americans sought to have protection from the state of Alabama's ruling that they. Had to turn over their membership list to the state as part of their corporate law approval processes.

So Alabama had a law requiring the corporations, uh, registered with the state. They decided that NAACP was a corporation that had not registered with the state. They went after them for for that, and they ended up. Charging them something like quarter million dollars in fines, but also saying, Give us your membership list.

NAACP says, No on the basis of the privacy of the, the members of the NAACP and having their identities not disclosed to the state and not publicized, and our Supreme Court said, Absolute. There's a First Amendment right to associational privacy. . But then we got, we went, moved beyond associational privacy to the mid 1960s when suddenly the court's recognizing something called [00:11:00] expectations of privacy through fourth Amendment law.

We got that. And then also in the middle of the century, we got the first efforts to protect us from having the state regulate birth control and abortion in, in the course of 1970, but the sixties, it was all about ab, about contraception. And there, uh, our Supreme Court. The concept of privacy that they said, you know, well, it's sort of in the First Amendment.

Look at NAACP versus Alabama. It's sort of in the 14th Amendment. Look at all these cases involving due process and, and liberty around families and religion. Uh, it's sort of in the fourth Amendment. And so it's there. We're gonna recognize it as a right, uh, of its own that would stand against, uh, state laws.

Criminalized, uh, contraception. And then that idea was moved on to abortion later. But before that, we got another ruling, which you mentioned, uh, uh, Loving Versus Virginia came after the Griswold case. And in that case, the court used the idea of privacy to help justify the idea that the equal protection of laws, uh, prohibited states from criminalizing.[00:12:00] 

Mixed marriages, mixed race marriages between blacks and and white. So that, that's kind of like the evolution, you know, sort of start with Tort law came into Constitutional law through the First Amendment, then through through the Fourth and Fifth Amendments, and then, then eventually to the 14th Amendment.

And by the time we get to Roe versus Wade, we have the court saying there is a right to privacy against state criminalizing abortion. It's a fundamental right of privacy. It requires strict scrutiny. And that was, I think, the, maybe the, the apex of the right to privacy in American law when the Supreme Court founded to be a fundamental right.

Requiring strict scrutiny and, uh, saying that, you know, unless we can show that the regulation is consistent with ordered liberty and, and, uh, our nation's history and tradition, You know, the statute has to fall. And that's a very, very important concept that, that of a fundamental right, that we've seen kind of fading away over time, uh, in the context of abortion law, but also in other, other places.

So that's, that's like my brief understanding of how things evolve, you know, [00:13:00] from this idea of, of, of freedom from publicity to, to protections against having our intimate lives, right, our lives, our private lives interfered with by government. And then the idea of data privacy. Also, uh, erupted in the 1960s and seventies initially in scholarship by people like, like Alan Weston wrote, Privacy and Freedom, published in 1960s and then's book on on privacy, published in the early 19 seven.

These early. Sociologists and legal scholars got us going with the idea that that American law does and should include a right to privacy and tech. That technology, as it was in the day of born Brandeis, is an increasing threat to our privacy. And both those scholars, Wes and uh, and and are familiar, did talk a lot about, about the threat of computer technology in the ways in which it's gonna change, uh, the information balance in our country.

Yeah. 

[00:13:54] Gus Herwitz: And that, uh, starts to bring us to the, the modern setting, the [00:14:00] mid-century cases that you were just discussing. Those, as you note, are very much founded in the right of privacy against the government. So the government can't do these things restricting or encroaching upon this. Right. And Warren and Brandis was.

They were much more focused on rights of individuals against encroachment by other individuals Frequently when there were some tangible commercial interests, but not always, uh, involved. How does that translate to, or what are the gaps there? Are there gaps? Um, as that translates to the modern setting today, certainly Roe v. Wade, and, uh, it's governmental rights against privacy. 

Very important today, Perhaps more important to be discussing today than in a generation, but, Much of the, the discussion that we have today about privacy is about the rise of big tech and these technology platforms and the panopticon omnipresent surveillance. Are the, the mid-century [00:15:00] rights and the warren and brand rights sufficient? Are there gaps? What, how, how do they translate to and apply in the modern setting? 

[00:15:09] Anita Allen: I think that, that in the United States, it became pretty clear. Even in the early 1970s that there were huge gaps left by tort law, privacy protections, and by the emergent constitutional protections.

And that's why we see the Age of privacy statutes beginning in the 1970s with the Privacy Act being the first. It was, you know, it, it and its Sister Act. The Freedom of Information Act came about in the early 1970s, as did the Fair Credit Reporting Act. These statutes were designed to to function in many different ways, but one of the things they did was.

To, to was to trade on the notion that the information that we give to government or that we give to private businesses for the purposes of government functions or economic transactions, that information needs to be private, needs to be kept private. And these laws required privacy at the government and required [00:16:00] privacy of the, of the, uh, credit sector.

We also got in 1970s the FERPA law, which is the Family Educational and Right to Privacy Act, which means that school records were protected by privacy as a matter of federal law. And in addition to ferpa, we had, uh, several other, other federal statutes come along, including the, um, The Electronic Communications Privacy Act in the early 1980s, which, which protected us against wire tapping and, uh, uh, gave us rights against, against electronic storage of information.

And it gave us rights against, against things like tapping, well tracing.

So, so that act was very important. So the gap, there was a, There is a gap. There was a gap. And federal statutes, Kind of thought to fill in those gaps. Plus, and I think this is an underemphasized point, the state's filled in the gap with their own statute. I mean, in Pennsylvania we have a statute which prohibits lie detection without consent, and, and, and employers can't [00:17:00] even ask their job applicants to undertake.

To undergo wired, sorry, um, el electronic or any kind of lie detection. Uh, and, and then there's state laws against things like revealing aids, hiv aids information, or other healthcare information. And there are state statutes involving library records. And in my state library records are protected by a special privacy statute.

So the states began throwing in the gaps years ago. The Fed began living in the gaps years ago, but Guin, I think this is where you want me to go. We still have gaps, right? Cause, Cause none of these laws, uh, are post, um, Social media, post internet, post big tech, And we, we need, I, I would argue some, uh, some new policies and laws and, and practices to, to deal with the fact that Big Tech has completely changed the game.

The internet and social media has completely changed the game of privacy. They shifted the balance. They've made, um, huge changes in the way that we, that we live and put our data at risk of [00:18:00] exposure, not only through. Traditional means like, um, you know, person A picks up the telephone and tells person B, which can be done with data collected by electronic means, but more importantly through just through electronic means.

I mean, and data breaches sort of accidental and, and nefarious breaches of data are, are a concern, but also just the fact that. Companies have our data. They can, they can analyze it, they can crunch it, they can, they can un massage it and find out things about us that we might, we may or may not want them to know to try to sell us things that we may or may not want or do things we may or may not want to have done.

And that's, that's the problem today. We don't have laws that deal very effectively with all of that. 

[00:18:37] Gus Herwitz: So we'll be, uh, turning to a brief break in a moment, and then, uh, when we come back, I, I, I'd like to turn to the discussion about your recent work on race and privacy, but I, I wonder, um, I, I have no idea what the answer to this question is.

I don't know if, uh, you might have looked into this, uh, or not. I, I wonder all those, uh, 1960s and 19 seven. Era laws that were so [00:19:00] foundational to the, the modern statutory privacy framework. If we were to look at them, would there be a similar story to what we see with Warren and Brandis, uh, and interest convergence story where they were intended to protect certain interests and really excluded or ignored or perhaps were actively harmful to, uh, other interests in particular minority interest.

[00:19:26] Anita Allen: Interesting question. I mean, I've never even thought, for example, about whether, you know, a law that, uh, that, uh, state law, that, uh, Creates a right of privacy around library records. There's a law that is for the richer the poor, or the in between, but it, but if you think about who uses libraries, say for, for research, Say if you've got a fancy IMA computer, home computer, you don't need to go to the public library to do.

Research on a computer, right? So it may well be, it turns out that that library record privacy [00:20:00] is mostly of value to lower income people who, or even homeless people, right? Who need the public library as where they go to find, to get books or to use computers for, for research and, and entertainment. So, uh, we could-

great question, we could look at a lot of state laws, um, um, that were, uh, enacted bef before, around the same time as this sort of, uh, sweep of federal laws, you know, to see whether or not they protect, um, Groups other than the rich another, oh, here's another example. So we have a federal laws that protect the privacy of, Of driver's license records, right?

Well, who drives cars? Almost everybody, of all economic classes. So I wouldn't say that the Privacy Protection Act is an act, which is, um, Aimed at rich people, right? Is an act. Act rained that aim at anybody who, who has, uh, driver's license, whose data might be used by other people. So that, Cause that's sort of an egalitarian one.

The Bork [00:21:00] law is an interesting example. And the federal level, right? So we have this federal statute that protects us from video tape, video information being disclosed that our consent and that statute was passed very quickly through Congress because of a famous. Powerful, um, DC oriented Yale law professor, former Yale law professor, had his video records disclosed from a bookstore in, uh, from a video, uh, store in DC So that, that was like a law Taylor made because of something that's happened to a, to a rich and powerful white man.

And, um, Does it also help the rest of us? Sure. It helps anybody who uses video stores, video rental stores back in the day, you know, the blockbusters, and we don't do that anymore. But, but it used to protect anybody who used to use a blockbuster. And today, you know, we haven't quite seen the scope of, of the old law in the new economy, but it's still protecting rich people who, who, uh, whose video records might be, uh, compromised, might be, might be of interest to the, to the country, but it also [00:22:00] protects ordinary people who, who.

Who read videos via Netflix or, or, or some other such facility. So yeah, so it's, it's a great question. We do have, I think, to ask about all of our laws, who are the winners and who are the losers and who's left out. Absolutely. 

[00:22:17] Gus Herwitz: We are speaking with Professor Anita Allen from the University of Pennsylvania.

We'll be back in a moment to continue our discussion and turn a bit to her, uh, recent work on race and privacy, along with other privacy related.

[00:22:36] Lysandra Marquez: Hi listeners. I'm Lysandra Marquez and I'm one of the producers of Tech Refactored. I hope you're enjoying this episode of our show. One of my favorite things about being one of the producers of Tech Refactored is coming up with episode ideas and meeting all of our amazing guests. We especially love it when we get audience suggestions.

Do you have an idea for Tech Refactored? Is there some thorny tech issue? You'd love to [00:23:00] hear us break. Visit our website or tweet us at UNL underscore NGTC to submit your ideas to the show. And don't forget, the best way to help is continue making content like this episode is word of mouth. So ask your friends if they have an idea too.

Now, back to this episode of Tech Refactored.

[00:23:28] Gus Herwitz: We are. Talking with Professor Anita Allen from the University of Pennsylvania. And I I, I'd like to start, uh, by asking just directly on, uh, your recent work on race and privacy, what, what are some of the surprising things that you think folks don't know but should know about the role in race and privacy law?

[00:23:51] Anita Allen: Well, I think one surprise might be how little the connection between race and privacy has been [00:24:00] directly studied. Privacy scholars know names like Khiara Bridges, who wrote about, uh, the poverty of privacy. People like Oscar Gandy who wrote the book, Panoptic Sort. Um, scholars know about more recent work by people like Simone Brown and uh, Charlton McIlwain.

But one surprises how little work is known and how little work there is that people have only very recently begun to study, uh, race and connection with privacy overtly, even though it's been recognized for many decades that African Americans face special privacy problems. I, I, um, I'm very interested in, in, uh, keeping track of the original, original books and articles written about privacy.

I try to go back to the 1950s and just to keep up, I'm trying to find all examples of, of people talking about race, African Americans and privacy going way, way back. And I discovered that the, that the first three or four books about privacy, um, you know, the band's Packard [00:25:00] book, the, the, the Allen Weston book, the author Miller book is a book by a lawyer named Mayor.

All these books mention. It's the same kinds of things. They mentioned things like, um, Like, uh, psychology testing for employers, how testing can, uh, discriminate against African Americans. How, because of protest African Americans are subjected to, to, to wire tapping and observation because cause of rat, of fear, of radicalism and, and so forth and so, so African Americans as a watched people.

Was recognized by the early white privacy scholars who wrote about privacy, yet they didn't, I'd say focus on it. It sort of, it sort of in passing mention, and they particularly mentioned Martin Luther King being wire tapped, protestors being arrested and, and, and, and monitored. And, uh, this problem about.

Psychology testing and those kinds of testing that, that, uh, disadvantage, uh, people of color because they probe into their ideas and their, and their experiences and find them to be of concern because, you know, racism causes our behaviors to be [00:26:00] skewed in a certain way. And so, so people were denied jobs because of, uh, of unfair testing practices.

So all of that, you know, is part of the, is part of the history. But I, I do think that that's where we have. Begin and then it's surprising how little people seem to care about black people's privacy in the beginning and how little scholarship there is and how much though has come about just in the last five to 10 years. It's exciting. It's terribly exciting. 

[00:26:24] Gus Herwitz: And, uh, I know probably the most famous example, at least that comes to my mind in this area. Henrietta Lacks, of course, was a black woman who was so instrumental having her genetic material stolen for scientific research effectively. Uh, are are there any other examples that, uh, you would give to listeners just to stick in their minds that if they remember later this afternoon or a couple hours from now and they want to Google them, would be useful for them to look up?

[00:26:54] Anita Allen: Well, I love the example of Henrietta Lacks, who was an African American woman who, [00:27:00] um, was treated for a very vicious kind of ovarian cancer from which she eventually died. And unbeknownst to her, the doctors at Hopkins, John Hopkins University, they, they saved some of her tumor cells and they proved to be, Something that some researchers were looking for, which is to say, a, a cell that could be turned into a, a, a, a perpetual cell line.

And so we still today have cell lines based on Henrietta Lacks' tumor that is now decades and decades and decades old. But her family was quite outraged when they learned that unbeknownst to her, unbeknownst to them, her cells have been, have been commercialized. And it, and it is a privacy issue. I mean, it's, it's a privacy issue that-

that is of concern because it, it does go against the idea of informed consent also in, in the medical research area, but another example that people may not. Surprised about that. I think it was worth looking at. You wanna Google a story about privacy invasion, uh, that maybe shocking? Dr. [00:28:00] Nikita Levy. I'm gonna dump a little bit on Johns Hopkins today.

He was also a Johns Hopkins University Dr. OB/GYN, who for decades, treated hundreds, if not over a thousand black, uh, and, and brown women living in the, in the Baltimore area, and he was beloved. But, uh, one of his assistants became concerned about some, some instruments she saw laying around his office.

And she took one home and discovered it was a, a security device that, well, a secret, a surveillance device that was taking video of the, of his patients. And so she, she called the authorities and he was the next day, um, arrested and, and uh, it turned out he had been using, uh, spy devices in his office, including like an ink pen that looked like a, that was a camera in his pocket to take video of his patients.

That's the story worth looking into it. Cause it turned out that he broke the hearts and, and ruined the lives of- of many people. And he was so upset actually, that when he was arrested that he actually committed suicide. Uh, but when the police went to his apartment, his home, excuse me, not his apartment, his [00:29:00] home, they found a bank of, of computers that were serving up internet porn, essentially the bodies of black women to the public.

Uh, and then the women also were concerned because, um, they didn't like they, the police then had access to all these intimate images of their genitalia and their breasts. And even though it was hard to identify a person based on their, their, just their genitalia and their, their breasts, it's still a privacy violation.

Right. To know that one's, um, images have been collected and then shared with, first with the general public over the internet and then with the police in Baltimore who are not necessarily well trusted by the people of Baltimore. I actually, um, was asked by a law firm to go down to Baltimore and meet with, uh, a couple hundred women who were victims of Doctor Nikita Levy, by the way, who himself was, was an, uh, African American and an immigrant.

He was beloved by these women. And you have never, I have never been among so many truly heartbroken and shattered women as I experienced that they just simply could not believe that they and the [00:30:00] daughters he told them to bring to him for care were turned into pornography. It was, and, and I was there to talk about this as a privacy issue.

And, and then the good thing about this story has a happy ending in one sort of perverse sense, which is that there was a legal settlement. I mean, Hopkins ended up paying a lot of money to the women who were, um, part of the class action that was brought on behalf of them as to result this. So that's a, that's an example.

It's a, I, I told the story maybe more detail than you wanted it, but it's a great example. And just Google the name. Uh, Nikita, Dr. Nikita Levy, spelled L E V I, um, John Hopkins Hospital, OB/GYN Spy Pen. And you'll find this exciting and troubling story. 

[00:30:42] Gus Herwitz: Yeah. And it, it. Such a powerful, and in many ways unfortunate reminder of the importance of privacy. We, we, or at least I, uh, talk about privacy, is so much a dignitary, right? And that means it, it's part of a person. It's part of you. That's [00:31:00] being violated. And when we're talking modern discussions about privacy so frequently are about, uh, well, what webpages did you vi- well, did you visit? Or, who are your friends on Twitter?

Or something like that. It, it's a curious continuum. And it's notable. I, I didn't even think twice before, uh, asking you, Can you suggest to listeners, someone to Google that's, I I kind of just asked you, can you suggest to listeners to go violate someone's privacy by googling them without thinking twice?

And now I feel, feel kind of terrible about that. 

[00:31:35] Anita Allen: But isn't it a wonderful thing at the same time that people who wanna understand, um, The privacy threats existing in our society can use a tool like Google, right, to uncover knowledge that they need to protect themselves moving forward. I've learned a lot by Googling , you know- Geo Sense,

Let's Google that and let's learn about the ways in which these new warrants [00:32:00] are giving Google and uh, and police wide access to people's. Uh, geolocation information, but that, you know, so, so I, I appreciate your point though that, you know, we are, we're so comfortable with, um, search engines that we, that we recommend search engines.

Even as we talk about the privacy of people who that is sometimes compromised by search engines, it's, it's the predicament of our times. Right. But I also wanna say that when I was dealing with the women in Baltimore, I was, it was a great example for me of. See, I believe that privacy laws are differentially enforced and differentially obeyed.

I think that a doctor that, that a urologist, um, a doctor who treated men, white men, would be far less likely to try to pull off a scheme like the one I described. With Dr. Nikita because they don't, they are more respectful, I believe, of the, of the rights of, of men, of white men. I think people are, not, even, other black people are not necessarily re respectful of the, of the, the rights, privacy, rights and interests of [00:33:00] poor black women.

And so, They're more easily victimized, more likely victimized. I, I truly believe that there's a sort of disparate impact and, and not, and compliance impact differences when it comes to privacy and people, Some people's privacy is more protected than others by laws that look facially or everybody has a right not to have their intrusions secluded.

There's a seclusion intruded upon, but some people are more likely targets of intrusion than others. I would. So thank you for, for that, that, that question. I, um, I, I think that, um, we've kind of shifted from privacy as protection of bodily images and in our facial features and our physical bot to, to, to data.

And I think that shift though, as you say, it is a kind of a, it's, it's not like we move from one thing or another. I think some people who do technology studies don't like to talk about the old privacy, right? They only wanna talk about data. They only only talk about data process. I only wanna talk about that, that algorithms, but I think it's a mistake not to see the continuum between [00:34:00] worrying about having your body on a camera, offline camera, and having your body on an online camera and having your data or information about you or bits of information that obtained from you being, being out of your control, under the control of others for their use and their exploit.

[00:34:18] Gus Herwitz: So I, I'd like to follow up on your point about, uh, differential treatment between or of, uh, different communities. And I, I'm curious your take on one of the common arguments that I hear in discussions about ad supported business models in particular, and that this is, uh, a starting to, uh, center her the discussion on the, the social media and the, the surveillance capitalism as it's sometimes referred to era that we're living in.

I guess I'll, I'll just generally ask, um, are the interests of disadvantaged, marginalized communities being represented in our discussions of [00:35:00] ad supported business models? And that the argument that I frequently hear is, um, If we were to do away with ad supported business models, if that would mean companies still need revenue somehow, we're going to be transitioning to a subscription based model.

For most services, you're not gonna be able to get free articles from newspapers or Facebook and Twitter won't be free, may or may not be better for the world. I won't comment on that. But in a subscription based world, who can afford to use these service? Well affluent people, people with money, and in which case those who are, uh, less affluent, poor, generally from marginalized communities, they're just not gonna have access to these services online.

Is that a fair argument? What's your response to that argument, I guess is the question. 

[00:35:53] Anita Allen: So from my point of view, African Americans are vulnerable to three, um, buckets of discrimination. [00:36:00] One of them is discriminatory, exclusion through, say, targeted advertising. But the other two are discriminatory, targeted surveillance over surveillance and discriminatory predation con jobs, scams, exploitation online.

And all three of them are serious problems that we need to. It is sometimes said that if we're gonna address the problem of discriminatory exclusion through changing the, um, revenue model of, of online businesses to eliminate targeted advertising or advertising altogether, we're gonna disadvantage, uh, the public, including minority group members who depend upon and can't, uh, free.

Free services from your Facebooks and Twitters and so forth, and also like other people benefit from these, these services. So that's, that's an interesting point, a shocking analogy. During the days of slavery, , there were lots of people who abolitionists who said, Let's get rid of slavery. It's, it's a terrible thing.

It's not good for, for black [00:37:00] people. Let's get rid of slavery. And then there are people say, But no slavery's good for black people because if we don't have slavery, black people will be living in poverty. They will have, they'll be uncivilized, they won't have religion, they won't have anyone to discipline them or anyone to, to, to help them out of their problem.

Cause they're not very intelligent and they're lazy and they have blah, blah, blah. Let's keep slavery for the sake of African Americans. Whenever I hear the argument, which you recited, I, I think about this analogy to the defense of slavery, um, because there are going to be some practices in the world.

That are, that have advantages for folks, even the people who you're out to protect, and that that helps to justify continuing the, the, the invasion intrusion, the, the, the thing which would otherwise be seen as a wrong. And I just think we have to be very. Self critical about allowing ourselves to think that those arguments, arguments with that [00:38:00] structure, they themselves will suffer if we take this yolk off their shoulders.

We have to be very careful about arguments with that structure. And so I, I'm not convinced that the world would be so bad if, if, if we forced the platform economy to, um, Change some of its funding models to protect more, keep more of our information secure and that require us to give up information. And therefore Assana, Zup, they give up ourselves, right?

In exchange for, for using communication resources, commercial resources, transaction resources, health resources, et cetera. So that's where I am right now, as being, I, I would like to, I mean, I would be willing to take the risk of stopping it. So for example, there, there's not very much harm in stopping, targeting, advertise.

At all. So we've seen that that targeted advertising, [00:39:00] advertising is not targeted can help to support our platform economy quite well. What if we gave up all advertising as a way to to fund these companies? I don't know what we, how, I don't know how they would respond, but I would like to see how they might respond.

Although, and I don't think that advertisement per se is a bad thing, but I certainly do say that this, that there are people who. Watch a lot fewer movies on Netflix or Hulu if they have to sit through too many ads. We gave up television in order to, to escape ads. If we have to discover ads on Facebook and Instagram and TikTok and, and Amazon and, and everywhere.

I mean, everywhere we go, it, it just, I mean, I think people are gonna get sick of it and, and, and it's gonna diminish the, the attractiveness of. Those, um, facilities such an extent that, that, that their businesses harmed that way as well. And I think we, we've already maybe gone too far. Some of these companies have gone too far in, in intrusive advertising and targeting [00:40:00] advertising, both of which are problems.

I don't think the intrusiveness is a particularly a black problem. I do think that targeting is a, is a problem that African American space, Cause that might, might mean they're gonna be disadvantaged because of racist stereotypes or data st. Bad algorithms, they're gonna be left out excluded and denied unfairly.

[00:40:18] Gus Herwitz: So I, I had wanted to ask you a series of questions about things that states the federal government and other countries are doing right or wrong, and their policy regulations, but I, I've allowed myself to delight too much in asking you other questions. Um, and I've lost track of those questions instead of asking a series of questions.

Throw them all together in a single omnibus question. From a legislative and regulatory purpose, how should we be thinking about privacy regulation in particular, keeping race in mind? 

[00:40:52] Anita Allen: So number one, I believe we should keep race in mind that as we, um, move forward toward new [00:41:00] state and federal legislation inspired in some cases by the GDPR or by the example of California, we should keep race in.

And that means that every person who's, uh, every policymaker, every lobbyist, every governor signing a bill, every president signing a, an act of Congress should be mindful of the ways in which the law will or will not help to move us beyond the black opticon, the black opticon of over surveillance, exclusion and predation.

The black opticon of of, of the pan opticon, the ban opticon, the con opticon I call them. That's very, I. And that means that people like me, people like you, everybody who's involved in in privacy studies needs to be helping to educate lawmakers about the ways in which people of color, including African Americans, are disadvantaged by some of the legal, um, Provisions and models that we've lived by up till now, and we could do something proactively.

We could [00:42:00] design privacy laws to be more racially equitable by naming and, and addressing specifically the issues of, of disparate impact and, and racism. And the panopticon in our, uh, attempts to, to create a, a legal framework that, that is fair, equitable, and, uh, effective against, against the, uh, the, the, the, the evils of the data.

So that's why I think we know just by, we have to be very, very, um, we have to educate, educate ourselves about those things in order to then, uh, create a legal regime that, that is better than the ones that came before. I don't see a lot of a focus on. Race in, uh, most people's, most of the bills I've seen introduced in Congress or in the States, but there are now, and this is a good thing, there are now anti-discrimination provisions in most of the proposed state and federal laws or provisions that say that nothing here shall be, nothing shall be allowed in that.

You know, how, how can I put this? Nothing. That violates [00:43:00] existing civil rights laws, you know, will be permitted. So you have both a referral to existing civil rights law, civil rights laws, and, and embedded antidiscrimination provisions in these laws. And that's a good thing. My concern is that just the word anti-discrimination or civil rights does in a statute does not guarantee.

That there are measures within the law to help to dismantle the black opton. But, but again, it's a, it's a good thing that these statutes mostly have some kind of anti-discrimination or civil rights provision. And then another thing I would say is that we don't have a lot of laws that. Specifically mentioned African Americans or Hispanics or Asian.

But I, but I do think that we need to have, uh, have a consciousness about the ways in which our laws do have these impacts on particular groups and be willing to talk about those things openly. And we don't talk about them openly. They're gonna, they're gonna be overlooked, they're gonna be pushed aside.

And I really, really, The notion that somehow we have to, we're gonna be told, Okay, African Americans, wait, we're gonna get a [00:44:00] privacy law on the books. It may not be what you want. We're gonna get one on the books and then we'll get to you. Like putting people of color off for the future in order to, to get something through, uh, a Congress or get something through a state legislature.

That is a bad move. We need to, right now, right now, figure. How the protection of people of color fits into the scheme of privacy regulation on the state and federal level, and be explicit about. 

[00:44:27] Gus Herwitz: So it, it seems that one of the, um, necessary inputs to a lot of legislation that would be more conscious of these issues is research, academic research, and empirical research on the potential disparate effects.

So I, I'd like to, uh, just end our discussion. Asking from your perspective as an academic and a researcher, what, what are the some of the important questions that you think people should be studying, and if there are any doctoral students or [00:45:00] aspiring academics out there, any nudges that you would give them for interesting questions that you would like them to, uh, try and invest?

[00:45:07] Anita Allen: Let me just give a shout out to some of my actual graduate students. So I'm mentoring a, a graduate student, uh, working on an SJD thesis about privacy harms and this some man of color, um, who happens to be an African immigrant, is, uh, very interested in the ways in which we are defining privacy harms and the ways in which we are, uh, finding remedies for privacy harms.

So that's a thing to be pursuing. Like what, what counts as the privacy harm and, uh, and how can we address those harms? How we made those harms. I have another graduate student, a man of color, African American man of color, in this case, philosophy, PhD student who's studying a algorithmic fairness and is using applied mathematical, empirical and empirical, uh, skills to look very closely at, at issues around the ways in which algorithms, uh, introduce unfairness and injustice into our society.[00:46:00] 

So, so I, I just gave you, took two examples of some of the things that I think are worth exploring. The issue of harm, the issue of algorithmic fairness. Those are two very important areas and it seems like a lot of people of color are interested in facial recognition. And I do think that we need to get, get our heads together through research around ways to, um, Exploit the good aspects of facial recognition and maybe also gate recognition and other kinds of, of biometric identification, right?

But don't so blatantly, uh, harm black people by discriminating against them because of technology can't deal with, with black and brown or female faces. So that's another thing. I think it's worth a lot of attention and, and study. Um, And I am very, myself, very interested in sort of an excavation process.

I'd like to excavate more examples of what I'm calling the Black Opticon on examples of ways in which people are being, um, taken advantage of. The con opticon people are being excluded and denied people are being over surveillance. We have a [00:47:00] lot of information about over surveillance, but I think we have less about, about the other two categories of the black opticon.

And I just wanna say, I know we're running of time, but I just wanna say that you mentioned earlier, um, You know, like who, what examples might one give of, of people who are African American and facing problems with privacy and, and so forth. And I, I just wanna say that, you know, one of the things I'd like to see more attention to, and it's kinda a research thing, is, um, The heroes of privacy heroes and heroines of privacy.

Cause it turns out that African Americans have been phenomenal in advancing our privacy rights. And it, it's not a story often told, and I'm trying to write about it a little bit now, but, but think about it. I mean that African Americans behind the Supreme Court case, NAACP versus Alabama, that establish the right of associational privacy.

You have black people petitioners to the Supreme Court behind. Most of the high profile criminal procedure cases that define what it means to have expectations of privacy. Some of those cases [00:48:00] were lost by the black petitioners, but some were one. But the point is that, is that black people, their experiences are shaping, uh, the way we think about Fourth Amendment privacy considerably.

Loving versus Virginia, we have the right to marry the person of our choice because a black and native American woman had the courage to, to find a lawyer, to defend her right, to be married to her husband and not her white husband, and not be put in jail in Virginia. Faye Walton, a black woman, has he helped for decades to maintain the right to abortion choice through her advocacy and her presidency at the, at the, uh, planned.

Parenthood, Federation of America, I could go on and on and on. Right now, organizations like Color of Change are, have their impact. So Latonya Sweeney, the, the wonderful scholar at Harvard who's doing such great work on a technical side with engineering and privacy, uh, issues around algorithm, fairness and, and pseudonymity and so forth.

So, and there are always black people who are doing amazing work that benefits [00:49:00] everybody's privacy. And that's, I think something we just celebrate. Maybe that's a good place to end. 

[00:49:05] Gus Herwitz: Yeah. Um, I, I think that is a wonderful place to end, both echoing the amount of work to be done and that is being done, but also the great people who have been doing it and are doing it and not, not to be perhaps, Overly effusive about the opportunity to speak with you, but you are one of those people.

Uh, and it has been a, uh, real pleasure and privilege to, uh, take this time to talk with you. Uh, today we've been speaking with Professor Anita Allen. Her, uh, recent article published in the Yale Law Journal Forum is dismantling the Black Opticon Privacy Race, Equity, and Online Data Protection Reform. You can find that by googling.

Which will not violate the article's privacy rights. And thank you as always to our listeners for, uh, joining us. I have been your host, thus Horowitz. This has been tech Refactor. [00:50:00] If you want to learn more about what we're doing here at the Nebraska Governance and Technology Center, or to submit an idea for a future episode, you can go to our website@ngtc.unl.edu, or you can follow us on Twitter at UNL underscore NGTC.

If you enjoyed the show, please don't forget to leave us a rating and review wherever you listen to your podcasts. Our show is produced by Elizabeth Magilton and Lysandra Marquez and Colin McCarthy created and recorded our theme. This podcast is part of the Menard Governance and Technology Programming Series.

Until next time. Keep asking those hard questions.

[00:50:59] Anita Allen: Wow.[00:51:00]