Tech Refactored

S2E37 - Sue Glueck on Life and Technology

April 20, 2022 Season 2 Episode 37
Tech Refactored
S2E37 - Sue Glueck on Life and Technology
Show Notes Transcript

Most professionals will tell you, career paths are rarely a straight line. On this episode guest co-hosts Elana Zeide and Elsbeth Magilton welcome our advisory board member, Sue Glueck. Glueck is the Senior Director of Academic Relations for Microsoft Corporation’s legal department where she works on AI and ethics, the future of work, privacy, and a myriad of other topics. In March 2022 she met group of law and graduate students at Nebraska about figuring out “Life After Graduation.” She spoke with us about her journey and her current interests at Microsoft - including quantum computing!

Sue provided some additional links for listeners! Check them out here:

Disclaimer: This transcript is auto-generated and has not been thoroughly reviewed for completeness or accuracy.

[00:00:00] Elsbeth Magilton: This is Tech Refactored. I'm one of your co-hosts today, Elsbeth Magilton, the Executive Director of the Nebraska Governance and Technology Center at the University of Nebraska. 

[00:00:21] Elana Zeide: I'm your other co-host, Elana Zeide, a professor at the college of. And with the center today, we are joined by my friend and our advisory board member, Sue Glueck. Sue is the Senior Director of Academic Relations from Microsoft Corporations legal department. There. She works on AI and ethics, the future of work privacy, and a myriad of other truly fascinating topics. Prior to that role, she was an assistant general counsel in Microsoft's regulatory affairs team, where she led a legal team that provided privacy advice to engineering groups that create software products and [00:01:00] online services.

[00:01:01] Elana Zeide: She's authored privacy guidelines, international standards, and even had privacy policies that she has written praised by the mainstream press. 

[00:01:10] Sue Glueck: Hi, I'm so delighted, uh, to be with you guys today, although I should make a confession. This is my first podcast. 

[00:01:21] Elsbeth Magilton: Oh that's so exciting, that we get to be the first. I love it.

[00:01:25] Sue Glueck: Yeah. It, it, it's, it's exciting and, um, yeah, we'll see how this goes. 

[00:01:31] Elsbeth Magilton: Well, you just met with a group of law and graduate students from the university talking about life after graduation, which is the one thing on all of their minds all the time, which is, what am I gonna do when I leave here? Why am I doing what I'm doing here? And all of those sorts of existential crises waiting to happen right as they march through the academic process. Something that probably all of us, Experienced. Uh, so today we're looking forward to talking to you a little bit more about your life after graduation and your [00:02:00] career journey and the advice you have for students.

Uh, and then in our second half we'll talk a little bit more about your work and the things that excite you and your position at Microsoft. 

[00:02:09] Sue Glueck: That, that sounds great. Um, and what a great group today. They were lovely. I'm always, um, sort of touched when I throw it back to the room to say, Okay, what do you think?

What, Is there something you'd like to share? Do you have any questions? And then there's that pause and you have to let that pause. Just simmer and then some brave soul. You know, throw throws their hat in the ring and goes ahead and talks about what, what they're thinking about. And you can, you can kind of tell how nervous people are sometimes they can't say, Well, I'm doing X though.

It's more like, Well, what if you blah, blah, blah? You know, it's a way of distancing one's self from how that is. But, um, I just, I thought this was [00:03:00] such a lovely group of students and I just wish we could have done it fully in person. Yeah, because I would've loved to see their, their smiling faces for sure.

[00:03:10] Elana Zeide: I think you would've been trapped for about several hours giving additional advice if that, if, if you were in person 

[00:03:15] Sue Glueck: though, , I, I love it because, um, gosh, when I was in college for sure, and also when I was in law school, I, I didn't take advantage of all the. You know, quote unquote adults who had jobs that I could have talked to because I was too shy.

I didn't wanna, you know, inter Oh, I didn't wanna impose on their time. And, um, I think it's, I just, you know, I wish I could go back in time and tell myself It's okay. You can, you can ask people and that's why I do like to do, and I, I did at the end a little net how to network, how to think about networking, [00:04:00] uh, session because people like to talk about themselves like it's, it's.

You'll never get someone who said, when you say, you know, Oh, you have such an illustrious career, I'd love to chat with you about how you got there. As I reflect on what I would like to do, um, never once will someone say no to that kind of invitation. Not even once. So, I mean, other than, Hey, I'm really busy now.

Can we talk in three weeks? That's a thing. But, but no, no. Flat, No, but I, I think. Yeah. I just, I feel so much empathy for them. This is, it's, it's tough when you're trying to imagine your future and, and really go for it. Mm-hmm. . 

[00:04:42] Elsbeth Magilton: Well, on that note, let's, let's turn this on you. I do wanna hear a little bit more about your early career.

Um, can you tell us about, you know, what was your undergraduate degree, what took you to law school and sort of that educational part of your journey? 

[00:04:56] Sue Glueck: Sure. So, um, [00:05:00] I went to college, uh, a little longer than most people do because I had a really hard time figuring out what I liked. Um, and that's probably why I like doing this kind of thing so much now that I do know what, uh, what I wanna be when I grow up.

I started out as a math major at Stanford and quickly realized that this would not be a good path for me. Um, You know, the, the emphasis was not on teaching, it was on for the professors. It was all on research. And so that can be a really bad experience. And I was, um, 16. When I started college. So it turned 17.

Cause I, Wow. I did, I skipped the eighth grade. Um, yeah. And I was maybe a little young to be all the way across the country from my parents going to, I, I got, I got lost. Frankly. I got lost. Um, which is how I wound up initially, um, majoring in [00:06:00] petroleum engineering when I wasn't particularly interested in it.

But it paid well. I have an engineering brain, so that seemed like a thing until, um, as I mentioned in the talk, uh, as a summer rast about for an oil company in Central California. I, my other summer hire and I compared pay stubs once cuz he thought they were taking too much tax out and it turns out they were paying him a hundred dollars more a month than they were paying me.

and that was the moment when I started to think maybe my grandmother was right. She had always wanted me to go to law school. I thought if I go to law school, no one will ever do this particular bad thing to me ever again. Cuz it, it'll shift the balance of power, you know? Um, so I wound up changing my major to industrial engineering, um, and.

Sort of accidentally got another [00:07:00] undergrad degree in psychology. And um, you know, it was mostly for, I took a lot of psych classes for summer jobs, so I did sleep research in college, but you had to have. Fairly significant training to be able to do that. Um, and I, I've always been a night owl, so it was great to be able to have part-time job during the school year and in the summer, um, that called on me to be a night owl.

Um, and I, yeah, I also, um, worked in a preschool for a while and had to take classes for that. So I was like, I might as well get a bachelor's degree in this since I have so many classes. Um, so. After, after, uh, graduating, um, I went to work for IBM as a programmer, and gosh, parts of it, I loved solving technical puzzles that no one had ever [00:08:00] solved.

Um, that was an incredible feeling to do that. But you also spend so much time in a room by yourself that I, I found that oppressive and I love the parts where I would hire a consultant. It was software for, um, high school and college math classes. And I'd hire a math teacher, basically head of the math department at, at

Pal- Pa- Palo Alto High School, um, was my consultant. We became really good friends and we'd spend evenings like working on a part of the product that really called for, for teacher's assistance, and it was, it was glorious. But there weren't enough moments like that, so I was like, I'm going to law school.

Cause then surely I will be collaborating. And I watched LA Law like, you know, we'd be in court every week and, and there'd be, you know, Yeah, Tom-foolery in, in with boxes of documents and okay, [00:09:00] so then it turns out that no, you're, you're in a room by yourself when you're working at a law firm. And, you know, I, I had all of this completely, completely wrong.

Um, and it, you know, I, I've had for a significant period of time, It was hard to leave IBM to go to law school because nobody was paying me anymore. You know? That was the thing. But also, oh my goodness. So much reading, so much reading and writing these things that as an engineer, I didn't really, hadn't done that much of, um, But you know, I, I got the rhythm of it and I liked it, but I, I did second guess myself quite a bit.

So after leaving the big firm, after a few years, I went to work for a startup. Um, and I got to do technical work part of the time and part of the time legal work. And I thought, here I am living the dream. Yeah. Except I. [00:10:00] Of course , 

[00:10:02] Elsbeth Magilton: you know, I, that whole story, I, I do wanna backtrack cuz I, I have a couple, or just a question.

First of all, I, this is so wonderful. I, so I was a web developer and then left being a web developer to go to law school. And so had that similar experience of like, okay, I like tech, but I don't like sitting in this room by myself, . And then I went to law school, so I, I did the same fundamental miscalculation that you did, but it's, but it's such a wonderful Pat.

I think a lot can relate probably as well. 

[00:10:29] Sue Glueck: Journalist. I was a journalist. Oh, journalist. You spend a lot of time by yourself doing stuff. Go to law school.

[00:10:40] Elsbeth Magilton: I love is the moral of the story that like all adult professions just require to be alone. A lot of the time that I'm starting to be worried that's what we're coming to or something, it's maybe a teacher, right? Or could just be that, you know, there's a lot of myths surrounding what a law career is actually like.

[00:10:57] Sue Glueck: Yes. It would've helped if I'd [00:11:00] talked to an actual lawyer before embarking on all of this, but how could I, You know, they're busy, they're important, and I didn't recognize that that could have been a great way to step into a summer job if I had, I was living in Palo Alto, California. And you know, like there were on page Mill Road, there, there were office complexes that, that there weren't so many people I could've introduced myself to, um, but I was too shy to impose on their time.

Mm-hmm. . And I think I would've saved myself some. Some heartache because the better informed you are about what I, I hopefully I still would've gone, cuz this turned out it's a good gig. But you, you know, it's turned out well. But uh, along the way, so many missteps. 

[00:11:51] Elsbeth Magilton: I wanna ask you about, you mentioned at the beginning that, you know, comparing, which I love wage transparency.

Love that. Love that. You compared paycheck stubs with a [00:12:00] colleague and saw that wage transparency. But the life of a programmer and the life of a, of a woman in technology, even in the legal field, is that something that you, some of those challenges, something you started experiencing early on in your career?

[00:12:16] Sue Glueck: So at IBM was glorious because we had, I don't know if they still have this, but they would offer early retirement to people and then let them come back as sort of consultants for like six months of the year and then go back to. Not working at IBM and, and, and so on. So we had a bunch of retirees who kind of looked at me as a daughter and were full of friendly advice.

Not all of them thought law school was a great idea. Some of them were my biggest cheerleaders for that. But some were like, Hey, you're talented technically. Are you sure this is the thing you should be doing? So there was like zero sexism in, in, in that job. Um, [00:13:00] and as it turns out, I'm tone deaf to sexism as applied to.

Other people I can see that, like I'm the first person to see it and, and do try to do something about it. But I've come out of meetings where someone has said to me, Oh, I can't believe so and so is such a jerk, blah, blah blah. And I'm like, I literally have no idea what you're talking about cuz I didn't hear anything wrong.

I just, I think it's helpful to be a little tone deaf, quite, quite honestly, because, but there was. There was one, one spectacular thing in my first year at Microsoft. Um, uh, a very senior patent lawyer and I were meeting with our shared clients to talk about something and I, I had not met him before, but he had a reputation for being kind of misogynistic.

So I was, I was, I guess I was primed and this is how I could actually see it, you know? Um, he [00:14:00] comes in, He leans back in his chair, he puts his feet up on the conference room table, which is like, I'm like, Oh my God. People eat lunch there. Like, what are you doing? That's so gross. And he doesn't have a laptop, paper, pen, nothing.

So instantly I knew he expected that I would take notes and share my notes with him. I always take notes and I'm a great verbatim note taker, so I bidded, I took notes, ibid my time and at the end of the meeting, but I didn't think he would do it in front of the clients, but he did. He was like, Hey Sue, uh, email me your notes, would ya?

And I. No. And he was like, What? No, come on. You were taking notes, weren't you? I'm like, Yeah, email me your notes. And I said, Oh, oh, Neil, just because there's a woman in a room taking notes does not make her your secretary [00:15:00] and the clients alliance. And it was, but I had like the whole hour to come up with what I was gonna say if he asked me for my notes.

So it it, but I, I, I tried to make it sound spontaneous, you know? He was so nice to me from that moment forward, he was so respectful and wonderful that it's the, I'm not gonna say he was a bully, he wasn't bullying me, but it's kinda that, that rule with a bully that if you stand up to the bully, Then the problem goes away usually.

[00:15:36] Elsbeth Magilton: Yeah. Wow. That's a, that's one of those moments where you like wanna put on your sunglasses walking out of the building. , like the explosion behind you and the action film. It just feels so satisfying. 

[00:15:51] Sue Glueck: It, it's pos possibly the greatest moment in my career. 

[00:15:56] Elsbeth Magilton: Thinking more about your current position, can you tell us a little bit more about what [00:16:00] it is that you do at Microsoft and sort of what's your favorite part of that is, um, as it relates to your journey, um, throughout your career.

[00:16:08] Sue Glueck: So I have the best job. I'm not sure I should tell you guys about my job. Well, Alana already knows, but I'm not totally sure I should tell you because do I really wanna set up this kind of competition for myself that I have the best job at the company and, um, not afraid to, I. , um, I think I've even said it to our president, Brad Smith, um, where I'm like, Oh, I have the best job.

And I'm like, I know you like your job and your job's very cool, but really I have the best job. So, um, we have a number of tech policy topics that we focus on because Microsoft. I, I honestly think because of our antitrust woes, um, in the late nineties, that it really shaped the [00:17:00] culture of the company. Um, we're a company that wants to comply with laws, in fact wants to have laws.

Back in my privacy days in, um, 2007 ish, I think someone, uh, a colleague drafted a US privacy law that he thought could be acceptable to the European Union for adequacy, for data transfers, and, um, We started trying to drum up support for it. Everyone in industry was completely opposed to it. Um, and, you know, Congress maybe hasn't been so great at passing that kind kind of law, so it didn't, it it, it didn't really go anywhere despite.

Despite our efforts, but that's a place you wanna work. You know, where they actually care about protecting privacy of customers. It was a joy to be a privacy lawyer, quite frankly, [00:18:00] cuz I was always looking out for consumers or, and, and enterprise customers and, and their privacy. So, um, I work at a company that wants the world to be a better place.

We also wanna make money. It's a, it's a business, but, um, we live by values and principles that people actually live up to. We work on tech policy topics. To try to find a good balance between being able to have innovation and so on, but also being responsible to the people who are impacted by our technology.

So AI policy, the ethics of artificial intelligence, the future of work. Um, Those are things I think about quite a bit. Uh, platform regulation, uh, quantum policy. You know, they're obvious when quantum computers are a thing. Yeah, this is, so the cool thing about [00:19:00] my job is like, this is not a big topic for most people today, although Professor Chris Hoof Noggle at Berkeley has.

Uh, co-author to book about, uh, about it. Um, but the stuff that's really gonna be coming around the corner, that's, that's what I'm always looking for. I, I got really interested in AI policy back in 2015, um, and. It, you know? And then if you build it, they will come. And when the corporation is ready to talk about it, and I've been planting seeds and gathering people together, then we know really smart people in academia, um, like Professor Society to uh, turn to and say, Gosh, here's this really hard problem.

What do you think? So we can learn from them. The, that, that, um, you know, that door goes both directions, where when folks in academia are like, Hey, check [00:20:00] out my paper, would this actually work in practice? And we can give feedback about, you know, what, what it's like to be in industry. And I'm kind of the hostess with the most is for that stuff.

And I will say, 

[00:20:15] Elana Zeide: Sue's, Sue's being a little modest here. It's if Sue builds. They will come , she does set the agenda sometimes on what people think about. Um, so it's 

[00:20:28] Elsbeth Magilton: great that she's so open minded about it. I love that. But I, first of all, I, if you have dear listeners, we have a Google doc open with all of our topics for both segments of the show.

And I just muted myself so I could type quantum policy , I feel like. So if you just saw that appear, it's cuz I was like, What, I wanna talk about this, but before we moved to our second segment of the show, um, when we talk about career journeys and we are. Counseling students and talking [00:21:00] to new or younger professionals.

Uh, our society really values success, and I think a lot of them are really focused on being successful each step of the way, which is of course is a wonderful goal. But we all know that failure is part of a part of the journey, right? That part of success is sometimes feeling what it feels like to not be success.

So I always love to ask people and, and only if you're comfortable sharing about a big fail. What was a setback or a failure that you experienced that has been a part of your journey in a meaningful 

[00:21:30] Sue Glueck: way? Ooh, that is a great question. There's so many. How do I choose, really? But I think, um, You know, the one for me that is the most poignant is the story I told with students about coming back from, uh, an extended medical leave only to find that my job was completely changing in a matter of days, and not in a way [00:22:00] that I liked.

Um, I felt so disrespected by that, but the truth is, That I hadn't grown in. I over most of the time, in that, in that job leading a privacy legal team, my skills had grown, my knowledge had grown, and, and the job had grown, and then the job wasn't growing. So that was the moment, and it was a, it was very stressful.

Um, Which may in turn have led to the medical issue, although there's no, there's no real way to trace cause and effect there. But I think enormous stress is not good for you under, under any circumstances. Um, and I wasn't able to see for myself, That, that the job no, was no longer growing in a way that I could grow with it.[00:23:00] 

That in fact it was shrinking and then rapidly shrinking, um, when I came back to work after the medical leave. So I think that if I had been smart enough to recognize. The signs. Then before we got to this very, you gotta be kidding me. This is my first real day back from medical leave and you're destroying everything I loved about my job.

I don't think it would've come to that, but I was so immersed in it and so in love with the parts that I love that I didn't notice, the parts that I love were shrinking over time. So having this more blatant thing happen where, Oh yeah, we're gonna transform your job into something you will loath, and I'm, Oh, okay, so let me tell you about, Inorganic chemistry.

I tried to take inorganic chemistry multiple times, but it was so boring and I was so not interested that every time [00:24:00] I would open the book I would fall asleep. It was like I had narcolepsy, like I would, I would wake up with my face in the book, and that's never happened to me in any other scenario. If I'm not interested, I can't do it.

This is a thing, I, I may not have had perfect self knowledge in guiding my career, but this is the one thing that I've known about myself since inorganic chemistry. I dropped the class twice, um, because I could not, I couldn't make myself do any of it, you know? I mean, I'd go to the lectures and I kind of doze off there as well.

So, um, back to suddenly my job is transformed and I knew that I had to get a different job because I, if I know nothing else about myself, I know that if I don't like it, I can't do it. In college, I got A's and A pluses and B minuses. And the B minuses were the, the things like [00:25:00] inorganic chemistry. I'm still the same person.

[00:25:03] Elsbeth Magilton: That is such excellent advice. Know what you love and know, know th self . Right? And that's a great takeaway, uh, when we've experienced those moments. Require us to, to use our resiliency. Well, we will be right back to discuss a little bit more about Sue's current work and modern issues in AI privacy and her other focus areas at Microsoft.

[00:25:32] Paige Ross: I'm Paige Ross, a student. Fellow at the Nebraska Governance and Technology Center, the student fellows at the center are drawn from across the University of Nebraska, including the colleges of law, business engineering, and journalism and mass communications. In the program, we develop research projects focused on the intersection of society and technology and working in multidisciplinary teams.

Think about how to communicate our work to the. Some of this year's subjects [00:26:00] include designing autonomous vehicles with drivers in mind, satellite congestion, and low earth orbit, and taking the politics out of online content moderation. We have some fun and network with fellow students and faculty too.

The program is open to graduate or law students at the University of Nebraska and welcome students from all departments. Now back to this episode of Tech Refactored.

[00:26:33] Elsbeth Magilton: Welcome back. Alana and I are here with Sue Glueck, the Senior Director of Academic relations, Microsoft Corporations legal department, where she works on AI ethics and the future of work and privacy among other issues. Okay. Let's dive into some of the issues and areas you're working on day to day. 

[00:26:50] Sue Glueck: That sounds great.

[00:26:53] Elana Zeide: So what does the academic relations team at Microsoft actually do?

[00:26:55] Sue Glueck: It's, I mean, the overall [00:27:00] goal is to identify the best and the brightest. Um, whether they're up and coming and sort of junior in their academic career, or they've been at it for quite a long time. On the tech policy topics we care about the most because we need, sometimes we need help, sometimes we wanna be helpful, um, and.

Scholars understand things that they don't necessarily have deep insight into because they work for a university and not a corporation. Sometimes there are topics that. Nobody seems to be working on, and we can kinda help with that. Uh, we can, um, you, you know, we can try to get people interested. And I have, uh, I set up meetings between, uh, It feels funny to tell you this, Alana, because you've experienced it, [00:28:00] but I, I set at meetings between subject matter experts in the relevant fields and professors when they come to visit or, um, during the pandemic, you know, just meeting, meeting online, um, So that we can, you know, maybe get you intrigued in a topic that you haven't thought about before or potentially an angle you haven't thought about before.

Uh, I, I think Elana, you should answer the question actually. 

[00:28:30] Elana Zeide: Well so this one I can, I can, I can speak on personally, which is, um, So I, as some of you know, up there, I do a lot with student privacy and I have to say, uh, Microsoft is very much responsible for some of that, uh, based on some connections and issues. Um, I talked about, uh, early on in my academic career, so, uh...

[00:28:52] Sue Glueck: Well, it was, it was fun to watch. So when I first met, met you, um, you were, uh, collaborating with, [00:29:00] uh, at least brainstorming with, uh, some, uh, one of my colleagues and, um, watching the two of you kind of riff and, you know, well, what if we put the peanut butter with the chocolate?

And with each of you, like you saying, Huh, I might, perhaps I'll write, you know, a paper that delves into that and my colleague saying, Huh, maybe I'll organize an event so people can talk about this. And um, at the time I don't think there was a big focus on student privacy at all, but there were companies.

And, and Microsoft among them providing technology to schools. But it, you know, uh, perhaps not really focusing on FERPA and not looking at, um, gosh, in, in student privacy, there. [00:30:00] There should be so many more laws than, than exist. So what, what do you, what do you do about that and what's, what's the right thing to do?

And Elana, your work and contemplating that kids, the meaning of your permanent record for kids should not be an albatross, a digital albatros that they dragged through their, their school years and off into their work years, um, is incredibly inspiring.

[00:30:30] Elana Zeide: To me, that may be the title of my next paper, The Digital Albatros.

That's a good one. So I'm a sucker for a good title and it, it pains me when sometimes at, at, so there are conferences like Privacy Loss Scholars Conference where people bring draft papers and you get to give feedback on their draft paper. And so it's fun. It's like going back to school. No [00:31:00] consequences, right?

No one's grading you on your feedback. Um, but you can sometimes help people with these things. But the the thing it, like in the beginning I was very shy about saying, Wow, you had a great, your, your draft. In the, in the original, in the abstracts, like you had this great title and now you've ruined it by making it boring.

But to eight years in now to this job, I, I routinely say that to people and try to, try to, try to help them come up with something, uh, something better. I try to get Woody Sog to rename, uh, he. I don't remember the name of the paper. It's something about privacy, about having a right to privacy even when you're out in public.

And, and I wanted him to ha call it something or other colon hiding in Plain's site. Um, but he wouldn't, he, I, I, he gave a talk about it at Microsoft. That's what I named [00:32:00] the talk. So the talk did not match up to his slides. Um, the, you know, I tried everyth. You 

[00:32:07] Elsbeth Magilton: know, Whaty is a semi-regular guest on the show, so Oh, he'll

[00:32:14] Sue Glueck: can even, I can later, I can look it up in my calendar what, uh, what my title for his paper was and what his title turned out to be. And just way better. So you have a really unique 

[00:32:28] Elana Zeide: bird's eye perspective on, on what's happening. Around the world in, in privacy and technology law. Uh, so what do you think some of the biggest regulatory challenge challenges are that society's facing right now?

[00:32:43] Sue Glueck: Boy, so many. So many. The first one that comes to mind is racial equity and technology. If you think about machine learning, you train machine learning on data sets. So if [00:33:00] you, for example, um, train facial recognition technology on the most popular, uh, corpus of data, of faces out there, they're all white men who are like CEOs or something.

Few white women. Mostly white men. Um, if you. Train your facial recognition on that kind of a, a data set that is not diverse, then your technology is, is first of all not gonna be very good. Um, And it's, and you, you, you could unintentionally send a ripple effect through society. If police forces are using facial recognition technology that mis identifies people of color, that, and so they get arrested, that is a huge problem.

Um, so I think racial equity and technology and. You know, there is always bias in the [00:34:00] human heart. It's part of, I believe, um, just. You know, in some long ago psychology class, they talked about just the ability to distinguish friend from foe, uh, blue, from green, that, that, that the, when it comes down to that, we are always quote unquote discriminating because we're distinguishing between one thing and the next.

And unfortunately what comes along with that is bias. Everyone has. So what do we do in technology, not only, not to make that worse, but to make things better. Like technology's not panacea, it's not gonna fix all the problems, but how can we, how can we do better? And what, so what? What should the lobby, what should regulations be?

What should the policy be to encourage that kind of thing? in today's heavily politic, politicized environment. [00:35:00] That's, that's tough. Um, then there's tech lash, um, the feeling that too much power is in the hands of big tech. And, um, how do you, how do you regulate that without entirely stamping out innovation?

How do you allow room for innovation while at the same time requiring responsible conduct? What do you do about content moderation? You know, What about misinformation and disinformation that we know can potentially sway elections? It can. Have people decide whether or not they think getting vaccinated is, is a good thing or a bad thing?

It, it's how do you have a tech company be the arbiter of what's true versus what's false? Because sometimes, you know, people just believe different things. Um, I think that at AI policy is at a, a lot [00:36:00] at the heart of a lot of what I was just talking about because. As we automate more and more, then we really need to be careful about the impacts, especially on the most vulnerable members of society who don't get a say in how the software is designed usually.

Um, so that I, I, I think these are really, really big challenges to solve and. I don't know. I like to think about it the way when I have a big problem or I, let's say I have a big project to do, I break it down into teeny, teeny, tiny bite sized pieces and you can go too small with that. I think the US approach to privacy, which is sectoral.

Outside of California, um, it is, is not the best approach that was taking bite size pieces. But [00:37:00] at some point you also have to elevate from the bite size pieces to let's look at the whole cake and what should the whole cake be. But you gotta start somewhere. And if you start small, you're more likely not to make a mistake with, you know, this.

I don't know. Everything in life is about unintended consequences. Can I jump 

[00:37:23] Elsbeth Magilton: in? I would love to get your take on. So I was having a conversation with somebody about missing misinformation this weekend and, uh, Different generation, quite a bit older than I was. And he, he was commenting, he's like, you know, I don't know why everyone's talking about this so much.

Didn't everyone else also read Julius Caesar lying To Be Success? To be successful in politics or influence elections in other countries is something that humans have been doing since we've existed, which is a fair point. Right. Um, and the way that he was framing it, which I thought was. Not wrong but interesting was that the [00:38:00] internet actually provides us, yes, this opportunity for rampant, mis and disinformation, but it also provides us probably the most effective remedy we've ever had as humans, um, to share correct information and which was.

Both simultaneously optimistic and pessimistic view, but from your vantage point and, and what you look at every day and, and the work you're putting out and the academics you talk to, what would be your kind of reaction to that? Um, whereas we're both at a good, the worst place and the best place to solve the problem.

[00:38:29] Sue Glueck: Well, I I always, when I'm thinking about this topic, I think about our reptile brains. So our, our reptile brains are all about lack and attack. Um, fight, flight, freeze that at a basic level to survive. We still have these mechanisms that perhaps don't serve us in 2022. Quite so much, right? No cyber tooth saver tooth.

Cyber tooth ha uh, [00:39:00] tigers out there. But, um, Oh no. Never laugh at your own jokes. 

[00:39:06] Elsbeth Magilton: That's terrible. It was good. I loved it. It wasn't an intentional joke either. I don't think so. Another amazing title. 

[00:39:15] Sue Glueck: Cyber Tooth Tigers. Um, so the thing is, the way the folks who've been putting out misinformation have been doing it in, in a fantastic manner, and some of this is modeled off of, um, during the Cold War, the, the disinformation wars that went on.

But of course we were talking. Pamphlets drop from an airplane maybe, and, and, or radio free Europe, or you know, how you package up the ac, the, the information if, if you go for things that frightened people that really gets their attention. Whereas things that soothe and calm [00:40:00] people well for the parasympathetic nervous system to kick in, you have to do that deep breathing.

You know, or maybe smell something that reminds you of your grandmother's cherry pie. She used to make just things that, but it takes a few minutes to get there, whereas the, the reptile brain, um, the brain stem just gets you right there in, in fear. Um, and it's hard to counteract that because I guess you.

Fight fire with fire. But mostly it's like, you know, nerdy scientists saying, um, that is actually not true. Here is what is true, and it's not as, it's, it's typically not packaged as a compelling message and it's not going for fear. Um, it's, it's trying to do the opposite and that stuff without the human brain works.

So I, I think. [00:41:00] It's, it's a difficult problem, but your, your friend has, uh, a really, uh, beautiful gem of an idea in there, and that is media literacy. That if you are better at evaluating the claims you see on the internet and not just going with what your reptile brain says, if you've read the classics, if you have some understanding of the human heart, That's the thing, but it's, um, you know, how do you get through to all these people?

The University of Washington Center for an informed public distributes a class that, that, uh, two of the, the, uh, professors involved developed. Um, and forgive me for saying the actual title, but it's something like calling bullshit. And it teaches students how to, um, how to. Figure out what's true and what's not on the internet.

[00:42:00] And, um, they, they offer like a, a PG title or a G rated title. Um, that does not have BS in the title. Um, but it, it's been kind of gratifying to me that the first school to take this take up this. Curriculum and, and teach. It was a Catholic girl school and they kept the bullshit in the title. So, Got that.

Yeah, for sure. So, 

[00:42:27] Elana Zeide: so we've talked about ai, we've talked about privacy, we've talked about content moderation, and then you mentioned something in your talk, which now I'm gonna get you on, which is quantum policy. What, what, what should we, what should we be looking for next? If it's not cmoplicated enough.

[00:42:46] Sue Glueck: So, um, so, um, quantum computers, um, have not been fully realized yet. Uh, and it's an incredible [00:43:00] undertaking to have a general quantum computer that, um, you, you know, that keeping, keeping it cool. And there are all kinds of technical issues, I think including. Having enough helium, uh, for, for them. And, and, uh, people have talked about mining helium from the moon because we have a finite supply here on earth.

Um, really sounds like something from the Jetsons or, you know, something very futuristic, but it's right around the, the corner. Um, but even though general quantum computing isn't available, there are quantum computing techniques that, um, so I, you remember back from. I don't know, physics class in high school maybe.

Um, that, um, that light can be both a wave and a particle and the act of observing to try to capture is it a wave, A particle can change a wave into a particle [00:44:00] or a particle into a wave that's, Basis of the Heisenberg Uncertainty principle. Um, imagine capturing that for quantum technology that lets you communicate almost instantaneously with someone in another part of the world.

Um, then imagine you get a warrant. For a wire tap of this perfectly encrypted, simultaneous, simultaneous communication. Uh, if you're not thinking about quantum policy, when you design the quantum technology, then you're, you're, you're going to, you know, that can be a very expensive mistake, uh, that that doesn't allow you to comply with the law.

And so, You, you're in a very difficult position if that happens. The obvious quantum policy thing that a number of people have talked about is around cybersecurity. So if the bad guy has a [00:45:00] quantum computer and you do not, when you buy something on the internet, the bad guys can be able to decrypt your credit card and get your credit card and other information, potentially steal your identity and the identity of.

Hundreds of thousands, millions of people very quickly and easily. Okay, That's bad. But it's so obvious. It's so obvious that it's, once you like learn what this technology is like, it's like, ooh, yes. The people who don't have quantum computers are really gonna have a hard time with security and maintaining their privacy.

I I, that part of it doesn't sound that interesting to me cuz it's obvious. It's the, well what about law enforcement stuff or what about other issues and how is that this brand new technology that, that most people, unless you're a physicist, like. You don't understand. That's, that's why I was so delighted when Chris Ho [00:46:00] Noggle at, at Berkeley, uh, coauthored a book about it so that ordinary humans, uh, could understand it.

And, um, I strangely, uh, at Microsoft didn't realize that a good friend of mine manages the lawyer who supports the quantum team, and he bought the book and he bought a copy for my friend as well. And they both read it and they, you know, and I was like, Oh, I'm sorry. Like I, if I had known, like I, you know, I could have shown you the article that led to the book.

I could have, um, Ask permission for you to read a pre publication copy before the book even came out. Um, and I can certainly let you. Um, now, now Chris has a, a fanboy and a fan girl.

[00:46:52] Elsbeth Magilton: Well, this was so much fun, Sue. I really appreciate you spending time with our students today and spending time with us this afternoon. Um, to share a little [00:47:00] bit more about your career and what it is that you do, uh, with all of our 

[00:47:03] Sue Glueck: listeners. Well, thank you for having me. I was not sure how this was gonna go cuz it's my first podcast and it- I, I, you know, I didn't feel a thing. It was, it was painless and actually fun. So thank you so much for that. 

[00:47:21] Elsbeth Magilton: Wonderful. Thank you so much for being here. And thank you listeners for joining us on this episode of Tech Refactored. If you want to learn more about what we're doing here at NGTC or submit an idea for a future episode, you can go to our website at ngtc.unl.edu, or you can follow us on Twitter at UNL underscore NGTC.

If you enjoyed this show, don't forget to leave us a rating and review wherever you listen to podcast. This podcast is part of the Menard Governance and Technology Programming series. [00:48:00]