See all Lex Fridman transcripts on Youtube

youtube thumbnail

Charles Isbell and Michael Littman: Machine Learning and Education | Lex Fridman Podcast #148

1 hours 57 minutes 46 seconds

🇬🇧 English

S1

Speaker 1

00:00

The following is a conversation with Charles Isbell and Michael Whitman. Charles is the dean of the College of Computing at Georgia Tech, and Michael is a computer science professor at Brown University. I've spoken with each of them individually on this podcast, and since they are good friends in real life, we all thought it would be fun to have a conversation together. Quick mention of each sponsor, followed by some thoughts related to the episode.

S1

Speaker 1

00:26

Thank you to Athletic Greens, the all-in-one drink that I start every day with to cover all my nutritional bases, 8 sleep, a mattress that cools itself and gives me yet another reason to enjoy sleep, masterclass, online courses from some of the most amazing humans in history, and Cash App, the app I use to send money to friends. Please check out the sponsors in the description to get a discount and to support this podcast. As a side note, let me say that having 2 guests on the podcast is an experiment that I've been meaning to do for a while. In particular, because down the road, I would like to occasionally be a kind of moderator for debates between people that may disagree in some interesting ways.

S1

Speaker 1

01:10

If you have suggestions for who you would like to see debate on this podcast, Let me know. As with all experiments of this kind, it is a learning process. Both the video and the audio might need improvement. I realized I think I should probably do 3 or more cameras next time as opposed to just 2, and also try different ways to mount the microphone for the third person.

S1

Speaker 1

01:34

Also, after recording this intro, I'm going to have to go figure out the thumbnail for the video version of the podcast, since I usually put the guest's head on the thumbnail, and now there's 2 heads and 2 names to try to fit into the thumbnail. It's a kind of a bin packing problem, which in theoretical computer science happens to be an NP hard problem. Whatever I come up with, if you have better ideas for the thumbnail, let me know as well. And in general, I always welcome ideas how this thing can be improved.

S1

Speaker 1

02:10

If you enjoy it, subscribe on YouTube, review it with 5 Stars on Apple Podcast, follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman. And now, here's my conversation with Charles Isbell and Michael Littman.

S2

Speaker 2

02:27

You'll probably disagree about this question, but what is your biggest, would you say, disagreement about either something profound and very important or something completely not important at all?

S3

Speaker 3

02:39

I don't think we have any disagreements at all.

S4

Speaker 4

02:41

Ah, I'm not sure that's true.

S3

Speaker 3

02:44

We walked into that 1, didn't we?

S4

Speaker 4

02:45

Yeah, that's pretty good. So 1 thing that you sometimes mention is that, and we did this 1 on air too, as it were, whether or not machine learning is computational statistics.

S3

Speaker 3

02:55

It's not.

S4

Speaker 4

02:56

But it is.

S3

Speaker 3

02:57

Well, it's not. And in particular, and more importantly, it is not just computational statistics.

S2

Speaker 2

03:02

So what's missing in the picture?

S3

Speaker 3

03:04

What's- All the rest of it.

S4

Speaker 4

03:06

What's missing? That which is missing. Oh, because it's- Well, you can't be wrong now.

S3

Speaker 3

03:10

Well, it's not just the statistics. He doesn't even believe this. We've had this conversation before.

S3

Speaker 3

03:14

If it were just the statistics, then we would be happy with where we are. But it's not just the statistics.

S4

Speaker 4

03:19

That's why it's computational statistics. Or if

S3

Speaker 3

03:21

it were just the computational.

S4

Speaker 4

03:22

I agree that machine learning is not just statistics.

S3

Speaker 3

03:24

It is not just statistics. We

S4

Speaker 4

03:25

can agree on that.

S3

Speaker 3

03:26

Nor is it just computational statistics.

S4

Speaker 4

03:27

It's computational statistics.

S3

Speaker 3

03:29

It is computational.

S2

Speaker 2

03:30

What is the computational and computational statistics? Does this take us into the realm of computing?

S3

Speaker 3

03:35

It does, but I think perhaps the way I can get him to admit that he's wrong is that it's about rules.

S4

Speaker 4

03:43

It's about rules.

S3

Speaker 3

03:44

It's about symbols, it's about all these other things.

S2

Speaker 2

03:45

But statistics

S4

Speaker 4

03:46

is not about rules? I'm gonna say statistics is about rules.

S3

Speaker 3

03:48

But it's not just the statistics, right? It's not just a random variable that you choose and you have a probability distribution.

S4

Speaker 4

03:52

I think you have a narrow view of statistics.

S3

Speaker 3

03:54

Okay, well then, what would be the broad view of statistics that would still allow it to be statistics and not say history that would make computational statistics okay?

S4

Speaker 4

04:02

Well, okay, so I had, my first sort of research mentor, a guy named Tom Landauer, taught me to do some statistics. Right? And I was annoyed all the time because the statistics would say that what I was doing was not statistically significant.

S4

Speaker 4

04:19

And I was like, but, but, but, and basically what he said to me is, statistics is how you're gonna keep from lying to yourself. Which I thought was really deep.

S3

Speaker 3

04:29

It is a way to keep yourself honest in a particular way. I agree with that.

S4

Speaker 4

04:34

Yeah. And so you're trying to find rules. I'm just going to bring it back to rules.

S2

Speaker 2

04:38

Wait, wait, wait. Could you possibly try to define rules?

S4

Speaker 4

04:43

Even regular statisticians, non-computational statisticians, do spend some of their time evaluating rules, right? Applying statistics to try to understand, does this rule capture this? Does this not capture that?

S2

Speaker 2

04:55

I mean like hypothesis testing kind of thing? Sure. Or like confidence intervals?

S2

Speaker 2

04:59

Like have like-

S4

Speaker 4

05:00

More like hypothesis. Like I feel like the word statistic literally means a summary, like a number that summarizes other numbers. But I think the field of statistics actually applies that idea to things like rules, to understand whether or not a rule is valid.

S4

Speaker 4

05:15

Software engineering

S3

Speaker 3

05:16

statistics? No. Programming languages statistics? No.

S3

Speaker 3

05:21

Because I think it's useful to think about a lot of what AI and machine learning is or certainly should be as software engineering, as programming languages. To put it in language that you might understand, the hyperparameters beyond the problem itself.

S4

Speaker 4

05:35

The hyperparameters is too many syllables for me to understand.

S3

Speaker 3

05:38

The hyperparameters.

S4

Speaker 4

05:39

That's better.

S3

Speaker 3

05:40

That goes around it, right? It's the decisions you choose to make. It's the metrics you choose to use.

S3

Speaker 3

05:44

It's the loss

S4

Speaker 4

05:45

function you choose

S3

Speaker 3

05:46

to focus on. So you wanna say the

S4

Speaker 4

05:46

practice of machine learning is different than the practice of statistics. Like the things you have to worry about and how you worry about them are different, therefore they're different.

S3

Speaker 3

05:54

Right, at a very little, I mean at the very least. It's that much is true. It doesn't mean that statistics, computational or otherwise aren't important.

S3

Speaker 3

06:02

I think they are. I mean, I do a lot of that, for example. But I think it goes beyond that. I think that we could think about game theory in terms of statistics, but I don't think it's very as useful to do.

S3

Speaker 3

06:12

I mean, the way I would think about it, or a way I would think about it is this way. Chemistry is just physics. But I don't think it's as useful to think about chemistry as being just physics. It's useful to think about it as chemistry.

S3

Speaker 3

06:25

The level of abstraction really matters here.

S4

Speaker 4

06:27

So I think it is, there are contexts in which it is useful.

S3

Speaker 3

06:30

Yes. To think of

S4

Speaker 4

06:30

it that way, right? And so finding that connection is actually helpful. And I think that's when I emphasize the computational statistics thing.

S4

Speaker 4

06:37

I think I want to befriend statistics and not absorb them.

S3

Speaker 3

06:42

Here's the A way to think about it beyond what I just said, right? So what would you say, and I want you to think back to a conversation we had a very long time ago, what would you say is the difference between, say, the early 2000s, ICML and what we used to call NIPS, NIRPS, is there a difference? A lot of, particularly on the machine learning that was done there?

S2

Speaker 2

07:01

ICML was around that long. Oh yeah. So I clear as the new conference newish.

S2

Speaker 2

07:06

Yeah, I guess so.

S3

Speaker 3

07:08

And I see my was

S2

Speaker 2

07:08

around the 2000. Oh, I see now predates that.

S4

Speaker 4

07:12

I think my most cited ICML papers from

S1

Speaker 1

07:14

94.

S3

Speaker 3

07:15

Michael knows this better than me because of course he's significantly older than I am. But the point is, what is the difference between ICML and NeurIPS in the late 90s, early 2000s?

S4

Speaker 4

07:24

I don't know what everyone else's perspective would be, but I had a particular perspective at that time. Which is I felt like ICML was more of a computer science place, and the NIRPS was more of an engineering place, like the kind of math that happened at the 2 places. As a computer scientist, I felt more comfortable with the ICML math, and the NIRPS people would say that that's because I'm dumb.

S4

Speaker 4

07:48

And that's such an engineering thing to say.

S3

Speaker 3

07:51

I agree with that part of it, but I do it a little differently. Actually, I had a nice conversation with Tom Dietrich about this. On Twitter.

S3

Speaker 3

07:57

On Twitter just a couple days ago. I put it a little differently, which is that ICML was machine learning done by computer scientists. And NURBS was machine learning done by computer scientists trying to impress statisticians. Which was weird because it was the same people, at least by the time I started paying attention, but it just felt very, very different.

S3

Speaker 3

08:18

And I think that that perspective of whether you're trying to impress the statisticians or you're trying to impress the programmers is actually very different and has real impact on what you choose to worry about and what kind of outcomes you come to. So I think it really matters. I think computational statistics is a means to an end. It is not an end in some sense.

S3

Speaker 3

08:36

And I think that really matters here in the same way that I don't think computer science is just engineering or just science or just math or whatever.

S4

Speaker 4

08:43

Okay, so I'd have to now agree that Now we agree on everything.

S3

Speaker 3

08:46

Yes, yes. The important thing here is that, you know, my opinions may have changed, but not the fact that I'm right, I think is what we just came to.

S4

Speaker 4

08:54

Right, and my opinions may have changed and not the fact that I'm wrong.

S2

Speaker 2

08:57

That's right. You lost me. I'm not even.

S2

Speaker 2

09:00

I think

S3

Speaker 3

09:00

I lost myself there too. But anyway, we're back.

S4

Speaker 4

09:04

This happens to us sometimes, we're sorry.

S2

Speaker 2

09:06

How does neural networks change this, just to even linger on this topic, change this idea of statistics, how big of a pie statistics is within the machine learning thing. Like, because it sounds like hyperparameters and also just the role of data. You know, people are starting to use this terminology of software 2.0, which is like the act of programming as a, like you're a designer in the hyperparameter space of neural networks and you're also the collector and the organizer and the cleaner of the data and that's part of the programming.

S2

Speaker 2

09:46

So how did, on the NeurIPS versus ICML topic, what's the role of neural networks in redefining the size and the role of machine learning?

S3

Speaker 3

09:57

Which is. I can't wait to hear what Michael thinks about this, but I would add 1.

S4

Speaker 4

10:01

But you will.

S3

Speaker 3

10:01

But that's true. I will. I'll force myself to.

S3

Speaker 3

10:04

I think there's 1 other thing I would add to your description, which is the kind of software engineering part of what does it mean to debug, for example. But this is a difference between the kind of computational statistics view of machine learning and the computational view of machine learning, which is I think 1 is worried about the equation as it were. And by the way, this is not a value judgment. I just think it's about perspective.

S3

Speaker 3

10:24

But the kind of questions you would ask, you start asking yourself, well, what does it mean to program and develop and build the system? It's a very computer science-y view of the problem. I mean, if you get on data science Twitter and econ Twitter, you actually hear this a lot with the economist and the data scientist complaining about the machine learning people, well, It's just statistics, and I don't know why they don't see this. But they're not even asking the same questions.

S3

Speaker 3

10:49

They're not thinking about it as a programming problem. I think that that really matters, just asking this question. I actually think it's a little different from programming in hyperparameter space and collecting the data. But I do think that that immersion really matters.

S3

Speaker 3

11:05

So I'll give you a quick example of the way I think about this. So I teach machine learning. Michael and I have co-taught a machine learning class, which has now reached, I don't know, 10,000 people at least over the last several years or somewhere thereabouts. And my machine learning assignments are of this form.

S3

Speaker 3

11:21

So the first 1 is something like, implement these 5 algorithms, KNN and SVMs and boosting and decision trees and neural networks and maybe that's it, I can't remember. When I say implement, I mean steal the code. I'm completely uninterested. You get 0 points for getting the thing to work.

S3

Speaker 3

11:38

I don't want you spending your time worrying about getting the corner case right of what happens when you are trying to normalize distances and the points on the thing, and so you divide by 0. I'm not interested in that, right? Steal the code. However, you're going to run those algorithms on 2 data sets.

S3

Speaker 3

11:55

The data sets have to be interesting. What does it mean to be interesting? Well, a data set's interesting if it reveals differences between algorithms, which presumably are all the same, because they can represent whatever they can represent. And 2 data sets are interesting together if they show different differences, as it were.

S3

Speaker 3

12:10

And you have to analyze them. You have to justify their interestingness, and you have to analyze them in a whole bunch of ways. But all I care about is the data in your analysis, not the programming. And I occasionally end up in these long discussions with students.

S3

Speaker 3

12:21

Well, I don't really. I copy and paste the things that I've said the other 15,000 times it's come up, which is, they go, but the only way to learn, really understand is to code them up, which is a very programmer, software engineering view of the world. If you don't program it, you don't understand it, which is, by the way, I think is wrong in a very specific way, but it is a way that you come to understand, because then you have to wrestle with the algorithm. But The thing about machine learning is it's not just sorting numbers, where in some sense the data doesn't matter.

S3

Speaker 3

12:49

What matters is, well, does the algorithm work on these abstract things, 1 less than the other? In machine learning, the data matters. It matters more than almost anything. Not everything, but almost anything.

S3

Speaker 3

12:59

And so, as a result, you have to live with the data and don't get distracted by the algorithm per se. And I think that that focus on the data and what it can tell you and what question it's actually answering for you, as opposed to the question you thought you were asking, is a key and important thing about machine learning and is a way that computationalists, as opposed to statisticians, bring a particular view about how to think about the process. The statisticians, by contrast, bring, I think I'd be willing to say, a better view about the kind of formal math that's behind it and what an actual number ultimately is saying about the data. And those are both important, but they're also different.

S2

Speaker 2

13:38

I didn't really think of it this way, is to build intuition about the role of data, the different characteristics of data, by having 2 data sets that are different and they reveal the differences and the differences. That's a really fascinating, that's a really interesting educational approach.

S4

Speaker 4

13:54

The students love it, but not right away.

S3

Speaker 3

13:57

No, they love it at the end. They love

S4

Speaker 4

13:58

it later.

S3

Speaker 3

13:58

They love it at the end, Not at the beginning.

S4

Speaker 4

14:02

Not even immediately after.

S2

Speaker 2

14:04

I feel like there's a deep, profound lesson about education there. Yeah. That you can't listen to students about whether what you're doing is the right or the wrong thing.

S3

Speaker 3

14:16

Well, as a wise, Michael Lipman once said to me about children, which I think applies to teaching, is you have to give them what they need without bending to their will. And students are like that. You have to figure out what they need.

S3

Speaker 3

14:29

You're a curator. Your whole job is to curate and to present Because on their own, they're not gonna necessarily know where to search, so you're providing pushes in some direction and learn space. And you have to give them what they need in a way that keeps them engaged enough so that they eventually discover what they want and they get the tools they need to go and learn other things off of.

S2

Speaker 2

14:50

What's your view, let me put on my Russian hat, which believes that life is suffering.

S4

Speaker 4

14:55

I like Russian hats, by the way. If you have 1, I would

S3

Speaker 3

14:57

like this.

S2

Speaker 2

14:58

Those are ridiculous, yes.

S4

Speaker 4

15:00

But in a delightful way, but sure.

S2

Speaker 2

15:03

What do you think is the role of, we talked about balance a little bit, what do you think is the role of hardship in education? Like, I think the biggest things I've learned, Like what made me fall in love with math, for example, is by being bad at it until I got good at it. So like struggling with a problem, which increased the level of joy I felt when I finally figured it out.

S2

Speaker 2

15:33

And it always felt with me with teachers, especially modern discussions of education, how can we make education more fun, more engaging, more all those things? Well, from my perspective, it's like you're maybe missing the point that education, that life is suffering. Education is supposed to be hard and that actually what increases the joy you feel when you actually learn something. Is that ridiculous?

S2

Speaker 2

16:02

Do you like to see your students suffer?

S4

Speaker 4

16:04

Okay, so this may be a point where we differ.

S3

Speaker 3

16:07

I suspect not. I'm gonna do go on.

S4

Speaker 4

16:09

Well, what would your answer be? I wanna hear you first. Okay, well, I was gonna not answer the question.

S3

Speaker 3

16:15

So you don't want the students

S4

Speaker 4

16:16

to know you

S3

Speaker 3

16:17

enjoy them suffering?

S4

Speaker 4

16:18

No, no, no, no, no, no. I was gonna say that there's, I think there's a distinction that you can make in the kind of suffering, right? So I think you can be in a mode where you're suffering in a hopeless way versus you're suffering in a hopeful way, right?

S4

Speaker 4

16:33

Where you're like, you can see that if you, that you still have, you can still imagine getting to the end, right? And as long as people are in that mindset where they're struggling, but it's not a hopeless kind of struggling, That's productive. I think that's really helpful. But if struggling, like if you break their will, if you leave them hopeless, no, that don't, sure, some people are gonna, whatever, lift themselves up by their bootstraps.

S4

Speaker 4

17:00

But like mostly you give up and certainly it takes the joy out of it and you're not gonna spend a lot of time on something that brings you no joy. So it is a bit of a delicate balance, right? You have to thwart people in a way that they still believe that there's a way through.

S3

Speaker 3

17:16

Right, so that's a, we strongly agree actually. So I think, well first off, struggling and suffering aren't the same thing, right?

S2

Speaker 2

17:24

Yeah, just being poetic.

S3

Speaker 3

17:25

Oh no, no, I actually appreciate the poetry and 1 of the reasons I appreciate it is that they are often the same thing and often quite different. So you can struggle without suffering. You can certainly suffer pretty easily.

S3

Speaker 3

17:37

You don't necessarily have to struggle to suffer. So I think that you want people to struggle, but that hope matters. You have to, they have to understand that they're gonna get through it on the other side. And it's very easy to confuse the 2.

S3

Speaker 3

17:49

I actually think Brown University has a very, just philosophically has a very different take on the relationship with their students, particularly undergrads from, say, a place like Georgia Tech, which is. Which university is better? Well, I have my opinions on that.

S4

Speaker 4

18:03

I mean, remember, Charles said, it doesn't matter what the facts are, I'm always right.

S3

Speaker 3

18:07

The correct

S4

Speaker 4

18:07

answer

S3

Speaker 3

18:08

is that it doesn't matter. They're different. But there's clearly answers to that.

S4

Speaker 4

18:14

Well, he went to a school like the school where he is as an undergrad. I went to a school, specifically the same school, though it changed a bit in the intervening years.

S3

Speaker 3

18:23

Brown or Georgia Tech? No, I

S4

Speaker 4

18:24

was talking about Georgia Tech. And I went to an undergrad place that's a lot like the place where I work now, and So it does seem like we're more familiar with these models.

S2

Speaker 2

18:33

There's a similarity between Brown and Yale?

S4

Speaker 4

18:35

Yeah, I think they're quite similar, yeah.

S3

Speaker 3

18:38

And Duke.

S4

Speaker 4

18:39

Duke has some similarities too, but it's got a little Southern draw.

S3

Speaker 3

18:42

You've kind of worked, you've sort of worked at universities that are like the places where you learned. And the same would be true for me.

S2

Speaker 2

18:52

Are you uncomfortable venturing outside the box? Is that what you're saying?

S3

Speaker 3

18:57

Journeying out? Not what I'm saying.

S4

Speaker 4

18:58

Yeah, Charles is definitely. He only goes to places that have institute in the name.

S3

Speaker 3

19:02

It has worked out that way. Well, academic places anyway. Well, no, I was a visiting scientist at UPenn, or visiting something at UPenn.

S2

Speaker 2

19:11

Oh, wow, I just understood your joke.

S4

Speaker 4

19:14

Which 1? I like to set these sort of time bombs.

S2

Speaker 2

19:20

The Institute is in the, that Charles only goes to places that have Institute in the name. So I guess Georgia, I forget that Georgia Tech is Georgia Institute of Technology.

S3

Speaker 3

19:30

The number of people who refer to it as Georgia Tech University is large and incredibly irritating. It's 1 of the few things that genuinely gets under my skin.

S4

Speaker 4

19:39

But schools like Georgia Tech and MIT have as part of the ethos, like there is, I want to say there's an abbreviation that someone taught me, like IHTFP, something like that. Like there's an expression which is basically, I hate being here, which they say so proudly. And that is definitely not the ethos at Brown.

S4

Speaker 4

19:57

Like Brown is, there's a little more pampering and empowerment and stuff. And it's not like we're gonna crush you and you're gonna love it. So yeah, I think the ethoses are different.

S2

Speaker 2

20:09

That's interesting, yeah. We had drown proofing. What's that?

S3

Speaker 3

20:12

Drown proofing. In order to graduate from Georgia Tech, this is a true thing, Feel free to look it up. If you-

S4

Speaker 4

20:17

A lot of schools have this, by the way.

S3

Speaker 3

20:19

No, actually Georgia Tech was apparently the first.

S4

Speaker 4

20:20

Brandeis has it. Had it.

S2

Speaker 2

20:23

I feel like Georgia Tech was the first in

S3

Speaker 3

20:25

a lot of ways. It was the first in a lot of things. Had the first master's degree in-

S4

Speaker 4

20:29

First Bumblebee mascot.

S3

Speaker 3

20:30

Stop that. First master's in computer science, actually.

S4

Speaker 4

20:34

Right, online master's.

S3

Speaker 3

20:35

Well, that too, but way back in the 60s. Oh, really? Yeah, yeah.

S3

Speaker 3

20:38

You're the first information and computer science master's degree in the country. But the Georgia Tech, it used to be the case that in order to graduate from Georgia Tech, you had to take a drown-proofing class, where effectively, they threw you in the water to hide you up, and if you didn't drown, you got to graduate.

S4

Speaker 4

20:54

Hide you up?

S3

Speaker 3

20:55

I believe so. No. There were certainly versions of it, but I mean, luckily, they ended it just before I had to graduate because otherwise I would have never graduated.

S3

Speaker 3

21:03

It wasn't gonna happen. I wanna say 84, 83, someone around then, they ended it. But yeah, you used to have to prove you could tread water for some ridiculous amount of time.

S4

Speaker 4

21:13

Or you

S3

Speaker 3

21:13

couldn't graduate. No, It was more than 2 minutes. I bet

S4

Speaker 4

21:15

it was 2 minutes.

S3

Speaker 3

21:16

Okay, well, we'll

S1

Speaker 1

21:16

look at it. And it was in a bathtub.

S3

Speaker 3

21:20

It was in a pool, but it was a real thing. But that idea that, you know, push you- Fully clothed. Yeah, fully clothed.

S4

Speaker 4

21:25

I bet it was that and not tied up, because like who needs to learn how to swim when you're tied? Nobody, but who needs to learn to swim when you're actually falling into the water dressed? That's a real thing.

S2

Speaker 2

21:34

I think your facts are getting in the way with a good story.

S4

Speaker 4

21:37

Oh, that's fair, that's fair. I didn't mean to. All right,

S3

Speaker 3

21:39

so they tie you up. The narrative matters. But whatever it was, you had to, it was called drown-proofing for a reason.

S3

Speaker 3

21:44

The point of the story, Michael, is

S2

Speaker 2

21:48

that- Struggle.

S3

Speaker 3

21:49

Well, no, but that's good. It does bring it back to struggle. That's a part of what Georgia Tech has always been, and we struggle with that, by the way, about what we want to be, particularly as things go.

S3

Speaker 3

21:59

But you sort of, how much can you be pushed without breaking? And you come out of the other end stronger, right? There's a saying we used to have when I was an undergrad, there was a Georgia Tech, building tomorrow the night before. Right?

S3

Speaker 3

22:15

And it was just kind of idea that, give me something impossible to do and I'll do it in a couple of days because that's what I just spent the last 4 or 5 or 6 years doing.

S4

Speaker 4

22:24

That ethos definitely stuck to you. Having now done a number of projects with you, you definitely will do it the night before. That's

S3

Speaker 3

22:30

not entirely true. There's nothing wrong with waiting until the last minute. The secret is knowing when the last minute is.

S2

Speaker 2

22:35

Right, that's brilliantly put.

S4

Speaker 4

22:38

Yeah, that is a definite Charles statement that I am trying not to embrace.

S3

Speaker 3

22:44

Well, I appreciate that because you helped move my last minute.

S2

Speaker 2

22:47

That's a social construct where you converge together what the definition of last minute is. We figure that out all together. In fact, MIT, I'm sure a lot of universities have this, but MIT has MIT time that everyone has always agreed together that there is such a concept and everyone just keeps showing up like 10 to 15 to 20, depending on the department, late to everything.

S2

Speaker 2

23:11

So there's like a weird drift that happens. It's kind of fascinating.

S3

Speaker 3

23:14

Yeah, we're 5 minutes. We're

S2

Speaker 2

23:16

5 minutes.

S3

Speaker 3

23:16

In fact, the classes will say, well, this is no longer true, actually, but it used to be a class would start at 8, but actually it started at 8 o' 5, it ends at 9, actually it ends at 8.55. Everything's 5 minutes off, and nobody expects anything to start until 5 minutes after the half hour, whatever it is. It still exists.

S4

Speaker 4

23:32

It hurts my head.

S2

Speaker 2

23:33

Well, let's rewind the clock back to the 50s and 60s when you guys met. How did you, I'm just kidding, I don't know. But can you tell the story of how you met?

S2

Speaker 2

23:43

So you've, like the internet and the world kind of knows you as connected in some ways in terms of education, of teaching the world. That's like the public facing thing, but how did you as human beings and as collaborators

S3

Speaker 3

23:59

meet? I think there's 2 stories. 1 is how we met, and the other is how we got to know each other. I'm not gonna say fell in love.

S3

Speaker 3

24:08

I'm gonna say that we came to understand that we

S4

Speaker 4

24:11

had some common something. Yeah, it's funny, because on the surface, I think we're different in a lot of ways, but there's something that just consonant. There you go.

S4

Speaker 4

24:22

Afternoon.

S3

Speaker 3

24:23

So I will tell the story of how we met, and I'll let Michael tell the story of how we met.

S4

Speaker 4

24:27

Okay, all right.

S3

Speaker 3

24:28

Okay, so here's how we met. I was already at that point, it was AT&T Labs. There's a long, interesting story there.

S3

Speaker 3

24:34

But anyway, I was there, and Michael was coming to interview. He was a professor at Duke at the time, but decided for reasons that he wanted to be in New Jersey. And so that would mean Bell Labs slash AT&T Labs. And we were doing interviews, interviews were very much like academic interviews.

S3

Speaker 3

24:51

And so I had to be there. We all had to meet with him afterwards and so on, one-on-one. But it was obvious to me that he was gonna be hired. Like no matter what, because everyone loved him.

S3

Speaker 3

25:00

They were just talking about all the great stuff he did. And, oh, he did this great thing. And you had just won something at AAAI, I think. Or maybe you got 18 papers in AAAI that year.

S4

Speaker 4

25:08

I got the best paper award at AAAI for the crossword stuff.

S3

Speaker 3

25:11

Right, exactly. So that had all happened. And everyone was going on and on and on about it.

S3

Speaker 3

25:14

Actually, so Tinder was saying incredibly nice things about

S4

Speaker 4

25:16

you. Really?

S3

Speaker 3

25:17

Yes. So

S4

Speaker 4

25:17

he can be very grumpy.

S3

Speaker 3

25:19

Yes.

S4

Speaker 4

25:19

That's very, that's nice to hear.

S3

Speaker 3

25:21

He was grumpily saying very nice things.

S4

Speaker 4

25:22

Oh, that's that makes

S3

Speaker 3

25:23

sense. And it does make sense. So, you know, so it was gonna come. So why were we, why was I meeting him?

S3

Speaker 3

25:27

I had something else I had to do. I can't remember what it was. Probably involved comic books.

S4

Speaker 4

25:31

So he remembers meeting me as inconveniencing his afternoon.

S3

Speaker 3

25:34

So he came, so I eventually came to my office. I was in the middle of trying to do something, I can't remember what, and he came and he sat down. And for reasons that are purely accidental, despite what Michael thinks, my desk at the time was set up in such a way that it had sort of an L shape, and the chair on the outside was always lower than the chair that I was in and you know the kind of point was to-

S4

Speaker 4

25:52

The only reason I think that it was on purpose is because you told me it was on purpose.

S3

Speaker 3

25:56

I don't remember that. Anyway the thing is that you know it kind of-

S4

Speaker 4

25:58

His guest chair was really low so that he could look down at everybody.

S3

Speaker 3

26:02

The idea was just to simply create a nice environment that you were asking for a mortgage, and I was gonna say no.

S4

Speaker 4

26:07

That was the point. It's a very

S3

Speaker 3

26:08

simple idea here. Anyway, so we sat there, and we just talked for a little while, and I think he got the impression that I didn't like him.

S4

Speaker 4

26:14

It wasn't true. I strongly got that impression.

S3

Speaker 3

26:15

The talk was really good. The talk,

S4

Speaker 4

26:17

by the way, was terrible. And right after the talk, I said to my host, Michael Curran, who ultimately was my boss.

S2

Speaker 2

26:23

I'm a friend and a huge fan of Michael, yeah.

S4

Speaker 4

26:25

Yeah, he is a remarkable person. After my talk, I

S3

Speaker 3

26:30

went into the- Irritably good basketball.

S2

Speaker 2

26:31

Racquetball, he's good at everything. No, basketball. No, but basketball, racquetball too.

S2

Speaker 2

26:36

Squash. Squash, squash, not racquetball.

S3

Speaker 3

26:38

Yeah, squash, which is not. Racquetball, yes. Squash, no.

S3

Speaker 3

26:42

And I hope you hear that, Michael.

S4

Speaker 4

26:45

Oh, Michael Kermes.

S2

Speaker 2

26:46

As a game, not his skill level, because I'm pretty sure he's-

S4

Speaker 4

26:49

All right, there's some competitiveness there. But the point is that it was like the middle of the day, I had full day of interviews. Like I met with people, but then in the middle of the day, I gave a job talk.

S4

Speaker 4

26:59

And then there was gonna be more interviews, but I pulled Michael aside and I said, I think it's in both of our best interest if I just leave now. Because that was so bad that it'd just be embarrassing if I have to talk to any more people. You look bad for having invited me. Let's just forget this ever happened.

S4

Speaker 4

27:19

So I don't think the talk went well.

S3

Speaker 3

27:21

That's 1 of the most Michael Lipman set of sentences I think I've ever heard. He did great, or at least everyone knew he was great, so maybe it didn't matter. I was there, I remember the talk, and I remember him being very much the way I remember him now on any given week.

S3

Speaker 3

27:33

So it was good. And we met and we talked about stuff. He thinks I didn't like him, but-

S4

Speaker 4

27:37

Because he was so grumpy.

S2

Speaker 2

27:39

Must've been the chair thing.

S4

Speaker 4

27:40

The chair thing and the low voice, I think. He obviously

S2

Speaker 2

27:43

wasn't happy with me. And that slight skeptical look. Yes.

S4

Speaker 4

27:48

I have no idea what you're talking about.

S3

Speaker 3

27:50

Well, I probably didn't have any idea what you were talking about. Anyway, I liked him.

S4

Speaker 4

27:54

He asked me questions, I answered questions. I felt bad about myself. It was a normal day.

S3

Speaker 3

27:59

It was a normal day. And then he left.

S2

Speaker 2

28:01

And then he left. And that's how we met. Can we take it?

S4

Speaker 4

28:03

And then I got hired and I was in the group.

S2

Speaker 2

28:05

Can we take a slight tangent on this topic of, it sounds like, maybe you could speak to the bigger picture, it sounds like you're quite self-critical.

S4

Speaker 4

28:15

Who, Charles?

S2

Speaker 2

28:15

No, you.

S4

Speaker 4

28:16

Oh, I think I can do better. I can do better. Try me again.

S4

Speaker 4

28:19

I'll do better. I'll be so self-critical. I

S2

Speaker 2

28:22

won't, I won't. Yeah, that was like a 3 out of

S1

Speaker 1

28:25

10

S2

Speaker 2

28:25

response. So let's try to work it up to 5 and 6. Yeah, I remember Marvin Minsky said on a video interview something that the key to success in academic research is to hate everything you do. For some reason.

S3

Speaker 3

28:44

I think

S4

Speaker 4

28:44

I followed that because I hate everything he's done.

S2

Speaker 2

28:48

Ah, that's a good line. That's a sick line.

S4

Speaker 4

28:52

Maybe that's a keeper.

S2

Speaker 2

28:54

But do you find that resonates with you at all in how you think about talks and so on?

S3

Speaker 3

28:59

I would say a different line. It's not

S4

Speaker 4

29:01

that. No, not really.

S3

Speaker 3

29:02

That's such an MIT view of the world though. So I remember talking about this when, as a student, you were basically told, I will clean it up for the purposes of the podcast. My work is crap, my work is crap, my work is crap, my work is crap, then you go to a conference or something, you're like, everybody else's work is crap, everybody else's work is crap, and you feel better and better about

S4

Speaker 4

29:21

it,

S3

Speaker 3

29:22

relatively speaking, and then you sort of keep working on it. I don't hate my work.

S4

Speaker 4

29:26

That resonates with me.

S3

Speaker 3

29:27

Yes, I've never hated my work, but I have been dissatisfied with it. And I think being dissatisfied, being okay with the fact that you've taken a positive step, the derivative's positive, maybe even the second derivative's positive. That's important because that's a part of the hope, right?

S3

Speaker 3

29:45

But you have to, but I haven't gotten there yet. If that's not there, that I haven't gotten there yet, then it's hard to move forward, I think. So I buy that, which is a little different from hating everything that you do.

S4

Speaker 4

29:56

Yeah, I mean, there's things that I've done that I like better than I like myself. So it's separating me from the work, essentially. So I think I am very critical of myself, but sometimes the work I'm really excited about, and sometimes I think it's kind of good.

S3

Speaker 3

30:10

Does that happen right away? So I found the work that I've liked, that I've done, Most of it, I liked it in retrospect more when I was far away from it in time.

S4

Speaker 4

30:21

I have to be fairly excited about it to get done.

S3

Speaker 3

30:24

No, excited at the time, but then happy with the result. But years later, or even I might go back, you know what, that actually turned out to matter. That turned out to matter.

S3

Speaker 3

30:31

Or, oh gosh, it turns out I've been thinking about that. It's actually influenced all the work that I've done since without realizing it.

S4

Speaker 4

30:37

But that guy was smart.

S3

Speaker 3

30:39

Yeah, that guy had a future. Yeah, I, yeah.

S4

Speaker 4

30:43

He's going places.

S3

Speaker 3

30:44

I think there's, So yeah, so I think there's something to it. I think there's something to the idea you've got to hate what you do, but it's not quite hate. It's just being unsatisfied.

S3

Speaker 3

30:52

And different people motivate themselves differently. I don't happen to motivate myself with self-loathing. I happen to motivate myself with something else.

S2

Speaker 2

30:58

So you're able to sit back and be proud of, in retrospect, of the work you've done.

S3

Speaker 3

31:04

Well, and it's easier when you can connect it with other people, because then you can be proud of them.

S2

Speaker 2

31:08

Proud of the people, yeah.

S3

Speaker 3

31:10

And then the question is- And then you

S2

Speaker 2

31:10

can still safely hate yourself privately.

S3

Speaker 3

31:12

Yeah, that's right. It's win-win, Michael, Or at least win-lose, which is what you're looking for.

S2

Speaker 2

31:18

Oh, wow. There's so many brilliant

S4

Speaker 4

31:20

lines in this. There's levels.

S2

Speaker 2

31:23

So how did you actually meet me?

S4

Speaker 4

31:25

Yeah, Mike. So the way I think about it is, because we didn't do much research together

S3

Speaker 3

31:31

at

S4

Speaker 4

31:31

AT&T, But then we all got laid off. So that was, that sucked.

S3

Speaker 3

31:36

By the

S2

Speaker 2

31:36

way, sorry to interrupt, but that was like 1 of the most magical places, historically speaking.

S3

Speaker 3

31:42

They did not appreciate what they had.

S2

Speaker 2

31:45

And how do we, I feel like there's a profound lesson in there too, how do we get it, like what was, why was it so magical? Is it just a coincidence of history or is there something special about?

S3

Speaker 3

31:55

There were

S4

Speaker 4

31:56

some really good managers and people who really believed in machine learning as this is gonna be important. Let's get the people who are thinking about this in creative and insightful ways and put them in 1 place and stir.

S3

Speaker 3

32:10

Yeah, but even beyond that, right, it was Bell Labs at its heyday. And even when we were there, which I think was past its payday.

S4

Speaker 4

32:17

And to be clear, he's gotten to be at Bell Labs. I never got to be at Bell Labs. I joined after that.

S3

Speaker 3

32:22

Yeah, I should have been 91 as a grad student. So I was there for a long time, every summer except for 2.

S4

Speaker 4

32:28

So twice I worked for companies that had just stopped being Bell Labs.

S3

Speaker 3

32:31

Right, Bell Corp.

S4

Speaker 4

32:31

Bell Corp and then AT&T Labs.

S2

Speaker 2

32:33

So Bell Labs was several locations or for the researchers, or is it 1? Like, is that because

S4

Speaker 4

32:38

they're all in Jersey?

S3

Speaker 3

32:41

Yeah, they're all over the place.

S4

Speaker 4

32:42

But they were in a couple places in Jersey.

S3

Speaker 3

32:44

Murray Hill was the Bell Labs place.

S4

Speaker 4

32:48

So you had an office in Murray Hill at 1 point in your career.

S3

Speaker 3

32:51

Yeah, I played ultimate frisbee on the cricket pitch at Bell Labs at Murray Hill. And then it became AT&T Labs when it split off with Luce during what we called tri-vestiture.

S2

Speaker 2

33:01

Are you better than Michael Korns at ultimate frisbee?

S4

Speaker 4

33:03

Yeah.

S3

Speaker 3

33:03

Oh, yeah. Okay.

S4

Speaker 4

33:04

But I think that was not boasting. I think Charles plays a lot of ultimate and I don't think

S3

Speaker 3

33:09

Michael does. Yes, but that wasn't the point. The point is, yes.

S4

Speaker 4

33:13

I'm sorry. Oh, yes, yes, sorry, sorry. Okay, I have played on a championship-winning ultimate frisbee team or whatever, ultimate team with Charles.

S4

Speaker 4

33:20

So I know how good he is. He's really good.

S3

Speaker 3

33:23

How good I was anyway when I was younger. But the thing is.

S4

Speaker 4

33:25

I know how young he was when he was younger.

S3

Speaker 3

33:27

That's true.

S4

Speaker 4

33:27

So much younger than now. He's

S2

Speaker 2

33:29

older now.

S3

Speaker 3

33:30

Yeah, I'm older. Michael was a much better basketball player than I was.

S4

Speaker 4

33:33

Michael Kearns.

S3

Speaker 3

33:34

Yes. No, not Michael. To be

S4

Speaker 4

33:36

clear, I've not played basketball with you, so you don't know how terrible I am, but you have a probably pretty good guess.

S3

Speaker 3

33:42

That you're not as good as Michael Kearns.

S4

Speaker 4

33:44

He's tall and athletic.

S3

Speaker 3

33:45

He cared about it.

S4

Speaker 4

33:46

He's very

S3

Speaker 3

33:46

athletic, he's very good.

S4

Speaker 4

33:48

I

S3

Speaker 3

33:48

love hanging out with Michael. Anyway, but we were talking about something else, although I no longer remember what it was. What were we talking about?

S2

Speaker 2

33:54

Oh, Bell Labs.

S3

Speaker 3

33:55

But also labs. So this was kind of cool about what was magical about it. The first thing you have to know is that Bell Labs was an arm of the government, right?

S3

Speaker 3

34:03

Because AT&T was an arm of the government. It was a monopoly. And, you know, every month you paid a little thing on your phone bill, which turned out was a tax for like all the research that Bell Labs was doing. And, you know, they invented transistors and the laser and whatever else is that big bang

S4

Speaker 4

34:17

or whatever the Cosmic background radiation.

S3

Speaker 3

34:20

Yeah, they did all that stuff. That's some amazing stuff with directional microphones, by the way I got to go in this room Where they had all these panels and everything and we would talk and 1 another he moved some panels around and then he'd have me step 2 steps to the left and I couldn't hear a thing he was saying because nothing was bouncing off the walls. And then he would shut it all down and you could hear your heartbeat.

S3

Speaker 3

34:40

Which is deeply disturbing to hear your heartbeat. You can feel it. I mean, you can feel it now. There's so much all this sort of noise around.

S3

Speaker 3

34:46

Anyway, Bell Labs was about pure research. It was a university, in some sense, the purest sense of a university, but without students. So it was all the faculty working with 1 another, and students would come in to learn. They would come in for 3 or 4 months during the summer, and they would go away.

S3

Speaker 3

35:00

But it was just this kind of wonderful experience. I could walk out my door. In fact, I would often have to walk out my door and deal with Rich Sutton and Michael Kearns yelling at each other about whatever it is they were yelling about, the proper way to prove something or another. And I could just do that.

S3

Speaker 3

35:15

And Dave McAllister and Peter Stone and all of these other people, including, it's a Tinder and then eventually Michael. And it was just a place where you could think, thoughts, and it was okay because so long as once every 25 years or so, somebody invented a transistor, it paid for everything else. You could afford to take the risk. And then when that all went away, it became harder and harder and harder to justify it as far as the folks who were very far away were concerned.

S3

Speaker 3

35:41

And there was such a fast turnaround among middle management on the AT&T side that you never had a chance to really build a relationship. At least people like us didn't have a chance to build a relationship. So when the diaspora happened, it was amazing, right? Everybody left and I think everybody ended up at a great place and made a huge, made a, continued to do really good work with machine learning.

S3

Speaker 3

36:02

But it was a wonderful place. And people will ask me, what's the best job you've ever had? And as a professor, anyway, the answer that I would give is, well, probably Bell Labs in some very real sense. And I would never have a job like that again because Bell Labs doesn't exist anymore.

S3

Speaker 3

36:19

And you know, Microsoft Research is great and Google does good stuff and you can pick IBM, you can tell if you want to, but Bell Labs was magical. It was around, it was an important time and it represents a high watermark in basic research in the US.

S2

Speaker 2

36:32

Is there something you could say about the physical proximity and the chance collisions? Like we live in this time of the pandemic where everyone is maybe trying to see the silver lining and accepting the remote nature of things. Is there, 1 of the things that people like faculty that I talk to miss is the procrastination.

S2

Speaker 2

36:56

Like the chance to, like everything is about meetings that are supposed to be, there's not a chance to just talk about comic books or whatever, go into discussion that's totally pointless. So it's funny you say this because that's how we met. Met, it was exactly that.

S3

Speaker 3

37:10

So I'll let Michael say that, but I'll just add 1 thing, which is just that research is a social process. And it helps to have random social interactions, even if they don't feel social at the time, that's how you get things done. 1 of the great things about the AI lab when I was there, I don't quite know what it looks like now once they moved buildings, but we had entire walls that were whiteboards and people would just get up there and they would just write and people would walk up and you'd have arguments and you'd explain things to 1 another and you got so much out of the freedom to do that, you had to be okay with people challenging every fricking word you said, which I would sometimes find deeply irritating, but most of the time it was quite useful.

S3

Speaker 3

37:49

But the sort of pointlessness and the interaction was in some sense the point, at least for me.

S2

Speaker 2

37:54

Yeah, I mean, I think offline yesterday I mentioned Josh Tenenbaum, and he's very much, he's such an inspiration in the child-like way that he pulls you in on any topic. It doesn't even have to be about machine learning or the brain. He'll just pull you into a closest writable surface, which is still, you can find whiteboards at MIT everywhere.

S2

Speaker 2

38:20

And just like basically cancel all meetings and talk for a couple hours about some aimless thing and it feels like the whole world, the time-space continuum kind of warps and that becomes the most important thing. And then it's just, it's definitely something worth missing in this world where everything's remote. There's some magic to the physical presence.

S3

Speaker 3

38:42

Whenever I wonder myself whether MIT really is as great as I remember it, I just go talk to Josh.

S2

Speaker 2

38:48

Yeah, you know, that's funny. There's a few people in this world that carry the best of what particular institutions stand for, right? And there's-

S3

Speaker 3

38:57

There's Josh. I mean, I don't, my guess is he's unaware of this.

S2

Speaker 2

39:00

That's the point. That the masters are not aware of their mastery. So, How did you meet?

S2

Speaker 2

39:09

Yes, but first a tangent, no. How did you meet me?

S4

Speaker 4

39:14

So I'm not sure what you were thinking of, but when it started to dawn on me that maybe we had a longer term bond was after we all got laid off. And you had decided at that point that we were still paid. We were given an opportunity to do a job search and kind of make a transition, but it was clear that we were done.

S4

Speaker 4

39:35

And I would go to my office to work and you would go to my office to keep me from working. That was my recollection of it. You had decided that there was really no point in working for the company because our relationship with the company was done. Yeah, but

S3

Speaker 3

39:49

remember, I felt that way beforehand. It wasn't about the company, it was about the set of people there doing really cool things, and it always been that way. But we were working on something together.

S3

Speaker 3

39:57

Oh, yeah,

S4

Speaker 4

39:57

yeah, yeah, that's right. So at the very end, we all got laid off, but then our boss came to, our boss's boss came to us, because our boss was Michael Kearns, and he had jumped ship brilliantly, like perfect timing, like things, like right before the ship was about to sink, he was like, gotta go, and landed perfectly, because Michael Kearns. Because Michael Kearns.

S4

Speaker 4

40:20

And leaving the rest of us to go like, this is fine. And then it was clear that it wasn't fine and we were all toast. So we had this sort of long period of time. But then our boss figured out, okay, wait, maybe we can save a couple of these people if we can have them do something really useful.

S4

Speaker 4

40:37

And the useful thing was we were gonna make basically an automated assistant that could help you with your calendar, you could like tell it things and it would respond appropriately, it would just kind of integrate across all sorts of your personal information. And so me and Charles and Peter Stone were set up as the crack team to actually solve this problem. Other people maybe were too theoretical, they thought, but we could actually get something done. So we sat down to get something done, and there wasn't time, and it wouldn't have saved us anyway.

S3

Speaker 3

41:09

And

S4

Speaker 4

41:10

so it all kind of went downhill. But the interesting, I think, coda to that is that our boss's boss is a guy named Ron Brockman. And when he left AT&T, cause we were all laid off, he went to DARPA, started up a program there that became KALO, which is the program from which Siri sprung, which is a digital assistant that helps you with your calendar and a bunch of other things.

S4

Speaker 4

41:37

It really, in some ways, got its start with me and Charles and Peter trying to implement this vision that Ron Brockman had, that he ultimately got implemented through his role at DARPA. So when I'm trying to feel less bad about having been laid off from what is possibly the greatest job of all time, I think about, well, we kind of helped birth Siri. That's something.

S3

Speaker 3

42:01

And then he did other things too. But we got to spend a lot of time in his office and talk about a lot of things.

S4

Speaker 4

42:07

We got to spend a lot of time in my office. Yeah. Yeah, yeah.

S4

Speaker 4

42:10

And so then we went on our merry way. Everyone went to different places. Charles landed at Georgia Tech, which was what he always dreamed he would do. And so that worked out well.

S4

Speaker 4

42:23

I came up with a saying at the time, which is luck favors the Charles. It's kind of like luck favors the prepared, but Charles, like he wished something and then it would basically happen just the way he wanted. It was inspirational to see things go that way.

S3

Speaker 3

42:38

Things worked out.

S4

Speaker 4

42:39

And we stayed in touch. And then I think it really helped when you were working on, I mean, you'd kept me in the loop for things like threads and the work that you were doing at Georgia Tech, but then when they were starting their online master's program, he knew that I was really excited about MOOCs and online teaching, and he's like, I have a plan. And I'm like, tell me your plan.

S4

Speaker 4

42:58

He's like, I can't tell you the plan yet, because they were deep in negotiations between Georgia Tech and Udacity to make this happen, and they didn't want it to leak. So Charles Wood kept teasing me about it, but wouldn't tell me what was actually going on. And eventually it was announced, and he said, I would like you to teach the machine learning course with me. I'm like, that can't possibly work.

S4

Speaker 4

43:18

But it was a great idea and it was super fun. It was a lot of work to put together, but it was really great.

S2

Speaker 2

43:23

Was that the first time you thought about, first of all, was it the first time you got seriously into teaching? I

S4

Speaker 4

43:30

mean, I was a professor. I'm trying

S2

Speaker 2

43:31

to get

S4

Speaker 4

43:31

the timing right.

S2

Speaker 2

43:33

Also, this was already after you jumped to, so like, there's a little bit of jumping around in time.

S4

Speaker 4

43:38

Yeah, sorry

S3

Speaker 3

43:39

about that. There's a pretty big

S2

Speaker 2

43:39

jump in time. So like the MOOCs thing.

S4

Speaker 4

43:42

So Charles got to Georgia Tech, and he, I mean, maybe Charles, Maybe this is a Charles story.

S3

Speaker 3

43:45

I think this was like 2002. He

S4

Speaker 4

43:46

got to Georgia Tech in

S1

Speaker 1

43:47

2002.

S4

Speaker 4

43:50

And worked on things like revamping the curriculum, the undergraduate curriculum, so that it had some kind of semblance of modular structure, because computer science was, at the time, moving from a fairly narrow, specific set of topics to touching a lot of other parts of intellectual life, and the curriculum was supposed to reflect that. And so Charles played a big role in kind of redesigning that. And then the-

S3

Speaker 3

44:16

And for my labors, I ended up as associate dean.

S4

Speaker 4

44:20

Right, he got to become associate dean of charge of educational stuff.

S2

Speaker 2

44:24

This should be a valuable lesson. If you're good at something, they will give you responsibility to do more of that thing until you.

S4

Speaker 4

44:34

Don't show competence.

S2

Speaker 2

44:35

Don't show competence if you don't have responsibility.

S3

Speaker 3

44:38

Here's what they say. The reward for good work is more work. The reward for bad work is less work.

S3

Speaker 3

44:46

Which, I don't know, depending on what you're trying to do that week, 1 of those is better than the other.

S2

Speaker 2

44:51

Well, 1 of the problems with the word work, sorry to interrupt, is that it seems to be an antonym in this particular language where you have the opposite of happiness. But it seems like they're, that's 1 of, you know, we talked about balance. It's always like work-life balance, it always rubbed me the wrong way as a terminology.

S2

Speaker 2

45:12

I know it's just words.

S4

Speaker 4

45:13

Right, the opposite of work is play, but ideally work is play.

S3

Speaker 3

45:17

Oh, I can't tell you how much time I'd spend, certainly when I was at Bell Labs, except for a few very key moments. As a professor, I would do this too. I would just say, I cannot believe they're paying me to do this.

S3

Speaker 3

45:28

Because it's fun, it's something that I would do for a hobby if I could anyway. So that's what it worked out. Are you

S2

Speaker 2

45:35

sure you wanna be saying that when this is being recorded? Oh no,

S3

Speaker 3

45:38

as a dean, that is not true at all. I need a raise. Yeah.

S3

Speaker 3

45:41

But I think here with this, that even though a lot of time passed, Michael and I talked almost every, well, we texted almost every day during the period.

S4

Speaker 4

45:49

Charles at 1 point took me, there was the ICML conference, the machine learning conference was in Atlanta. I was the chair, the general chair of the conference. Charles was my publicity chair or something like that, or fundraising chair?

S4

Speaker 4

46:05

Yeah, but he decided it'd be really funny if he didn't actually show up for the conference in his own home city, so he didn't. But he did at 1 point pick me up at the conference in his Tesla and drove me to the Atlanta mall and forced me to buy an iPhone because he didn't like how it was to text with me and thought it would be better for him if I had an iPhone, the text would be somehow smoother.

S3

Speaker 3

46:30

And it was.

S4

Speaker 4

46:30

And it was.

S3

Speaker 3

46:31

And it is, and his life is better. And my

S4

Speaker 4

46:32

life is better. And so, yeah. But it was, yeah, Charles forced me to get an iPhone so that he could text me more efficiently.

S4

Speaker 4

46:40

I thought that was an interesting moment.

S3

Speaker 3

46:41

It works for me. Anyway, so we kept talking the whole time, and then eventually we did the teaching thing. And it was great.

S3

Speaker 3

46:46

And there's a couple of reasons for that, by the way. 1 is I really wanted to do something different. Like, you've got this medium here. People claim it can change things.

S3

Speaker 3

46:54

What's a thing that you could do in this medium that you could not do otherwise? Besides edit, right? I mean, what could you do? And being able to do something with another person was that kind of thing.

S3

Speaker 3

47:04

It's very hard. I mean, you can take turns, but teaching together, having conversations is very hard. So that was a cool thing. The second thing, give me an excuse to do more stuff with him.

S4

Speaker 4

47:12

Yeah, I always thought, he makes it sound brilliant. And it is, I guess. But at the time, it really felt like I've got a lot to do, Charles is saying, and it would be great if Michael could teach the course and I could just

S3

Speaker 3

47:27

hang out.

S4

Speaker 4

47:27

Yeah, just kind of coast on that.

S3

Speaker 3

47:29

Well, that's what The second class was more like that. Because the second class was- The

S4

Speaker 4

47:32

second class was explicit like that.

S3

Speaker 3

47:33

But the

S4

Speaker 4

47:33

first class, it was at least half. So the structure that we came up- I think you're

S2

Speaker 2

47:37

once again letting the facts get in the way of a good story.

S4

Speaker 4

47:42

I should just let Charles talk.

S3

Speaker 3

47:44

But that's the facts that he saw. But so that was kind of true. That's your facts.

S3

Speaker 3

47:48

Yeah, that was sort of true for 7642, which is the reinforcement learning class, because that was really his class.

S2

Speaker 2

47:52

You started with reinforcement learning?

S3

Speaker 3

47:53

No, we started with, I did the intro machine learning,

S1

Speaker 1

47:55

7641,

S3

Speaker 3

47:57

which is supervised learning, unsupervised learning, and reinforcement learning and decision making, cram all that in there, the kind of assignments that we talked about earlier. And then eventually, about a year later, we did a follow on 7642, which is reinforcement learning and decision making. The first class was based on something I'd been teaching at that point for well over a decade.

S3

Speaker 3

48:14

And the second class was based on something Michael had been teaching. Actually, I learned quite a bit teaching that class with him. But he drove most of that. But the first 1 I drove most of was all my material.

S3

Speaker 3

48:23

Although I had stolen that material originally from slides I found online from Michael, who had originally stolen that material from, I guess, slides he found online, probably from Andrew Moore. Because the jokes were the same anyway. At least some of the, at least when I found the slides, some of the stuff was there. Yes, every machine learning class taught in the early 2000s stole from Andrew Moore.

S3

Speaker 3

48:41

A

S4

Speaker 4

48:41

particular joke or 2.

S3

Speaker 3

48:43

At least the structure. Now I did, and he did actually a lot more with reinforcement learning and such, and game theory and those kinds of things. But we all sort of built.

S3

Speaker 3

48:51

You mean in

S2

Speaker 2

48:51

the research world?

S3

Speaker 3

48:52

No, no, no, in that class. No, I mean in teaching that class.

S4

Speaker 4

48:54

The coverage was different than

S3

Speaker 3

48:56

what we started.

S4

Speaker 4

48:57

Most

S3

Speaker 3

48:57

people were just doing supervised learning and maybe a little bit of clustering and whatnot. But we took it all the way to the... A

S4

Speaker 4

49:02

lot of it just comes from Tom Mitchell's book.

S3

Speaker 3

49:04

Oh no. Yeah, except, well, half of it comes from Tom Mitchell's book, right? But the other half doesn't.

S3

Speaker 3

49:10

This is why it's all readings, right? Because certain things weren't invented when Tom wrote this book.

S4

Speaker 4

49:13

Yeah, okay, that's true.

S3

Speaker 3

49:14

Right? But it was quite good. But there's a reason for that besides, you know, just I wanted to do it. I wanted to do something new and I wanted to do something with him, which is a realization, which is despite what you might believe, he's an introvert and I'm an introvert, or I'm on the edge of being an introvert anyway.

S3

Speaker 3

49:32

But both of us, I think, enjoy the energy of the crowd. There's something about talking to people and bringing them in to whatever we find interesting that is empowering, energizing, or whatever. And I found the idea of staring alone at a computer screen and then talking off of materials less inspiring than I wanted it to be. And I

S4

Speaker 4

49:55

had in fact done a MOOC for Udacity on algorithms. And It was a week in a dark room talking at the screen, writing on the little pad. And I didn't know this was happening, but the crew had watched some of the videos while in the middle of this, and they're like, something's wrong.

S4

Speaker 4

50:15

You're sort of shutting down. And I think a lot of it was I'll make jokes and no 1 would laugh. And I felt like the crowd hated me. Now of course there was no crowd, so it wasn't rational.

S4

Speaker 4

50:30

But Each time I tried it and I got no reaction, it just was taking the energy out of my performance, out of my presentation. This is such

S3

Speaker 3

50:38

a fantastic metaphor for grad school. Anyway, by working together, we could play off each other and have a- And keep

S4

Speaker 4

50:44

the energy up, because you can't let your guard down for a moment with Charles, he'll just overpower you.

S3

Speaker 3

50:50

I have no idea what you're talking about. But we would work really well together, I thought, and we knew each other, so I knew that we could sort of make it work, plus I was the associate dean, so they had to do what I told them to do, so We had to make it work. And so it worked out very well, I thought.

S3

Speaker 3

51:03

Well enough that we-

S4

Speaker 4

51:04

With great power comes great power.

S3

Speaker 3

51:06

That's right. And we became smooth and curly, and that's when we did the overfitting thriller video.

S4

Speaker 4

51:15

Yeah, Yeah, that's a thing.

S2

Speaker 2

51:17

So can we just like smooth and curly, where did that

S4

Speaker 4

51:21

come from? So okay, so it happened, it was completely spontaneous.

S2

Speaker 2

51:24

These are nicknames you go by.

S4

Speaker 4

51:25

Yeah, so.

S3

Speaker 3

51:27

It's what the students call us.

S4

Speaker 4

51:28

He was lecturing, So the way that we structure the lectures is 1 of us is the lecturer and 1 of us is basically the student. And so he was lecturing on- The

S3

Speaker 3

51:37

lecturer prepares all the materials, comes up with the quizzes, and then the student comes in not knowing anything. So it was just like being on campus. And I was doing game theory in particular,

S4

Speaker 4

51:47

the

S3

Speaker 3

51:47

Prisoner's Dilemma. Prisoner's

S4

Speaker 4

51:48

Dilemma. And so he needed to set up a little Prisoner's Dilemma grid. So he drew it and I could see what he was drawing. And the Prisoner's Dilemma consists of 2 players, 2 parties, so he decided he would make little cartoons of the 2 of us.

S4

Speaker 4

52:01

And so there was 2 criminals, right, that were deciding whether or not to rat each other out. 1 of them he drew as a circle with a smiley face and a kind of goatee thing, smooth head. And the other 1 with all sorts of curly hair. And he said, this is smooth and curly.

S4

Speaker 4

52:18

I said, smooth and curly? He said, no, no, smooth with a V. It's very important that it have a V. And then the students really took to that.

S4

Speaker 4

52:27

Like they found that relatable.

S3

Speaker 3

52:28

He started singing Smooth Criminal by Michael Jackson.

S2

Speaker 2

52:31

Yeah, yeah, yeah. And those names stuck.

S4

Speaker 4

52:33

So we now have a video series, an episode, our kind of first actual episode should be coming out today, smooth and curly on video, where the 2 of us discuss episodes of Westworld. We watch Westworld and we're like, huh, what does this say about computer science and AI?

S3

Speaker 3

52:51

And we've never, we did not watch it. I mean, I know it's on season 3 or whatever we have. As of this recording, it's on season 3.

S4

Speaker 4

52:57

And watch now 2 episodes total.

S3

Speaker 3

52:59

Yeah, I think I watched 3.

S2

Speaker 2

53:01

What do you think about Westworld?

S4

Speaker 4

53:02

2 episodes in, so I can tell you so far, I'm just guessing what's gonna happen next. It seems like bad things are gonna happen with the robots uprising. It's a lot of- Spoiler alert.

S3

Speaker 3

53:12

So I have not, I mean, you know, I vaguely remember a movie existing, so I assume it's related to that.

S4

Speaker 4

53:18

That was more my time than your time, Charles. That's

S3

Speaker 3

53:20

right, because you're much older than I am. I think the important thing here is that it's narrative. It's all about telling a story.

S3

Speaker 3

53:26

That's the whole driving thing. But the idea that they would give these reveries, that they would make people- Let them remember- remember the awful things that happened. Who could possibly think that was going to happen? I mean, I don't know.

S3

Speaker 3

53:38

I've only seen the first 2 episodes or maybe the third 1. I think I've only seen the third 1. You know

S4

Speaker 4

53:40

what it was? You know what the problem is? What?

S4

Speaker 4

53:42

That the robots were actually designed by Hannibal Lecter.

S3

Speaker 3

53:45

That's true.

S4

Speaker 4

53:46

They weren't. So what do you think is going to happen? Bad things.

S3

Speaker 3

53:50

It's clear that things are happening and characters are being introduced and we don't yet know anything. But still, I was just struck by how it's all driven by narrative and story. And there's all these implied things like programming.

S3

Speaker 3

54:01

The programming interface is talking to them about what's going on in their heads, which is both, I mean, artistically, it's probably useful to film it that way, but think about how it would work in real life. It just seems very crazy. But there was, we saw on the second episode, there's a screen, you could see things

S4

Speaker 4

54:15

that sort of

S3

Speaker 3

54:16

stayed in the world. It was quite interesting to just kind of ask this question so far. I mean, I assume it veers off into never never land at some point.

S3

Speaker 3

54:23

But, what is this? So we

S4

Speaker 4

54:24

don't know, we can't answer that question.

S2

Speaker 2

54:25

I'm also a fan of a guy named Alex Garland. He's a director of Ex Machina. And he is the first, I wonder if Kubrick was like this actually, is he like studies what would it take to program an AI system?

S2

Speaker 2

54:41

Like he's curious enough to go into that direction. On the Westworld side, I felt there was more emphasis on the narratives than actually asking computer science questions. How would you build this? How would you, and how would you debug it?

S3

Speaker 3

54:58

To me, that's the key issue. They were terrible debuggers.

S4

Speaker 4

55:02

Yeah. Well, they said specifically, so we make a change and we put it out in the world, and that's bad because something terrible could happen. If you're putting things out in the world and you're not sure whether something terrible is going to happen, your process is probably flawed. I just

S3

Speaker 3

55:13

feel like there should have been someone whose sole job it was to walk around and poke his head in and say, what could possibly go wrong? Just over and over again.

S2

Speaker 2

55:20

I would have loved if there was an, and I did watch a lot more, and I'm not giving anything away. I would have loved it if there was an episode where the new intern is debugging a new model or something, and it just keeps failing. And they're like, all right.

S2

Speaker 2

55:34

And then it more turns into an episode of Silicon Valley or something like that versus this ominous AI systems that are constantly threatening the fabric of this world that's been created. Yeah.

S3

Speaker 3

55:47

Yeah, and you know, this reminds me of something that, so I agree with that, that actually would be very cool, at least, well, for the small percentage of people who care about debugging systems. But the other thing is-

S4

Speaker 4

55:57

Right, debugging, the series.

S3

Speaker 3

55:59

It falls into, think of the sequels, fear of the debugging. Oh my gosh. Anyway.

S3

Speaker 3

56:04

It's a

S4

Speaker 4

56:04

nightmare show. It's a horror movie.

S3

Speaker 3

56:07

I think that's where we lose people by the way early on, is the people who either figure out debugging or think debugging is terrible. Where we

S4

Speaker 4

56:13

lose people in computer science.

S3

Speaker 3

56:14

This is part of the struggle versus suffering. You get through it and you kind of get the skills of it or you're just like, this is dumb and this is a dumb way to do anything. And I think that's when we lose people.

S3

Speaker 3

56:23

But, well, I'll leave it at that. But I think that there's something really, really neat about framing it that way, but what I don't like about all of these things, and I love Dexmachina, by the way, although the ending was very depressing.

S2

Speaker 2

56:42

1 of the things I have to talk to Alex about, he says that the thing that nobody noticed he put in is at the end, spoiler alert, the robot turns and looks at the camera and smiles briefly. And to him, he thought that his definition of passing the general version of the Turing test or the consciousness test is smiling for no 1. Like, not, oh, you know, It's like the Chinese room kind of experiment.

S2

Speaker 2

57:20

It's not always trying to act for others, but just on your own, being able to have a relationship with the actual experience and just take it in. I don't know, he said nobody noticed the magic of it. I have

S3

Speaker 3

57:32

this vague feeling that I remember the smile, but now you've just put the memory in my head, so probably not. But I do think that that's interesting. Although by looking at the camera, you are smiling for the audience, right?

S3

Speaker 3

57:43

You're breaking the fourth wall. It seems, I mean, well, that's a limitation of the medium, but I like that idea. But here's the problem I have with all of those movies, all of them, is that, but I know why it's this way, and I enjoy those movies, and Westworld, is it sets up the problem of AI as succeeding and then having something we cannot control. But it's not the bad part of AI.

S3

Speaker 3

58:08

The bad part of AI is the stuff we're living through now. It's using the data to make decisions that are terrible. It's not the intelligence that's gonna go out there and surpass us and take over the world or lock us into a room to starve to death slowly over multiple days. It's instead the tools that we're building that are allowing us to make the terrible decisions we would have less efficiently made before, right?

S3

Speaker 3

58:32

You know, computers are very good at making us more efficient, including being more efficient at doing terrible things. And that's the part of the AI we have to worry about. It's not the, you know, true intelligence that we're going to build sometime in the future, probably long after we're around. But, you know, I just, I think that whole framing of it sort of misses the point, even though it is inspiring.

S3

Speaker 3

58:55

And I was inspired by those ideas, right? That I got into this in part because I wanted to build something like that. Philosophical questions are interesting to me, but that's not where the terror comes from, the terror comes from the everyday.

S2

Speaker 2

59:05

And you can construct, it's in the subtlety of the interaction between AI and the human, like with social networks, all the stuff you're doing with interactive artificial intelligence, but I feel like HAL 9000 came a little bit closer to that when it's in 2001 Space Odyssey, because it felt like a personal assistant. It felt like closer to the AI systems we have today and the real things we might actually encounter, which is over-relying in some fundamental way on our like dom assistance or on social networks, like over offloading too much of us onto things that require internet and power and so on, and thereby becoming powerless as a standalone entity.