See all Lex Fridman transcripts on Youtube

youtube thumbnail

Rajat Monga: TensorFlow | Lex Fridman Podcast #22

1 hours 10 minutes 57 seconds

🇬🇧 English

S1

Speaker 1

01:00:00

Or is there still a balance to where, I mean, it's the last deadline. You had the Dev Summit, they came together incredibly. It looked like there was a lot of moving pieces and so on. So did that deadline make people rise to the occasion, releasing Tesla Flow 2.0 Alpha?

S1

Speaker 1

01:00:18

I'm sure that was done last minute as well. I mean, like up to the last point.

S2

Speaker 2

01:00:25

Again, it's 1 of those things that's a, you need to strike the good balance. There's some value that deadlines bring That does bring a sense of urgency to get the right things together. Instead of getting the perfect thing out, you need something that's good and works well.

S2

Speaker 2

01:00:41

And the team definitely did a great job in putting that together. So I was very amazed and excited by everything, how that came together. That said, across the year, we try not to put out official deadlines. We focus on key things that are important, figure out how much of it's important.

S2

Speaker 2

01:01:01

And we are developing in the open, internally and externally, everything's available to everybody, so you can pick and look at where things are. We do releases at a regular cadence, so fine if something doesn't necessarily end up this month, it'll end up in the next release in a month or 2. And that's okay, but we want to get, like, keep moving as fast as we can in these different areas. Because we can iterate and improve on things, sometimes it's okay to put things out that aren't fully ready.

S2

Speaker 2

01:01:32

We'll make sure it's clear that, okay, this is experimental, but it's out there if you want to try and give feedback. That's very, very useful. I think that quick cycle and quick iteration is important. That's what we often focus on rather than here's a deadline where you get everything else.

S1

Speaker 1

01:01:49

It's 2.0, is there pressure to make that stable? Or like for example, WordPress 5.0 just came out and there was no pressure to, It was a lot of build updates that delivered way too late. But, and they said, okay, well, but we're gonna release a lot of updates really quickly to improve it.

S1

Speaker 1

01:02:09

Do you see TensorFlow 2.0 in that same kind of way, or is there this pressure to once it hits 2.0, once you get to the release candidate and then you get to the final, that's going to be the stable thing. So it's going to be stable in just like when NodeX was, where every API that's there is going to remain and work. It doesn't mean

S2

Speaker 2

01:02:32

we can't change things under the covers. It doesn't mean we can't add things. So there's still a lot more for us to do, and we continue to have more releases.

S2

Speaker 2

01:02:41

So in that sense, there's still, I don't think we'll be done in like 2 months when we release this.

S1

Speaker 1

01:02:46

I don't know if you can say, but is there, you know, there's not external deadlines for TensorFlow 2.0, but is there internal deadlines, artificial or otherwise, that

S2

Speaker 2

01:02:58

you try and just set for yourself? Or is it whenever it's ready? So we want it to be a great product, and that's a big, important piece for us.

S2

Speaker 2

01:03:09

TensorFlow's already out there. We have 41 million downloads for 1.x, so it's not like we have to have this... Yeah, exactly. So it's not like...

S2

Speaker 2

01:03:18

A lot of the features that we've, you know, really polishing and putting them together are there. We don't have to rush that just because. So in that sense, we want to get it right and really focus on that. That said, we have said that we are looking to get this out in the next few months, in the next quarter, and as far as possible, we'll definitely try to make that happen.

S1

Speaker 1

01:03:39

Yeah, my favorite line was, spring is a relative concept, and I love it. Yes. Spoken like a true developer.

S1

Speaker 1

01:03:47

So, something I'm really interested in in your previous line of work is, before TensorFlow, you led a team at Google on search ads. I think This is a very interesting topic on every level, on a technical level, because at their best, ads connect people to the things they want and need. And at their worst, they're just these things that annoy the heck out of you to the point of ruining the entire user experience of whatever you're actually doing. So they have a bad rep, I guess.

S1

Speaker 1

01:04:23

And on the other end, so this connecting users to the thing they need and want is a beautiful opportunity for machine learning to shine. Like huge amounts of data that's personalized and you kind of map to the thing they actually want won't get annoyed. So what have you learned from this, Google that's leading the world in this aspect? What have you learned from that experience?

S1

Speaker 1

01:04:47

And what do you think is the future of ads? Take you back to

S2

Speaker 2

01:04:52

the, that. Yes, it's been a while, but I totally agree with what you said. I think the search ads, the way it was always looked at, and I believe it still is, is it's an extension of what search is trying to do.

S2

Speaker 2

01:05:08

The goal is to make the information and make the world's information accessible. With ads, It's not just information, but it may be products or other things that people care about. And so it's really important for them to align with what the users need. And in search ads, there's a minimum quality level before that ad would be shown.

S2

Speaker 2

01:05:32

If we don't have an ad that hits that quality, but it will not be shown even if we have it, and OK, maybe we lose some money there. That's fine. That is really, really important. And I think that is something I really liked about being there.

S2

Speaker 2

01:05:45

Advertising is a key part. I mean, as a model, it's been around for ages, right? It's not a new model. It's been adapted to the web and became a core part of search and many other search engines across the world.

S2

Speaker 2

01:06:02

I do hope, like I said, there are aspects of ads that are annoying, and I go to a website and if it just keeps popping an ad in my face not to let me read that's going to be annoying clearly. So I hope we can strike that balance between showing a good ad where it's valuable to the user and provides the monetization to the service. And this might be search, this might be a website, all of these, they do need the monetization for them to provide that service. But if it's done in that good balance between showing just some random stuff that's distracting versus showing something that's actually valuable.

S1

Speaker 1

01:06:50

So do you see it moving forward as to continue being a model that funds businesses like Google, that's a significant revenue stream? Because that's 1 of the most exciting things, but also limiting things in the internet is nobody wants to pay for anything. And advertisements again, coupled at their best, are actually really useful and not annoying.

S1

Speaker 1

01:07:18

Do you see that continuing and growing and improving? Or do you see more Netflix type models where you have to start to pay for content?

S2

Speaker 2

01:07:29

I think it's a mix. I think it's going to take a long while for everything to be paid on the internet, if at all. Probably not.

S2

Speaker 2

01:07:36

I mean, I think there's always going to be things that are sort of monetized with things like ads. But over the last few years, I would say we've definitely seen that transition towards more paid services across the web, and people are willing to pay for them because they do see the value. I mean, Netflix is a great example. I mean, we have YouTube doing things.

S2

Speaker 2

01:07:56

People pay for the apps they buy. More people, I find, are willing to pay for newspaper content, for the good news websites across the web. That wasn't the case even a few years ago, I would say. And I just see that change in myself as well, and just lots of people around me.

S2

Speaker 2

01:08:15

So definitely hopeful that we'll transition to that mixed model where maybe you get to try something out for free, maybe with ads. But then there's a more clear revenue model that sort of helps go beyond that.

S1

Speaker 1

01:08:30

So speaking of revenue, how is it that a person can use the TPU in a Google CoLab for free? So what's the... I guess the question is, what's the future of TensorFlow in terms of empowering, say, a class of 300 students.

S1

Speaker 1

01:08:52

And I'm asked by MIT, what is going to be the future of them being able to do their homework in TensorFlow? Like, where are they going to train these networks, right? What's that future look like with TPUs, with cloud services and so on?

S2

Speaker 2

01:09:08

I think a number of things there. I mean, any TensorFlow open source, you can run it wherever. You can run it on your desktop and your desktops always keep getting more powerful, so maybe you can do more.

S2

Speaker 2

01:09:19

My phone is like, I don't know how many times more powerful than my first desktop.

S1

Speaker 1

01:09:23

You'll probably train it on your phone, though.

S2

Speaker 2

01:09:26

Right, so in that sense, the power you have in your hand is a lot more. Clouds are actually very interesting from, say, students or courses perspective, because they make it very easy to get started. I mean, Colab, the great thing about it is go to a website, and it just works.

S2

Speaker 2

01:09:45

No installation needed, nothing to, you know, you're just there, and things are working. That's really the power of cloud as well. And so I do expect that to grow. Again, Colab is a free service.

S2

Speaker 2

01:09:57

It's great to get started, to play with things, to explore things. That said, with free you can only get so much. So just like we were talking about, free versus paid. Yeah, there are services you can pay for and get a lot more.

S1

Speaker 1

01:10:15

Great, So if I'm a complete beginner interested in machine learning and TensorFlow, what should I do? Probably start with going

S2

Speaker 2

01:10:22

to our website and playing there. So just

S1

Speaker 1

01:10:24

go to TensorFlow.org and start clicking on things.

S2

Speaker 2

01:10:26

Yep. Check our tutorials and guides. There's stuff you can just click there and go to a Colab and do things. No installation needed.

S2

Speaker 2

01:10:32

You can get started right there.

S1

Speaker 1

01:10:34

Okay, awesome. Rajat, thank

S2

Speaker 2

01:10:45

you