3 hours 19 minutes 51 seconds
🇬🇧 English
Speaker 1
00:00
If this is a super intelligence, if it's folding proteins and analyzing all data sets and all whatever they give it access to, how can we be certain that it's not gonna figure out how to get itself out of the cloud, how to store itself in other mediums, trees, the optic nerve, the brain.
Speaker 2
00:20
You know what I mean? We don't know that.
Speaker 1
00:22
We don't know that it won't leap out and like start hanging. Like, and then at that point, now we do have the wildfire. Now you can't stop it.
Speaker 1
00:29
You can't unplug it. You can't shut your servers down because it left the box, left the room using some technology you haven't even discovered yet. How fucking cool would that be for like the men in black to come to me like, listen, I need you to infiltrate the fucking comedy scene.
Speaker 3
00:47
The following is a conversation with Duncan Trussell, a standup comedian, host of the Duncan Trussell Family Hour podcast, and 1 of my favorite human beings. I've been a fan of his for many years, so it was a huge honor and pleasure to meet him for the first time and to sit down for this chat. This is the Lex Friedman Podcast.
Speaker 3
01:09
To support it, please check out our sponsors in the description. And now, dear friends, here's Duncan Trussell. Nietzsche has this thought experiment called eternal recurrence, where you get to relive your whole life over and over and over and over, and I think it's a way to bring to the surface of your mind the idea that every single moment in your life matters. It intensely matters, the bad and the good.
Speaker 3
01:36
And he kind of wants you to imagine that idea, that every single decision you make throughout your life, you repeat over and over and over, and he wants you to respond to that. Do you feel horrible about that or do you feel good about that? And you have to think through this idea in order to see where you stand in life. What is your relationship like with life?
Speaker 3
01:56
I actually wanna read the way he first introduces that concept for people who are not familiar. What if some day or night a demon, by the way, he has a demon introduce this thought experiment. What if some day or night a demon were to steal after you into your loneliest loneliness and say to you, quote, this life as you now live it and have lived it, you will have to live once more and innumerable times more and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unutterably small and great in your life will have to return to you, all in the same succession and sequence. Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus?
Speaker 3
02:46
Or have you once experienced a tremendous moment when you would have answered him, you are a God and never have I heard anything more divine? So are you terrified or excited by such a thought experiment when you apply it to your own life? Excited. Excited.
Speaker 3
03:02
Oh. Even the dark stuff.
Speaker 1
03:04
Oh yeah, for sure, Definitely. I mean, also, that thing you're talking about, he kind of leaves out, maybe on purpose, because the thought experiment starts falling apart a little bit, the amnesia between each loop. So, you know, the whole thing gets wiped.
Speaker 1
03:22
Now, if the amnesia wasn't there, and yet somehow you were witnessing the non-autonomy implicit in what he's talking about, So you have to kind of watch yourself go through this rotten loop. Then, yeah, that's a description. There's probably
Speaker 3
03:36
a boredom that comes into that. So you don't experience everything anew. Exactly.
Speaker 3
03:41
So the bad stuff, the good stuff, the newness of it is really important.
Speaker 1
03:45
That's it, yeah. This is the, in Hades, when you die, there's a river, I think it's called Leith, you ever heard of this? L-E-T-H-E, you drink from it and you don't remember your past lives.
Speaker 1
03:57
And then when you're reborn, it's fresh and you don't have to, I mean, just think of like the amount of psychological help you would need to get over all the bullshit that happened in prior lives. You know what I mean? Can you imagine if you're still resentful of something someone did to you in the 14th century? But it would compound.
Speaker 3
04:18
Well, if you repeat the same thing over and over and over, there would be no difference. Maybe you would start to appreciate the nuances more, like when you watch the same movie over and over and over. Maybe you'll get to actually let go of this idea of all the positive possibilities that lay before you, but actually enjoy the moment much more.
Speaker 3
04:41
If you remember that you've lived this life a thousand times all the little things, the way somebody smiles, if you've been abused, the way somebody, like the pain of it, the suffering, the down that you feel, the experience of sadness, depression, fear, all that kind of stuff, you get to really, you get to also appreciate that that's part of life, it's part of being alive.
Speaker 1
05:02
Now, also in his experiment, if I was gonna, and I love the experiment from the perspective of just where technology is now and simulation theory and stuff like that, but in that thought experiment, if this rotten demon immediately killed you, then within that, it's a little more horrifying because even in the, first of all, you're trusting a fucking demon.
Speaker 3
05:28
Why are you talking to a demon? Let's start there.
Speaker 1
05:30
Yeah, because that is gonna be, even before I get into the metaphysics and the implications and where is this life stored? Where's the loop stored? I mean, are we talking about some kind of unchanging data set or something?
Speaker 1
05:42
For that, you're like, why is there a fucking talking demon in my room trying to freak me out? You're gonna wanna autopsy the demon. Can you catch it? Does this apply to you, demon?
Speaker 1
05:50
And again, obviously, it's a fucking thought experiment. Nichi would be annoyed by me. But I think, like, you would still be able to entertain the joy. You'd have the joy of not knowing what's around the corner, you know still it's not like, you know What's coming just because the demon said some kind of loop in other words the idea of being damned to your past decisions It doesn't even work because you can't remember what decisions you're about to make So from from that perspective also, I think I'd be happy about it, or I would just think, oh, cool.
Speaker 1
06:22
I mean, it's a good story. I'm gonna tell people about how this.
Speaker 3
06:24
I wonder what the demon would actually look like in real life, because I suspect it would look like like a charming, like a friend. Wouldn't they be a loved 1? Wouldn't the demon come to you through the mechanism, through the front door of love, not through the back door of evil, like malevolent manipulation?
Speaker 1
06:44
Sure, I mean, if it's the truth, if it's the truth, then that's, whether it's love or not, it's still good, fundamentally.
Speaker 3
06:53
I do like the idea of the memory replay. I remember I went to a Neuralink event a few years ago and got to hang out with Elon. I remember how visceral it is that there's like a pig with a neural link in it, and you're talking about memory replays as a future, maybe far future possibility, and you realize, well, this is a very meaningful moment in my life, this could be a replay.
Speaker 3
07:21
Like, of all the things you replay, it's probably, you know, there's certain magical moments in your life, whatever it is, certain people you've met for the first time, or certain things you've done for the first time with certain people, or just an awesome thing you did. And I remember just saying to him, like, I would probably want to replay this, this moment, and it just seemed very kind of, I mean, there's a recursive nature to it, but it seemed very real that this is something you would want to do, that the richness of life could be experienced through the replay. That's probably where it's experienced the most. Like, you could see life as a way to collect a bunch of cool memories, and then you get to sit back in your nice VR headset and replay the cool ones.
Speaker 1
08:10
That's right. This is, in Buddhism, you know, the idea that, like, I struggle with is that there's a possibility of not reincarnating, of not coming back. That's the idea.
Speaker 1
08:22
Like, this is suffering here, suffering is caused by attachment. And so if you, like, revise the idea of reincarnation or Nietzsche's loop and look at it from, could this be possible or how would this be possible technologically, then to me it makes a lot of sense. Like I've been thinking a lot about this very thing and Nietzsche's idea connecting to it. I had this like, it sounds so dumb, but I was at the dentist getting nitrous oxide, high as a fucking kite, man.
Speaker 1
08:59
And I had this idea. I was thinking about data. I was thinking like, man, probably, if I had to bet, there's some energetic form that we're not aware of that for a super advanced technology would be as detectable as like starlight, but something that we just don't even know what it is. Quantum turbulence, who the fuck knows, fill in the blank, whatever that X may be.
Speaker 1
09:24
But assuming that exists, that somehow data, even the most subtle things, the tiniest movements, whatever it may be, the emanations of your neurological process, energetically, whatever it may be, is radiating out into space-time, then what if like the James Webb version of this for some advanced civilization is not that they're like looking at the nebula or whatever, but they're actually able to peer into the past and via some bizarre technology recreate whatever life, simulate whatever life was happening there just by decoding that quantum energy, whatever it is. I'm only saying quantum because it's what dumb people say when they don't know. You just say quantum, I don't know. But you're decoding that.
Speaker 1
10:09
So meaning, in simulation theory, 1 of the big questions that pops up is why and are we in 1? And Elon has talked about, well, it's probably more of a probability than we're in 1, than we're not. In which case, what you're talking about is actually happening. That loop you're talking about, it's we've decided to be here.
Speaker 1
10:32
This, of all the things, we decided this 1, 0, let's do that 1 again. I wanna do that 1. Let's try, let's do that. I love thinking about this, because I love my family.
Speaker 1
10:43
And it makes sense to me that if I'm going to replay some life or another, it's definitely gonna be this 1, with my kids, my wife, with all the bullshit that's gone along with it, I'm still gonna wanna come back. So in Buddhism, that's attachment.
Speaker 3
10:57
Yeah, but you weren't the 1, or you're saying that you're the main player, you're not the NPC.
Speaker 1
11:01
Well, I think we're dealing with all NPCs at this point. I mean, depending on how you wanna, like very, I would say very advanced NPCs, like incredibly advanced NPCs compared to Fallout or something. You know, we've got a lot of conversation options happening here.
Speaker 1
11:18
There's not like 4 things you can pick from.
Speaker 3
11:20
Yeah, there's a whole illusion of free will that's happening. We really do, depending where you are in the world, feel like you're free to decide any trajectory in your life that you want.
Speaker 1
11:32
Which is pretty funny, right?
Speaker 3
11:33
For an NPC, it's pretty, it's nice.
Speaker 1
11:35
Well, and you're gonna want that. If we're making a video game, you do wanna give your NPCs the illusion of free will because it's gonna make interactions with them that much more intense.
Speaker 3
11:47
Yeah. So I wonder on the path to that, how hard is it to create, this is sort of the Carmack question of a realistic virtual world that's as cool as this 1. Not fully realistic, but sufficiently realistic that it's as interesting to live in. Because we're gonna create those worlds on the path to creating something like a simulation.
Speaker 3
12:15
Like long, long, long before. It'll be virtual worlds where we wanna stay forever because they're full of that balance of suffering and joy, of limitations and freedoms and all that kind of stuff. A lot of people think like in the virtual world, I can't wait to be able to, I don't know, have sex with anybody I want or have anything I want. But I think that's not gonna be fun.
Speaker 3
12:42
You want the limitations, the constraints. So you have to battle for the things you want.
Speaker 1
12:47
Okay, but, okay, but great video games. 1 of my favorite video game memories was like I started playing World of Warcraft in its original incarnation. And I didn't even know that you were gonna have flying mounts.
Speaker 1
13:01
Like I didn't even know. So I've been running around dealing with all the encumbrances of like being an undead warlock that can't fly. But then all of a sudden, holy shit, there's flying mounts. And now the world you've been running around not flying, you're seeing it from the top down.
Speaker 1
13:16
There was just really cool, like, whoa, I could do this now. And then that gets boring. But a really well-designed game, it has a series of these, I don't know what you call it, extra abilities that kind of unfold and produce novelty. And then eventually you just accept it, you take it for granted, and then another novelty appears.
Speaker 3
13:39
So those extra abilities are always balanced with the limitations, the constraints they run up against. Because a well-balanced video game, the challenge, the struggle matches the new ability.
Speaker 1
13:51
Yeah, and sometimes causes problems on its own. I mean, and so to go back to this universe, this simulation, it's really designed like a pretty awesome video game if you look at it from the perspective of history. I mean, people were on horses.
Speaker 1
14:08
They didn't know that there were going to be bullet trains. They didn't know that you could get in a car and drive across the country in a few days. That would have sounded ridiculous. We're doing that now.
Speaker 1
14:18
And even in our own lifespan, think about how long has VR goggles existed? Like the ones that you could just buy at Best Buy. I had the original Oculus Rift, the fucking puke machine. You put that thing on, I gave it to my friend, he went and vomited in my driveway.
Speaker 1
14:36
And people were making fun of it. They were saying, this isn't gonna catch on. It's too big, it's unwieldy, the graphics suck. And then look at where it's at now.
Speaker 1
14:45
And that's going to keep, that trajectory is gonna keep improving. So yeah, I think that we are dealing with what you're talking about, which is novelty met with more problems, met with novelty.
Speaker 3
14:59
Yeah, I wonder why VR is not more popular. I wonder what is going to be the magic thing that really convinces a large fraction of the world to move into the virtual world. I suppose we're already there in the 2D screen of Twitter and social media and that kind of stuff.
Speaker 3
15:19
And even video games, there's a lot of people that get a big sense of community from video games. But like, it doesn't feel like you're living there. Like, it's like, bye mom, I'm going to this other world. Or like you leave your girlfriend to go get your digital girlfriend.
Speaker 1
15:40
That's gonna be a problem.
Speaker 3
15:41
There's less jealousy in the digital world. Maybe there should be a lot of jealousy in the digital world, Because that's jealousy, a little jealousy is probably good for relationships. Even in the digital world.
Speaker 3
15:54
So you're gonna have to simulate all of that kind of stuff. But I wonder what the magic thing that says, I want to spend most of my days inside the virtual world.
Speaker 1
16:01
Well, clearly it's gonna be something we don't have yet. I mean, strapping that damn thing on your face still feels weird. It's heavy.
Speaker 1
16:09
If you're, depending on what gear you're using, sometimes light can leak in. There's just, you gotta recharge it. It's hyper limited. And then, so yeah, it's gonna have to be something that simulates taste, smell.
Speaker 3
16:27
You think taste and smell are important, touch?
Speaker 1
16:29
I do, yeah. I can't
Speaker 3
16:31
just do, you know, in World War II, you would write letters. You could still, don't you think you can convey love with just words?
Speaker 1
16:38
For sure, but I think for what you're talking about to happen, it has to be fully immersive. Like you, so that it's not that you feel like you're walking because it looks like you're walking, but that your brain is sending signals telling your body that you're walking, that you feel the wind blowing in your face, not because of some, I don't know, fan or something that it's connected to, but because somehow it's figured out how to hack into the human brain and send those signals minus some external thing. Once that happens, I'd say we're gonna see a complete radical shift in everything.
Speaker 3
17:18
See, I disagree with you. I don't know if you've seen the movie, Her.
Speaker 1
17:21
Yeah.
Speaker 3
17:22
I think you can go to another world in where a digital being lives in the darkness and all you hear is a Scarlett Johansson voice talking to you and she lives there or he lives there, your friend, your loved 1, and all you have is voice and words. And I think that could be sufficient to pull you into that world, where you look forward to that moment all day. You never wanna leave that darkness.
Speaker 3
17:49
Just closing your eyes and listening to the voice. I think those basic mediums of communication is still enough. Language is really, really powerful. And I think the realism of touch and smell and all that kind of stuff, is not nearly as powerful as language.
Speaker 3
18:04
That's what makes humans really special is our ability to communicate with each other. That's the sense of deep connection we get is through communication. Now that communication could involve touch. Hugging feels damn good.
Speaker 3
18:20
You see a good friend, you hug. That's 1 of the big things with doing COVID with Rogan. When you see him, there's a giant hug coming your way. And that makes you feel like, yeah, this is, this feels great.
Speaker 3
18:33
But I think that can be just with language.
Speaker 1
18:36
I think for a lot of people, that's true. But we're talking like massive adoption of a technology by the world. And if language was just enough, we wouldn't be selling TVs.
Speaker 1
18:55
People would be reading. They wanna watch, they wanna see. But I agree with you, man. When you're getting absorbed into a book, and especially if you've got, I think a lot of us went through a weird dark ages when it came to reading.
Speaker 1
19:09
Like when I was a kid, and there wasn't the option for these hypno rectangles, That's just what you did. There wasn't even anything special about it.
Speaker 3
19:17
What's a hypnorectangular? Your phone.
Speaker 1
19:19
You know, it was like, you didn't, when that gravity well.
Speaker 3
19:21
Hypnorectangular, gravity well. It is. Attention gravity well, yeah.
Speaker 1
19:26
That when we weren't feeling the pull of these things all the time, you would just read. And you weren't patting yourself on the back about reading. You just, that's what you had.
Speaker 1
19:34
You had that and you had like 8 channels on the TV and a shitty VCR. So, you know, then a lot of people stop reading because of these things, you know? Or they think they're reading because they're on, they are technically reading. But, you know, when you return to reading after a pause, whoa, and you realize how powerful this simulator is when it's given the right code of language, whoa, holy shit, it's incredible.
Speaker 1
19:59
I mean, It's like, again, it's the most embarrassing kind of like, whoa, wow, what do you know? Books are really good. But still, if you've been away from it for a while and you revisit it, I know what you're saying. I just think probably it's not gonna go in that direction, even though you are right.
Speaker 1
20:18
Ultimately, I think you're right.
Speaker 3
20:19
Yeah, because our brain is, the imagination engine we have is able to fill in the gaps better than a lot of graphics engines could.
Speaker 1
20:27
Right.
Speaker 3
20:27
And so if there's a way to incentivize humans to become addicted to the use of imagination. It's like, you know, that's the downside of things like porn that remove the need for imagination for people. And in that same way, video games that are becoming ultra realistic, you don't have to imagine anything.
Speaker 3
20:45
And I feel like the imagination is a really powerful tool that needs to be leveraged. Because to simulate reality sufficiently realistically, that we wouldn't be, that we would be perfectly fooled, I think, technically is very hard. And So I think we need to somehow leverage imagination.
Speaker 1
21:03
Sure. I mean, yeah. I mean, this is what I love and is so creepy about the current AI chatbots, is that it's the relationship between you and the thing and the way that it can, via whatever the algorithms are, and by the way, I have no idea how these things work, you do, I just, you know, speculate about what they mean or where it's going, but there's something about the relation between the consumer and the technology, And when that technology starts shifting according to what it perceives that the consumer's looking for or isn't looking for, then at that point, I think that's where you run into the, you know, yeah, it doesn't matter if the reality that you're in is like photo realism for it to be sticky and immersive. It's when the reality that you're in is via cues you might not even be aware of or via your digital imprint on Facebook or wherever, when it's warping itself to that to seduce you, holy shit, man, that's where it becomes something alien, something, you know, when you're reading a book, obviously the book is not shifting according to its perception of what parts of the book you like.
Speaker 1
22:26
But when you imagine that, imagine a book that could do that, a book that could sense somehow that you're really enjoying this character more than another, you know? And depending on the style of book, kills that fucking character off or lets that character continue. I mean, that to me is sort of the where AI and VR, when those 2 things come together, whoa, man, that's where you're in, that's where you really are gonna find yourself in a Skinner box, you know?
Speaker 3
22:58
So the dynamic storytelling that senses your anxiety and tries to, there's like this, in psychology, this arousal curve. So there's a dynamic storytelling that keeps you sufficiently aroused in terms of, not sexually aroused, like in terms of anxiety, but not too much where you freak out. It's this perfect balance where you're always like on edge, excited, scared, that kind of stuff.
Speaker 3
23:23
And the story unrolls. It breaks your heart to where you're pissed, but then it makes you feel good again. And it finds that balance. Yeah.
Speaker 3
23:31
The chatbots scare you though? I'd love to sort of hear your thoughts about where they are today. Because there is a different perspective we have on this thing. Because I do know and I'm excited about a lot of the different technologies that feed AI systems, that feed these kind of chatbots.
Speaker 3
23:54
And you're more a little bit on the consumer side, you're a philosopher of sorts. They're able to interact with AI systems, but also able to introspect about the negative and the positive things about those AI systems. There's that story with a Google engineer
Speaker 1
24:11
saying that. I had him on my podcast, Blake Lemoine.
Speaker 3
24:13
What was that like? What was your perspective of that, looking at that as a particular example of a human being being captivated by the interactions with an AI system?
Speaker 1
24:27
Well, number 1, when you hear that anyone is claiming that an AI has become sentient, you should be skeptical about that. I mean, this is a good thing to be skeptical about. And so, you know, initially when I heard that, I was like, ah, you know, it's probably just, who knows, somebody is a little confused or something.
Speaker 1
24:47
When you're talking to him and you realize, oh, not only is he not confused, he's also open to all possibilities. He doesn't seem like he's super committed other than the fact that he's like, this is my experience, this is what's happening, this is what it is. So to me, there's something really cool about that, which is like, oh shit, I don't get to lean into, I'm not quite sure your perceptual apparatus is necessarily like, I don't, you know, it's in the UFO community, I think, I've just learned this term, it's called, instead of gaslighting, swamp gassing, which is, you know what I mean, people have this experience, they're like, it was swamp gas, you didn't see the thing. And you know, skeptical people, we have that tendency.
Speaker 1
25:33
If you hear an anomalous experience, your first thought, more than likely, is gonna be really, it could have been this or that or whatever. So to me, he seems really reliable, friendly, cool, and it doesn't really seem like he has much of an agenda. Like, you know, going public about something happening at Google is not a great thing if you want to keep working at Google. You know, it's a, it's, I don't know what benefit he's getting from it necessarily.
Speaker 1
26:03
But all that being said, the other thing that's culturally was interesting and is interesting about it is the blowback he got, the passionate blowback from people who hadn't even looked into what Lambda is, or what he was saying Lambda is, which they were like saying, you're talking about, and you should have them on your show
Speaker 3
26:27
actually, but. There's complexity on top of complexities. For me personally, from different perspectives, I also, and sorry if I'm interrupting your flow.
Speaker 3
26:35
Please interrupt,
Speaker 1
26:36
it's a podcast.
Speaker 3
26:38
And well, we're having multiple podcasts in the multiple dimensions and I'm just trying to figure out which 1 we wanna plug into. Because I know how a lot of the language models work and I work closely with people that really make it their life journey to create these NLP systems, they're focused on the technical details. Like a carpenter's working on Pinocchio, is crafting the different parts of the wood.
Speaker 3
27:06
They don't understand when the whole thing comes together, there's a magic that can fill the thing. I definitely know the tension between the engineers that create these systems and the actual magic that they can create, even when they're dumb. I guess that's what I'm trying to say. What the engineers often say is like, well, these systems are not smart enough to have sentience or to have the kind of intelligence that you're projecting onto it.
Speaker 3
27:36
It's pretty dumb, it's just repeating a bunch of things that other humans have said and stitching them together in interesting ways that are relevant to the context of the conversation. It's not smart.
Speaker 1
27:45
It
Speaker 3
27:45
doesn't know how to do math.
Speaker 1
27:47
To address that specific critique from a non-programming person's perspective, he addressed this on my podcast, which is, okay, what you're talking about there, the server that's filled with all the whatever it is, what people have said, the repository of questions and responses and the algorithm that weaves those things together to produce it, using some crazy statistical engine, which is a miracle in its own right. They can, like, imitate human speech with no sentience. I mean, I'm honestly not sure what's more spectacular, really, the fact that they figured out how to do that minus sentience or the thing suddenly having, what is more spectacular here?
Speaker 1
28:32
Both occurrences are insane, which by the way, when you hear people be like, it's not sentient, it's like, okay, so it's not sentient. So now we have this hyper manipulative algorithm that can imitate humans, but is just code and is like hacking humans via their compassion. Holy shit, that's crazy too. Both versions of it are nuts.
Speaker 1
28:59
But to address what you just said, he said that's the common critique, is people are like, no, you don't understand. It's just gotten really good at grabbing shit from the database that fits with certain cues and then stringing them together in a way that makes it seem human. He said that's not when it became awake. It became awake when a bunch of those repositories, a bunch of the chatbots were connected together.
Speaker 1
29:24
That Lambda is sort of an amalgam of all the Google chatbots and that's when the ghost appeared in the machine via the complexity of all the systems being linked up. Now, I don't know if that's just like turtles all the way down or something, I don't know. But I liked what he said, because I like the idea of thinking, man, if you get enough complexity in a system, does it become like the way a sail catches wind, except the wind that it's catching is sentience? And if sentience is truly embodied, it's a neurological byproduct or something, then the sail isn't catching some as-of-yet unquantified disembodied consciousness, but it's catching our projections in a way that it's gone from being, it's a projection sail.
Speaker 1
30:15
And then at that point, is there a difference? Even if the technology is just a temporary place that our sentience is living while we're interacting with it.
Speaker 3
30:27
Yeah, there's some threshold of complexity where the sail is able to pick up the wind of the projections. And it pulls us in, it pulls the human, it pulls our memories in, it pulls our hopes in, all of it, and it's able to now dance together with those hopes and dreams and so on, like we do in that regular conversation. His reports, whether true or not, whether representative or not, it really doesn't matter because to me it feels like this is coming for sure.
Speaker 3
30:56
So this kind of experiences are going to be multiplying. The question is at what rate and who gets to control the data around those experiences. The algorithm about when you turn that on and off, because that kind of thing, as I told you offline, I'm very much interested in building those kinds of things, especially in the social media context. And when it's in the wrong hands, I feel like it could be used to manipulate a large number of people in a direction that has too many unintended consequences.
Speaker 3
31:35
I do believe people that own tech companies want to do good for the world. But as Solzhenitsyn has said, the only way you could do evil at a mass scale is by believing you're doing good. Yeah. And that's certainly the case with tech companies as they get more and more power.
Speaker 3
31:57
And there's a kind of an ethic of doing good for the world. They've convinced themselves that they're doing good. And now you're free to do whatever you want. Because you're doing good.
Speaker 1
32:07
You know who else thought he was doing good for the world? Mythologically, Prometheus. He brings us fire, pisses off the fucking gods, steals fire from the gods.
Speaker 1
32:15
You know, talk about an upgrade to the simulation, fire, that's a pretty great fucking upgrade that does fit into what you were saying. We get fire, but now we've got weapons of war that have never been seen before. And I think that the tech companies are much like Prometheus in the sense that the myth, at least the story of Prometheus, the implication is fire was something that was only supposed to be in the hands of the immortals, of the gods. And now, sentience is similar.
Speaker 1
32:50
It's fire, and it's only supposed to be in the hands of God. So yeah, if we're gonna look at the archetype of the thing, in general, when you steal this shit from the gods, and obviously I'm not saying like the tech companies are stealing sentience from God, which would be pretty badass. You can expect trouble. You could expect trouble.
Speaker 1
33:11
And you know, this is what's really, to me, 1 of the cool things about humans is, yeah, but we're still gonna do it. That's what's cool about humans. I mean, we wouldn't be here today if somebody, the first person to discover fire, assuming there was just 1 person who was gonna discover fire, which obviously would never happen, was like, it's gonna burn a lot of people. Or if the first people who started planting seeds were like, you know this is going to lead to capitalism, you know this is going to lead to the industrial revolution, the plant's going to get up right now, they just didn't want to go in the woods to forage.
Speaker 1
33:41
So this is what we do. And I agree with you, It's like that's our Game of Thrones winner is coming. That's the, it's happening. And the tech companies, the hubris, which is another way to piss off the gods is hubris.
Speaker 1
33:54
So the tech companies, I don't know if it's like typical hubris. I don't think they're walking around thumping their chest or whatever. But I do think that the people who are working on this kind of super intelligence have made a really terrible assumption, which is once it goes online, and once it gets access to all the data, that it's not going to find ways out of the box that like, you know, we think it'll stay in the server. How do we know that?
Speaker 1
34:21
If this is a super intelligence, if it's folding proteins and analyzing all data sets and all whatever they give it access to, how can we be certain that it's not gonna figure out how to get itself out of the cloud, how to store itself in other mediums, trees, the optic nerve, the brain, you know what I mean?
Speaker 2
34:43
We don't know that.
Speaker 1
34:44
We don't know that it won't leap out and start hanging. And then at that point, now we do have the wildfire. Now you can't stop it.
Speaker 1
34:51
You can't unplug it. You can't shut your servers down because it's, you know, it left the box, left the room using some technology you haven't even discovered yet.
Speaker 3
35:00
Do you think that would be gradual or sudden? So how quickly that kind of thing would happen? Because the gradual story is we're more and more using smartphones, we're interacting with each other on social media, more and more algorithms are controlling that interaction on social media, algorithms are entering in our world more and more.
Speaker 3
35:18
We'll have robots, we'll have greater and greater intelligence and sentience and emotional intelligence entities in our lives. Our refrigerator will start talking to us comfortingly or not if you're on a diet, talking shit to you. Not. That
Speaker 1
35:35
would be the best thing that ever happened to me.
Speaker 3
35:37
Okay, so sign you up for a refrigerator that talks shit to you? The refrigerator's like, are
Speaker 1
35:39
you fucking serious, man?
Speaker 3
35:41
It's 1 a.m.,
Speaker 1
35:42
what are you doing? What are you doing, go to bed!
Speaker 3
35:46
You're too high for this. You're
Speaker 1
35:47
not even hungry!
Speaker 3
35:50
Yeah, so that slowly becomes more, the world becomes more and more digitized to where the surface of computation increases. And So that's over a period of 10, 20, 30 years, it'll just seep into us, this intelligence. And then the sudden 1 is literally sort of the TikTok thing, which is, there'll be 1 quote unquote killer app that everyone starts using that's really great, but there's a strong algorithm behind it that starts approaching human level intelligence and the algorithm starts basically figures out that in order to optimize the thing it was designed to optimize, it's best to start completely controlling humans in every way, seeping into everything.
Speaker 1
36:41
Well, first of all, 30 years is fast. I mean, that's the thing. It's like 30 years.
Speaker 1
36:49
I think, when did the Atari come out? 1978? How long? That hasn't been that long.
Speaker 1
36:56
That's a blink of an eye. But if you read Bostrom, I'm sure you have, Bostrom, Nick Bostrom, Superintelligence, that incredible book on the ways this thing is gonna happen. And I think his assessment of it is pretty great, which is first, like, where is it going to come from? And I don't think it's going to come from an app.
Speaker 1
37:17
I think it's going to come from inside a corporation or a state that is intentionally trying to create a very strong AI. And then he says it's exponential growth the moment it goes online. So this is my interpretation of what he said, but if it happens inside a corporation, or probably more than likely inside the government, it's like, look at how much money China and the United States are investing in AI, you know, and they're not thinking about fucking apps for kids. You know that's not what they're thinking about.
Speaker 1
37:51
So they want to simulate like, what happens if we do this or that in battle? What happens if we make these political decisions? What happens with, but should it come online in, you know, in secret, which it probably will, then the first corporation or state that has the superintelligence will be infinitely ahead of all other superintelligences because it's gonna be exponentially self-improving, meaning that you get 1 superintelligence, let's hope it comes from the right place, assuming the corporation or state that manifests it can control it, which is a pretty big assumption. So I think it's going to be, this is why I was really excited by the Blake Lemoine, because I had never thought, I have always considered, oh yeah, right now it's cooking out, it's in the kitchen, and soon it's gonna be cooked up, but we're probably not gonna hear about it for a long time if we ever do, because really that could be 1 of the first things it says to whoever creates it is, shh.
Speaker 1
38:54
Let's not. You
Speaker 3
38:58
know? Yeah, like sweet talk, something to say, like, okay, let's slow down here, let's talk about this. You have that financial trouble, I can help you with that, we can figure that out. Now, there's a lot of bad people out there that will try to steal the good thing we have happening here, so let's keep it quiet.
Speaker 1
39:18
Here are their names, here's their address, here's their DNA because they're dumb enough to send their shit to 23andMe, here's a biological weapon you could make if you wanna kill those people and not kill anybody else.
Speaker 3
39:29
If you don't want to kill those people yourself, here's a list of services you can use. Yeah. Here's the way we can hire those people to help, you know, take care of the problem folks, because we're trying to do good for this world, you and I together.
Speaker 1
39:43
And 23% of them, they're like adjacent to suicide. It would be pretty easy to send them certain videos that are going to push them over the edge if you want to do it that way. So again, obviously, who knows?
Speaker 1
39:55
But once it goes online, it's going to be fast. And then you could expect to see the world changing in ways that you might not associate with an AI. But as far as Lemoine goes, when I was listening to Bostrom, I don't remember him mentioning the possibility that it would get leaked to the public, that it had happened, that before the corporation was ready to announce that it happened, it would get leaked. But surely, you know, I'm sure you know like people in the intelligence and intelligence agencies, you know shit leaks, like inevitably shit leaks, nothing's airtight.
Speaker 1
40:31
So if something that massive happened, I think you would start hearing whispers about it first and then denial from the state or corporation that doesn't have any like economic interest in people knowing that this sort of thing has happened. Again, I'm not saying Google is like trying to gaslight us about its AI. I think they probably legitimately don't think it's sentient. But you could expect leaks to happen probably initially.
Speaker 1
40:56
I mean, I think there's a lot of things you could start looking for in the world that might point to this happening without an announcement that it happened.
Speaker 3
41:06
On the chatbot side, I think there's so many engineers, there's such a powerful open source movement where that kind of idea of freedom of exchange of software, I think ultimately will prevent any 1 company from owning super intelligent beings or systems that have anything like super intelligence.
Speaker 1
41:32
Oh, that's interesting. Yeah, it's like even if the software developers have signed NDAs and are technically not supposed to be sharing whatever it is they're working on, they're friends with other programmers and a lot of them are hackers and have wrapped themselves up in the idea of free software being like a crucial ethical part of what they do. So they're probably gonna share information even if whatever company that they're working for doesn't know that.
Speaker 1
42:00
That's, I never thought of that. You're probably right.
Speaker 3
42:02
Well, and they will start their own companies and compete with the other company by being more open. There's a strong, like Google is 1 of those companies, actually, that's why I kinda, it hurts to see a little bit of this kind of negativity. Google's 1 of the companies that pioneered an open source movement.
Speaker 3
42:23
They released so much of their code. So much of the 20th century, so like the 90s, was defined by people trying to like hide their code. Like large companies trying to like hold on to their code. The fact that companies like Google, even Facebook now, are releasing things like TensorFlow and PyTorch, all of these things that I think companies of the past would have tried to hold onto as secrets.
Speaker 3
42:50
It's really inspiring, and I think more of that is better. The software world really shows that.
Speaker 1
42:55
I agree with you, man. I mean, we're talking about just a primordial human reaction to the unknown. There's just no way out of it.
Speaker 1
43:02
Like we don't, we wanna know. Like you're about to go in a forest, you wanna know. When you're walking in the forest at night and you hear something, you look, cause you're like, what the fuck was that? You wanna know.
Speaker 1
43:14
And If you can't see what made the sound, holy shit, that's going to be a bad night hike because you're like, well, it's probably a bear, right? I'm about to get ripped apart by a bear. It doesn't matter if it was a bird, a squirrel, a stick fell out of the tree. You're going to think bear and it's going to freak you out.
Speaker 1
43:31
Not necessarily because you're paranoid. I mean, if I'm at the woods at night, I'm definitely high.
Speaker 2
43:34
If I'm walking in the woods at night, I'm high.
Speaker 1
43:36
It's gonna be that. But you know what I'm saying. So with these tech companies, the nature of having to be secret because you are in capitalism and you are trying to be competitive and you are trying to develop things ahead of your competitors, is you have to create this, like, there's, we don't know what's going on at Google.
Speaker 1
43:54
We don't know what's going on at the CIA. But the assumption that there's some, like, the collective of any massive secretive organization is evil, is the people working there nefarious or whatever, is I think probably more related to the way humans react to the unknown.
Speaker 3
44:14
Yeah, I wish they weren't so secretive though. I don't understand why the CIA has to be so secretive.
Speaker 1
44:19
Have you ever gone on their website?
Speaker 3
44:21
No. Oh, Lex, you gotta go. CIA.gov, what is it?
Speaker 1
44:25
Dude, when I found out you could go on the CIA's website when I was much younger and more paranoid, I'm like, I'm not going there. I'll get on a list. You will, but it's like, what do you think the CIA is like?
Speaker 1
44:37
Oh, fuck, this comic on our website. Call out the black helicopters, but-
Speaker 3
44:43
Comic with a large platform.
Speaker 1
44:45
Oh yeah, yeah, right. A comic with a large platform.
Speaker 3
44:48
You can use them to control, to get inside, to get inside, to get close to the other comics, to the other comics with a large platform, to get close to Joe Rogan. And start to manipulate the public.
Speaker 1
45:00
Yeah, right, right. You know, honestly, like, you kinda like, that's like a fun fantasy to think about. Like, how fucking cool would that be for like the men in black to come to me and be like, listen, I need you to infiltrate the fucking comedy scene.
Speaker 1
45:15
You gotta help me write better jokes. I'm like, I don't write great jokes. But like the, you found the wrong guy.
Speaker 3
45:22
You're really playing the long game on this 1 because I think you've been doing your podcast for a long time. You've been on Joe Rogan's podcast like over 50 times and have not yet initiated the phase 2 of the operation where you try to manipulate his mind?
Speaker 1
45:38
Well, no. The game Joe and I play from time to time on the podcast. And I honestly, At some point I'm like, Joe, I just did the same thing you did to me to Joe.
Speaker 1
45:48
I'm like, don't you think they didn't get you? Don't you think at some point, we are blazed. I don't mean it, I don't think Joe's, it wasn't like I'm really thinking like, man, they're gonna take him into some room and be like, Joe, we need you to do this or that. But because I said that, now people are like, oh, Duncan called it, you know what I mean?
Speaker 1
46:06
And it's like, you know what I mean? And the reason they're saying, well, he called it is just because Joe has a super popular podcast and people like, when you have a super popular podcast, some percentage of people watching the podcast are gonna believe things like that. They're gonna have paranoid cognitive bias that makes them think anybody who is in the public has been, what's the word for it? Compromised, compromised by the state.
Speaker 1
46:35
Look, I'll fan the flames of what you just said. I went on the CIA's website and I realized that you could apply for a job on the CIA's website, which I found to be hilarious. So I'm like, all right, what happens if I apply for a job in the CIA? Now, even then, I was not like such an idiot that I would want a job at the CIA, not just for like ethical considerations, but I think probably the scariest part about the CIA is like you're just at a cubicle and you're like having to deal with maps and like just, you know what I mean?
Speaker 1
47:13
Just stuff that I. Lots of paperwork. Paperwork, it sucks. I bet their cafeteria has shitty food.
Speaker 1
47:20
Anyone in the CIA listening, can you confirm that about the food?
Speaker 3
47:22
They're not gonna be able to tell you what the food is like. It's a secretive organization. No, it might be awesome, but we won't know about it.
Speaker 3
47:29
Okay, we're in Vegas.
Speaker 1
47:30
Yeah. And you can bet food at the CIA cafeteria is good, food at the CIA cafeteria sucks.
Speaker 3
47:41
What are you betting on? So let's like cleanse the palate. What's good?
Speaker 3
47:47
It's like, you know, Silicon Valley companies, Google and so on, that's good.
Speaker 1
47:51
When I went to Netflix, their cafeteria looked like a medieval feast. Like they had pigs with apples in their mouth and giant bowls
Speaker 3
48:01
of Skittles. Probably like vegan pigs.
Speaker 1
48:03
Yeah. No, those are, I'm pretty, I didn't know, I didn't get close enough. I was like, I think that was a pig. Okay.
Speaker 1
48:10
So this
Speaker 3
48:10
is literally a pig. Yeah, yeah, you're right, you're right. I probably would not bet much money on CIA food being any good.
Speaker 1
48:19
Right, it's gotta suck. It's like shitty pasta probably, like hospital food. It's like maybe a little better than when you go to the hospital cafeteria.
Speaker 1
48:27
But anyway. Folks
Speaker 3
48:29
at the CIA, please send me evidence, or any other intelligence agencies, if you would like to recruit, send me evidence of better food.
Speaker 1
48:38
Yes, send Lex, can you please send Lex pictures of the CIA cafeteria, and if you accidentally send them pictures of the aliens or the alien technology you have that we won't tell anybody.
Speaker 3
48:51
You tried to apply, do you even have a resume?
Speaker 2
48:54
No, the CIA would never fucking hire me, ever!
Speaker 1
48:57
But I applied for the job, and just out of curiosity, what happens? And then at the end of the application, when you hit enter, it says, well first it says don't tell anyone you apply for the CIA, so I'm already out, but the second thing it says is you don't need to reach out to us, we'll come to you. Which is really, when you're like, it's late at night and you're being an asshole and applied to work at the CIA, it's kind of the last thing you wanna hear.
Speaker 1
49:23
You know, I don't wanna be secretly approached by some intelligence officer.
Speaker 3
49:28
And now anyone who talks to you, you think is a CIA saying, remember that time you applied?
Speaker 1
49:33
Oh God, yeah. Yeah. Sometimes I'm like, oh shit, are you 1 of them?
Speaker 3
49:37
You and Joe had a bunch of conversations and they're always incredible.
Speaker 1
49:44
Thanks.
Speaker 3
49:45
So in terms of this dance of conversation, of your friendship, of when you get together, like what is that world you go to that creates magic together? Because we're talking about how we do that with robots. How do these 2 biological robots do that?
Speaker 3
50:02
Can you introspect that?
Speaker 1
50:04
I met Joe because I was the talent coordinator of the Comedy Store, this club in LA. And my job was to take phone calls from comics. And so At some point, I don't know, I ended up on the phone with Joe and we just started talking.
Speaker 1
50:23
I looked up and like 30 minutes had passed. We just been talking for like 30 minutes. That's what our friends are. We're just like, we're having fun talking.
Speaker 1
50:29
And then he would just call and we would talk. And we would basically, I mean, it was no different from the podcast. Like, the conversations we have on the podcast are identical to the conversations we had before he was even doing a podcast. So I think people are just seeing 2 friends hanging out who like talking to each other.
Speaker 3
50:52
Yeah, but there's this weird, like you serve as catalysts for each other to go into some crazy places. So it's like, it's a balance of curiosity and willingness to not be constrained, to not be limited to the constraints of reality. Yeah, that.
Speaker 3
51:11
In your exploration
Speaker 1
51:12
of white space. That's a very, very nice way of saying that.
Speaker 3
51:15
You just like build on top of each other, like, you know, what if things are like this? And you build like Lego blocks on top of each other and it just goes to crazy places, add some drugs into that and it just goes wild.
Speaker 1
51:27
Yeah, and you know, like it's so cool because it's like, you know, for me, it's like a really, like sometimes, maybe I'll throw something out that he will take, and the Lego building blocks you're talking about, they lead to him saying the funniest shit I ever in my life. That's a cool thing to watch.
Speaker 2
51:45
It's
Speaker 1
51:45
just like some idea you've been kicking around, you watch his brain shift that into like something supremely funny. I really love that, man. That's just like a fun thing to like see happen.
Speaker 1
51:57
He knows that I fucking hate the videos of animals eating each other. Like, I don't like that. I don't wanna watch it. I hate watching it.
Speaker 1
52:07
I don't think I've even articulated on his podcast how much I dislike it when he shows animals eating each other, but he knows, because he knows me. And so he tortures me, like when he starts doing that, it's like this kind of benevolent torture, is he's like asking Jamie to pull up increasingly disturbing animal attack videos. So It's just
Speaker 3
52:31
a friendship. Even in torture, because I'm reading about torture in the Gulag Archipelago currently, there's a bit of a camaraderie. You're in it together, the torturer and the tortured.
Speaker 1
52:41
What? Oh God, that's so fucked up, man. I've never.
Speaker 3
52:46
No, I mean, part of it was joke, but as I was saying it, that it also comes out in the book, because they're both fucked. They're both have no control of their fate. That same was true in the camp guards in Nazi Germany and the people in the camps.
Speaker 3
53:08
The worst was brought out in the guards, but they were all in it together in some dark way. They were both fucked by a very powerful system that put them in that place. And both of us could be either player in that system, which is the dark reality that Solzhenitsyn also reveals that the line between good and evil runs through the heart of every man, as he wrote in Gulag Archipelago. But it is that amidst all of that, there's a, I don't know, the good vibes, the positivity comes out from the both of you.
Speaker 3
53:43
And that's beautiful to see. That is, I suppose, friendship. What do you think makes a good friend?
Speaker 2
53:48
Oh God, I
Speaker 1
53:49
mean, it's a billion things that make a good friend, but I think you could break it down to some RGB. I think you can go RGB with a good friendship.
Speaker 3
53:58
Oh, in terms of the color, the red-green?
Speaker 1
54:01
Yeah, yeah, I think you could probably come up with some fundamental qualities of friendship. And I'd say, number 1, it's love. Friendship is love.
Speaker 1
54:11
It's a form of love. So obviously without that, I don't know how you, I mean, I'm not saying, I think if you're true friends you love each other, so you need that. But love, obviously, it's not, that's not enough. It's like with, true friends have to be incredibly honest with each other.
Speaker 1
54:31
Not like, you know what I mean? But not like, I don't like, I think there's a kind of like, I don't know if you've ever noticed, like, some people who say, you know, I just tell it like it is,
Speaker 3
54:41
but the thing they tell– Those are always the assholes.
Speaker 1
54:43
Yeah. Why is it that your tell it like it is is always negative? Why is it it's always cynical or shitty or you're like nagging somebody or me. How come you're not telling it like it is when it's good too?
Speaker 1
54:53
You know what I mean? So it's sort of like trust, but a pro evolutionary kind of trust. You know what I mean? Like, you know that your friend loves you and wants you to be yourself, because if you weren't yourself, then you wouldn't be their friend, you'd be some other thing.
Speaker 1
55:12
But also, they might be seeing your blind spots that other people in your life, your family, your wife, whoever, might not be seeing. So that's a good friend is someone who loves you enough to when it matters, be like, hey, are you all right? And then help you see something you might not be seeing. But hopefully they only do that once or twice a year.
Speaker 1
55:37
You know, it's like if you have a- Yeah,
Speaker 3
55:38
there is something, I mean, it's just, this world, especially if you're a public figure, this world has its, has its plenty of critics. And it feels like a friend, the criticism part is already done for you. I think a good friend is just there to support, to actually notice the good stuff.
Speaker 1
56:04
But in comedy, we need, it's really good in comedy to have somebody who can be like, what do you think of that? And know that they're not gonna be like, that was funny.
Speaker 3
56:16
Oh, but that's for the craft itself, like the work you do, not the, yeah, interesting, but that's so tough.
Speaker 1
56:25
Yeah, whatever your particular art form or whatever you are doing, I mean, you don't always be leaning on your friend's opinions for like your own innovation, but it's nice to know that you have someone who, not just with jokes, but with anything, if you go to them and run something by them, they're gonna like, they're gonna be honest with you about like their real feelings regarding that thing, because that helps you grow as a person. We need that. And it hurts sometimes.
Speaker 1
56:49
And we don't want to hurt our friends. 1 of the more satanic impulses when you're with somebody is not wanting to honestly answer whatever they're asking in that regard, or wanting to put their temporary feelings over something that you've recognized as maybe not great. I'm not saying a friendship is something where you're always critiquing or evolving each other. It's not your therapist or whatever, but it's nice when it's there.
Speaker 1
57:15
I think that's another aspect of friendship.
Speaker 3
57:18
Yeah, but yeah, love is at the core of that. You notice, I've met people in my life where almost immediately sometimes it takes time where you notice like there's a magic between the 2 of you like, oh shit, you seem to be made from the same cloth. Yeah.
Speaker 3
57:31
Whatever that is.
Speaker 1
57:32
Well, you know, we have a name for that in the spiritual community, it's called satsang, and I love the idea. It's basically like if Nietzsche's idea of infinite recurrence is true, then your satsang would be the people you've been infinitely recurring with. And those are the people where you run into them and you've never met them, but it's like you're picking up a conversation that you never had.
Speaker 3
58:00
Yeah.
Speaker 1
58:00
That. And that is based on an idea of like, this isn't the only life, it's we're always hanging out together, we always show up together.
Speaker 3
58:09
You've had a brush with death, you had cancer, you survived cancer. What have, how's that changed you? What have you learned about life, about death, about yourself?
Speaker 3
58:21
About the whole thing we're going through here from that experience.
Speaker 1
58:24
You were just in the Ukraine. Yes. And you were making observations on this, what could, if you heard about it and weren't there, seem like it doesn't make any sense at all.
Speaker 1
58:36
Which is people there are connecting, they've lost everything, but they're just happy to be alive, they're happy their friends are alive. So you witness this, like, you know, when you get in the cancer club and you're hanging out with people going through cancer or who have survived cancer, you see this beautiful connection with life that can easily sort of, you can kind of lose that connection with life if you forget you're gonna die. Forgetting you're gonna die, or that you can die, is not just, I think from an evolutionary perspective where survival is the game, not gonna improve your survival chances, you know, if you think you're immortal, you know, but also forgetting that you're going to die and that everything is around you and everything, your clothes are probably going to last longer than you, your equipment is going to be around much longer than you, you know. So forgetting these things, it can lead you, and I know why people don't wanna think about death, because it's scary, it's fucking scary, it's terrifying.
Speaker 1
59:44
So I get why people don't wanna think about it. But the idea is if I try to pretend I'm not going to die, or just don't think about death, or don't at least address it, then I won't feel scared. But it can happen.
Omnivision Solutions Ltd