2 hours 22 minutes 40 seconds
🇬🇧 English
Speaker 1
00:00
The following is a conversation with Tim Dillon, a standup comedian who is fearless in challenging the norms of modern day social and political discourse. Quick mention of our sponsors, NetSuite Business Management Software, Athletic Greens All-in-One Nutrition Drink, Magic Spoon Low Carb cereal, BetterHelp online therapy, and Rev speech-to-text service. So the choice is business, health, sanity, or transcripts. Choose wisely, my friends.
Speaker 1
00:30
And if you wish, click the sponsor links below to get a discount at the support this podcast. As a side note, let me say that I will continue talking to scientists, engineers, historians, mathematicians, and so on. But I will also talk to the people who Jack Kerouac called the mad ones in his book, On the Road, that is 1 of my favorite books. He wrote, the only people for me are the mad ones, the ones who are mad to live, mad to talk, mad to be saved, desirous of everything at the same time, the ones who never yawn or say a commonplace thing, but burn, burn, like fabulous yellow Roman candles exploding like spiders across the stars.
Speaker 1
01:14
And in the middle, you see the blue center light pop and everybody goes, ah. Some of these conversations will be a bit of a gamble in that I have no idea how they will turn out. But I'm willing to risk it for a chance at a bit of an adventure. And I'm happy and honored that Tim, this time, wanted to take a chance as well.
Speaker 1
01:36
If you enjoy this thing, subscribe on YouTube, review it on Apple Podcasts, follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman. And now, here's my conversation with Tim Dillon.
Speaker 2
01:51
What would you like your tombstone to read?
Speaker 1
01:54
It's a good way to summarize the essence of a human being.
Speaker 3
01:58
I would like it to say, this has not been paid for. And I want my living relatives to struggle to pay for it. And I think I would like them to be hounded every day.
Speaker 3
02:11
I would like people to call and go, listen, we don't wanna ever excavate a body, but we will because this has not been paid for. I love the idea of leaving the world, like debt, leaving the world in lots of debt that other people have to deal with. And I know people that have done that. I know people that have been in families where that's happened, where someone has to sit and just curse the sky because they don't have a physical person anymore to be angry at, and they, but they still have to deal with the decisions that person made, and that's deeply tragic, but that's always struck me as very funny.
Speaker 2
02:47
Well, it's a kind of an immortality, the debt. Because you can, if the debt lasts for a long time, the anger lasts for a long time, and then you're now immortal in the minds of many. You arouse emotion in the minds of many.
Speaker 3
02:59
My mother's best friend in the town I grew up in, her husband shot himself in the driveway. Yeah. And my mother's friend never got a chance to just grieve because he owed so much money, she would come over and go, I hate him.
Speaker 3
03:15
I fucking hate him. And it was just such an interesting thing to see somebody who, and her kids ended up getting angry at her for that because they didn't understand why she would hate a guy who was clearly suffering. But she goes, he took the selfish way out. He fucked us.
Speaker 3
03:31
And it was always interesting for me to just remember that you can leave Earth and still be a problem. That's kind of a special person. So that's, I think, what I'd like my tombstone to read.
Speaker 2
03:46
Yeah, there's a show called Louis, with Louis C.K. I don't
Speaker 3
03:48
know if you watched it. I'm aware of it.
Speaker 2
03:50
There's this moment, I think, where an old guy's talking to Louis about the best part about love is after you break up and it's remembering that, like remembering the good times and feeling that loss, the pain of that loss. The worst part about love is when you no longer feel that pain. So the pain of losing somebody lasts longer, is more intense and lasts longer than the actual love.
Speaker 2
04:18
So his argument was like the pain is what love really is. In the same way that anger, your tombstone, what arouses is, will last longer. And that's deeply like a human thing. Like why do we attach happiness to the way we should remember others?
Speaker 2
04:38
It could be just anger.
Speaker 3
04:40
I know so many people who will have deeply complicated feelings when, I did drugs for many years, so and I spent time with some wild people and their parents were also wild people and some of their parents have done crazy things to them and have created situations that were not productive for child rearing. And so I know that when those people die, it's going to be a very mixed bag. Like there's going to be a lot of complex emotions, like, hey, we loved that guy, but also when we look back, he was a horrible father, a horrible husband, but he was fun.
Speaker 3
05:25
And we don't put enough stock in that, but there will be a push and pull. And I'll be the 1 kind of bringing up like, hey, he was a lot of fun. He was a lot, remember when he stuck us, 1 of the things, this particular person I'm talking about, we were at a bar, me and my friend were there, we're having dinner, and his father, who was an alcoholic, and a guy that would go out every night and didn't work, refused to work, would lie and say he was going to work and then go to a bar. I mean, just a fun person.
Speaker 3
05:55
And we were sitting at this bar restaurant And the bartender, we see his father walk up to the bartender and say, point at us, point at our table and go and put the thumbs up. And the bartender nodded. And then the father walked over to our table and he said, listen, I just want to let you know, I just bought you dinner. And I looked at his son, I said, he's a pretty good guy.
Speaker 3
06:17
And then he climbed over the little fence down to the water and got in his little boat, it was a little cigarette boat and he just drove away. And then about an hour later, we went and we said, I think that guy took care of the bill. But she said, well, go talk to the bartender. So we talked to the bartender and he goes, he handed us a bill, and the bill was for like $1,000.
Speaker 3
06:36
And we said, wait a minute, what the hell's going on? And he goes, the guy that left an hour ago said you were gonna take care of his bill, he's been drinking here all week. And we go, what are you talking about? And he goes, remember he pointed at you, he put the thumbs up and you guys waved.
Speaker 3
06:50
You remember that? And the guy goes, and we went, yeah, and I just looked at my friends, my friend and I went, you know, your dad is just, we're gonna remember him for all kinds of reasons. But to you, he was fun. He was a lot of fun.
Speaker 3
07:02
He wasn't my dad, but I spent a lot of time with him. I was in 2 boating accidents with him. You know, 2 boating
Speaker 2
07:08
accident involved drugs.
Speaker 3
07:10
Yes, he was usually alcohol was involved when he left his house and when he was at home as well. But I was in 2 boating accidents. And do you know how fun someone has to be to get in a second boating accident?
Speaker 3
07:22
Do you know what a good time someone has to be to get in a boat with them after you've already gotten in 1 wreck?
Speaker 2
07:30
Never get fooled. What was that line? George Bush never get fooled again.
Speaker 3
07:34
Right? Yes. If you're getting fooled again, you know, there's a reason for it. But he was a fun guy.
Speaker 3
07:39
He did have a death wish. The second voting accident, he grabbed me and said, you can't hang out with me anymore. And I said, why? He goes, I'm trying to kill myself.
Speaker 3
07:45
And I was like, oh, and then I understood that like all of the fun under the fun lived a very destructive person who not only was destructive but wanted to die.
Speaker 2
07:57
So speaking of fun people that wanna die, I don't know if you're, we can go Hunter S. Thompson, but Charles Bukowski, I don't know if you're aware of the guy.
Speaker 3
08:06
I'm aware of him, sure, I've read some of his stuff.
Speaker 2
08:08
So his tombstone says, I just wanted to ask you a question about it. His tombstone says, don't try.
Speaker 3
08:15
Interesting.
Speaker 2
08:16
What do you think about that advice as a way to approach life?
Speaker 3
08:20
I think for many people it's a good advice. Because the people that are gonna try will do anyway. And the people that need to be told, there's a whole cottage industry now of motivational speakers and life coaches and gurus that tell people that they all have to own their own business and be their own boss and be a disruptor and get into industries.
Speaker 3
08:45
That's incredibly unrealistic for most people. Most people are not suited for that. And the Gary V's of the world that tell everybody that they should just hustle and grind and hustle and grind, they're very light on the specifics of what they should actually do. Yeah, I think a lot of people, that's not horrible advice to give to a lot of people.
Speaker 3
09:02
I think my generation got horrible advice from our parents, from our teachers, and that advice was follow your dreams. And that was it, by the way. There was no like, what are your dreams? Are they realistic?
Speaker 3
09:15
What happens when they don't work out? Will your dreams make you happy? Are your dreams real? Do they exist on earth?
Speaker 3
09:22
Can you follow, anybody who follow your dreams, you can be anything you wanna be. Horrible advice, horrible advice. Worst advice you could ever give a generation of people. Really, truly, I mean, think about it.
Speaker 3
09:34
If you were talking to somebody and you were trying to make them succeed, are there any 2 worse pieces of advice to give them than follow your dreams and you can be anything you want to be. Those to me are the 2 most destructive pieces of information I've ever heard.
Speaker 2
09:53
So let me push back because-
Speaker 3
09:54
Okay, that's fair. This is- Many people do.
Speaker 2
09:58
So yeah, this is like a rigorous journalistic interview. Larry King, by the way, passed away today. So I'm taking over the-
Speaker 3
10:05
It's very sad.
Speaker 2
10:06
I'm carrying the-
Speaker 3
10:07
Very sad. RIP King.
Speaker 2
10:09
Yeah, what was I even gonna say? Oh, let me push back on the follow your dream thing. I come from an immigrant family where I was always working extremely hard at stuff, like in a stupid way.
Speaker 2
10:25
There's something about me that loves hitting my head against the wall over and over and over until either my head breaks or the wall breaks. Just like, I love that dedication for no purpose whatsoever. It's like the mouse that's stuck in a cage or whatever. And everybody always told me, my family, the people around me, the sort of, the epitome of what I could achieve is to be kind of a stable job, you know, the old like lawyer, doctor, in my case it's like scientist and so on.
Speaker 2
10:57
But I had these dreams, I had this fire, you know, about, I love robots. And that nobody ever gave me permission to pursue those dreams. I know you're supposed to grab it yourself. Nobody's supposed to give you permission, but there's something about just people saying, you know, fuck what everyone else thinks.
Speaker 2
11:18
Like giving you permission, a parent or somebody like that saying, do your own thing, go become an actor, go become like, do the crazy thing you're not supposed to do, an artist, go build a company, quit school, all that kind of stuff.
Speaker 3
11:31
Yes. Sure.
Speaker 2
11:33
That's the pushback against the follow your dreams as a as
Speaker 3
11:37
a bad advice in mass. If you were to look at in mass, if you were to look at statistically how few people that works out for, I'm just no. Let's be very honest.
Speaker 2
11:46
It's very true.
Speaker 3
11:47
Yeah, very honest. So I mean, like, yeah, if you're gonna go be an actor, hey, I was broke for 10 years before I became a, before I was making money as a comedian. I get it.
Speaker 3
11:55
I didn't need Gary Vaynerchuk to tell me to follow my thing, right? And here's the other thing. I was kind of funny and like, I was kind of, a lot of things were in my favor of being a comedian, right? I had this kind of crazy fucked up life.
Speaker 3
12:09
I had a lot of stories. I had exhausted, I was willing to fail. I had failed before. I was broke.
Speaker 3
12:15
I didn't care about being broke. I knew how to be broke. I was shameless to a degree. I would get on a stage night after night and be laughed at.
Speaker 3
12:25
I had a high threshold for being embarrassed. I had a high threshold for people thinking that I was a scumbag, right? And showing up at family parties and being like, yeah, I still really don't have a job. And I'm just, I work at comedy clubs kind of, and I get booked when I can.
Speaker 3
12:41
And I was suited for it. There's this idea that people can just roam around the world injecting themselves into other things they have no aptitude for at all, and will that to happen? A small percentage of people might be able to do that, but the vast majority of people have something they might key into that they're meant to do. Like you loved robots, you love technology, and you found a place in that world where you thrive.
Speaker 3
13:12
But I think many people, a lot of people love robots, right? So a lot of people think everything you do is interesting. I think your shit is fascinating. I watch your podcast and I think it's very interesting.
Speaker 3
13:23
I have no place in your world. You know what I mean? I have no place in that world. I don't like remedial math.
Speaker 3
13:33
I don't like community college math. I think it's a waste of my time.
Speaker 2
13:37
What do you think about a robot? Would you ever buy a robot for your home?
Speaker 3
13:40
Yes. What will it do?
Speaker 2
13:43
I'll be a companion, a friend.
Speaker 3
13:45
Oh yeah, I mean, I would like to start replacing friends and family with robots immediately. I mean, truly, truly. I mean, I'm not even kidding.
Speaker 3
13:52
Like, I would like to have a Thanksgiving with 4 robots. I'm dead serious. Are they into QAnon? Like, are the robots, when do the robots start going crazy?
Speaker 3
14:05
That's my question is like, how long do the robots live with me before they are also a problem? And I gotta replace them, you know what I mean? You're gonna indoctrinate the robots. The robot's gonna call me like my aunt does and talk about coronavirus for an hour every morning and tell me everyone in America who's died of coronavirus?
Speaker 2
14:22
1 of the things I enjoy in life is how terrified people like you, I'm a huge fan by the way, get in front of robots.
Speaker 3
14:32
Well, I'm concerned about AI like completely getting rid of the need for human beings because human beings, I mean, you go out in the street and you go, so few of these people are necessary. Even now, even now you look at people and you go, they're hanging on by a thread, right? And you can just imagine how many jobs are gonna get replaced, how many industries are going to be completely remade with AI, and the pace of change worries me a little bit because we do a very bad job in this country of mitigation when we have problems.
Speaker 3
15:07
We don't do a great job. We did a not great job with COVID, right? We don't do a good job. It's just something we don't do well.
Speaker 3
15:13
We're good in booms and busts. We're good when it's good. And we're actually, we kind of know how to kind of like, hey, we're bottomed out. We're like a gambling addict in this country.
Speaker 3
15:23
We know what it feels like to be outside of an OTB at 9am drinking coffee and smoking cigarettes going, I'm going to build it back. And we know what it's like to win. But anything in between, it seems not that great. So to me, it feels like, are we gonna be able to like, help people that are displaced and that have their jobs taken by, I mean, do you not fear sort of a world where you have a lot of, you know, artificial intelligence replacing workers and then what happens?
Speaker 2
15:53
There's a lot of fears around artificial intelligence. 1 of them is yes, displacement of jobs, workers. That's technology in general.
Speaker 2
16:01
That's just any kind of new innovations displace jobs. I'm less worried about that. I'm more worried about other impacts of artificial intelligence. For example, the nature of our discourse, like social, the effects of algorithms on the way we communicate with each other, the spread of information, what that information looks like, the creation of silos, all that kind of stuff.
Speaker 2
16:24
I think that would just make worse the effects that the displacement of jobs has. I think ultimately, I have a hope that technology creates more opportunities than it destroys.
Speaker 3
16:38
I hope so too.
Speaker 2
16:39
And so in that sense, AI to me is an exciting possibility. But the challenges this world presents will create divisions, will create chaos and so on. So I'm more focused on the way we deal as a society with that chaos, the way we talk to each other.
Speaker 2
16:58
That's huge. Creating the platform that's healthy for that.
Speaker 3
17:01
Now, as a comedian, creator, whatever you want to call it, people that put out content, the gatekeepers are now algorithmic, right? So they are kind of almost AI-ready. So if you are a person that puts out YouTube videos, podcasts, whatever you're doing.
Speaker 3
17:21
It used to be a guy in the back of the room with a cigar saying, I like you, or get him out of here. Now it's an algorithm you barely understand. Like I've talked to people at YouTube, but I don't know if they understand the algorithm. They don't.
Speaker 3
17:36
They don't. And that's crazy. Yeah, it's fascinating. Cause I speak to people at YouTube and I go, hey man, what's going on here?
Speaker 3
17:42
1 of my episode titles of my podcast was called Knife fight in Malibu. It was about real estate. And it was because a realtor in Malibu, I was trying to get a summer rental, which I can't really afford, but I don't think that's a huge problem. I follow my dreams.
Speaker 3
17:57
So I called a realtor and she said, She goes, I don't know what the government's saying, but she goes, it's a real knife fight out here. You know, an old grizzled woman, real realtor, tan skin, cig out the mouth, driving a Porsche, you know, it's a real knife fight out here. You know, her entire life had become real estate. Her soul had been hollowed out, her kids hate her.
Speaker 3
18:15
You know, no one's made her come in years, but it's just, she just loves heating kitchen floors and views, fun. She's a demon from hell and we need them, truly. We're getting rid of them, it's not good. And she goes, it's a real knife fight out here.
Speaker 3
18:27
So we put that in the episode title. And of course, I guess some algorithm thought that we were showing like people stabbing each other in Wendy's. And we got like demonetized. Did we get demonetized?
Speaker 3
18:40
We lost a lot of views because we were kicked out of whatever out like we're just kicked out. And I was asking YouTube about it. They were kind of understanding it. But even the people that work there didn't truly seem to understand the algorithm.
Speaker 3
18:51
So can you explain to me how that works, where they barely know what's going on?
Speaker 2
18:55
No, they do not understand the full dynamics of the monster or the amazing thing that they've created. It's the amount of content that's being created is larger than anyone understands. Like this is huge, they can't deal with it.
Speaker 2
19:08
The teams aren't large enough to deal with it. There's like special cases. So if you fall into the category of special cases, we can maybe talk about that like a Donald Trump, where you actually have meetings about what to do with this particular account. But everything outside of that is all algorithms.
Speaker 2
19:24
They get reported by people, and they get, if enough people report a particular video, a particular tweet, it rises up to where humans look over it. But the initial step of the reporting and the rising up to the human supervision is done by algorithm, and they don't understand the dynamics of that. Because we're talking about billions of tweets, we're talking about hundreds of thousands of hours of video uploaded every day. Now, the hilarity of it is that most of the YouTube algorithm is based on the title.
Speaker 2
20:08
That's crazy. And the description is a small contribution in terms of filtering, in terms of the knife fight situation. And that's all they can do. They don't have algorithms at all that are able to process the content of the video.
Speaker 2
20:22
So they try to also infer information based on if you're watching all of these QAnon videos or something like that, or Flat Earth videos, and you also watch, are really excitedly watching the whole knife fight in Malibu video, that says, that increases the chance that the knife fight is a dangerous video for society or something like that.
Speaker 3
20:47
Interesting, wow.
Speaker 2
20:48
Based on their contribution. So if
Speaker 3
20:49
people are watching something, because I watch QAnon and Flat Earth videos to ridicule them. Right. You know what I mean?
Speaker 3
20:57
I watch these videos and I make fun of them on my show. But what's interesting is if I then go watch something else, I'm increasing the likelihood that that video is gonna get looked at as potentially subversive or dangerous. That's why. So they
Speaker 2
21:09
make decisions about who you are as a human being, as a watcher, the visual user, based on the clusters of videos you're in, but those clusters are not manually determined, they're automatically clustered.
Speaker 3
21:23
It's so weird. We have titles where they got upset about and I don't even understand. Like we had a title that was so innocuous in my opinion, and the title of the episode was called Bomb Disney World.
Speaker 3
21:36
And I was asking people to consider bombing Disney World, and YouTube got angry at that. So you don't know why, You can never understand why. You could
Speaker 2
21:47
have said Disney World is the bombs. Right, right. It's just rearranging.
Speaker 2
21:51
That's what you
Speaker 3
21:52
probably meant.
Speaker 2
21:52
I wasn't saying
Speaker 3
21:52
you do it, but I was saying let's start thinking about plans to do, like not let's do it, but let's get in the mind. Let's change the conversation. I think it's very interesting because as a comedian, you don't wanna live in that world of worrying about algorithms.
Speaker 3
22:07
You don't wanna worry about deplatforming and shadowbanning. I mean, all these conversations that I've had with other comedians about shadowbanning, I mean, it's hilarious. We all call each other, I think I'm being shadowbanned. Are you being shadowbanned?
Speaker 3
22:18
And nobody knew what that word was a month ago, I mean a year ago, but everyone now is convinced that everything they do that isn't succeeding is being shadow banned. So it's this new paranoia, this algorithmic paranoia now that we all kind of have because there are genuine instances of people being taken out of an algorithm, you know, rightly or wrongly, for however you want to believe. But then there are also things that just don't perform as well for a myriad of reasons. And then we're all saying like, well, they're against me.
Speaker 3
22:50
They're shutting me down. And you don't know if that's true or not.
Speaker 2
22:55
What do you think about this moment in history, which was really troubling to me? We could talk about several troubling aspects, but 1 is Amazon removing Parler from AWS. To me, that was the most clearly troubling.
Speaker 2
23:13
It felt like it created a more dangerous world when the infrastructure on which you have competing medium of communications now puts its finger on the scale, now influences who wins and who loses.
Speaker 3
23:30
Absolutely, you're right. And what you're always told is like, if you don't like Twitter, create your own service. Or if you don't like something, you can do your own thing.
Speaker 3
23:40
Or if you are, and basically because, you know, tech, you have to be in business with 1 of 5 companies. I think it's like Amazon, Facebook, Google, YouTube, and Twitter, whatever. I mean, Amazon puts everything on the cloud. Google and YouTube, it's all basically the SEO and the advertising, and you got to get your name out there.
Speaker 3
23:58
You don't want to be buried. Because you have to do business with those, it's a cartel of these companies, you understand it better than anybody, that you are prevented, truly. And I think, whatever you think about Parler, whatever you think about what people are saying on Parler, whatever you think about Alex Jones, whatever you thought about Milianopolis, the state has an interest in, and has always had an interest in crushing dissent. This is what the state has done.
Speaker 3
24:27
This is how they retain the power they have by eliminating dissent where they can. Now, because you don't have 3 broadcast networks anymore and a handful of newspapers that were all run, by the way, by people that had been either compromised or happily going with the program, and you have this wild west of the internet, people like me, people that make, I make funny content that I hope is funny, but a lot of it is wild and crazy. I say a lot of wild and crazy things, they're very funny.
Speaker 2
25:02
I say a
Speaker 3
25:02
lot of wild and crazy things, they're very funny. I say a lot of wild and crazy things about powerful people.
Speaker 2
25:05
Yeah, you mock the powerful in there by bringing them down a notch. We'll probably talk about it, but humor is 1 of the tools to balance the powers in society.
Speaker 3
25:17
Well, sure, and to make people feel better about things and to, you know, whatever the case may be, right? That's my goal is to kind of like, hey, people have had a shitty day. If this video or podcast makes you laugh, that's great.
Speaker 3
25:29
I think that it won't ever, it was never gonna stop at Alex Jones. Not that I think he should have been taking off everything the way he was, but this keeps going until we have sanitized all of social media. And what they really want it to be is what Instagram's kind of becoming, which is a marketplace of, you could just go and buy sneakers, go buy a sweatshirt, go buy jeans, go buy this, go buy that. And the idea of the free exchange of information seems to be the old internet, and it seems the new internet seems to be hyper, and I'm a capitalist, but this seems to be like hyper-capitalist in the sense of like, they only want you consuming things and they don't want you thinking too much.
Speaker 3
26:11
And that seems to be where it's heading. I've even seen that with Instagram where It's like everything on Instagram is like, buy a sweatshirt. You know? And I'm like, all right, man.
Speaker 3
26:20
Hey man, if I want a sweatshirt, I'll get it. Like, relax. You know, just every ad seems to be encouraging consumption, but very few things seem geared towards, hey, let's have a dialogue or let's, and not that Instagram was ever great for that, but like very things are geared now towards content on Instagram. A lot of it seems geared towards shopping.
Speaker 2
26:44
See, I don't know. That's an interesting point. I don't know if the consumerism that capitalism leads to is necessarily gets in the way of nuanced conversation.
Speaker 2
26:53
I feel like you could still sell Tim Dillon sweatshirts and have a difficult nuanced conversation or mock the current president, the previous president, mock the powerful, all that kind of stuff.
Speaker 3
27:05
Yeah, we try, we try to balance that. I mean, it's-
Speaker 2
27:07
Do you have sweatshirts?
Speaker 3
27:08
We do, we do, they're not, are they on sale now, fake business? We do, fake business sweatshirt with the Enron logo, fake business, because I do fake business all the time. It would
Speaker 2
27:17
be nice if you talk about Alex Jones, if you plug the sweatshirt during that conversation.
Speaker 3
27:21
Yeah, we'll do that, absolutely. But what I tend to worry about with, I see social media and technology existing to flatten society. It makes people very boring.
Speaker 3
27:34
All of the experiences kids have right now are online. Many of their closest friendships are online. Their first relationships are online. The culture is very homogenous.
Speaker 3
27:45
And I think it's eliminating characters. It's eliminating interesting people. It's making people into AI, all of their tastes.
Speaker 2
27:53
Whoa, whoa, whoa,
Speaker 3
27:53
whoa, whoa. Yeah, yeah, yeah, yeah, yeah. That's right.
Speaker 2
27:55
AI could be Charles Bukowski as well. Let's not get crazy.
Speaker 3
27:57
It's not there yet, right? I mean, The $75,000 dog is not doing anything. So we're not there yet.
Speaker 3
28:06
Listen, I get why you like AI so much. I hate people too, and I'm very amenable to AI. And I agree with you. Listen, I think the future, we gotta get everyone out of here.
Speaker 3
28:15
I'm with you on that,
Speaker 2
28:16
so don't think I'm. I love people, he's manipulating my mind and my. That's why
Speaker 3
28:21
the flash of light in your eyes when you talked about that dog was so much more than any person, and I get it,
Speaker 2
28:27
by the way. You're right, I love people, but if we could just.
Speaker 3
28:29
They're not exciting.
Speaker 2
28:30
If we could just use robots to kill most of them, I think that would be good for society.
Speaker 3
28:35
I'm with that too. But I think that social media flattens people.
Speaker 2
28:40
Flattening the personalities of characters.
Speaker 3
28:42
Flattening the personalities of people, man. And it's just, you know, when's the last time, like I like the idea of like, you know, and I'm, you know, somebody showing up to, you know, high school with like a backpack and taking out an old CD and be like, Hey man, here's this band you've never heard of that I love or whatever. You got to get into this.
Speaker 3
28:58
And I'm like, you know, when I talk to young, you know, I have friends that have younger brothers and everything. And I know that the dominant culture was always dominant. I'm not an idiot, but I feel like it's harder to be unique and original now because so much of what's promoted is just this way to kind of corral people into believing and thinking a certain set of ideals that's constantly shifting and evolving, and people are just caught up in that, and to me, it gets very boring very quickly. I hate being bored, and that's what it is.
Speaker 3
29:29
I don't
Speaker 2
29:29
know what to do with that, because at the same time, podcasts are really popular, long form podcasts are really popular and people are hungry for those kinds of conversations. There's a lot of dangerous ideas, quote unquote, flowing, being spread around through podcasts, meaning just like debates. Correct.
Speaker 2
29:47
So that's still popular. So I don't know what to.
Speaker 3
29:50
I agree with you.
Speaker 2
29:51
That gives me hope, I guess.
Speaker 3
29:52
I hope so too. And like I said, I look at the negative a lot because that's what I usually make fun of, but there's a lot of positive stuff happening too.
Speaker 2
30:00
Let's talk a bit about Alex Jones. So you've gotten a chance to talk to him while you were on the Joe Rogan Experience.
Speaker 3
30:09
I've been on Alex's show. I've talked, I've had Alex on my show. I've talked to Alex for 3 hours in front of, I guess it was maybe like 15 million people, right, on Joe's show.
Speaker 3
30:18
It was a really wild conversation. I think it was 1 of the coolest moments in broadcasting that I, clearly that I've ever been a part of, but I think it goes in the lexicon of like, these are big podcasts. Like, I think it's 1 of the biggest podcasts. A week before the election, Alex Jones.
Speaker 3
30:35
I'm really grateful that Joe gave me the opportunity to be there, and it was just an amazing conversation to watch.
Speaker 2
30:41
What was the shirt you wore, Julian Maxwell? Freedia's Lane.
Speaker 3
30:44
It was a fun joke that no 1 in tech got because we all know how funny they are. But the tech writers, which is mainly blue haired.
Speaker 2
30:50
I do not agree with these statements.
Speaker 3
30:51
They're mainly blue haired people whose goal in life is to find things to give them orgasms with, you know. If you want
Speaker 2
30:58
to dye your hair blue, it's your choice. I respect it.
Speaker 3
31:00
Yeah, but is it your choice? But at the end of the day, it's like, you know, all the tech writers, like a lot of people just, and I'm not, I'm just maligning tech unfairly, but a lot of people that sense of humor were like, he's advocating for human trafficking. I'm like, it's clearly a joke because we're coming off the Believe All Women.
Speaker 2
31:16
Yes.
Speaker 3
31:16
We're coming off that, and it's very funny to just say, Freedia's lame, hey man, believe all women, like, it's just, our politics and our public sphere is so schizophrenic right now, that when you point that out, people are going to be angry with you, but That was a fun shirt to wear.
Speaker 2
31:31
But on Alex, I was 1 of the people that found him really entertaining. The same kind of thing as with Bukowski, these kinds of personalities that are wild, crazy, full of ideas. They don't have to be grounded in truth at all, or they can be grounded in truth a little bit.
Speaker 2
31:51
He's just playing with ideas like a jazz musician, screaming sometimes. Obviously he has some demons. Sometimes he's super angry for no reason whatsoever. It's some weird thing that he's constructed in his own head.
Speaker 2
32:04
Sometimes he's super loving and peaceful, especially lately that I've heard him. I don't know if you've seen with him with Michael Malice, where he's doing, like Malice was doing, telling Alex Jones, I love you, Alex, just this loving kind of softness and kindness underneath it all. I don't know what to make of any of it. And then there's this huge number of people that tell me that Alex Jones is dangerous for society.
Speaker 2
32:28
And so what do you do with that? Do you think he's dangerous for society? Do you think he is 1 of the sort of entertaining personalities of our time that shouldn't be suppressed or somewhere in between?
Speaker 3
32:39
I don't think that Alex per se is dangerous for society. I think the greater danger for society comes again from stifling all dissent, right? All like anybody with a voice that uses it, that critiques the government and putting all of those people in a category and getting rid of them is incredibly dangerous.
Speaker 3
33:00
To me more so, I think the biggest problem that Alex has ever had was when he questioned the Sandy Hook shooting. And that really was, because it really is this identifiable incident that you can look at where it did get away from him and a lot of his fans who, the people that are attracted to conspiracy stuff, and I have some of those fans, some of them are really smart people, some of them are mentally unwell. A lot of them happen to be mentally unwell. So when you have a fan base of people where some of them are mentally unwell, and you are questioning, you know, tragic events, okay?
Speaker 3
33:33
And Alex was right about Epstein. He was right about a lot of things, and he's got no credit for that. And I understand that this piece, sometimes when you write about 10 things and you're wrong about something, and the thing you're wrong about is so offensive to people, you're never gonna get any credit for being right, even though you were right more than when you were wrong. The problem was a lot of his fans who were crazy, stalked, harassed these families and accused them of being actors and accused them of like faking their children's deaths.
Speaker 3
34:02
It was just horrific experience. And Alex is tied to that. And how much he inspired that by what he did on his show, I don't know because I haven't watched hours and hours of that particular thing, like the whole Sandy Hook thing. If you listen to him, he says, I really covered it.
Speaker 3
34:24
I kind of covered it and moved on. Other people go, no, he spent a long time on it. But that's the real danger of going into that territory over and over again, going everything is a false flag or everything's fake. I think Alex has actually been kind of reasonable.
Speaker 3
34:39
Like he's resisted a lot of the politics of like racial resentment on like the alt-right, for example, He's resisted that. He's resisted the anti-Semitic currents of a lot of that politics, right? He's resisted a lot of the virulently anti-trans or anti-gay stuff. Now he does dip his toe into the water of like the culture wars, of course he does.
Speaker 3
35:00
But I've never really seen him, well, I could be wrong about this, embrace white nationalism or identitarianism. I've never seen him really go anti-Semitic. I've never seen him take that route. When I grew up, and I would turn him on every now and then, he was talking about NAFTA, the WTO.
Speaker 3
35:17
He was talking about 9-11. He was talking about the World Trade Organizations and a lot of these big conferences, whether it was the Bilderberg Group, whether it was Bohemian Grove, which he infiltrated. And he was talking about, hey, here are the most powerful people in the world, here's what they're doing, and here's how it affects you. And that was interesting to me, because no 1 else was really talking about it, except Alex Jones, occasionally Art Bell on WABC, you'd listen to him at night, right?
Speaker 3
35:45
I think Alex became very controversial when he decided to back Donald Trump. And then he has a considerable following and a considerable audience that he was then able to marshal in the direction of supporting Donald Trump. That was when the spotlight, because then he was talking to Trump, Trump did his show, Alex Jones just got bigger, right? I mean, he blew up,
Speaker 2
36:08
that's the
Speaker 3
36:08
term, right? He blew up, like he had the good, he put out the good HBO special, whatever you wanna call it, he has a hit song. He blew up, And then people started looking at the things that he was associated with.
Speaker 3
36:20
The Sandy Hook thing is a blemish on his record. I do believe he regrets it. But again, I do see the point of the families who are like, dude, fuck this guy forever. This is the worst thing I ever went through.
Speaker 3
36:29
It's a very tough, I understand the people that say that. I understand and I understand the people that go, when you have tech companies that act in a coordinated manner to just get rid of someone, they don't have any way to defend themselves. It's a little terrifying when you think about that power being abused, and how wouldn't it be? Do
Speaker 2
36:53
you think he should not have been banned from all these platforms?
Speaker 3
36:57
I don't think, I do think that if you are a private company, right, I do think, and this is where you run into this problem, like, I don't know if these tech companies were government utilities, would that decrease people's likelihood of being banned? I don't know, right? So I understand the benefit of them being treated like public utilities and people thinking they have the right to a Twitter.
Speaker 3
37:21
I've never, I don't know, I have very little confidence. I mean, the government's trying to roll out a vaccine in California and we vaccinated like 5 people, I mean, in terms of what we need to do in the state. Right? So maybe if it was a government utility, I do think someone like Alex, like there should be some process.
Speaker 3
37:40
So if you're going to get rid of someone, they should have a way to defend themselves. There should be more democratic. Yeah. Process that you can go through than just being unilaterally taken off something.
Speaker 3
37:54
But like, then you run into the, you're like, am I gonna say that everyone deserves, no, if you're threatening or harassing people or threatening to kill them, publishing their private information. If you're committing crimes on these platforms, obviously the people that own these platforms are gonna be like, we're not gonna allow this to happen. So I understand that there is a line, right? There is some, like people that say there's no line aren't really thinking.
Speaker 3
38:17
Like there is a line. I just don't, that line seems to be moving all the time and it seems to be a very hard thing to police. But I don't think you can remove a guy off everything and then also bank accounts won't give him debit cards or credit cards. I don't know if
Speaker 2
38:31
you talked to him about that. But like, you know, there were financial institutions that were refusing to let him, you know, park his money. So I mean, it really does get pretty terrifying pretty quickly.
Speaker 2
38:43
Probably without any transparency from those companies. So you're right, it feels like there should be a process of just having, for him to defend himself.
Speaker 3
38:54
I think there needs to be a process for people to defend themselves. Every day I wake up and I go, is something I said in a video going to get taken out of context? Is somebody going to get angry?
Speaker 3
39:04
Is somebody going to be, you know, I say wild stuff because that's what makes me laugh. That's what makes my friends laugh. And that's what makes my audience laugh. So I, I never, ever people, you know, whatever, whatever political side you come down on.
Speaker 3
39:18
I think if you make your living speaking, it's always interesting to me if you are pro the platforming. It's that's odd.
Speaker 2
39:27
It's interesting to consider kind of a jury context to where, you know, there's transparency about why your video about bombing Disney World might be taken down. It gets taken down and then there is, it's almost like creating a little court case, a mini court case, and not in a legal sense, but in the public sphere. And then people should be able to have, you know, you pick representatives of our current society and have a discussion about that and make a real vote.
Speaker 2
40:00
You know, just have like jury locks himself up in a discussion. That kind of process might be necessary. Right now what happens is Twitter is completely, first of all, they're just mostly not aware of everything they're doing. There's too much stuff, but the stuff they're aware about, they make the decision and close doors, the meetings and without any transparency to the rest of the company actually, but also transparency to the rest of the world.
Speaker 2
40:29
And so, And then all they say is we're making decisions because the people, they use things like violence. So violence equals bad, and if this person is quote unquote inciting violence, therefore that gives us enough reason to ban them without any kind of process. I mean, it's interesting. I'm torn in the whole thing.
Speaker 2
40:51
If it was indeed, there's no transparency about it, but if Parler was indeed inciting violence, like if there was brewing of violence, potential violence where thousands of people might die because of some kind of riot. Like, this is the scary thing about mob, about when a lot of people get together, who are good people, like legitimately good people that love this country, that don't see enemies yet around them, but if they get excited together and there's guns involved, and then some cop gets nervous and shoots 1 person, another person shoots the cop, and then there's a lot of shooting involved, and then it goes from 5 people dying in the Capitol to thousands of people dying in the Capitol.
Speaker 3
41:40
Well, in fairness to defend the people of the Capitol, they didn't shoot the cop, they bludgeoned him to death with a fire extinguisher.
Speaker 2
41:49
Yes.
Speaker 3
41:49
So I do wanna just kind of put that out as a defense of them. Listen, I'm sure there was some wild shit going on on Parler, and I think the problem, here's the problem, right? There's a lot of people that just wanna go on these sites and say they wanna kill everyone.
Speaker 3
42:06
And the problem is, at what point do you shut them all down? Like I think a lot of people are just living in a world where they're powerless, they don't have any political power, They don't have any economic power, right? They can't throw their money around. They don't have healthcare.
Speaker 3
42:23
Their job security isn't great. They might be living in a community that doesn't have the resources they would like it to have. They're not happy and thrilled. And then they have these sites where they can go on and just say, man, I'd like to fucking burn it all down.
Speaker 3
42:39
And distinguishing a guy blowing off steam and saying wild stuff from a genuine threat is a very hard thing to do. Like I've threatened to kill, I got banned from Airbnb, I threatened to kill the people that banned me comedically. Comedically, this is a joke, I'm not going to kill you.
Speaker 2
43:01
This
Speaker 3
43:02
is a joke because I'm blowing off steam
Speaker 2
43:03
and
Speaker 3
43:04
I'm angry. Do you know how many people that my parents, like my dad's like, I'm gonna fucking kill this guy. My mom's like, I'm gonna fucking kill.
Speaker 3
43:11
They were talking about each other, but none of it ever happened, but we should be, I think you have to create a space for people to threaten to overthrow the government. Yeah. As long as they don't violently do it.
Speaker 2
43:26
Yeah.
Speaker 3
43:26
I mean, does that make any sense? Like, I mean, as long as they're not going to go hurt innocent people, what are you gonna do? Like there's so many people out there that, that's why a lot of these things like 4chan, these sites, a lot of people going on there, they just wanna say the most fucked up shit because it's the thing that gives them, they can laugh or they can release steam.
Speaker 3
43:46
And it is immature. It is stupid. It's not productive. It's not, you know.
Speaker 3
43:51
But at the end of the day, if you're not going to give people health insurance, you've got to give them something. It's like when someone in this country dies that everyone disagrees with, right? Political figure, media figure. A lot of people dance on their grave online.
Speaker 3
44:05
And then everyone, people goes, and the other side will always do it. Like if a conservative dies and everyone goes, great. Conservatives goes, this is grotesque. And then when RBG dies, they all have parties and the conservatives go, great.
Speaker 3
44:20
You have to let people in this country enjoy the deaths of their enemies. You do, because they don't have much else. Again, If you gave them other things, you might say, guy, you can go get a knee operation. Why don't you stop?
Speaker 3
44:36
But if they're working for shit wages and you haven't figured out a way to treat them, treat their cancer diagnosis, and they don't, I mean, life, you know, you gotta derive pleasure from something, right?
Speaker 2
44:54
It's an interesting point that anger is a good valve, if your life is suffering, that there's something very powerful about anger, but I still have hope that it doesn't have to be. I mean, that kind of channeling into anger that then becomes hate led us into a lot of troubles in human history.
Speaker 3
45:15
So you
Speaker 2
45:15
have to be careful empowering people too much in that anger, especially, I think my, I think I understand why people were nervous about Parler, about Twitter and so on. Because all that shit talking about violence was now paired with let's get together at this location. This was a new thing.
Speaker 2
45:41
Like it's not just being on whatever platform talking shit, it's saying we're going to, in physical space, meet. And then everybody got, all these platforms got nervous. Well, what happens when all these shit talkers, all these angry people that are just letting off steam meet in the physical space? And there was probably overreach, almost definitely overreach, but I can understand why they were nervous about it.
Speaker 3
46:07
I agree. There doesn't seem to be, and this is when Trump got elected, and when you have like, whatever you have, right? Whether you have riots in Portland and Seattle, where you have the Antifa people doing crazy things, you have like, you know, the people at the storming the Capitol, there never seems to be a ton of an examination of why these ideas are becoming popular.
Speaker 3
46:24
Why are people so angry? What is leading people to this? Why are we here? What about their lives is to the point where they need to show up at these places.
Speaker 3
46:34
And like, and obviously there's always gonna be people on the fringe, there'll always be the mentally unwell, there'll always be people that wanna destroy society. But when you look at how popular large, you know, long discredited things, whether it's fascism, you know, totalitarian communism, all of these things are like, why are they back? Why are they back in a big way? And why are people so fed up with the status quo that they're finding solace in the most extreme discredited theories of how to run and operate societies, theories that have led to deaths of a lot of people.
Speaker 3
47:11
So to me, I'm like, if those people at the Capitol, yes, if they were going to work, if they were able to go out and drink at Chili's, if they were able to get a fucking checkup, right? Like if their job paid a little bit better. And I'm not saying that this is all the reason, right? I'm sure that there's a lot of people there that are doing quite well and they're still nuts.
Speaker 3
47:38
But like the anger and the rage that's boiling to the surface of this society, does it come from the fact that across the board, people in very different areas and with very different political beliefs feel like they are being fucked over and there's nothing they can do about it. That's what the baseline to me, they look at the people that run the country and run the world, whether they're tech titans, the guys that you talked to, or whether they're people that run the government, whether they're people that run large banks, large media companies, the people that have created this kind of infrastructure that everyone lives in, these people are incredibly powerless. And when you push people to that point, logically, sadly, and unfortunately, the next thing does seem to be violence.
Speaker 2
48:33
Yeah, the thing that troubles me a lot is you said nobody's asking why these beliefs are out there, but sometimes it's not even acknowledged that people are hurting, people are angry, Just even acknowledging that all the conspiracy theories that are out there, acknowledging that they're out there, and then people are thinking about it and talking about it. Because otherwise, So it's not acknowledged in this nuanced way. What happens is you say, okay, 70 million people are white supremacists.
Speaker 2
49:07
It's just throwing a kind of blanket statement. And of course that gets them angrier and makes them feel more powerless. And that ultimately, that's what's been painful for me to see is that there's not an acknowledgement that most people are good. And there's circumstances where it's just, you're pissed off.
Speaker 2
49:36
Right. Because you were powerless.
Speaker 3
49:38
I mean, most of us are powerless. You could fall in with a bad crowd. That's the thing, you can just fall in.
Speaker 3
49:43
Yeah. And it doesn't mean that there's not blame. Obviously, you have agency, you're a person. But the idea that you could be rehabilitated, you could do something stupid or you could fall into a group of people that are, and then in a few years you could go, what the fuck was I doing?
Speaker 2
50:01
You know,
Speaker 3
50:01
I'm an ex-drug addict. I know what it's like to go from being 1 thing to being another thing, right? I'm still a drug addict.
Speaker 3
50:07
If I were to use drugs right now or drink, I would still be addicted to them, right? I mean, it's not something that I can ever change about myself, but I know what it's like to go from 1 thing to another thing. So when you look at racism or whatever, ism, homophobia, misogyny, whatever you're looking at, anti-Semitism, and you go, that's a fixed condition where nobody's ever going to be able to change. Nobody's ever gonna be able to be rehabilitated.
Speaker 3
50:30
Nobody's ever going to be able to reimagine themselves in a different way. To me, you're just, you're throwing away someone and you're making them feel helpless and worthless. And that's gonna lead to antisocial behavior that spills out into violence. We don't have a very redemptive society.
Speaker 3
50:48
That's a huge factor. We don't have a redemptive society. That's why I like OJ Simpson, because OJ Simpson, yes, he did a bad thing, supposedly. Allegedly, yeah.
Speaker 3
50:58
But he's very kind now on Twitter, and he makes very nice points about how we all have to get involved in the political process and he's on golf courses and I like watching people golf. I don't do it, but I like watching him do it. And he's like an elder statesman because I remember him from the Naked Gun and I choose to forgive him for whatever happened there, which I don't know. But I choose to forgive him really for, I mean, obviously, what they say is he cut his wife's head off.
Speaker 3
51:26
But I can look past that and redeem him because he's very stable on Twitter and he's a good, I see all these people going crazy on Twitter and I'm like, there's maybe, OJ's lived a full life. I think there's a benefit to that. There's a benefit to kind of living a full life. Yeah,
Speaker 2
51:45
how many of us have not at least tried to murder somebody in the
Speaker 3
51:48
hundred percent? Listen, oh J's had the highs and the lows But he did it on his terms and there's a real
Speaker 2
51:57
like a Frank Sinatra song. Yeah, he
Speaker 3
51:59
did it my way I mean like there's a benefit to that And he seems like a very well-adjusted person now. So I mean, I don't know. How is that a fact?
Speaker 3
52:07
But it is a fact, and that's an uncomfortable fact.
Speaker 2
52:09
Well, that's a strong case for forgiveness in 1 of the more extreme cases, I suppose. But yeah, there's not a process of forgiveness. It seems like people just take a single event from your, or sometimes a single statement from your past and use that as a categorical, like capture of the essence of this particular human being.
Speaker 2
52:31
So murder might be a thing that you should get a timeout for a little while.
Speaker 3
52:38
Murder's bad. Murder, and let's just say that. Murder is not good.
Speaker 2
52:43
I'm glad you make this definitive statement.
Speaker 3
52:46
It's a controversial. OJ's an interesting cat, because you're like, he's very stable on Twitter. He's very like, he's like, let's take a look at it, guys.
Speaker 3
52:55
Like, we need more of his energy. That's what I'm trying to say. Yeah, yeah. I know, like, yes, it was bad.
Speaker 3
53:00
He killed the woman in the waiter. I was not for that. I wish he didn't do that, but the OG Simpson trial was such a fun
Speaker 2
53:09
thing. Yeah, and like you said, we need more fun people in society. Speaking of fun people, your politics have been all over the place.
Speaker 3
53:17
I hope so, I hope so.
Speaker 2
53:19
I mean, that's-
Speaker 3
53:20
Imagine not, imagine someone whose politics weren't all over the place. It would seem odd. Right, like- In the 10 years that I've been politically conscious, just because I'm 35 and
Speaker 1
53:29
20,
Speaker 3
53:30
no, I've probably been conscious for over 2 decades, but like Democrats have become Republicans, Republicans become Democrats. I remember when Ann Coulter said, we need to defend the George W. Bush, when he said, we need to go out and Christianize or modernize the Arab world, we need to democratize the Arab world.
Speaker 3
53:45
And then Ann Coulter backed Donald Trump. And all the right wing in America believed in nation building. They believed in going out and democratizing areas that might breed radical terrorists, whether it was Iraq or wherever you were going, toppling regimes and instituting new democratic norms in those countries. That was the right-wing point of view when I grew up.
Speaker 3
54:07
Then the right-wing switched to, we are going to be isolationist, we're gonna take care of America first and foremost, we're not gonna go into other countries. And then the Democrats, who when I grew up were doves, and the right-wing people were more hawkish, and the Democrats were like, the military solutions aren't the way. We need to have multilateral diplomatic coalitions to solve all the problems. Now, you know, Rachel Maddow's like, let's nuke Russia every night on MSNBC.
Speaker 3
54:37
The Democrats are like, we need strong presence in Syria. We need a strong presence. We need to counter Putin all over the globe. We need to get, so they're more hawkish on things.
Speaker 3
54:47
So literally I have watched 2 political parties literally flip and it's crazy to watch.
Speaker 2
54:54
And in some sense I've watched that as well because when I first saw Barack Obama, I admired that he was against the war. This is whatever, maybe before he was a senator, he spoke out against the Iraq war. And then, you know, it doesn't feel like, it feels like his administration was more hawkish than dovish in a sense with all the drone attacks, with the sort of inability to pull back, or at least in mass, efficiently pull back from all the military involvement that we have all over the world.
Speaker 2
55:34
Right. And just the language. What I think is interesting
Speaker 3
55:37
about that, what's interesting about Obama, because it's a very interesting study, is that presidents are controlled in very different ways, right? You know, presidents can be controlled by different factors, power factions within Washington. You know, I think 1 of the reasons that Obama was maybe, you know, he had a very close relationship with John Brennan, who was a CIA director.
Speaker 3
55:57
And Obama was very close with John Brennan, and Obama was very, you know, I think, malleable to the extent that the CIA, and I've had CIA agents on my show, John Kuriakou, a guy who went to jail for exposing torture, was saying that, you get into the Oval Office, all of a sudden, you're having that presidential daily briefing every day and the intelligence people come in and they go, listen, man, I mean, there's gonna be a terrorist attack on your watch if you don't do X, Y, and Z. They go, we have, they call it like blue book information, which is 5 levels above top secret, and they go like, hey man, a guy in Iran at a cafe said he's blown everything up next week. And you know, I mean, it's the same thing as parlor. You don't know if it's true or not.
Speaker 3
56:39
But now the president's making decision on usually a lot of uncorroborated intelligence that goes into a presentation for the president where you're just terrified every day and you don't want a terrorist attack on your watch. Now, so why are they getting all this information? Because a lot of the people in Washington have an interest in perpetual, constant, ongoing warfare. And there's a lot of financial gain to be had from that.
Speaker 3
57:02
So they're sneaking their information into the presentations that are going to the president, and then the president is now behaving and going, fuck, I don't want a bomb going off. We gotta do what we gotta do. And whatever version of that happens, that is really kind of what is happening. Whereas the presidents are being controlled by forces that are outside of the political sphere, but very much still in it, and they have a lot of power.
Speaker 3
57:27
That's what the deep state is. You know, Trump, there's a lot of ridiculing Trump of going, the deep state doesn't exist. It absolutely exists. There's been books about it written by liberal journalists.
Speaker 3
57:36
The deep state is only a term for unelected, largely, power factions in Washington, DC, that outlive any presidential administration. These are people that might work at the State Department, they might work at the Defense Department. These are people that are not always working officially in any government capacity. They might be private companies, they might be military contractors, They might be people at Boeing or Raytheon or General Dynamics.
Speaker 3
58:06
And they constitute a group of people that Trump kind of called the swamp, but Trump had really no interest in draining the swamp. But he articulated these things, and this is what it is. You have a lot of interested parties that have budgets that they want, big budgets. Everybody wants a budget in Washington, whether you know what it is, they want money.
Speaker 3
58:29
And these are the people who really control. So this idea that the president is the be all end all has gotta be smashed, which is why the horse race model of politics and being like, is it right wing? Is it left wing? Is it, what team am I on?
Speaker 3
58:41
And what color am I wearing? It's very simplistic, but The reality is this is an empire. It's past its peak.
Speaker 2
58:48
We're in trouble. The United States is an empire past its peak.
Speaker 3
58:51
Yeah, I mean, that's just, you could prove that case in court.
Speaker 2
58:55
Well, let's go to court right now, but I do love the more complex idea that there's just human beings who crave power and seek ways to attain that power through different ways. If you have Barack Obama or George Bush or Donald Trump, there's different attack vectors.
Speaker 3
59:15
Correct.
Speaker 2
59:15
Different ways to attain that power and then you can use that to leverage. And it probably doesn't have to be just in Washington, D.C. There's people who crave power all over the world.
Speaker 3
59:26
Of course, not in, but where we are now in Los Angeles, these people are all good. L.A. Studio executives, people that I, from what I understand, they treat everyone fairly and they're nice.
Speaker 2
59:38
But he
Speaker 3
59:39
sees the bad guys, but out here in LA. West Coast. Everyone's lovely.
Speaker 2
59:43
So amidst this fun exploration in your mind through the political landscape that you've done over the past couple of decades, decade that you've been conscious politically. Where does Donald Trump fit into this picture for you?
Omnivision Solutions Ltd