14 minutes 54 seconds
🇬🇧 English
Speaker 1
00:00
The Joe Rogan Experience. The U.S. Government produces more classified information than non-classified information.
Speaker 2
00:12
So even if like there's an audit, they could redact everything.
Speaker 1
00:16
Yeah, and it's just all these different divisions and departments and they all have their own protocols so just getting a handle on it is I mean that's the first thing that has to get done. We have to but not that we're even going to get the real information from there.
Speaker 2
00:32
Right. But, But then there's also the national security aspect of it. It's like, you know, you have to have some things redacted because, you know, of China and Russia. Like, they could just say that and then...
Speaker 1
00:44
Yeah, that is the phrase that gets used. Sure. It's so sad.
Speaker 2
00:47
Because they have a full, like, clamp down on their population. I mean, they limit the access to the internet. Their internet is essentially like China based.
Speaker 2
00:57
Like you VPNs are illegal. It's like and they're trying to do that here in America.
Speaker 1
01:02
It's all backwards. Yeah. With the restrict.
Speaker 2
01:05
That is wild. It's getting,
Speaker 1
01:07
it's getting nasty. 20
Speaker 2
01:07
years if you use a VPN, which is hilarious.
Speaker 1
01:09
And it's, it's managed by the commerce department. Unelected bureaucrats are the ones, see the TikTok is actually not named in that act. They're just letting the Secretary of Commerce decide which apps.
Speaker 1
01:25
That's insane. It's insane.
Speaker 2
01:28
So, yeah. Dan Crenshaw posted about it. He He thinks it's not that big a deal because he thinks that you know There's a lot of acts that get pushed and then they never get passed through but what's disturbing is Just the idea the desire to do this and the fact that it imagine if it did get passed I mean, it's just a fucking full-on assault on free speech.
Speaker 1
01:56
Yeah, I mean, it seems to be getting a toxic stigma connected to it. Did you see Jesse Waters' grill, Lindsey Graham, about it?
Speaker 2
02:05
No, I didn't.
Speaker 1
02:06
Oh, he didn't read it, but he endorsed it.
Speaker 2
02:10
Oh, Jesus.
Speaker 1
02:11
And he just got completely called out. It was really funny. Like, that should be illegal.
Speaker 1
02:15
You should not even be able to sign right something that you haven't read and they can't read it It's too it's too long right there's not
Speaker 2
02:22
enough time. There's not enough time. That's a lot of these acts right oh And they slip a bunch of shit in there.
Speaker 2
02:27
That's like wait a minute. What about page? 485 like what the fuck is going on there and then like oh
Speaker 1
02:36
Yeah, I mean while it's just like because
Speaker 2
02:38
it changed discourse in this country it's gonna change what people have You know what the the access that people have to free speech and communication
Speaker 1
02:46
And I mean, I think a lot of people endorsed it, righteously, being concerned about TikTok. You know, that's what was so sneaky, is that, you know, they enrage you to then support this disaster. Yeah.
Speaker 1
03:02
And it's just, like, We can all agree that there's a problem with TikTok and that the Chinese government having access to all of this data is problematic. But There should be an encrypt act, like, encrypt everything, but you can't go around banning apps. It just doesn't work. It's irrelevant.
Speaker 1
03:22
People are going to use VPNs. I think this act needs, I don't think it's gonna make it.
Speaker 2
03:29
I hope I- I hope you're right. Yeah. Because more people are talking about it.
Speaker 2
03:32
Tulsi Gabbard posted a big thing about it. There's a lot of people that are up in arms. But my concern is if it wasn't for social media, that act, which was kind of ironic, right? If it wasn't for social media and people sharing this and becoming outraged and people discussing this, it would have slipped right through, like the Patriot Act did.
Speaker 2
03:51
The Patriot Act existed in a time where there wasn't social media, and people weren't really aware of what they were pushing through until it was too late.
Speaker 1
04:00
Yeah, I think there's much better solutions. I mean, did you watch any of the TikTok CEO getting grilled?
Speaker 2
04:06
Yes, I did.
Speaker 1
04:06
Okay, so, you know, that was interesting because, you know, he's a pretty, he seemed like a sober guy, but in His point was, well, you have to have consistent standards for other social media companies too. I mean, how do we know that Facebook and Google, just because they're US-based, doesn't mean that they're not giving data to China. We have no idea.
Speaker 1
04:29
We have no idea. So that's really the issue. We need to understand what specifically are all of these apps doing. They should be labeled very specifically.
Speaker 1
04:39
And we're starting to see some of that happen, but the thing is you can't know with these proprietary apps because they're just not sharing anything.
Speaker 2
04:48
I think 1 of the problems that people have with whether any kind of decentralized app like yours or any other decentralized social media network is that people immediately go, oh, what do I have to do to do this? Like, Mastodon, when people start using Mastodon, and you get on it, you're like, what is this? There's so many servers, and how do I know what to join, and what's going on here?
Speaker 1
05:16
Yeah, so, well, Mines is different. Mines is actually not fully decentralized. We're a hybrid.
Speaker 1
05:21
So we run a centralized infrastructure, but we interface through delegation, delegated cryptographic event signing. That's happening in the background, but our app feels like a normal social media app. It's different. Masted on the way that that works is federated instances.
Speaker 1
05:40
So there's all of these different instances with different URLs, and there's like 20 people on each 1. But there is some interoperability between the instances because you can subscribe to somebody on another instance from your instance, but it's not fully decentralized, it's federated. The problem is that you don't own your identity. So if 1 of those instances goes down, you're screwed.
Speaker 1
06:07
Your stuff is gone. In Noster, which is like an architecturally different setup, and there's other protocols similar to Noster, But it doesn't matter if the website goes down. You just pop over to another 1, upload your key, and all your stuff is there. So, and that's why we like it, because it keeps us in check.
Speaker 1
06:27
Because our users can now Basically, if we fuck around, they'll bounce, and they can take their stuff. Because the social graph specifically is the key, because you spend a decade getting all these followers. It's your life. People spend their lives doing this, and then to be able to just get taken out by YouTube is so devastating and unethical.
Speaker 2
06:53
It's ridiculous. Well, it's really creepy, too, because many of the things they took people out for have turned out to be true. Like, there was a lot of things that they were labeling as disinformation or misinformation which are 100% proven fact now and people lost their accounts and there's no recourse they're not gonna reinstate you and that was a problem also with Twitter that for the longest time if you said anything that was contrary to whatever the narrative was.
Speaker 2
07:21
Whether the government was pushing it or the CDC was pushing it, like anything contrary to that narrative you would get fucked.
Speaker 1
07:29
Yeah, and those people are not back though. I think Twitter's making way more progress than everyone else. And look, I'm ultimately an Elon fan.
Speaker 1
07:37
I'm rooting for him. I think it's vastly improved. But there's chaos currently underway at Twitter. Oh, sure.
Speaker 1
07:45
And those people have not all been let back on. And I don't really understand why. So who hasn't been
Speaker 2
07:49
let back on?
Speaker 1
07:49
The people that we don't know. The people who's random Joe Schmo posting a COVID study, like has he been let back on? All the thousands of people that got banned.
Speaker 2
07:57
Well I think he essentially let back on Everyone who didn't do anything illegal.
Speaker 1
08:04
Not Alex.
Speaker 2
08:05
Not Alex, yeah that's true. Why? Yeah, well that's a personal opinion of Elon's, which I don't agree with at all.
Speaker 2
08:13
Yeah. Because they let Andrew Tate on. Right. You know, it's like- Yeah,
Speaker 1
08:17
it doesn't mean that he's endorsing Alex to let him back on
Speaker 2
08:20
right? It doesn't I mean Because there's a lot of people that are back on that are you know They didn't make that 1 specific mistake that Alex made but they've said some
Speaker 1
08:29
but the mistake that the reason Alex was banned was because he confronted, it was actually for something he did off Twitter. So he confronted this journalist, Oliver Darcy, in a line at some event. And he was, you know, being Alex Jones, sort of ranting at him.
Speaker 1
08:47
And then Twitter said, oh, you're bullying this guy and this is not acceptable behavior, so you're gonna leave. But then when I remember the exchange with Elon and whoever it was that was asking, it was that he hadn't been let on because of this Handy Hook stuff. Which is not the same, that's not even why he was banned. So you know, it's not easy.
Speaker 1
09:08
I understand the politics of it and he probably has Tim Cook being like, you know, we're not going to advertise if you have Alex Jones. I don't know what's going on but it doesn't seem to me, because he could win the argument if he would just let him back on. And did you see this crazy clip of Elon and the BBC guy?
Speaker 2
09:31
I did, I posted it today
Speaker 1
09:32
on Twitter. Oh, you did?
Speaker 2
09:33
It was amazing.
Speaker 1
09:33
Amazing.
Speaker 2
09:34
It was amazing. It was amazing, yeah. Because that guy kept trying to change subjects and let's move on, like no, no, no, no, no, no, no, no.
Speaker 2
09:41
What the fuck are you talking about? Because that guy thought he could just say the narrative without specific examples, like give me an example. And the guy had no examples.
Speaker 1
09:52
That's most people who are concerned about this
Speaker 2
09:54
Well, this is like a lot of people that I know that are famous that like publicly announced They were leaving Twitter and you know, 1 of them I really love and I was like why are you doing? I didn't even say anything to her, but I'm like why are you doing this? This is so dumb like you you're just doing this because this is the thing that everyone feels like they're supposed to do Hey, well Twitter's kind of fucked now so by no It was fucked before it's less fucked now Yeah, are there are people that are gonna say things like what I showed you earlier today Which is hilarious and someone posted to Kamala Harris Oh, yeah, she said something about the assault ban that shit's important.
Speaker 2
10:28
It's important to have people mock people Like I'm sorry if it hurts someone's feelings. But that shit's important.
Speaker 1
10:35
Yeah. And I I think the way that you handle that was great because obviously you need a specific example to back up an argument. However I sort of think the whole premise of the conversation is wrong. This idea that this war that Twitter is at with all the think tanks and I think it was the Institute for Strategic Discourse that had actually compiled the information that the BBC guy was talking about.
Speaker 1
10:56
And there is information that there is data showing you know hate speech XYZ has has increased. However this is the wrong conversation. It's not the existence or even rise of hate in the presence of that content on an app is not. You're not just trying to ban hate.
Speaker 1
11:18
Banning hate does not stop hate. And this is what the peer-reviewed research shows. So trying to bully Elon and Twitter for, look, even if there was a bump of hate speech since it became a little bit more free. I mean, it seems like that's a potentially understandable intermediary effect to happen while things reorient.
Speaker 1
11:45
Like, we open up free speech, we open up the valve a little bit. Okay, because we think that this is gonna be healthy for society long-term So let it bump a little bit. We need that. We need to see what we hate or what other people hate You need to like what is it?
Speaker 1
12:00
Free speech. Let us let's us know who the idiots are like you need to identify them.
Speaker 2
12:04
Yes Yeah, the the best response to whatever it is bad speech is better speech There's better arguments and that's You literally have a debate platform, which is what Twitter essentially is.
Speaker 1
12:19
Yeah, that is the purpose.
Speaker 2
12:21
Yeah, it's the purpose.
Speaker 1
12:23
Yeah, and not to mention that the hate isn't defined. So it's only 1 type of hate that these people are typically referring to. Right-wing hate.
Speaker 1
12:32
Right-wing hate. Yeah, not left-wing. Right, that's okay and so actually so we're at we're suing California we just filed this law because the this complaint they are trying to pass the social media law called a b587 which requires it's a censorship law there They require these policies on disinformation, misinformation, hate speech, and then they use, undefined, use the words extremism and radicalization. There's no definitions.
Speaker 1
13:04
They don't require you to have a child exploitation Material policy, but they do require you to have a policy on hate which isn't defined And so we're suing them with the Babylon Bee and Tim Pool.
Speaker 2
13:19
When did they start this, when did they start trying to pass this?
Speaker 1
13:24
It just went into effect in January.
Speaker 2
13:26
So it's
Speaker 1
13:27
now? It's now, it's in.
Speaker 2
13:28
So if you live in California, what's the repercussions?
Speaker 1
13:33
So it is, it's targeted at social media companies. So basically mandating that social media companies submit these policies. So we would have to, they would force us to write a policy on hate speech and submit it to them.
Speaker 1
13:51
And then additionally, we would have to, on like a biannual basis, submit analytics about all of our moderation data, which honestly, we're already transparent about our moderation data, so that's largely public anyway. We have a jury system and we have in-house moderators. But it's just, it's a huge burden. Like it's crazy that they would expect companies to submit all that and then have these arbitrarily, well actually not arbitrarily, specifically chosen categories for policies that are clearly politically charged.
Speaker 1
14:24
And Newsom, when he came out and announced this law, it was very, you know, we have to stop hate on social media and misinformation and disinformation. Protect society. Protect democracy. No, you know, you're not protecting democracy by stopping free speech.
Speaker 1
14:42
That is... There's
Speaker 2
14:43
no checks and balances in place if something turns out to be accurate.
Omnivision Solutions Ltd