A conversation about “The Social Dilemma” documentary on Netflix

Chris and Mark have a conversation about “The Social Dilemma” documentary on Netflix.
Intro Music: “Blue Scorpion” Kevin MacLeod (incompetech.com) Licensed under Creative Commons: By Attribution 3.0 http://creativecommons.org/licenses/by/3.0/
Dive Horn: https://freesound.org/s/104882
Outro Music: “Neolith” Kevin MacLeod (incompetech.com) Licensed under Creative Commons: By Attribution 3.0 http://creativecommons.org/licenses/by/3.0/

Transcript:

(0:00) Hi everybody and welcome to Knocked Conscious. (0:03) I’m going to play this episode in its entirety, but I had a quick correction (0:07) before I played it. I mentioned a gentleman named Matthew (0:10) Ward, who I said was Eddie Van Halen’s guitar tech.(0:13) It actually turns out it was Matthew Brock, not Ward, so I apologize for the (0:18) misspeak, and I hope you enjoy this episode. (0:21) Enjoy. (0:26) I’m really dumb, I don’t know what the fuck I’m doing now, (0:31) cause I can’t hit a simple record button.When we start the podcast now we have to (0:38) start all over again. (0:41) So, uh, welcome. (0:43) Welcome again! (0:44) Hola! (0:45) Take two! (0:46) Take two.(0:47) So we just exchanged some birthday gifts. (0:48) We did exchange birthday gifts. (0:49) Even though it’s October and this is being released in November, (0:52) and our birthday was in September, but the gifts were on a slow boat (0:55) from the Czech Republic.(0:57) From the Czechs. (0:58) Yes. (0:58) Check me out.(0:59) This sounds like a horrible idea. (1:00) What time you want me, sir? (1:02) I saw your picture of the shark. (1:04) It’s hilarious, right? (1:05) What time? (1:07) Christopher has gifted me two of the most beautiful gifts I’ve ever received.(1:12) Coffee mugs. (1:13) A white mug with black block writing that reads, (1:16) I hate being this sexy, but I’m from Czech Republic. (1:20) I can’t help it.(1:21) That one’s going to have absinthe in it. (1:22) Which is amazing because the grammar is wrong. (1:26) It doesn’t say I’m from the Czech Republic.(1:29) It says I’m from Czech Republic. (1:30) True. (1:31) So… (1:32) And there’s no punctuation.(1:33) Right, but it’s funny that it sounds like a person from the Czech Republic would say that. (1:38) I hate being this sexy, but I’m from Czech Republic. (1:40) Right.(1:40) I can’t help it. (1:42) I forward to looking for my mug with Czech on it. (1:44) Have you seen coffee mug, dear? (1:47) The second mug.(1:47) Mug number two is, in contrast, a black mug. (1:51) So we’ve got the lightness of the force and the dark side. (1:55) Yeah, the Darth Vader mug.(1:56) The Darth Vader mug. (1:57) Says… (1:57) Reads, Prague, Czech me out. (2:01) Czech you out.(2:02) Czech me out. (2:02) But Czech is spelled… (2:03) With a Z, not the CH. (2:06) And a CH.(2:07) Yeah, at the end. (2:07) Not a CK at the end. (2:08) Yes, no Ks.(2:09) No Czechings. (2:11) Thank you, sir. (2:12) You’re welcome.(2:12) That is beautiful, man. (2:13) I’m really grateful. (2:14) And my gift was an amazing Van Halen concert ticket from October 15th.(2:22) 1991 with a Van Halen guitar pick signed by Edward attached to it. (2:28) The concert ticket price was $22.50. (2:32) Which would have been $842,000. (2:34) Which front row would be $9 million.(2:37) Brought to you by Ticketmaster and Q102 Philadelphia. (2:41) This is very thoughtful, sir. (2:42) Thank you so much.(2:43) Thank you, man. (2:43) And I will frame it and I will attach it to a wall in my crib. (2:46) I know how much of a fan you are and were.(2:49) I mean, you still are a fan. (2:51) So regardless. (2:52) But I feel like that will have much more meaning in your collectible, (2:56) in your world than it will in the myriad of my collectibles or my myriad of collectibles.(3:02) It’s just really cool that we both went to the same concert. (3:05) Yeah, we went to the same tour, but just in different cities. (3:09) So I told you the story, right? (3:10) So if I may expound one more time.(3:13) The son, my mom worked at a catering company. (3:16) The owner of that catering company, her son, (3:19) was a musician and moved out to L.A. years ago, back in the 80s, even. (3:23) I think he has a song that’s on one of the 21 Jump Street TV shows.(3:27) Wow. (3:27) You know, you have like, you know, you get that was like his claim to fame. (3:30) Like it wasn’t like an actual radio hit.(3:33) Right. (3:33) It’s one of the songs on the show. (3:34) Yeah, that’s what I hear.(3:36) And in that time, he became Eddie Van Halen’s roadie, personal roadie for all that time. (3:41) Like Guitar Tech? (3:42) Guitar Tech. (3:42) Absolutely.(3:43) He was the one coming out, tuning, whatever. (3:44) Wow. (3:45) So he is actually, allegedly, the story that I have, (3:51) is he’s the one who has the drill from the pound cake during recording.(3:55) One of the one of the monitors or speakers blew and he drilled out. (3:57) He was drilling out one of the monitors or one of the speakers. (4:00) And that picked the thing, picked it up.(4:02) And we need to put that on there. (4:04) Yes. (4:06) So allegedly, that’s what he’s known for.(4:08) His name’s Matt, Matthew Ward. (4:10) He comes up. (4:11) I’m not going to dox him because he’s awesome.(4:13) He comes up to my brother and I right before the concert. (4:15) He goes, here, guys. (4:16) And he just drops a handful of these picks.(4:18) That’s fantastic. (4:19) Eddie Van Halen’s guitar picks. (4:21) You know, they’re electronically signed or digitally signed.(4:23) That plastic. (4:24) But it says F-U-C-K on it for the tour. (4:26) It’s got his name on it.(4:27) Those are the ones he throws out. (4:28) The thing that we just got a handful right before the concert. (4:31) Totally awesome.(4:32) Then we booed Allison Chains off the stage. (4:35) It was Allison Chains? (4:36) It wasn’t Faith No More? (4:37) It was Allison Chains doing Man in the Box. (4:39) And we’re like, Eddie, Eddie.(4:41) F-U-C-K YOU! (4:42) Boom. (4:43) Walks off. (4:43) Throws the mic down.(4:44) Allegedly, it happened multiple times. (4:46) And I think Philly started it, though. (4:47) Of course you did.(4:48) We always do. (4:49) You booed Santa Claus. (4:50) No.(4:50) Snowballs at Santa Claus, my friend. (4:52) And Jimmy Johnson. (4:54) He deserved it.(4:55) But Santa? (4:56) True. (4:57) But did you see the Santa that they had? (4:59) He was a little drunk? (4:59) Yeah, but he was like 28 pounds. (5:01) And they just threw some cotton ball in his face.(5:03) I’ll show you the Santa later. (5:05) He’d be like, no wonder. (5:05) They tried to put one over on us.(5:09) Okay. (5:09) They were Philly Outrage, bro. (5:10) All the outrages.(5:12) And Philly’s already outrageous. (5:14) Yes. (5:15) So welcome to Knocked Conscious.(5:17) Sir, thank you for an awesome birthday month or two months now. (5:21) Two tambien. (5:22) We’ve totally touched this up.(5:24) We’ve had dinner for four at a really nice restaurant. (5:28) All the fishes. (5:28) We’ve had presents galore for months now.(5:32) Yeah. (5:33) And the f- (5:33) Okay. (5:34) I’m putting this up tonight.(5:36) Barring a heart attack before tonight. (5:38) Don’t say that. (5:40) I’m just going to… (5:41) What am I going to reverse jinx it? (5:43) This episode will be published with a specific date.(5:46) And guess when that date is, sir? (5:47) November 6th. (5:48) You know what that is? (5:49) 11-6. (5:52) It’s a Friday.(5:54) It’s a Friday. (5:54) Three days after the election. (5:55) Thank you.(5:56) It’s after the f-ing election. (5:59) So we are… (5:59) It is over, ladies and gentlemen. (6:01) Whatever happened, happened.(6:03) We have no idea, nor do we fucking care. (6:05) Well… (6:05) Actually, we do care. (6:05) I am concerned there could be a hanging Chad.(6:08) Oh, there’s definitely some dangling Chad’s going on. (6:11) There’s some dingleberries. (6:12) Chad was dangling, but other people are hanging.(6:16) Yes. (6:18) So we decided to do this one because we started a four-part thing on the century of self. (6:25) Yeah.(6:25) That we’re going to go back into part two in a couple weeks, I think we’re going to put that out. (6:29) Correct. (6:30) But in the meantime, this one came up and has gotten some traction.(6:33) Yes. (6:33) And you and I, as social media morons… (6:37) I don’t know another way to say it. (6:39) Sadly, that’s accurate.(6:40) SMM? (6:41) I don’t like that title, but sure. (6:44) Right. (6:44) I don’t… (6:44) Well, we can’t say that other one.(6:46) Well, it’s… (6:46) I mean, that’s… (6:47) Unfortunately, that’s true. (6:48) So we are definitely on the back end of the power curve when it comes to social media. (6:55) So how I have been promoting our show is looking up people who have the word podcast and recommendations.(7:02) Oh my God, you’re giving away your secrets? (7:04) Oh my God. (7:05) I might as well, because everybody’s doing it. (7:06) It’s not like any different.(7:07) All the cool kids are doing it. (7:08) Well, I figured it out, but I reinvented the wheel. (7:10) No one told me, but I was just like, this seems like a smart idea to do.(7:13) And then people yell at you and it’s pretty cool. (7:15) But anyway, somebody will go, can someone give me podcast recommendations? (7:21) To which I will respond, hey, we have a podcast. (7:24) Please give us a try.(7:25) We have a million different types of topics. (7:28) We’re at 37 episodes already when this is being recorded. (7:31) So by the time this will be out, I think it’ll be over 40 at least.(7:36) And you and I know, you’ve been here personally. (7:39) You’ve seen how many topics. (7:40) I’m pretty sure I’ve been here for all of them, but like four.(7:43) Yeah. (7:44) So we’ve done that, right? (7:46) So I tell them and then some people are like, so promotion, bro. (7:50) And you’re like, bro, you asked.(7:55) I didn’t text you something in the middle of the night. (8:00) Lindsay Robinson. (8:01) Oh, still.(8:02) Actually, I’ve got, may I? (8:03) Because I’ve got two. (8:04) No, you may not. (8:06) Tangent number one.(8:07) Would you like to hit the dive horn? (8:07) It’s a little purple guy right there. (8:09) I’ll let you play with it today. (8:11) Just let it go.(8:16) I know I was trying to have some fun. (8:17) Have a little more fun. (8:18) You got to do it quick.(8:23) I didn’t know that could do that. (8:24) DJ Woodsy. (8:25) What? (8:27) I didn’t know that was a thing.(8:29) Yeah. (8:30) Are you ready? (8:32) No, but you’re going to tell me anyway. (8:33) Of course.(8:34) I’m going to tell you anyway. (8:37) Thank you. (8:38) Ricky shorter stuff.(8:39) I get an email. (8:40) Well, first of all, an email, not a text. (8:43) Oh, a text.(8:43) I’m sorry. (8:43) It’s a text last night. (8:45) First of all, some people think my name’s Sabian because they text me Sabian.(8:49) I’ll explain that in the next more than one tape. (8:51) Sabian text messages. (8:52) Oh yeah.(8:53) Like Nick Sabian. (8:54) Because someone, yes, he’s also. (8:58) A person.(8:59) Yes, he is a person. (9:01) You’re a genius. (9:02) Tongue has been clearly off my friends.(9:06) Blood. (9:07) In this case, my name is Mark. (9:08) So that’s good.(9:09) It’s always good to address the person. (9:11) You’re texting as their actual name. (9:13) That’s so kind and thoughtful.(9:15) Mark, Mark Kelly wants to take your guns. (9:19) Exclamation point. (9:21) Protect your second amendment rights.(9:23) Vote for Martha McSally for US Senate. (9:26) Reply. (9:26) Stop to cancel.(9:27) So I gotta do fucking work. (9:29) Don’t fuck you. (9:30) Don’t tell me what to do.(9:31) First of all, you’re not the boss. (9:32) Yeah, first of all, you’re not the boss. (9:33) Me.(9:33) Would you like, would you like to read the response? (9:35) I think you’d enjoy it. (9:39) I’m you don’t have to. (9:41) Should I be worried? (9:42) I think you read it before.(9:43) I think I already sent it to you. (9:44) I think I think I know what it says. (9:45) It’s a good one though, right? (9:47) Okay.(9:47) Do you want me to read it? (9:48) I’ll read it. (9:50) The other one you don’t know. (9:51) So I can’t wait for you to read that one.(9:52) So check Mark’s response to the person that wants (9:57) Martha McSally to win for US Senate in the state of Arizona (10:00) was quote unquote. (10:02) If your wife was shot in the head, (10:04) you’d probably have a different perspective about guns. (10:09) Do you see any text after that? (10:11) No, I was trying to open a conversation, right? (10:13) Like, wasn’t that an opening? (10:15) Well, that’s a valid point.(10:17) I mean, it’s a valid point, right? (10:18) Look, Gabby Giffords is who we’re talking about. (10:20) Yeah. (10:20) The Congresswoman from Tucson.(10:21) Gentleman went down there on a Raleigh shot in the head wife. (10:25) Yeah. (10:25) Mark Kelly’s wife, which I think that’s why he’s running is to help to kind of (10:30) not take over for her, but they’re the bridge, the gap.(10:33) Well, she’s challenged, right? (10:35) Can she, she still is missing some faculties, isn’t she? (10:38) I don’t know all the details of her health. (10:40) She got shot in the head like twice. (10:42) Yeah.(10:42) So at least once it was bad. (10:44) She was critical. (10:44) Like had to relearn how to do pretty much.(10:47) She couldn’t walk. (10:47) Yes, correct. (10:50) And, you know, this, this Jamo just texts me out of the blue and tells him that my guns (10:54) are going to be taken away.(10:55) So I reply that way. (10:56) What’s funny is, you know, whether I’m a gun advocate or not, but do you think this person (11:02) has any clue? (11:03) Absolutely not. (11:04) Absolutely not.(11:04) You know why? (11:05) Because he just read my response. (11:06) It says, if your wife was shot in the head, you’d have a different opinion. (11:10) I bet you, if Megzy got shot in the head, I’d have a different opinion about the gun (11:13) that was used.(11:14) Of course. (11:14) Absolutely. (11:15) So, you know, that kind of blanket wants to take your fuck from the guy, the one race, (11:22) and this isn’t political.(11:23) This is just human, right? (11:24) The one person who wants to help, who’s going to do it. (11:28) And his wife was a direct result of gun violence. (11:32) As a politician, and he thinks he can make a change.(11:36) How could you see anything wrong with that? (11:38) Has he said, has Mark Kelly said anything? (11:40) No. (11:40) About gun control? (11:42) I’ve not seen a single thing about. (11:43) I mean, I haven’t really been paying attention.(11:44) No, but it’s Arizona, right? (11:46) So we’re a gun tote nation. (11:48) So state also, right? (11:49) Well, nation state, we’re kind of our own little thing in our own little weird way, (11:53) right? (11:54) Yeah. (11:54) We’re a gun friendly state.(11:57) Yes. (11:58) Okay. (12:00) So here’s the next one.(12:01) What does this have to do with today’s topic? (12:03) Because it’s on social media, kind of. (12:06) It’s a little tangent. (12:07) Well, this is after the election, so I can dump this.(12:09) I’m like, I’m, I want, I’ve been wanting to talk about these things that I’ve been getting (12:14) for a while. (12:15) Okay. (12:15) But I can’t do it before the election because I didn’t want to, other than that one person.(12:20) Yeah. (12:20) Of whom we spoke. (12:21) So here’s a better one.(12:23) Hey, Sabian. (12:24) That’s my name. (12:25) Sabian.(12:25) S-A-B-I-A-N. (12:27) It’s Emily, a volunteer with Stand Up America. (12:30) Mail-in ballots in Arizona are being sent out.(12:32) It’s time to get ready to vote. (12:34) Have you received your ballot yet? (12:35) Reply, stop to unsubscribe. (12:37) Once again.(12:37) Hey, Emily, don’t tell me what to do. (12:40) First of all, you came at me. (12:42) My lack of response should make you stop.(12:44) How about you stop? (12:46) No means no. (12:48) Would you like to check out my awesome response? (12:50) Were you, well, you know, you were dressed kind of slutty, dude. (12:53) So I was, I was, I was asking for it.(12:54) Yeah, I know. (12:55) The little midriff showing. (12:57) I had the shirt tied off.(12:58) Halter top. (12:58) I had the shirt tied off the top. (13:00) Kind of schoolgirl outfit.(13:01) With my speedos. (13:02) Yeah. (13:02) With my banana hammock.(13:03) Hit me baby one more time. (13:06) With my banana hammock. (13:08) So the response after being told, called Sabian, by the way.(13:12) Me, me, me, me, me, me, me. (13:15) Checkmark responded. (13:16) How do you expect mail-in ballots to be reliable? (13:21) Reliable when you text someone the wrong fucking name.(13:25) Thank you. (13:26) Question mark, exclamation, question mark, exclamation, (13:28) question mark, exclamation, exclamation. (13:30) I’ll do three.(13:33) Thoughts? (13:35) Those are both valid, right? (13:36) Yeah. (13:37) Okay, thanks. (13:38) Hey, next, next.(13:39) Can I start calling you Sabian? (13:41) Please do. (13:42) And I, and Megzy doesn’t know. (13:44) Have you ever heard what a Sabian is? (13:45) Do you know what it is? (13:46) No.(13:47) It’s a toy for women. (13:48) Shut up. (13:49) It’s basically a saddle that you, that, with that, that sticks up and goes.(13:55) It’s like a large, really large, uh. (13:57) A vibrator? (13:58) A very large electric shaver on top of. (14:01) So it’s a saddle with a vibrator? (14:02) Yep.(14:02) And you basically climb it and it. (14:04) Watch those videos. (14:05) No.(14:06) Yeah, you’ll be, you’ll be like, wow. (14:07) Do you want me to tell you that story? (14:09) Yeah, please. (14:10) So, um.(14:11) We’ll wait. (14:11) No, we’ll do story time later. (14:14) We are really on a tangent, which I love.(14:15) I need to, um, replace some backpacking equipment that I gave away to a friend. (14:20) And, uh, the. (14:24) Portable stove that I have.(14:26) I’m going to wait till you put your drink down before I tell you what it’s called. (14:29) Cause you’re going to spit everywhere when I tell you what it’s called. (14:33) It’s called pocket rocket.(14:38) Yep. (14:39) I would have spit out my soda if you had told me to. (14:41) And I’ve, I had one many years ago and I really like, it’s very small.(14:44) And the little, um, little silver blades fold out so you can put a pot on top of it. (14:49) And the pocket rocket, um, spins right onto a gas canister. (14:54) Yeah.(14:54) So it’s very small and slender and it fits in your backpack. (14:57) So you can save weight for backpacking. (14:59) Love it.(15:00) I was not aware that there’s other things also known as a pocket rocket. (15:07) And so on Amazon, I was like pocket rocket. (15:10) The first three things were the.(15:13) Check where it’s going to lose it. (15:15) The first three things were the backpacking thing. (15:17) We need to get back to video.(15:18) The other 92 things were not backpacking stuff. (15:22) I’m going to guess. (15:23) They were all pink and they were all lady pleasurable thingies.(15:27) Wait, I think they were just large or small back massagers. (15:31) No, they all Amazon. (15:32) They all said.(15:34) Toys, pleasure toys. (15:35) Yeah. (15:35) And I was like, holy crap.(15:38) I had no idea. (15:40) I can’t believe you didn’t. (15:41) Well, I guess I wouldn’t expect you to know what a pocket rocket was.(15:44) Well, because I knew pocket rocket as a device that I use for backpacking. (15:49) Right. (15:49) Because I had one before.(15:50) Right. (15:51) And I was looking for another one. (15:52) I didn’t know it was a, you know, a pink pleasurable thing.(15:57) Which one do you think came out first? (16:00) To get the name. (16:00) Probably the backpacking one. (16:02) Those sons of bitches, man.(16:04) Not them, the people who stole pocket rocket and repurposed it for women’s pleasure. (16:12) Sure. (16:12) I mean, I don’t have an issue with women’s pleasure.(16:16) No, that’s not. (16:17) Of course not. (16:17) I’m not.(16:17) We love that. (16:18) The product should exist. (16:20) That’s not what I think.(16:20) It just needs to be rebranded because you can’t have a Coleman pocket rocket. (16:24) I love it’s Coleman. (16:25) It’s not probably.(16:26) No, it’s MSR is the brand. (16:28) Okay. (16:29) You could have the pink rocket instead of the pocket rocket.(16:33) I’d like that. (16:35) You’ve got a crotch rocket. (16:36) You have a crotch rocket and a pocket rocket.(16:38) No, I don’t have a crotch rocket. (16:39) I have a pasta rocket because it’s Italian. (16:43) Oh, it’s Italian.(16:43) Should we move along? (16:44) Sure. (16:45) What are we talking about today? (16:46) Oh, well, I was going to ask you that. (16:48) Okay.(16:49) So as we were talking about, I think before I realized that we didn’t hit record the first (16:52) time. (16:53) Yeah. (16:53) I was about talking about the century of self.(16:56) Yes. (16:56) Oh yeah, we were. (16:57) So we covered part one a couple of weeks ago and we’re going to do part two, three and four (17:02) later, probably a month apart because we don’t, we’re not, that’s not, we’re not on.(17:06) We just want to open people’s eyes to stuff. (17:08) We don’t want to talk about the same boring shit all the time. (17:10) It’s kind of.(17:10) We want to talk about newborn. (17:11) Yeah. (17:11) Newborn shit.(17:12) That’s our specialty. (17:13) Newborn shit is my favorite. (17:15) Yeah.(17:16) Like, especially like two hours, 38 minutes in right, right when it hits. (17:20) And then we just, you and I just bash each other with sticks. (17:24) With coffee cups.(17:26) Do not break my chick, my cook, my mugs. (17:29) I want my bird. (17:31) Okay.(17:33) So we’re talking about a century of self, which will, you’ve heard the first part of this. (17:37) And it’s basically how we became consumers in the two. (17:42) We human beings.(17:43) Yeah. (17:43) In the 20th century. (17:44) And we’re going to talk about that.(17:45) We’ll get to that because this will obviously be out after that. (17:49) But social media is a big part of this thing that humans have become now. (17:56) So something about this came up, it was popular on Netflix.(17:59) And what is it, sir? (18:00) What are we talking about today? (18:01) The social dilemma. (18:02) And what does that entail as a whole? (18:05) Like what’s general? (18:06) How would you explain, like if you were synopsized, just the, what it’s about. (18:13) How social media, Twitter and Snapchat and Facebook and Instagram shape the human psyche.(18:25) Yeah. (18:26) So that we become addicted. (18:31) Humans become addicted.(18:33) To the, to those tools. (18:35) To the apps. (18:36) Those things that were tools at one point are no longer tools.(18:39) They’re like drugs. (18:40) Correct. (18:41) Yeah.(18:41) Okay. (18:42) That’s exactly. (18:43) Is that accurate synopsis? (18:45) Very much.(18:45) Okay. (18:46) Mine was more like you and I, you and I see things much differently than others. (18:55) Like we have the most part.(18:56) Yes. (18:56) We’ve mentioned this multiple times. (18:57) Yes.(18:58) We are very human. (18:59) There’s no doubt that we’re human. (19:00) We have human tendencies, but we have this different side of us that looks 82 steps past (19:08) the step we’re at to see potential pitfalls down the road.(19:12) Yes. (19:12) You’re in telecom. (19:14) I’m in other stuff.(19:17) IT, right? (19:18) Sure. (19:18) They tell you one thing and you’re like, immediate. (19:21) Your hand goes up and you go in 38 minutes, this switch is going to turn off or whatever.(19:26) Yeah. (19:27) Right. (19:27) You know exactly because you take that idea and you work it out in your head somehow or (19:34) on paper, however you see what’s going to happen because of what you’re looking.(19:39) Yes. (19:40) Everyone else just fucking does no. (19:44) And I, I know that’s an everything.(19:45) No one kind of statement. (19:47) I don’t like absolute, but for the most part, people in organizations of all companies don’t (19:52) look, don’t want to look far ahead. (19:55) Remember I told you about the guy with 140 plus IQ.(19:58) He’s an IQ in the mid 140s and I share with him about how DDT and all these pesticides (20:03) have worked their ways to the poles of the earth and how, and he looks at me and goes, (20:08) don’t tell me that stuff. (20:10) I don’t want to know. (20:14) And that hurts, man.(20:15) We’ve got a guy who can solve problems with the intelligence level that he has, who’s (20:21) unwilling, not unable, actually says, I’m not interested. (20:28) Right. (20:29) That’s a smart guy.(20:31) He, the average IQ, I think in the United States is like, I know IQ is not the end (20:35) all be all, but it’s like 97 or 98. (20:38) I think it’s like the average IQ in the United States. (20:40) He is 50 points higher.(20:42) He’s 50% higher than the average. (20:47) And half the people are below that. (20:51) And he doesn’t even want to look at the problem.(20:55) So my hands caught, man, we don’t, we want to be better. (21:01) Not all of us. (21:03) Well, that’s the problem.(21:04) I think we all do. (21:05) And I think we’re, we’re afraid that, uh, but being better is work takes. (21:13) Well, absolutely.(21:14) And it takes away from your family. (21:16) Cause you have to take time away from your family to work on being better. (21:20) You take time away from your work to work on being better.(21:23) Or some, some portion of your time has to be readjusted for this (21:27) improvement portion, right? (21:29) But no one wants to do that. (21:31) Cause they’re so locked into their schedule, right? (21:34) I got work nine to five blinders, kids to this. (21:37) I got right.(21:37) Distractions, right? (21:38) Soccer practice, work, you name it. (21:40) Well, it’s easier. (21:41) Yes.(21:42) It’s so much easier, easier to be distracted. (21:44) Just to go through your life and 2.5 kids, white picket fence and not see the things around you. (21:51) Yeah.(21:51) And that’s where everyone else seems to be in our world. (21:54) When you and I, that’s why we started this, right? (21:56) For the most part. (21:57) You’re correct.(21:58) Not everyone. (21:58) We know there are people like us. (22:00) Cause we actually have people who.(22:02) First of all, thank you to everyone who’s listening, but people who listen to us are (22:06) like, yeah, that you bring up something different. (22:09) It’s not better or smarter or whatever. (22:11) It’s different.(22:12) It’s just something that no one’s thought about. (22:13) Right? (22:15) So this one is, this is a big problem though. (22:17) Right? (22:17) You and I, it, if you and I put out the model for how we’re going to make social media, (22:23) I think you and I would have seen this problem coming.(22:25) You think? (22:27) So we’re going to, let’s get, let’s get into it because it’s kind of interesting. (22:31) So basically social media, it starts out with a quote. (22:35) On the, on the screen.(22:36) Yeah. (22:36) And it’s on Netflix. (22:37) We’re not going to, we’re not going to spoil or anything.(22:39) We’re not going to tell you something you don’t know, but the quote reads nothing (22:43) vast enters the life of mortals without a curse. (22:47) And it’s, it was Sophocles or at least that was, it was credited Sophocles. (22:51) And the second I read that, I was like, Oh fuck, this is not going to go well.(22:56) What did you think when you read that? (22:57) Cause that’s the first thing is just, well, I think it’s the fact that this, that, that (23:01) was written thousands of years ago in an, in an ancient language, but yet it’s true today. (23:08) That’s, that’s, that’s disturbing. (23:11) That tells me that humans we’ve come so far and we haven’t gone anywhere.(23:16) Knowledge, wisdom. (23:17) We’re just, we’re in the same little tiny, we’re all walking in a little circle. (23:21) We just have air conditioning and jets and wifi.(23:24) Yeah. (23:25) It’s we’re dumb. (23:28) Where we have knowledge and not wisdom.(23:31) Yeah. (23:31) We are, we are smarter, quote unquote, with the smartphone in our hand, with the (23:36) technologies that have been provided to us with running water without lead in the pipes. (23:40) Great.(23:41) Yeah. (23:41) Awesome. (23:42) But that other side of it, right? (23:44) The humanity hasn’t caught up.(23:46) So that was, that was the first thing that stuck out to me. (23:51) And it was crazy too, because really early on, I think that, you know, they, they were (23:57) introducing people, right? (23:58) They had people from Facebook, Instagram, Google, YouTube, Mozilla labs, Firefox, Twitter, (24:05) president of Pinterest, previously director of monetization of Facebook for five years. (24:09) Twitter head of consumer products, co-inventor of Google drive, Gmail, chat, Facebook pages, (24:15) and the Facebook like button.(24:17) The like button. (24:17) That’s crazy. (24:18) They had the guy on.(24:19) The guy from the like button came on. (24:21) That tells you how big of a problem this is. (24:23) Like he is single handedly credited with like.(24:29) Yeah. (24:30) And he’s on this thing. (24:31) So it tells you how important this really is, in my opinion.(24:35) And I agree, or I wouldn’t be sitting here. (24:37) Oh yeah. (24:38) So, uh, and then there, there’s this one guy when working at a place, he fundamentally thought (24:43) that they were a force for good, but he doesn’t know if he feels that way anymore.(24:48) And that’s scary when you’re like, I fundamentally thought we were forced for good and not that he (24:52) thinks are bad, but questioning that chipping away at that just right away to me is telling. (24:58) I agree. (24:59) And then came the question, man, it was, I think they showed three people’s reactions.(25:04) They said, well, what’s the problem? (25:06) It was very early, right? (25:07) In the documentary. (25:09) What’s the problem? (25:10) And immediately one word came to my head and we’re going to talk about the end of the day. (25:14) I want to talk about the end of the podcast because I want to get your like fresh take (25:18) on what it is.(25:18) I haven’t, we haven’t talked about this since. (25:20) Correct. (25:22) But one person smiled.(25:24) The other person like put his head in his hands are scratching his head. (25:27) Like what the fuck? (25:28) And the last person chuckled. (25:31) And you sit there and I’m like, I have the answer.(25:34) Cause you and I, we see it from a different angle, right? (25:39) So then they go through. (25:41) So let’s talk about this thing. (25:43) What did you learn from this spot from this documentary? (25:48) Nothing.(25:49) Okay. (25:49) Nothing. (25:50) You didn’t know.(25:50) Correct. (25:51) Okay. (25:51) So tell me some things that, you know, that might be an epiphany to others.(25:57) Cause they’re not always looking in the direction you’re looking. (26:01) Um, I already knew that I, I’ve already seen. (26:07) News segments in the previous years that social media is a drug, that it’s a dopamine hit.(26:13) When you get a like, or you get a heart on Instagram or you get a, whatever the other (26:19) ones are, it’s a reaction that your brain, it’s like, it’s like chocolate. (26:26) It’s the same exact reaction in your brain that happens when you have a piece of chocolate, (26:33) cocaine, chocolate, sugar, all those things. (26:35) So that’s disturbing that chocolate and the like button have the same reaction in the human brain.(26:42) That, that, that’s, that, that bothers me that that’s what’s happening. (26:48) Is it crazy that humans are such social creatures? (26:54) I mean, cause we are. (26:57) We’re so social that a necessity, like a food item gets the same.(27:05) Hit to your point. (27:06) You mean like if someone takes a picture of a sandwich and post it, you mean like that? (27:10) No, as an approval, like the, an actual piece of chocolate gives you the same reaction. (27:14) I sorry.(27:15) Yeah, yeah, yeah, yeah. (27:15) That approval of someone, someone else’s approval gives you. (27:21) That’s terrible.(27:23) It’s kind of scary. (27:24) Cause the first thing that I think about is, well, what happens if they don’t approve? (27:31) Like what’s the, and I haven’t seen, like, we’ve only seen those dopamine hits, right? (27:35) Yeah. (27:35) I’ve never, I don’t know.(27:37) I mean, obviously we’ve seen suicides go up in accordance with these types of things, right? (27:41) And we’ll talk about that. (27:42) Yeah. (27:42) Emotional issues.(27:44) Issues once social media started, but that’s scary to me that. (27:50) Very. (27:51) If, if approval gives you a crack like response, what does disapproval give you? (27:58) How detrimental to development is disapproval? (28:03) And then I look about our childhood and we’re not going to go back and be like, (28:07) but we had different childhoods, man.(28:10) You and I had different. (28:10) Very much so. (28:11) Yeah.(28:11) Than each other, but we also, but they were different than others in a lot of ways. (28:14) In the, you know, growing up in the seventies and eighties, there was no smartphones. (28:19) There was no, it was completely different.(28:21) But my point was, my point is how wrong were you all the time? (28:25) I don’t know if your dad ever said you were wrong. (28:27) Oh yeah. (28:27) All the time.(28:28) That was, that was my life. (28:29) And I’m sorry if my mom’s listening. (28:31) I love my parents are amazing people.(28:34) My parents are amazing people, but it was really hard in the culture that I lived because it not (28:41) didn’t have to be the right way or the wrong way. (28:43) It was their way. (28:44) Yeah.(28:44) So even when I knew it was right, like Mark, do this math problem. (28:49) One plus one equals. (28:50) And I put down two cause I know two is right, but it wasn’t there.(28:53) Right? (28:53) No, it’s 2.1, you know, or whatever. (28:55) Okay. (28:55) Yeah.(28:56) You know, everything felt so I look back on my life and all the disapprovals or the feelings (29:01) of that and like how that impacted me and how you and I are in our mid forties and just (29:06) started doing this when we could have done this 10 years ago or whenever. (29:10) Like, right, right. (29:11) I feel like I allowed myself to be marginalized by that cultural upbringing.(29:20) So I can imagine, imagine you put up a picture, like they, there’s a girl that puts her picture (29:24) up, right? (29:24) On the show. (29:25) Yeah. (29:25) And I say girl, cause she’s under 18.(29:27) I just don’t know. (29:28) Very young lady. (29:29) Yeah.(29:30) Whatever. (29:30) She’s like taking all different pictures. (29:32) She’s deleting them trying to get the perfect picture.(29:34) She gets like, oh my God, you look gorgeous. (29:36) You look great. (29:36) And then it’s like, Hey, can you do something about those ears? (29:39) And it just took away all that positive.(29:43) And then there were tears. (29:44) And then there were tears. (29:45) And then, you know, and then they went into the suicide rate, for example, with the women, (29:48) young women and how it’s, it’s up 170 or 152 percent, 65%, some ridiculous number.(29:56) Things only an hour and a half worth watching. (29:58) And I’m, we’re probably going to jump around a little bit. (30:00) But what’s crazy to me is they talk about three portions.(30:05) You have engagement, right? (30:06) Getting you on the thing. (30:07) Growth and then advertising. (30:11) And to tell them that they are the product, that the, the, the thing that we think is (30:17) the product is actually the advertising product.(30:20) It’s like backwards. (30:22) Like we used to write, used to advertise a product sell. (30:27) Well, now you’re, you’re advertising to get those people to go to you.(30:32) It’s a, it’s different. (30:34) Very much so. (30:36) And there’s no other, is there another business model like that? (30:41) I think they’re starting to all kind of go that.(30:44) I don’t know. (30:44) It’s a good question. (30:45) Well, is social media the first business model of this type? (30:49) In an advertisement way? (30:51) Yeah, because there’s nothing been before, right? (30:54) It’s not like someone, their example was like you and I are on the phone together back in (30:58) the 80s.(30:59) Yes, right. (30:59) We hang up. (31:00) No one follows up with a advertisement call or no one’s behind there trying to keep us (31:04) on the phone together.(31:05) Yeah. (31:06) You know, there’s, there’s stuff at play behind. (31:08) And what’s crazy is we talked about, we did the AI thing, the algorithms, right? (31:12) They even said in this show that they don’t know what they do.(31:18) Did you, do you remember? (31:19) I do. (31:20) One time it’ll put you in that feed and the other time it’ll put you in that feed. (31:24) In feed A or feed B. (31:26) And they don’t know which one it’s going to put you in.(31:29) We talked about it getting out of the box. (31:31) It sounds like it’s already out of the box. (31:33) And we haven’t even looked at it, its impact.(31:36) If it starts to have its own ideas when it rewrites itself. (31:41) I mean, it’s rewriting itself now. (31:42) These are, this machine learning that they talk about.(31:44) Yeah. (31:45) Is basically the same thing, that alpha zero thing that I told you about. (31:48) Yes, I recall.(31:49) It learned against itself. (31:50) Yeah. (31:50) Taught itself.(31:51) And every time, the first time, yeah. (31:54) Okay. (31:54) The first time it’s going to get 10 flat earthers that believe that, right? (31:58) But then the algorithm gets better.(32:00) So yeah, you don’t, it sends flat earth to everybody, right? (32:04) And only three people click on it, right? (32:07) What gets better, it learns. (32:08) The next time it throws out some crazy conspiracy, it’ll get 10 people to do it. (32:12) Because it’ll find a way to entice you to click it.(32:17) And once you’re in, you’re kind of in. (32:20) A lot of these people, you’ve seen people’s minds change. (32:22) And it’s like, it’s like its own religion in a weird way.(32:26) You know, these, some of these philosophies. (32:31) But so we’re watching this thing. (32:33) What are your thoughts as you’re watching this? (32:34) I mean, to your point that you had just now, several people, (32:40) several experts mentioned that AI is already, it’s not in the future, it’s now.(32:48) It’s already happening. (32:50) It’s already underway. (32:51) It’s already being run by Google and YouTube and Facebook and every social media company.(32:58) Because they are, it’s intuitive. (33:01) No one used that word. (33:02) That’s my word.(33:03) It’s intuitive to say, oh, oh, you like Dr. Pepper. (33:09) You like this video about Dr. Pepper. (33:13) Oh, we’re going to put the next video up that is kind of about that.(33:17) So we’re going to press autoplay for you. (33:20) And that’s going to suck up your screen time. (33:22) And then YouTube does it.(33:24) Facebook does it. (33:25) Et cetera, et cetera, et cetera. (33:26) So it’s, AI is already, it’s already doing it.(33:30) Everybody mentioned how AI is a massive part of this. (33:34) Right. (33:34) And that’s what’s funny about it.(33:36) Because they talk about starting it and they’re like, the intention is not evil. (33:39) There’s no intention. (33:40) Right, right.(33:42) But then as people engage, it learns from the engagement because its job isn’t to make things (33:48) good. (33:49) It’s not to make the world better. (33:51) It’s to maximize screen time.(33:54) Yes. (33:54) Attention. (33:55) And in doing so, maximize dollars.(33:58) Add revenue. (33:59) Yes. (34:00) What I found interesting, there was, was it the Facebook guy? (34:04) No, it’s a Pinterest guy.(34:06) He talked about how much people admired Google because they had this ultra altruistic, (34:14) beautiful side that there was real charitable and whatever. (34:16) And then parallel ran this for profit. (34:20) And it was crazy that they’re like, we admire that they could do both.(34:24) I don’t remember that part. (34:25) It was a little bit earlier. (34:27) It was the guy with the gray t-shirt.(34:28) Yeah, I remember the guy with the gray t-shirt. (34:29) Yeah. (34:29) So, but it was interesting.(34:31) I watched it again today. (34:33) I want to watch a couple of times, but you’re brow is furrowed, sir. (34:37) I’m trying to look up the Facebook stock price.(34:39) Oh, that’s good. (34:41) Because. (34:41) Has it split ever? (34:42) I don’t know.(34:43) I should have looked this up beforehand. (34:44) So I apologize. (34:45) No, it’s okay.(34:45) So this guy talks about how much Google was admired because they could monetize this thing. (34:52) And this guy goes, the advertising model is the best way to do this. (34:56) We need to monetize because we need to make money.(34:59) But that was never branded to us. (35:01) Was it the branding to us when we started, when we started giving our data away? (35:05) And I’m not one of those told you show told you so people. (35:08) But how long ago have I ever told you? (35:11) You might remember because we’ve known each other almost 30 years.(35:16) Data information is a new currency. (35:19) Yeah. (35:19) I’ve said that at least 10 years ago.(35:21) No, it’s 27. (35:24) But like, when this started coming out, I’m like, you’re giving away your number. (35:28) You’re giving them your, you’re, you’re giving them this giving.(35:33) Cause wait, but they’re not asking for anything back. (35:35) There’s no free lunch people. (35:37) There’s no free lunch.(35:39) I don’t know what they’re getting yet, but it’s definitely, they’re getting something. (35:42) Of course. (35:43) And it became the currency.(35:44) And now everybody’s talking about that. (35:45) Like, it’s just the phrase. (35:48) But people are talking about it because it’s also bad.(35:50) Yeah. (35:50) Well, now it is right. (35:51) It is evil.(35:52) But now they’re, now they’re finally admitting that the information is the, is the monetary (35:57) source, but it’s too late. (35:58) I know that’s the problem you and I foresee these things. (36:03) Not always.(36:04) And I’m not in this case, right? (36:06) I didn’t know what social media was going to happen, but we also were never given that (36:10) problem to begin with. (36:11) But I also look at it as it’s not just social media. (36:13) It’s Amazon and all other internet walmart.com. (36:20) It’s become an internet mall.(36:22) The woman said it’s like a mall now. (36:25) Yes. (36:25) Correct.(36:26) So it’s not just social media. (36:27) It’s all internet, all companies that use the internet are utilizing data and information (36:34) about all its customers to maximize everything they possibly can to make more money. (36:40) I mean, target.com or whatever.com so that people can spend more time on their website (36:45) to possibly spend more money on the website.(36:48) Right. (36:48) But this is the difference. (36:51) Not everybody’s going on walmart.com. (36:53) They’re all going on Facebook and Google and giving their preferences to them.(36:59) Walmart then has to go to Google and Facebook to get the data because they don’t have enough. (37:04) Yes. (37:04) To be really predictive, you have to have a lot of information.(37:08) And Walmart has a lot of people going on the website. (37:10) Right, but not as much as Google. (37:12) But not Google and Facebook.(37:13) Right, right, right, right. (37:14) To your point, absolutely right. (37:15) These businesses, to your point, are taking what they’re learning from the way we use (37:20) Facebook and Google and YouTube.(37:22) Correct. (37:23) Against us. (37:24) I hate to say against us, but it’s manipulation to maximize dollars.(37:29) Yeah. (37:30) It’s the bottom line. (37:31) It’s the almighty dollar.(37:33) Right. (37:33) And to maximize shareholder value. (37:36) That’s the entire point.(37:38) What’s frustrating to me is that they call that capitalism. (37:41) And that is bullshit. (37:42) True capitalism is not this model.(37:46) This is a skewed work. (37:48) Just like pure communism could work like in a Marxist with 20, 30 people. (37:52) You know what I mean? (37:53) Like, yes, we’re all like a little commune, right? (37:55) That could work in a small group.(37:57) Our capitalist system is not pure capitalism. (38:00) We were supposed to reinvest into our products and reinvest and reinvest into the community (38:06) and only take a portion out to live. (38:08) But obviously CEOs are getting paid these crazier and crazier bonuses and salaries.(38:14) That gap is getting bigger. (38:15) We’re not capitalists the way we’re supposed to be capitalists. (38:18) That’s the problem.(38:19) The system isn’t the problem. (38:21) Our use of it, in my opinion, is very screwed, right? (38:25) It’s skewed right now. (38:29) I’m thinking I agree and disagree.(38:36) I think the system’s fucked. (38:38) I think it’s just like anything else. (38:40) The current system is fucked.(38:41) I think capitalism is a beautiful, amazing idea that has gone sideways. (38:48) The same thing with organized religion. (38:50) It’s a beautiful, amazing, pure idea to love one another.(38:56) Every religion is amazing, and then humans fuck it up. (39:00) And that’s true. (39:01) Like what you said.(39:02) You said Marxism? (39:04) It’s a great idea when there’s 30 people. (39:07) When there’s 30 million, we have a problem. (39:09) And what’s funny is it could be 31.(39:12) All you need is one of them not playing the game and that whole system. (39:16) Yes, that’s why using words like communism, capitalism are very incorrect terms in the (39:23) fact that they’re not written the way Adam Smith and Karl Marx wrote them. (39:26) They’re not practiced the way they’re written.(39:29) And they actually work in a vacuum. (39:30) They don’t necessarily work in real life. (39:32) It’s not a practical guide.(39:34) Well, so let’s back up. (39:37) Can we… (39:38) Oh, we need a beep beep backer upper. (39:40) Not a beep beep, but a beep beep.(39:43) Is that a forklift or is that a truck? (39:45) I’ll find one. (39:46) Whatever it takes. (39:50) So can anyone blame these social media companies? (39:57) I mean… (39:58) How can you? (39:59) Exactly.(40:00) Because all they’re trying to do is make money for their employees. (40:05) Make money for their advertisers, which are there. (40:08) Technically, the advertiser is their customer, correct? (40:10) The advertisers are their customers and the people that own the shares of stock in those (40:15) companies.(40:15) That’s so whatever tactics they do to make those people money. (40:22) That’s their business. (40:24) You don’t have to use their app.(40:26) You don’t have to buy their stock. (40:29) Yes. (40:30) But there is ethical.(40:33) But it’s beyond that. (40:35) How many people can start a Google? (40:37) How many people can start a Facebook? (40:39) There’s there’s a border. (40:41) Have you heard the term like natural monopoly or… (40:43) Yes, of course.(40:44) Utility. (40:44) We talked about that before. (40:45) We talked about, right? (40:46) The Internet as a whole is really scary because it’s not it’s not considered utility right (40:51) now.(40:52) But somebody started a couple of a year ago. (40:56) I had never heard of TikTok. (40:57) Somebody started TikTok.(40:58) Oh, that’s okay. (40:59) Okay, that’s China. (41:00) That’s a bad China.(41:00) So three, four years before that, I had never heard of Snapchat. (41:03) I agree. (41:04) All of a sudden, Snapchat super popular.(41:05) And they have VCs and they have people funding it. (41:07) And to that point, I watched that. (41:09) I didn’t I didn’t catch it the first time I watched it.(41:11) Snapchat is the largest communication tool for kids, teenagers, among teenagers. (41:15) So it is and it’s they communicate differently. (41:18) So some guy goes, hey, we need a social media app that caters to kids.(41:24) And we’re going to call it Snapchat. (41:27) Oh, okay. (41:27) Right.(41:28) But this is so different. (41:29) I see it. (41:30) Take it this way.(41:32) Anyone can drill in their backyard for oil. (41:35) No, we’ve had that conversation. (41:37) So that’s kind of I’m just making the this specific thing because not everybody has venture (41:43) capitalists to make a social media thing.(41:45) It is kind of become a natural monopoly. (41:48) It’s it’s become a utility much more than it’s become a business. (41:53) I agree with that.(41:54) Just some people would may disagree. (41:56) Well, of course, the businesses themselves would absolutely disagree because they’re (41:58) OK, but what percentage of our listeners are going to go? (42:01) Mark, I don’t agree with that. (42:03) That’s a great question.(42:04) And hello to the world. (42:05) Can someone tell me who does not think that the Internet, Google and when I say Google, (42:12) YouTube, all that because YouTube’s owned by Google and Facebook aren’t some kind of (42:17) natural monopoly where you can’t get into that business. (42:20) You can’t really get into the Facebook business if you want to start another right social (42:25) media app and you get pushed out.(42:27) I mean, they would just pull. (42:29) They would probably buy you right or do that or whatever they can. (42:32) But, you know, sometimes you can be stubborn and say, I refuse to sell.(42:34) Yeah, then they’ll find there’s another way. (42:37) Of course, there’s another because they had the clout, right? (42:39) Because they can then possibly manipulate the Internet to steer traffic away from those (42:44) sites. (42:45) Shadow, you hear shadow banning and all these other things.(42:48) So it’s scary in that way. (42:50) So it’s just that’s my opinion. (42:53) Now, I’m also a cat.(42:54) I am a capitalist of heart. (42:56) I believe in for profit, but they came to us like they were saviors. (43:03) They were connecting people and finding organ donor right for someone who needed it or finding (43:09) long lost relatives.(43:10) Wasn’t that like that was how they cooked us? (43:14) Right, and then they switched it without us even knowing. (43:19) And now those algorithms are running in the background and they’re directing us into places (43:25) we don’t want to be. (43:26) But we think we do because it thinks that we do.(43:29) So it goes, hey, let’s worry about the color scheme of Gmail. (43:34) Let’s not worry about how addictive it is. (43:36) Let’s make it more addictive by adding schemes and colors and pictures and puppies.(43:41) Puppies. (43:42) Yeah, and unicorns, bro. (43:43) Unicorns and puppies, my favorites, rainbows, unicorns, puppies and sprinkles.(43:46) Yeah, Jimmy’s Jimmy’s. (43:49) So what are your thoughts about that? (43:53) I, the natural monopoly thing, I don’t I struggle with because you have a good point, (43:58) but I’m tired of talking about it. (44:02) I am still also on the fence, and I think I keep bringing it up because I’m trying.(44:05) I’m trying to find my own answer. (44:07) You should just talk yourself out of it and shut the hell up. (44:10) Oh, what’s the second? (44:11) What was your second point after natural monopoly? (44:15) Words, words, words, words.(44:18) Oh, about that’s how they hooked us with with the natural monopoly. (44:21) Yes, they didn’t hook us with. (44:23) Hey, give us your information.(44:24) We’ll sell it to third parties and use it against you like that. (44:28) It wasn’t a business. (44:30) Facebook never came as a business.(44:32) Never came to you as a business, whereas like you knew Walmart was a business. (44:37) Yes, I see what you’re saying. (44:40) Right.(44:40) YouTube was a video, a place to check out video. (44:43) It wasn’t a business, right? (44:45) I mean, don’t get me wrong. (44:46) We got snookered.(44:47) We believed it. (44:50) But once again, you and I, we need WD-50 on that stuff. (44:55) At least 48, 48, 49, whatever it takes.(44:59) Yeah. (45:01) We’ve seen the pitfalls of social media or we saw them prior and you and I have not really (45:07) engaged in social media until we started this six months ago. (45:10) That’s correct.(45:11) Six months ago, I had a Facebook account. (45:15) I think the last time I post was like 2018 or 17 or something. (45:19) Thanking people for wishing me birthday.(45:21) Yeah, right. (45:23) My dog’s had an Instagram. (45:24) That was it.(45:25) Yeah, but that’s worth it. (45:26) And then you talked me into getting on Twitter. (45:30) Yeah.(45:31) You asshole. (45:32) Well, I’m sorry, man. (45:33) I don’t think you are.(45:35) Well. (45:36) Well. (45:36) The reason I don’t feel as bad.(45:39) I’ll tell you what. (45:40) The reason I don’t feel as bad is because I do way more on it that’s way more toxic than (45:43) what you do. (45:44) It’s very true.(45:46) And I’m kind of taking one for the team, bro. (45:47) You are. (45:48) I give you all the credit.(45:49) I do not engage in toxic stuff, but it’s just it’s a grind to monitor that as a human being. (45:57) And then you’re told you’re a bot. (46:00) And then you’re like, I ain’t no bot.(46:02) It’s like reminds me of Randy Moss when he’s like, I ain’t no girl. (46:05) I ain’t no bot. (46:07) And they’re like, yes, you are.(46:09) I just texted back. (46:10) I just messaged you back that I’m not, bro. (46:12) Would bots write back? (46:14) Would bots use the word ain’t ain’t ain’t no bot, bro.(46:17) Would they say bro? (46:18) They wouldn’t. (46:20) Bots don’t know bro. (46:21) They’d say bot.(46:23) I ain’t no bro bot. (46:27) Bro. (46:27) That’s a different shirt we need.(46:29) Hashtag I ain’t no bro bot. (46:30) Bro bot. (46:31) It’s like go bots.(46:33) Oh, yes. (46:34) It’s like Transformers light. (46:35) Yeah, I remember that crap.(46:37) Good, good go bots. (46:39) Their toys were so inferior too. (46:41) Very much so.(46:41) They had like two moving parts. (46:42) Yes. (46:43) And Transformers had like 42 articulating joints.(46:47) Optimus Prime. (46:49) Get to the chopper. (46:50) I’m like, did he just do an Arnold? (46:53) He did.(46:53) He stole it. (46:54) Where’s Soundwave? (46:56) Optimus Prime. (46:58) Does that sound like? (46:59) I can’t.(46:59) That was pretty good. (47:00) I can’t hear it right. (47:02) Yes.(47:03) Checkmark says, oh, we just got Transformer. (47:06) The G.I. Joe Transformer hour. (47:08) Every, every day is checkmark.(47:10) Yes. (47:11) In Czechoslovakia, in Czech Republic. (47:12) Yes.(47:13) It’s we just got released this fall. (47:15) This fall 2020, we get G.I. Joe. (47:18) And the Transformer.(47:20) Yes. (47:20) Followed by Gem. (47:22) Yes.(47:22) And the holograms. (47:23) Yes. (47:24) On the holograms and synergy.(47:28) What about Land of the Lost? (47:31) Marshall, Will and Holly on a routine expedition. (47:33) What cave you live under? (47:35) We got that back in the nineties. (47:36) Oh, I’m so sorry.(47:38) No, we got a long time. (47:39) Long time ago. (47:40) So checkmark, what is the motto of G.I. Joe? (47:45) Let me show you something.(47:47) Something. (47:48) No, the more, you know, no. (47:50) What’s knowing is half the battle.(47:52) I believe. (47:53) And yo, Joe. (47:55) Go Joe.(47:56) What’s up shipwreck? (47:59) Okay. (48:00) That was total tangent. (48:01) It was pretty good.(48:03) So we were talking about AI, AI. (48:06) The algorithms are just directing us to look at the screen longer and they’re getting better at it. (48:12) Every second of every day.(48:14) And that’s you and I, you and I knew that. (48:17) How easily we could get hooked on that or how easily we draw us in. (48:20) So what did we do? (48:21) We just didn’t engage, right? (48:23) We didn’t do anything with it.(48:24) Correct. (48:25) And I’ll be honest. (48:26) I haven’t missed.(48:27) I didn’t miss a damn thing when I wasn’t on it. (48:30) I agree. (48:31) But I’ve been on it for six months now.(48:33) Not even we started July. (48:35) Yeah. (48:36) So really it’ll be by the time this is released.(48:38) It’s like exactly four months in. (48:40) Right. (48:40) Almost like just over four months.(48:44) And now I feel like I’m missing everything. (48:46) When I don’t have that thing in my hand. (48:48) Really? (48:51) It’s the thing is, I, I like to, I like to work hard at a craft.(48:56) So I want our podcast to succeed. (48:59) I agree. (48:59) And I believe that every chance that I can to tell someone about it or try to share it (49:06) with someone, I need to take that opportunity.(49:09) So any second I’m sitting and not doing something else with my hands, I’m either on the computer (49:14) now or on the phone asking people to give us a try, give us a listen and yeah, it’s (49:20) work and it’s like grinding. (49:21) So I can, I can still bifurcate it and say, it’s kind of my work because I’m not, I’m (49:27) not emotional about it. (49:29) Right.(49:29) Like I’m not emotional about, Hey, give us a try. (49:33) And if someone says no, I go, okay, cool. (49:34) Next.(49:35) Like not a big deal. (49:36) Right. (49:36) But if I got into the conversations of things that are going on, that’s bad.(49:42) The emotion, it just, because people equal shit people. (49:47) Who was that again? (49:48) Was that not slipknot? (49:49) Not beautiful. (49:50) Dear Corey, dear Corey, please come on our show.(49:53) Please. (49:54) We’d love to have you on. (49:55) Yes.(49:56) I’ll do a duet with you. (49:57) Okay. (49:58) I’ll say shit.(49:58) The people, I say backwards. (50:01) I say, Hey, I for to looking for shit or for to looking for peoples. (50:04) I need, I need peoples to spread on the ground to make things grow better.(50:08) Yes. (50:08) Oh, you mean shit. (50:09) Oh yes.(50:09) But people equal sheet. (50:11) Oh damn. (50:11) Same thing.(50:13) Uh, so they are just roping us into these engagements. (50:20) Right. (50:20) But even though I’m not looking for the engagement part, as I’m scrolling, looking for people (50:28) to whom I can write or to, to whom I can, with whom I can engage, there are issues going (50:34) on.(50:34) Other poop. (50:35) So the stuff runs by me. (50:36) It’s good.(50:37) Obviously I see it. (50:39) And then it plays in my head. (50:41) Holy crap.(50:43) The president’s a fascist. (50:45) Holy crap. (50:46) Antifa is fascist.(50:48) And they’re the opposite ends of the, of the spectrum. (50:51) And they’re both fascists. (50:55) Um, my brain doesn’t work that way, bro.(51:00) So those things actually enter your brain. (51:02) Yeah. (51:03) Cause I read it.(51:04) I read like just skimming. (51:05) You see Trump fascist and then Antifa fascist or something. (51:09) Well, I’m just going through just looking for, I just don’t, I see those as well, but I just (51:14) don’t even process that crap.(51:16) I can’t help it. (51:17) Cause I think you and I are slightly different in our, not, not our political stances per (51:21) se, even though we are, but I think the way we care about politics. (51:24) Yeah.(51:24) I’m just, I’m so done with, with all of it. (51:27) I am also done with it, but I’m a little concerned. (51:31) I’m not going to lie.(51:32) I’m concerned. (51:32) Oh, absolutely. (51:34) I, so I have a hard time letting it go, but that’s on me.(51:37) I’m totally okay with that. (51:38) But you also don’t hear us talk politics. (51:40) Like create, we’re not a politics show by any stretch of any right.(51:44) For a reason. (51:45) We talk about philosophy, dumb as shit. (51:47) Exactly.(51:47) We talk about philosophies. (51:48) We talk about ideas and things like that. (51:50) And they can be, you can paint them however you want, but we don’t talk.(51:55) Candidates specifically other than people who Texas at midnight. (51:58) I knew you were going to say that. (52:00) Or, or orange hair, orange faced, orange hair people who think he’s terrific.(52:05) Who of whom Jess Garcia does a great. (52:08) Uh, I didn’t, I don’t know if I added her yet. (52:10) I don’t know if I added her yet.(52:12) I need to send you a reminder. (52:15) Keep talking, bro. (52:17) Keep talking, bro.(52:17) To send you a reminder, to see if this Garcia is amazing. (52:22) Huge impersonations on the board. (52:24) That’s a terrible impression.(52:27) Oompa loompa doopity doo. (52:29) We’ve got a funny story for you. (52:32) Sorry about that.(52:33) I don’t have it. (52:34) Dick face. (52:34) I have it here actually.(52:36) Since we’re. (52:36) Yes. (52:37) Good.(52:37) Since we’re already talking about a certain. (52:41) Hold on one second. (52:42) May I try this? (52:43) Yes, you may.(52:45) Here we go. (52:51) Did that pick that up? (52:53) Barely. (52:53) We’re going to do this.(52:55) We’re going to try this one more time. (52:56) Okay. (52:56) Hold on.(52:56) Hold on one second. (52:57) You want to flip the microphone? (52:57) Yeah, I’m going to try one more time. (52:58) Go, go, go, go, go.(53:01) Is the greatest, most factual documentary that’s ever existed. (53:06) The greatest, most factual documentary that’s ever existed. (53:09) Thank you, Ms. Garcia.(53:11) We’re going to give her that. (53:13) Yes. (53:14) Yes.(53:15) Thanks, Jess. (53:16) So speaking of that, I did have a, I did have a note. (53:19) And we can squash this if you don’t want to discuss it.(53:22) No, we’ll talk about everything, bro. (53:23) This is tangents three, three point six, seven niner. (53:28) So with my recent attempt to assist the podcast, I’ve been trying to.(53:37) Share the podcast on Facebook and gain more friends as we’ve talked about, (53:42) but I have not announced that on the podcast before. (53:44) So I’ve seen a. (53:48) Lot of young ladies, probably under the age of 30 that are attractive and they all have their (53:56) profile picture all has a Trump circle on the outside and they all have a Trump flag (54:04) or some kind of a hat and there’s a lot of them. (54:09) So I’m like, so when I started friending people, I’m looking for a very diverse audience.(54:13) I don’t care about color. (54:15) I don’t care about political. (54:16) I would love to have a Biden listener and a Trump listener.(54:19) You actually do want different people. (54:21) I do. (54:21) I really want.(54:22) Yes. (54:22) In a weird way. (54:23) We talk about diversity, like, oh, you shouldn’t see color.(54:25) No, we, you actually do by looking by saying you’re not looking for you’re actually, (54:29) that’s what you want is diversity. (54:30) Yeah. (54:31) So when I’m looking for people that I want to have as friends on Facebook, I want diversity.(54:36) I want people from all over the world, all walks of life, all colors, because I think. (54:42) I think everyone is going to like something that we have to say something that we’re (54:47) going to talk about. (54:48) So I’m trying to friend of wide variety of people to share our podcast.(54:52) Right. (54:52) But I’ve been running into a lot of these young women that are Trump supporters and (54:56) that’s fine. (54:57) So I friended probably five.(54:59) How many friends do they have? (55:01) I don’t know. (55:01) I can look. (55:04) 50 to 50 or so.(55:05) I’ll let you finish your point, but there’s, there’s something to that too. (55:08) I believe there is too. (55:09) So, um, so one girl messaged me and said, Hey honey, how are you doing? (55:14) And I’m like, okay, something’s wrong with this picture.(55:17) I don’t know you. (55:18) So I click on her profile. (55:19) Her profile is two days old.(55:21) Like, Oh, she has three pictures, right? (55:24) She has 50 friends. (55:25) They’re all men. (55:26) Mike.(55:27) Okay. (55:27) Then I click on another girl, some other girl from some other state equally is cute equally (55:35) as Trump, which is totally fine by the way. (55:38) Okay.(55:38) Same thing. (55:39) Her profile is three days old. (55:41) You realize the other side is doing the same thing.(55:42) Oh no, that’s just not seeing them. (55:44) No, I am. (55:45) I’ve seen some, but you understand what I’m saying? (55:47) I’ve seen some Biden’s as well.(55:49) And I’m not saying that you’re calling. (55:50) No, I’m not. (55:51) No, it’s not.(55:52) I want to be clear. (55:52) It’s not a Trump thing at all, because I’ve seen the other side of the coin. (55:55) This is being done is what you’re saying.(55:57) Yes. (55:57) You’re seeing, you’re seeing young people. (56:01) I saw a guy.(56:02) People who would be your demographic, people that you could, well, they can entice you. (56:07) Well, people that I think would like the show, right? (56:10) Not somebody that I want to date. (56:11) That’s not what I’m talking about.(56:12) Not at all. (56:13) But I’m like, Hey, I think this person might like the show regardless of political affiliation, (56:18) regardless of whatever. (56:20) So I’m like, Hey, do you want to be friends? (56:22) Yes or no.(56:22) They can say no. (56:23) That’s cool. (56:24) And they say yes.(56:24) And then I look at their profile. (56:26) I’m like, how come your profile is two days old? (56:28) How can your profile is three days old? (56:30) Why are there only three pictures? (56:32) Okay. (56:32) That’s super suspicious to me on friend.(56:35) And I’ve done that five times. (56:36) So that’s my question to you is that what, what the hell? (56:41) Okay. (56:42) So this is, this is what happens.(56:43) Tell me. (56:44) They’re flooding. (56:45) They’re, they is everyone.(56:47) I’m just going to say all parties, whatever left, right, center. (56:50) It doesn’t even matter. (56:52) The manipulators, whoever’s trying to manipulate you is creating all these different profiles.(56:59) Picking a side that they want to manipulate about you. (57:03) And they just throw them at you. (57:05) Assuming that you’re just automatically going to click friend because someone wants to be (57:09) your friend.(57:12) So many people have no idea who they’re friending. (57:16) Cause they don’t to your point. (57:18) They don’t think like you and I, they don’t have that agreed.(57:21) They go, I’m too busy to figure out who that is. (57:23) I’ll just friend him. (57:23) I’ll figure it out later.(57:24) If it works out, right? (57:26) Whereas I’ll get one, a friend and I go, they don’t have any friends. (57:30) And then you look, they’re two days old, right? (57:31) Or they have five friends. (57:34) How’d they get five? (57:34) They’re all older dudes.(57:36) And it’s obviously some young, attractive woman. (57:39) Who’s barely barely close. (57:41) Just legally out gone.(57:42) Right. (57:43) And you’re sitting there like, how did, how did that, how did five people at her? (57:46) And I’m not going to say that every old dude’s a creepazoid. (57:49) Some are, but not everybody is true.(57:52) So some acts probably just said, accept. (57:54) Yes. (57:55) I think some people are on auto accept, right? (57:58) I think that’s a thing that is correct.(58:00) And I did the same thing you did when I was adding friends (58:04) and I got up to the 4,800 friend mark. (58:08) Is that the max? (58:09) In like 5,000 is the max of friends. (58:12) You can have as many followers of a page or something, but as a friend thing, (58:16) direct friend connection, it’s 5,000.(58:18) It’s capped. (58:19) At least that’s what my experience. (58:20) Okay.(58:21) I’ve seen people and someone’s, someone’s tried to add me. (58:25) Or asked to be their friend and they’re capped at 5,000. (58:28) And I’m like, once again, now in hindsight, after watching this show, (58:32) somebody threw me into that, into their little echo chamber.(58:35) I’m like, why would you ask me to be your friend when I can’t be? (58:39) It’s virtually impossible because you have a limit of 5,000. (58:42) Yeah. (58:42) You would think you would get some kind of an error, right? (58:44) You would, right.(58:45) You would think that fate, whoa, I don’t think the guy Facebook Facebook would go. (58:48) Sorry, Mark can’t be your friend. (58:50) He can make this.(58:51) It can make this algorithm do this, but not that you don’t want it. (58:54) Cause it’ll still engage me because someone wants to add me. (58:57) I’m just, these light bulbs are coming on as I’m watching this show.(59:01) Oh, it’s not about that. (59:03) Like the one that Tristan talked about Tristan Harris. (59:06) Yeah.(59:06) Who’s on here. (59:07) And I sent him a message and I, I would love, I mean, I would just talk. (59:12) Geek with him forever.(59:13) I was just, that would be a great thing. (59:16) I agree. (59:16) He just seems like such a nice guy.(59:18) Just a really great guy. (59:20) Anyway, Tristan talks about it. (59:21) And then he talks about, um, keeping you engaged.(59:26) It’s funny. (59:26) Cause there say, oh, you’ll like this picture, right? (59:31) Yeah. (59:31) Someone or someone tagged you in a photo.(59:33) What does that make you do? (59:35) Click to the link. (59:36) Why doesn’t it just show you the photo with you tagged on it? (59:40) It makes you actually push the button to get it. (59:43) It’s training you.(59:45) Yeah. (59:45) It’s training you because dog, it could have just sent you the email with here’s a picture. (59:49) You photo tagged in it.(59:50) Isn’t that great? (59:51) Yeah. (59:51) No, it’s someone tagged you click here to see who it is. (59:55) Right.(59:56) Isn’t that how they do it? (59:57) Click here to find out. (59:59) Tap, tap the notification. (1:00:01) Would you like to know more? (1:00:01) Would you like to know more? (1:00:02) I mean, let’s stomp bugs, bro.(1:00:05) You’d like to know more. (1:00:06) I mean, tell me that’s not crazy, right? (1:00:08) I mean, we’re just being cattle, like herded in this direction. (1:00:12) And the bigger thing is that 99.9% of us don’t think like you and I, and don’t see it.(1:00:18) And they truly believe if you, I know it’s not 99% believe it, but if you ask whether (1:00:24) Facebook is like doing good or doing bad, I would say a majority would still probably say (1:00:30) I like Facebook. (1:00:31) It works for me. (1:00:32) Yes.(1:00:32) I would agree with that. (1:00:34) Yes. (1:00:34) I’m not going to say it’s, I don’t know a number, but I would guess it’s 51 49 at least.(1:00:38) I would say 60 40. (1:00:40) Probably more, probably even a little more depending on who’s using it and whatnot. (1:00:44) But, um, what’s scary about that is like, I remember everybody sharing early and, (1:00:53) and it’s, this isn’t an, I told you, so it was just the way we see the world.(1:00:57) I was like, you guys are dooming yourselves. (1:01:00) You’re dooming yourselves. (1:01:03) And there was a point, okay.(1:01:05) There was a point in Nixon’s administration where there was something said about if you (1:01:09) wanted to make America, if you wanted to make do better, like be a good America, right? (1:01:13) Like if you want to make more money or have better things, both of you work, both, both (1:01:19) husband and wife. (1:01:20) Yes. (1:01:20) Both parents.(1:01:21) Let’s just say parents because now in today, back then it would have been husband and wife, (1:01:24) right? (1:01:25) Have your wife work or whatever. (1:01:27) But then did you notice a single, a single family working isn’t really happening anymore. (1:01:34) Everyone works because now it raised the level to the point where you can’t do it as a single (1:01:41) family.(1:01:42) Expectation now. (1:01:43) Right. (1:01:43) Yes.(1:01:43) Now you need both just to keep your nose above that water level. (1:01:48) Yeah. (1:01:48) Right.(1:01:49) So what became like an idea, the intention was good. (1:01:54) Hey, if you want to do a little better, but by doing so, the people that did that raised, (1:01:59) right. (1:01:59) Raised a wage gap or raising inflation or whatever, because things grew.(1:02:03) And now look what happened. (1:02:05) But we spend more money that way. (1:02:06) Right.(1:02:07) But now we need that. (1:02:08) We need both incomes just to get through by. (1:02:11) Yes.(1:02:12) You know? (1:02:13) So it’s kind of like how the internet is like, it was like, you’re gonna need it. (1:02:18) We’re dependent on some kind of social media because without it, you and I, no one, you (1:02:24) and I can sit here and record for 42 hours at a time and I’ll do it with no listeners. (1:02:31) I don’t want to, I’d love to monetize this.(1:02:34) Yes, of course. (1:02:35) But we would do that. (1:02:37) But if we wanted to make this a thing, we need the thing that’s the thing that we’re (1:02:43) seeing is evil or at least becoming evil, right? (1:02:47) The social media.(1:02:48) We need, we need the social media part. (1:02:49) Of course. (1:02:50) I understand.(1:02:50) We won’t connect with people. (1:02:52) We won’t be able to engage with a phone call. (1:02:54) People don’t pick up the phones.(1:02:56) I won’t even get a phone list. (1:02:57) Like who has a phone? (1:02:58) Well, you could call a cell phone, but that’s dumb. (1:03:01) Hi, I’m calling from Knocked Conscious.(1:03:05) Would you like to know more? (1:03:06) Would you like to listen to us? (1:03:08) Would you like to know more? (1:03:09) Yeah. (1:03:09) Hi, I’m with Knocked Conscious. (1:03:10) Would you like to know more? (1:03:11) And Starship Troopers.(1:03:13) Chex Marks Roughnecks. (1:03:15) Roughnecks. (1:03:16) Why is it that my space faded away? (1:03:23) It made some errors.(1:03:25) Like what? (1:03:26) I don’t know, but it would be, it would have to be some error in understanding people. (1:03:34) Because, well, it transitioned a little bit. (1:03:36) It’s actually more of like a musical thing now.(1:03:38) Really? (1:03:39) A lot of, yeah, there’s actually a lot of independent artists that are on my space. (1:03:43) I didn’t know it was still alive. (1:03:44) If I may say, I’m, I believe that, um, Justin Timberlake is part owner.(1:03:51) I believe he, he bought it for like $5 million. (1:03:53) It’s ridiculous. (1:03:54) He should have bought it for $5.(1:03:55) Yeah, but it was like value. (1:03:57) The valuation at its, at its peak was massive. (1:04:00) Yeah.(1:04:00) It was ridiculous, but it died as quickly because there’s obviously there’s something (1:04:03) that wasn’t right. (1:04:05) I don’t think it changed. (1:04:06) I think it kind of stayed the same.(1:04:07) Yeah. (1:04:07) Facebook replaced it. (1:04:08) Right.(1:04:09) But Facebook changes all the time. (1:04:12) Yes. (1:04:12) But it also bought Instagram, right? (1:04:15) Oh yeah.(1:04:15) But it also grew and bought some other things. (1:04:17) My space thought, oh, we’re good. (1:04:18) We’re kind of the Xerox, right? (1:04:20) We don’t need to make any change.(1:04:21) So by the time you remember the top eight and all that bullshit it was trying to do (1:04:25) at the end, like who are your top eight? (1:04:27) And then you’d be, you could be on somebody’s top eight. (1:04:29) No, they just didn’t do it right. (1:04:31) They just didn’t understand the social part.(1:04:33) But it was too new. (1:04:34) Also. (1:04:35) Yeah.(1:04:35) They, I don’t think they had now, they didn’t understand how to write an algorithm for that. (1:04:39) And I think I’m sure, you know, the way, you know, like, let’s be honest, Facebook was (1:04:44) really interesting because those two twin brothers really came up with the idea and (1:04:48) it worked at Harvard first. (1:04:49) Right.(1:04:49) And then Stan and then was Zuckerberg took it to Stanford. (1:04:52) Right. (1:04:53) And then it grew kind of the way it did, I think, but the idea and the way they did, (1:04:58) it was really smart because it was, my space was a little more, I think it was more random.(1:05:03) I think Facebook initially started like you needed to be friends. (1:05:06) Correct. (1:05:07) Like you needed to know each other.(1:05:08) Like, so I think it created even more of a bond for us to connect that way. (1:05:13) Cause it made it more serious of a connection. (1:05:16) Yeah.(1:05:17) And my space kind of came a joke, right? (1:05:19) Remember my, from my place to my space or my space to my place. (1:05:22) I remember people saying, oh, my grandma’s on it. (1:05:25) I don’t want to be on it anymore.(1:05:26) And then I remember the younger folks, cause I was on it when I was in my mid thirties (1:05:31) probably. (1:05:32) And I remember people on, I’m going, I’m on Facebook now cause my mom’s not on Facebook. (1:05:38) And then that obviously was 10, 12 years ago, whatever.(1:05:42) So, and then it’s, it’s funny. (1:05:45) It’s how things evolve, you know? (1:05:48) So it’s apparently my space is a music site now. (1:05:51) My space is a very strong music site.(1:05:52) It’s actually pretty neat because a lot of independent artists go on there and they’re (1:05:55) like, Hey, check us out. (1:05:56) It’s kind of like almost like a podcast platform. (1:05:59) So it’s pretty neat.(1:06:01) What’s really interesting, it’s not interesting at all, ladies and gentlemen, but I have a (1:06:05) my space account and I don’t remember the password. (1:06:07) Yeah. (1:06:07) I haven’t logged in in many, many, many, many years.(1:06:10) My email is so old. (1:06:11) It’s a pre Yahoo email. (1:06:14) It’s AOL.(1:06:14) No, it’s one from like frontier net.net. (1:06:18) Like what? (1:06:19) It was like, you know, the old wild west when Mike, we need a, we need a modem sound on (1:06:25) the door. (1:06:26) Oh, don’t ever do that again. (1:06:28) That was, oh God, I hurt my ears.(1:06:31) That’s the wrong misophonia. (1:06:33) We’re on the wrong. (1:06:34) You know what? (1:06:35) We never talked about how annoying that is.(1:06:37) We didn’t. (1:06:37) That is. (1:06:38) I hear that at work every day though.(1:06:39) Yeah. (1:06:39) Really? (1:06:40) Yeah. (1:06:41) How many times do you have to dial in, bro? (1:06:43) I have like 185 analog lines in my building, dude.(1:06:46) Thank God for COVID. (1:06:48) I don’t have to go to work. (1:06:49) A bad bro.(1:06:51) Cause I love my flip flops. (1:06:52) Love that COVID 19. (1:06:54) My flip flops and my tank top.(1:06:55) I do love them. (1:06:56) Flip flops and tank top. (1:06:57) Flop lips.(1:06:59) And the, I think it’s, I think it’s interesting that the way that my space rose and fell and (1:07:06) then Facebook replaced it and then Instagram came along and then Twitter came along and (1:07:13) then they, Twitter blew up. (1:07:15) It seems like everyone’s, it seems like everyone on that, on that show on Netflix talks about (1:07:23) Twitter and how everyone gets their news from Twitter. (1:07:26) Everyone gets their reports from Twitter.(1:07:28) Hey, my flight’s delayed because Delta Airlines tweeted out the Dallas airport shut down. (1:07:33) So they, everyone gets instant information from Twitter and has nothing to do with Facebook (1:07:38) or Instagram. (1:07:39) And I find that fascinating that it’s, it’s, it’s, it’s more than a social media tool.(1:07:46) It’s a everything tool. (1:07:48) I that’s, that’s crazy to me. (1:07:51) How do you keep up with that? (1:07:54) Used correctly.(1:07:56) It’s an information tool. (1:07:58) Use the way it’s being used. (1:08:00) It’s kind of a misinformation tool.(1:08:03) I see why you say that. (1:08:05) Um, to your point, think about the double edged sword, right? (1:08:09) The shot heard around the world. (1:08:11) Think about if Twitter existed when the revolutionary war started.(1:08:16) The shot heard around the world. (1:08:17) Yeah. (1:08:18) Would have been 13 seconds.(1:08:20) China would have been like, holy shit, America’s going to war. (1:08:23) Right. (1:08:23) Not nine months fucking later.(1:08:25) Yeah. (1:08:25) Boat or what? (1:08:27) You know what I’m saying? (1:08:27) Like, yes. (1:08:28) Just think about that.(1:08:29) I remember, do you remember the airplane in San Francisco that went off the edge of the (1:08:34) runway? (1:08:34) People were live tweeting that or whatever, while it was happening, while they were on (1:08:38) the freaking plane. (1:08:40) That is, um, that is amazing. (1:08:43) That is amazing.(1:08:46) But the other side is I can be slanderous about you and no one knows you. (1:08:51) And it’ll be in China in a billion people’s face on a billion people’s screens. (1:08:56) Yeah.(1:08:57) If I have the right pull and now you’re a piece of shit. (1:09:00) Yeah. (1:09:00) And that, and they don’t even know you.(1:09:02) And that piece of shit comment may be completely false. (1:09:06) I’m saying, I’m telling you, I would make up something. (1:09:09) And they may say, has amazing boobs.(1:09:12) And you’re like, uh, check mark has no boobs. (1:09:17) Why would somebody say that truly is the definition of fake news? (1:09:20) That’s a lie. (1:09:21) It’s not, that’s, that’s a lie.(1:09:23) Look, I’m sorry. (1:09:24) But when you tweet Trump as a fascist, or you tweet Biden is a pinko commie lefty or (1:09:30) whatever, those are, those are, those are not truths. (1:09:33) Yes.(1:09:34) Those are not truths. (1:09:36) Yes. (1:09:38) However, what do you think gets the most engagement? (1:09:41) Yeah.(1:09:41) As, as they stated in that show, it’s the things that outrage people get the most attention (1:09:48) and that those get the most hits and those suck up the most screen time because people (1:09:52) that evokes them, they never said this, but to me, my interpretation was it evokes the (1:09:58) most emotional response. (1:09:59) Yes. (1:10:00) But they never said that, but it seemed obvious.(1:10:02) Well, outrage is an emotional response and that’s the most strong, that’s the strongest (1:10:05) one. (1:10:06) Correct. (1:10:06) It’s the most negative.(1:10:07) What’s funny that I, that was one thing. (1:10:09) Thank you for bringing that up. (1:10:10) You’re welcome.(1:10:11) He said, do you feel on outrage coming on like an onset of outrage? (1:10:18) You don’t actually feel outrage. (1:10:20) You become outraged. (1:10:22) Outrageous.(1:10:23) You become outraged. (1:10:24) Outraged. (1:10:25) Right.(1:10:26) Like when he said that, I was like, holy shit. (1:10:28) I never thought about, I never, I never walk in going, you just get outraged. (1:10:35) Yes.(1:10:36) With more stuff thrown in your face. (1:10:39) Yes. (1:10:39) It becomes cumulative.(1:10:41) And I didn’t think of it from that perspective that you get it versus have it. (1:10:46) Yeah. (1:10:48) And that’s a good part of the manipulation is they’re giving it to you.(1:10:51) Yeah. (1:10:52) And you don’t, you can’t avoid it because you don’t even know that you think that you’re (1:10:56) doing it. (1:10:57) Like you think that you are it, right? (1:10:59) You think you are that angry person.(1:11:01) So like you think even less of yourself, but they’re pulling you into these different (1:11:07) directions to make you feel that way. (1:11:09) They’re trying to evoke that emotional response to your point. (1:11:14) And it’s scary.(1:11:15) It’s kind of like the thing when, when 10 people have good experience, one personal (1:11:19) right about it. (1:11:19) But when one person has a bad experience, they’ll send it to 10 people. (1:11:25) And what was that growth thing? (1:11:27) In that, in that school, what was that school called? (1:11:30) The school of persuasive fucking person.(1:11:33) Yeah. (1:11:33) Persuasion that Stanford school of persuasion and learning or whatever. (1:11:36) He’s like, or the Facebook guy’s like, I need you to add seven new friends every 10 (1:11:40) days.(1:11:42) That’s like a fucking formula part. (1:11:44) It was when the guy was sitting in the chair talking about how brilliant they are at (1:11:49) getting people to engage or grow. (1:11:50) But basically in that documentary, once again, it’s an hour and a half, you can miss (1:11:55) some parts.(1:11:55) It’s easy. (1:11:58) Seven people, they wanted to, they needed you to add seven friends every 10 days. (1:12:02) Why? (1:12:03) That’s how they got this growth model to do what it did exponentially to become what (1:12:07) they, they have models about if seven people are added to seven new friends every 10 (1:12:12) days, it’ll be X by X. (1:12:15) And we can charge Y for Z. (1:12:18) That’s interesting.(1:12:20) It became a formula. (1:12:21) From a financial perspective, they say, okay, we need each person every 10 days to add (1:12:27) seven friends. (1:12:28) That equals this many.(1:12:31) Clicks. (1:12:32) Clicks, which equals this much revenue for this many advertisers, which then in theory (1:12:38) will increase our stock price by Z. (1:12:41) Yeah. (1:12:41) That’s, that’s insane.(1:12:43) Yeah. (1:12:44) And all, remember that whole human thing about connecting us? (1:12:47) No, now you’re just a formula because you just need to be plus seven every 10 days. (1:12:52) Yeah.(1:12:53) I’m not even a name. (1:12:54) Yeah. (1:12:54) We’re just a number.(1:12:55) Once again. (1:12:56) Yeah. (1:12:57) Just, we’re just a fricking number.(1:12:58) And I have to work. (1:12:59) I have to fix that because that squeak is not doing well. (1:13:03) Sorry, Twitter world.(1:13:04) I hear it. (1:13:05) Maybe you don’t. (1:13:05) You do? (1:13:06) Yeah.(1:13:06) Oh, I hear it, but I’m sitting next to it. (1:13:09) That’s true. (1:13:09) Maybe, maybe that’s all I’m hearing.(1:13:11) Maybe I’m not hearing it through the mic. (1:13:12) Uh, so, uh, yeah, to your point is like you become outraged, right? (1:13:20) They get you and you, they grow exponentially and you’re just a fucking another stat. (1:13:28) Yeah.(1:13:28) When they said, oh, there’s a billion, there’s a billion people with smartphones. (1:13:33) I thought, wow, I’m one of a billion people. (1:13:39) That’s fucking crazy.(1:13:40) That’s less than 15% of the population is still a billion people. (1:13:46) One seventh, right? (1:13:47) Yes. (1:13:47) Yes.(1:13:48) Yes. (1:13:48) So it’s like 12, 13%. (1:13:49) Yeah.(1:13:49) Because you have a million. (1:13:50) Well, yeah. (1:13:51) You have a tons of kids under the age of 12, whatever you’ve got.(1:13:54) Oh, you’ve got a billion folks, whatever, you know? (1:13:56) Yeah. (1:13:57) And then you’ve got, you’ve got other places that don’t have people that live in the Amazon (1:14:01) rainforest. (1:14:02) We’re in America, bro.(1:14:03) Everyone’s got two phones, if not three. (1:14:05) Uh, so we account for the billion, right? (1:14:07) So that number gets kind of watered down a little bit. (1:14:10) Some people have two phones.(1:14:11) A lot of people at work. (1:14:12) Work phones. (1:14:13) Right.(1:14:13) So if you think a billion total and everyone has two, you’re not at a half a billion, but (1:14:19) you know, you’re maybe not 800 million, 900 million. (1:14:22) Yeah. (1:14:22) I understand.(1:14:23) And then you just get watering it down. (1:14:24) But, um, yeah, and, and they have us, they have us by the short and curlies. (1:14:32) They know exactly what if they know exactly what they need to tell us to keep us there.(1:14:39) And if they don’t know this time, they’re learning for next time. (1:14:43) Yeah. (1:14:43) Have you seen Terminator? (1:14:45) Have you seen my baseball? (1:14:47) But that’s the funny thing.(1:14:49) They even mentioned Terminator, right? (1:14:50) I ruined things like, right. (1:14:52) Fire and brimstone. (1:14:53) No, my friends.(1:14:54) This is just different. (1:14:55) Fire and brimstone. (1:14:56) It is just controlling a totally different way.(1:14:58) We all will need this. (1:14:59) And it’s scary. (1:15:02) A couple of days ago, I, uh, got up, fed the dogs, made coffee, logged into work.(1:15:11) And about one or two o’clock I went, oh, Hey, I haven’t checked any social media today. (1:15:16) How about that? (1:15:18) So I’m like, wow. (1:15:20) Holy shit.(1:15:21) Cool. (1:15:24) Yay me. (1:15:27) I’m giving you a class.(1:15:28) And then I raced to my Twitter account. (1:15:29) No, I did not. (1:15:30) Broke your leg, broke your ankle.(1:15:32) My pinky toe. (1:15:33) I do. (1:15:34) I’ll be honest.(1:15:35) I’m, I’d like to pat myself on the back for as much dedication I’m putting to the show (1:15:39) and trying to get out there. (1:15:41) Like last night, uh, magazine and I were having dinner at like six ish or so. (1:15:46) And I’m like, you know what? (1:15:48) I’m not going to touch my phone or the computer.(1:15:50) Good. (1:15:50) Put it down until she went to bed. (1:15:53) And then I got on there and I’ll, I’m going to be honest.(1:15:57) I felt overwhelmed because so much stuff had gone by and I have to go back. (1:16:01) Right. (1:16:02) Cause I’m looking for specific things.(1:16:03) So I’m scrolling through hours, five, six hours of stuff. (1:16:10) Yeah. (1:16:12) And it’s impossible to have it not go in my head and then bounce around, bro.(1:16:16) I mean, you know me. (1:16:17) So I see these things and it’s just like the most radical or just like. (1:16:22) Someone’s asking for a left leftist communist podcast.(1:16:26) That sounds awesome. (1:16:27) Is one out there? (1:16:29) Sure. (1:16:30) We need to look at the hammer on the sickle.(1:16:32) I am sickles. (1:16:34) These police. (1:16:34) Oh, it’s funny.(1:16:36) We need the hammer in the sickle. (1:16:37) We need the hammer in the pickle. (1:16:38) Oh, yes.(1:16:40) The pickle is delicious. (1:16:41) Kosher dill. (1:16:42) Yes.(1:16:42) Yes. (1:16:43) The dill pickle. (1:16:44) What’s the dill pickle? (1:16:46) So, but I mean, you just hear weird stuff like that.(1:16:49) Like I, one of my favorite is that’s what’s great about podcasting as a whole is like, (1:16:53) I’m looking for a center left pro life. (1:16:59) Anti-gun pro death penalty. (1:17:03) I’m sorry.(1:17:04) Anti-death penalty. (1:17:06) Anti-death. (1:17:07) Nigerian woman who used to be a man.(1:17:11) Whoa. (1:17:11) And it’s like, and that’s great. (1:17:13) Look, you know, transgender, I guess would be the term or whatever.(1:17:16) But it’s like the niche is so specific about what people are looking for. (1:17:19) And honestly, that’s where Twitter should be used as a tool. (1:17:22) It’s like, I know somebody who knows that.(1:17:24) Boom. (1:17:24) Here you go. (1:17:25) Yeah.(1:17:25) Yeah. (1:17:26) I did you, I don’t know if you saw some of the tweets earlier today, (1:17:28) but there was a woman who was asking about music. (1:17:32) And I know we talk about music, but not the way she was asking.(1:17:35) She was asking for, I’m looking for more of like how they talk about the song, (1:17:38) not whether it’s good or not. (1:17:40) And you had shown me Amazon Prime, (1:17:43) the classic albums thing where they break down Peter Gabriel’s so. (1:17:46) Yes, that was really good.(1:17:48) It was awesome. (1:17:48) Like, and they had all those albums. (1:17:50) Have you seen all of them? (1:17:51) I’ve seen, yeah.(1:17:52) Iron Maiden and all the other ones, right? (1:17:54) Duh. (1:17:54) Number of the beast. (1:17:55) I was going to send it to you.(1:17:56) I’m like, I know you sent it to me. (1:17:57) I’m like, you sent it to me. (1:17:58) So I’ve seen an ACDC.(1:18:00) I’ve seen an Iron Maiden. (1:18:01) The Peter Gabriel. (1:18:02) So one is amazing to me.(1:18:03) I just love the way his mind works. (1:18:05) So I told her about it and she was like, (1:18:08) oh my gosh, this is exactly what I was looking for. (1:18:11) Is it our podcast? (1:18:12) No, but it’s still a cool, but I was able to help someone from thousands of miles away (1:18:16) or who knows tens of that.(1:18:17) I don’t know where this person lived or next door, (1:18:20) but I was able to help this person find exactly what she wanted (1:18:24) because I happened to be there when she happened to be there. (1:18:26) Yeah, that is a beautiful thing. (1:18:30) But the other side of that is just, is it worth? (1:18:34) Is it worth that? (1:18:36) So like, so the question becomes, is the juice worth the squeeze? (1:18:43) I, I don’t, I don’t know how to answer that.(1:18:45) Or do, do we regulate or is it too late or is it out of the box? (1:18:50) I mean, the way this, I, the way this AI stuff they were talking about, (1:18:54) it’s smarter than them. (1:18:55) They don’t know what it’s going to do that. (1:18:59) Do you remember when I said that’s when we’re in trouble? (1:19:01) I didn’t know it already existed.(1:19:03) When I said, when it does something that you don’t want it to do, (1:19:07) it overrides what you’re expecting it to do or whatever. (1:19:10) That’s a problem. (1:19:11) You think? (1:19:12) Yeah.(1:19:13) And we’re there. (1:19:13) Oh, that’s like when you said about the car. (1:19:15) Yeah.(1:19:16) I want to turn left. (1:19:17) No, you’re going to go straight. (1:19:18) You’re going to go straight.(1:19:19) Yeah. (1:19:20) So do you think we should regulate? (1:19:23) You know me, man. (1:19:24) I am, I’m a business guy.(1:19:26) I believe in business. (1:19:29) Um, I would love, we’ll talk a little bit about that (1:19:33) because we didn’t talk about what the problem is (1:19:34) because that’s the big part that I want to get to at the end. (1:19:37) Um, yes, I, I would like to treat the internet as a whole, (1:19:42) as a utility that I can’t create an internet company on my own.(1:19:48) No one, no one really can. (1:19:50) Therefore it should be treated like gas, electricity, water, power. (1:19:56) Yeah.(1:19:57) All that. (1:19:58) That’s my opinion. (1:19:58) What’s your thought on that? (1:20:01) I don’t have a problem with it being treated like a utility, (1:20:07) but I, I just think that government is corrupt and.(1:20:15) And behind. (1:20:16) Yeah. (1:20:17) And antiquated.(1:20:18) Antiquated, yes. (1:20:18) So thinking about, oh, we need to regulate Facebook. (1:20:22) We need to regulate Instagram.(1:20:24) With old white guys in government. (1:20:26) Right. (1:20:27) They know.(1:20:28) Yeah. (1:20:28) It’s, that’s idiotic because it’s, it’s, it’s, there’s, (1:20:33) they’re more corrupt than the AI that’s feeding me bullshit on my fucking feed. (1:20:39) What if they’re forced to, what if they’re like, like a package on cigarettes? (1:20:44) I know that’s a simple thing.(1:20:45) It’s not going to solve the problem. (1:20:47) You know how, when you go on the website now, it talks about cookies. (1:20:49) Yeah.(1:20:50) This site has cookies. (1:20:51) What if it just said all the shitty things that each social media thing is like, (1:20:55) hi, we’re Twitter. (1:20:56) Just so you know, you may be shadow banned.(1:20:59) Just so you know, we may. (1:21:01) You might, you might see porn. (1:21:02) We might bar you.(1:21:03) Right. (1:21:04) Like, yeah. (1:21:04) What if there was something like that is like, we’re not what you think we are.(1:21:10) We’re, this is who we are. (1:21:11) And guess what guys you need us because we’re the only game in town. (1:21:16) And if any other game comes in town, they’re going to do it the same way (1:21:19) because it’s the only way to do it.(1:21:21) Right. (1:21:22) It’s the money model. (1:21:23) Yeah.(1:21:23) So if they came and just were like honest with us, don’t you think like you would, (1:21:27) we’d at least be more vigilant about what we share or. (1:21:30) No, we wouldn’t. (1:21:31) I know.(1:21:31) Because no, I don’t have a problem with, with Twitter and Facebook and Instagram. (1:21:37) I don’t either. (1:21:38) Doing what you said and having it, having a, basically like a acceptance policy.(1:21:43) And I don’t have a problem with it being what it is, to be honest. (1:21:45) If you, if you, if you gun to my head, I don’t have a problem with what they’re (1:21:48) doing and how they’re doing it. (1:21:50) I believe that it’s unethical.(1:21:52) It is. (1:21:53) However, it’s, it’s as a human being, it’s your choice to put it down. (1:21:59) Yeah.(1:22:00) You know, and if, and if we want to, if we want to classify an addiction to, to a social (1:22:06) media app up there with alcoholism or heroin, I think that’s stretching it a little bit. (1:22:13) But if we want, let’s say, okay, oh my God, I’m 17 and I’m addicted to Facebook. (1:22:18) I can’t put it down.(1:22:19) Blah, blah, blah, blah, blah. (1:22:20) Okay. (1:22:21) Well then we need to have an intervention.(1:22:22) You need to go to a meeting and we have to, we have to address it like a drug addiction. (1:22:26) I think addiction is actually the way it is going because. (1:22:32) Unlike you and I who experienced email in our twenties.(1:22:38) That sounds like fun. (1:22:39) Sounds like motorcycles. (1:22:40) Yeah.(1:22:41) Um, um, unlike you and I who experienced our 20, these kids are growing up with it. (1:22:46) So it’s probably wiring their brains in a very different way than I’ve always wondered (1:22:50) that. (1:22:50) So think about video game addiction.(1:22:52) Think about that kind of, I, I do see it become potentially becoming a very big health issue. (1:23:00) When you give a two year old an iPad and they know how to click on it better than you. (1:23:04) And then I saw a video years ago of a two year old.(1:23:08) They gave the two year old a magazine and it was clicking on the magazine like it was (1:23:13) an iPad and it wasn’t reacting because it was a magazine. (1:23:17) So the two year old got upset because it was so used to an iPad. (1:23:22) Uh, you need to send me that link, sir.(1:23:24) Okay. (1:23:25) I’ll have to find it. (1:23:26) It’s been, it’s been a couple of years, but the point is what does that’s, this is a, (1:23:29) this is a very side tangent, but look, this is all related.(1:23:33) These, yes. (1:23:33) The thing about Knocked Conscious that you and I have talked about is this isn’t just a, this (1:23:37) isn’t a bitch session. (1:23:39) This is us finding a challenge in the world that we would like people to just be a little (1:23:44) more vigilant of and maybe open their eyes a little bit to, uh, something they’re not (1:23:49) aware.(1:23:49) Absolutely. (1:23:50) That’s all. (1:23:50) But I’ve always wondered when you, and I’m not a parent, so I have, I have no, this is (1:23:56) completely a neutral statement.(1:23:58) What happens to a child at the, at a very, very toddler age, barely had a diapers and (1:24:05) you give that toddler an iPad or whatever tablet to, Hey, Jesus, I just need to make (1:24:12) dinner. (1:24:13) I’m just going to give the baby the iPad for a half hour. (1:24:15) Okay.(1:24:16) Well, if that turns into half hour every day, half hour, five days a week, that child is (1:24:23) condition is, is, is, is raised with that in their hand. (1:24:29) So what does that do to the development of the child’s brain and emotions and psyche? (1:24:36) I don’t know the answer to those questions, but I’m very, it’s very, I find that very (1:24:40) interesting. (1:24:42) An analogy, if I may.(1:24:46) Back in the day, you, when you and I were growing up, we had these televisions that (1:24:51) were called cathode ray tube television. (1:24:53) Yes. (1:24:54) CRT.(1:24:55) I mean, a 20 inch was a 25 inch TV was big. (1:24:58) I think the biggest you could get was like 35, 32, 32, 35 tops. (1:25:02) And the thing was a behemoth.(1:25:05) Did you have to hit it on the side? (1:25:06) Yeah, of course. (1:25:07) The rabbit ears, the tinfoil, all that. (1:25:09) Okay.(1:25:11) These big glass, the thing weighed hundreds of pounds. (1:25:14) And it was only, you know, we have a 55 inch TV diagonal that you can pick up like and (1:25:21) throw like a Frisbee now. (1:25:22) I mean, it’s ridiculous.(1:25:23) It took three of us to get this thing just because it’s the glass in the front, just (1:25:26) the size. (1:25:27) Okay. (1:25:27) So we had those, we couldn’t put those in our pockets and leave, but I remember how (1:25:32) addicted we got to television.(1:25:34) Think about us as kids. (1:25:36) Oh, GI Joe’s on. (1:25:37) Oh, you got to go out and play.(1:25:38) No, but, but, but GI Joe’s on and we couldn’t record it. (1:25:41) It wasn’t on demand. (1:25:42) If we didn’t see it that day, we wouldn’t see it.(1:25:44) But what did we do? (1:25:45) We went outside or whatever. (1:25:47) We didn’t take that TV. (1:25:48) We couldn’t take the TV with us.(1:25:49) If we could, we probably have that same addiction to like to that as, you know, in (1:25:56) that strength as they do because these screens are portable now and it’d be great if there (1:26:01) was like before age five, it’s all education, boring ass, like one plus one equals and then (1:26:09) pick the number. (1:26:09) Nothing like you can do a little sparkles here and there, but nothing like really the (1:26:14) information overload. (1:26:15) Well, not the, the, the entertaining overload.(1:26:18) The, the part that makes you feel good about visualization and the graphics and all that (1:26:22) shit. (1:26:23) That’s what hooks you. (1:26:24) Right.(1:26:24) Cause it’s pretty mom and it, oh my God. (1:26:26) Oh, the dope blings. (1:26:28) Right.(1:26:28) Okay. (1:26:28) Yeah. (1:26:29) It’s almost like it’s make it just more educational.(1:26:31) So it’s more CA T equals like the speaking spell or the little thing where you’re pulling (1:26:36) the string, right? (1:26:37) Yes. (1:26:37) Moo. (1:26:38) No, it was cool.(1:26:38) The CA, the, the cat thing, C A D like, Holy crap. (1:26:44) Remember? (1:26:44) I mean, that was huge for us, but obviously it’s that on steroids, but meth make it just (1:26:50) boring, educational so that when it’s not such a contrast to leave it, when you go into (1:26:58) the real world, because what you, what you see on that screen are rainbows and sprinkles (1:27:02) and sparkles and, and Jimmy’s and glitter and unicorns, right? (1:27:08) Yeah. (1:27:09) Puppies and cats.(1:27:10) Oh my God. (1:27:10) Kittens. (1:27:11) The real world is like a dredgy.(1:27:14) It’s not a fun. (1:27:15) It’s not the, it’s not the greatest place in the world. (1:27:19) Well, it’s not the greatest place in the world.(1:27:20) The world’s not the greatest place in the world, but you don’t say like, I do, we’ve (1:27:23) got bills and we’ve got, you know, mortgages and, and, and stresses and work and, and all (1:27:29) these other, you know, interpersonal challenges. (1:27:33) And getting escape into this screen is so enticing. (1:27:37) If, I mean, man, in growing up with that, we were so lucky to not, I think I, I think (1:27:42) I would be, I’d be in a lot of trouble.(1:27:45) See, that’s why I never, I never bought a gaming system. (1:27:51) So, cause I, I knew, I, you know, I, PS two, PS three, blah, blah, you know, I see the (1:27:57) graphics and I’m like, I would be so addicted to doom and blah, blah, blah, Mortal Kombat. (1:28:05) And then the next one comes out and then, you know, the shit that I, it’s just final (1:28:10) fantasy 14.(1:28:12) It’s ridiculous. (1:28:13) I would be, that’s all I would do. (1:28:16) I would get off work and I would do that shit for eight hours.(1:28:19) So that’s why I never, I’m like, no, no, thank you. (1:28:22) I’m good. (1:28:24) My brother and I had an Atari 2600 growing up.(1:28:27) And I was always pissed. (1:28:28) My dad would not buy me. (1:28:29) It would have probably been 79 or 80.(1:28:31) My brother was 71 same year as you. (1:28:34) So that would have been probably when it was eight, probably 80. (1:28:38) So he was nine and I was six or so, but you know, that went away over time.(1:28:43) Obviously. (1:28:44) Right. (1:28:44) The cartridges, whatever, 83, 80, 45 Atari dumps, they just, they, they dump all those (1:28:51) ETS into the dumpster and whatever the ET games.(1:28:55) Yeah. (1:28:55) Anyway, so the next one’s Nintendo. (1:28:58) Now I didn’t, I didn’t get, there was a big gap between the 2600.(1:29:02) And when I got an Nintendo, I got an Nintendo as a gift because I flew out of a car at 50 (1:29:07) miles an hour. (1:29:08) Not because like, you know what I mean? (1:29:11) I could use an A on your report card. (1:29:13) Funny story.(1:29:14) It was a gift from my mom’s boss. (1:29:18) The Van Halen guy. (1:29:20) Matthew’s mother.(1:29:21) Wow. (1:29:22) Yeah. (1:29:22) She gifted me a Nintendo.(1:29:25) The NES, the original NES with the, you know, up, up, down, down, left, right, left, right, (1:29:29) B, A, B, A start. (1:29:34) No, I was gifted that because I was, it was 19 and I had a broken leg and I was in a body (1:29:39) cast for, for months. (1:29:41) So it’s like, that’s how I got a gaming system.(1:29:44) I didn’t do it to get a gaming system, but that’s how I got it. (1:29:48) I don’t think I would have had it, but then that actually did get me hooked. (1:29:51) I got the Genesis.(1:29:52) Yeah. (1:29:52) I got the PlayStation one. (1:29:54) I got the Xbox.(1:29:55) I didn’t get the original Xbox, but then I went to Xbox 360. (1:29:58) I still have one. (1:29:59) And then the X and then the S and then the, and now there’s a newest one.(1:30:03) I told magazine I’m done. (1:30:05) I’m not buying the newest. (1:30:06) There’s a new one that’s coming out now, or that just did.(1:30:09) And I think PS5 just came out and I’m like, nope, done. (1:30:12) I am. (1:30:13) I’m just cutting it off.(1:30:14) Cause you know, I got to turn my hat back around. (1:30:17) Yes, you do. (1:30:18) Did I tell you? (1:30:19) No.(1:30:20) About the hat thing? (1:30:21) No. (1:30:22) I think I know where I heard it from. (1:30:25) George Carlin.(1:30:26) The man. (1:30:27) But he said 10 years old. (1:30:29) Yes.(1:30:30) You sent me the video. (1:30:31) Okay. (1:30:31) I did send you the video.(1:30:32) Okay. (1:30:32) I just want to be clear. (1:30:33) Yes.(1:30:33) Just hello to the world. (1:30:35) Just to let you know, we were talking about when, when a man should turn his baseball cap around. (1:30:39) And I made a joke about at 40, he should turn around, but I think I might have stolen that (1:30:44) from George Carlin.(1:30:45) So I want to give him the credit. (1:30:46) He’s 10 years old. (1:30:47) May he rest in peace.(1:30:48) He said 10. (1:30:48) Yeah. (1:30:49) Could you, could you help me out of you guys? (1:30:51) I bet you Eddie, Vinnie, and Timmy are going to beat the crap out of Kyle and Todd.(1:30:58) Todd. (1:30:59) Todd. (1:31:00) Kyle.(1:31:00) What else is in your notes? (1:31:02) Well, so this was the question. (1:31:07) Do you remember the, when they asked, what’s the problem? (1:31:10) I do. (1:31:11) Did you have a single word that they said in your head? (1:31:15) Is there a problem? (1:31:17) And what’s the problem? (1:31:18) That’s the quote that they said.(1:31:20) Yes. (1:31:22) And obviously they all know there is one. (1:31:24) And then the guy, the person reiterated, well, what is the problem? (1:31:27) And that’s when the guy chuckled, Tristan chuckled, and then it cut in.(1:31:31) Did you have a thought right away? (1:31:34) Well, it’s the same thought I had going into it is greed. (1:31:39) And it’s the top, the way I see it is it’s the top. (1:31:44) It’s the board of directors of each company doing whatever is required to maximize (1:31:50) shareholder wealth that I don’t, I mean, and borderline everyone can see it.(1:31:58) You know, it’s, it’s unethical what they’re doing because they understand that (1:32:02) there’s a psychological and emotional impact that each app is having on each person. (1:32:09) They, they know that that’s the fact is that do you, do you agree that it’s a fact? (1:32:15) Yeah. (1:32:16) So.(1:32:18) Uh, 100%. (1:32:19) I think the real question is, do these social media companies have an (1:32:25) moral or ethical obligation to. (1:32:30) Be moral and ethical.(1:32:31) Yeah. (1:32:32) To not be dicks. (1:32:35) To not be evil.(1:32:36) Like they removed it. (1:32:37) Like that, but set different, but totally the same. (1:32:40) Set the same, but interesting.(1:32:44) You went to the corporations. (1:32:46) Well, cause this, I swear to God that well, real quick, real quick. (1:32:51) Go ahead.(1:32:52) T-shirt time. (1:32:52) No, you were going to say, well, what you’re, you had a thought. (1:32:55) So I, it’s the same thought I had when I, when I, before I even started watching it.(1:33:00) Cause I already, I kind of saw his greed and his greed. (1:33:03) It’s all, everything revolves around money. (1:33:05) And that’s really, it sucks.(1:33:07) Cause it doesn’t make money. (1:33:09) Doesn’t make you happy. (1:33:10) And the rich are already rich.(1:33:12) How much more rich do you need to be? (1:33:15) Excellent points. (1:33:18) I find it interesting because I don’t, it it’s the algorithms were written, but we use them. (1:33:26) So like, it’s funny because the second they said, what’s the problem? (1:33:31) I went humans.(1:33:34) Cause that’s the truth. (1:33:35) That’s the answer that Tristan says. (1:33:38) It’s like a thing without a name, right? (1:33:40) Remember? (1:33:40) It’s like a, it’s the center, but it doesn’t have a name.(1:33:43) It does. (1:33:44) It’s human monsters, humans, bro. (1:33:50) And that’s nothing wrong.(1:33:52) They’re just using our deficiencies against us. (1:33:56) They’re the algorithm we’re typing. (1:33:59) We’re telling the algorithm how to change us by what we input into it.(1:34:04) Yeah. (1:34:05) Yeah. (1:34:05) I like Slayer.(1:34:06) Oh, here’s another video by Slayer. (1:34:08) Right? (1:34:08) Yeah. (1:34:09) I know.(1:34:09) Okay. (1:34:10) So none of it, in my opinion, none of it is even the problem of. (1:34:16) The greed, because if we were more vigilant, like you and I sometimes try to look behind (1:34:25) the curtain or look for the strings or try to squeeze extra toothpaste out of the tube.(1:34:30) We, we try to look for the system behind and all these things are blowing up around us. (1:34:37) This is a very weird time in our world. (1:34:39) Yeah.(1:34:40) And it’s because no one decided, they all decided to look all at the same time. (1:34:45) Right. (1:34:46) Is it, is it me or did they all just go, okay, here’s everything.(1:34:49) Everything just went to shit. (1:34:51) We’ve been gradually watching these changes happen going. (1:34:54) Is everyone aware of this? (1:34:56) Yeah.(1:34:56) But we’re busy over here. (1:34:58) You know, it’s like, you know, pay no man to the man behind the curtain or don’t look (1:35:02) at my left hand. (1:35:03) Just look at my right hand.(1:35:04) I’m going to distract you while I pickpocket you or whatever. (1:35:07) And it’s humans. (1:35:09) That’s not a bad thing.(1:35:10) We have been, we have been evolved over hundreds of thousands and millions of years, but computers (1:35:21) and technology in 50 years, they said 100 quadrillion times the processing power. (1:35:27) It was crazy. (1:35:28) 100 quadrillion.(1:35:29) I don’t even know what that number is. (1:35:31) You can’t even process that in your brain. (1:35:32) I don’t even know what that number is.(1:35:33) Yeah. (1:35:33) 100 quadrillion, whatever. (1:35:34) The one guy said trillion, but according to the graph, it read 100 quadrillion.(1:35:39) I remember that. (1:35:39) It was crazy. (1:35:40) I recall.(1:35:41) And we’ve only evolved 50 years under roofs. (1:35:44) We haven’t evolved at all. (1:35:46) We have not evolved the way that has.(1:35:50) That is done. (1:35:53) That’s been 99. (1:35:54) It’s done in 50 years with like 99% of our evolution as right.(1:35:59) Cause we are, you and I believe within 20, 30 years, we’re going to have that Turing test (1:36:03) past, right? (1:36:03) Where that AI is going to sound just like a human being. (1:36:06) Oh yeah. (1:36:07) And look like it and feel like it.(1:36:08) All that, but on a phone or in some kind of communication, I will not know that it’s (1:36:12) not a person, right? (1:36:13) Sex bots, bro. (1:36:14) Oh, sex box. (1:36:15) Hashtag sex bots for woodsy.(1:36:18) Woodsy bots. (1:36:23) Sex bots for what? (1:36:24) Hashtag sex bots for woodsy. (1:36:25) That’s so funny.(1:36:26) We’re going to have to, we’re going to have to, we need to, we need to get our merch side (1:36:29) up. (1:36:30) Okay. (1:36:30) Hello to the world.(1:36:31) If anyone knows about merchandise, we’d like to make canes for you. (1:36:35) Dot com. (1:36:36) Yeah.(1:36:36) But Teespring, they’re quality shirts. (1:36:38) Kind of. (1:36:38) Really? (1:36:38) Okay.(1:36:39) Yeah. (1:36:39) I’m a little, I was a little disappointed. (1:36:40) I’ve heard some things, but I’ve heard they’re a good company.(1:36:43) I’m not slandering the company. (1:36:44) I’m still looking. (1:36:44) Cause there’s like, you know, you, you beer Googles it and you’re like good company for (1:36:50) merchant.(1:36:51) And there’s like eight of them. (1:36:52) And then you have to read all of them and they use this and not that. (1:36:54) And you know, me, I have to go on each rabbit hole on their own.(1:36:58) I know what to summarize. (1:37:00) You believe the earth is flat. (1:37:02) Yes, it is.(1:37:03) Absolutely. (1:37:04) Just like a plate. (1:37:05) Actually.(1:37:05) You know what? (1:37:06) I think it’s like a bowl. (1:37:07) I think it’s actually, I think it’s like Louisiana. (1:37:09) Like a cereal bowl or like a, okay.(1:37:14) Like a popcorn bowl. (1:37:16) All the nuts, nuts out, like a nut bowl. (1:37:19) All the nuts is hanging out in now.(1:37:24) So what are we? (1:37:25) So you don’t think there’s a problem? (1:37:26) No, no, there is a problem. (1:37:29) It’s humans are the problem. (1:37:30) Well, the problem is that this thing, the technology is getting smarter than us.(1:37:40) And knows us better than we know ourselves because we don’t do a lot of introspection. (1:37:48) All they do is look at us. (1:37:51) It’s calculating what we’re doing all the time.(1:37:54) When do we ever pay attention to what we do? (1:37:56) Hardly ever. (1:37:57) Like we do stuff. (1:37:59) You ever go to work and you go, how the hell did I get here? (1:38:01) I don’t remember even making a left turn and it was all automatic.(1:38:05) You were safe as all get out. (1:38:06) There wasn’t a problem, but you literally just do not even recall getting to work. (1:38:10) Yeah.(1:38:11) And you weren’t even drunk. (1:38:13) Not that morning, but right. (1:38:16) I mean, yes, we have these lapses where we just lose it, right? (1:38:19) Yes.(1:38:21) Us allowing ourselves to be so distracted is keeping us from seeing where problems can arise. (1:38:29) Agreed. (1:38:31) You and I have hesitated doing things because we’ve (1:38:35) foreseen the impact that it could cause someone.(1:38:38) Of course. (1:38:40) Like we don’t hurt them or this or that or the other, right? (1:38:44) Not many people look ahead. (1:38:48) They’re just kind of, I mean, living in the now is a great thing, but (1:38:50) you got to see where there’s going to be some potholes on the way.(1:38:55) And I’m, I’m after watching the social dilemma, the way they know how to use us. (1:39:02) They won’t even need to have a drone army. (1:39:05) They’ll just turn us against each other.(1:39:07) I haven’t. (1:39:08) What’s that? (1:39:09) A civil, I mean, but a real civil, they’ll just do it. (1:39:11) Hasn’t that already begun? (1:39:12) Yeah, it’s, it has there.(1:39:14) People are literally concerned, have mentioned civil war and these aren’t alarmist people. (1:39:20) These are like level headed. (1:39:22) People go, it could happen.(1:39:25) Yeah. (1:39:26) A friend of mine mentioned that to me. (1:39:28) It’s a little scary.(1:39:29) Yeah. (1:39:29) And I, I, I thought, I didn’t know what he meant. (1:39:35) Like, what do you mean like civil war? (1:39:37) He means like 1863 civil war.(1:39:41) And I went, wow. (1:39:42) Yeah. (1:39:43) There are a couple of, and I don’t want to be an alarmist.(1:39:46) I’m not one of them. (1:39:46) Right. (1:39:47) Of course.(1:39:47) And we are intrigued with conspiracy theories because we’re interested in looking behind the (1:39:53) veil or seeing if there’s more information. (1:39:55) Yeah. (1:39:56) We’re just curious people.(1:39:57) You and I, we’re very curious. (1:39:59) But hearing just the word civil war just very uneasy, man. (1:40:04) Absolutely.(1:40:04) It doesn’t bode well for anyone. (1:40:07) Correct. (1:40:07) And that’s done.(1:40:08) There’s not, I mean, there is injustice. (1:40:11) But the way it’s being manipulated and enhanced and put under a microscope, then thrown back (1:40:18) in your face, lit on fire and thrown back in your face, we can have a civil war and no (1:40:23) one’s even talked to anyone yet. (1:40:26) Like no one’s even talked to anyone.(1:40:28) All they’ve done is talk to the respective machines. (1:40:31) Yeah. (1:40:32) Machines are telling us who knows how far that AI has gone to say something.(1:40:39) It wants to tell us like to manipulate, you know what I mean? (1:40:41) Yeah. (1:40:41) And they mentioned that in the show, the overthrowing of different governments in, you (1:40:46) know, third world. (1:40:47) And, and then that’s not just to say that it’s a third world country.(1:40:52) That’s not, it can easily happen in the U S or in, in, in a first world country. (1:40:59) Germany got Hitler and that was a democracy. (1:41:02) That was a, there was a voted, there was an election, there was a free election and he (1:41:06) manipulated behind the scenes to get to kind of collaborate.(1:41:10) So, but how far are we from, I mean, our system’s a little different. (1:41:16) It would have to be martial law kind of thing. (1:41:18) It would be, it would have to be a, you’d need the military’s help for sure.(1:41:21) Yeah. (1:41:22) But not saying you couldn’t get it. (1:41:24) It’s just as easy.(1:41:25) You could vote somebody in like that. (1:41:27) I could see it. (1:41:29) I mean, I can’t see it to be honest.(1:41:31) No. (1:41:32) And when, when, when Will said civil war, I immediately thought, I thought tanks in the (1:41:39) streets and I was like, oh my God. (1:41:43) Checkpoints.(1:41:44) Yeah. (1:41:44) Like I don’t, do I really want to be, no, I do. (1:41:51) I really want to be in a war like that.(1:41:53) No, obviously I don’t. (1:41:55) So, and what’s crazy is like, we’re America. (1:41:59) We’ve been the antithesis of the world.(1:42:01) Like we’ve been what the world isn’t. (1:42:04) The first ones to really recognize individual rights and freedoms. (1:42:09) In, in their government to make the government serve them for once.(1:42:15) Right. (1:42:16) And we’re not anymore. (1:42:19) The people that get elected have lobbyists that they serve.(1:42:24) They’re not serving the people. (1:42:25) No, no side. (1:42:26) Sad, right? (1:42:27) That isn’t, and that’s not political.(1:42:28) We’re just talking, we’re not getting political with the specific parties, but none, no one, (1:42:35) no one who gets kickbacks can be 100% unbiased. (1:42:42) They’re, they’re owned. (1:42:44) Yes.(1:42:44) Burn it down, bro. (1:42:47) Well, I don’t. (1:42:48) I know you said we can’t.(1:42:49) I know. (1:42:50) We can’t burn it down if we want something to replace it. (1:42:52) We need.(1:42:52) So let’s get back to your notes. (1:42:53) We need a system. (1:42:54) Yeah.(1:42:54) And no, so it just was, it was humans. (1:42:56) Okay. (1:42:57) So the answer is humans.(1:42:58) Which goes to the next question. (1:43:00) Have the humans lost control? (1:43:03) Do I know we’ve touched on it? (1:43:05) Is that a yes or no? (1:43:06) I say yes. (1:43:07) Okay.(1:43:07) And the reason I say yes is because most of humanity doesn’t know anything about the control (1:43:14) itself. (1:43:14) So we’ve already lost it because they don’t even know. (1:43:16) There’s very few people that understand the, the, the, the algorithms and the way things (1:43:21) work.(1:43:22) Right. (1:43:22) Aren’t they talking about a few hundred total people? (1:43:25) Yes. (1:43:26) In all of them, in all of Fang, that’s Facebook, Amazon.(1:43:32) Uh, I don’t know what instance for Netflix and Google. (1:43:36) Okay. (1:43:36) Fang F A N G. (1:43:37) It’s Facebook, Amazon.(1:43:41) The other one. (1:43:42) I don’t care, dude. (1:43:42) Whatever.(1:43:44) You know what I’m talking about? (1:43:45) I don’t, but that’s okay. (1:43:47) Fang, uh, Netflix and Google. (1:43:49) Anyway, we don’t even know that there’s a control system in place.(1:43:52) We, as in most humans don’t, they just think they’re giving us what we, what they want. (1:43:58) They don’t know they’re being funneled into that little cattle prod thing. (1:44:05) Yes.(1:44:05) Right. (1:44:06) They don’t know they’re being guided. (1:44:07) A human funnel.(1:44:08) Right. (1:44:08) So how do they, so we’ve already lost control because we don’t even know we’ve never. (1:44:13) Well, no, but it have those, have those hundreds of people that know the algorithms lost control (1:44:17) of the algorithms.(1:44:20) They still own them as in they’re contained in their servers and whatnot, and they can (1:44:24) pull the plug, right? (1:44:25) So it’ll turn off. (1:44:27) Would they ever pull the plug if it became dangerous to them? (1:44:32) Yeah. (1:44:32) Right.(1:44:33) Right. (1:44:33) But not that I’m just saying like they, I’m trying to figure out the right way to say it. (1:44:42) They have lost control in the fact that they, they even stated in the documentary that they (1:44:48) didn’t know which direction it would go.(1:44:51) Right. (1:44:51) So there’s a loss of control. (1:44:53) Yeah.(1:44:53) Cause you would think ultimately at the end of the day, you as the human would, it couldn’t (1:45:01) go certain other places. (1:45:03) You would limit it to where you want it to go. (1:45:06) Or you’d know.(1:45:07) Yes. (1:45:07) Or you could say one plus one equals two. (1:45:09) So two times two equals four, four plus three, seven.(1:45:12) Okay. (1:45:13) It’s going to go here and then it drops into eight instead of seven. (1:45:15) You’re like, what are the, how the fuck did that happen? (1:45:17) Right.(1:45:19) Yeah. (1:45:19) You lost control. (1:45:20) Yeah.(1:45:20) You lost control of that system. (1:45:22) Now the thing is, like I said, you unplug it, it’s done. (1:45:25) All right.(1:45:25) What else we got? (1:45:27) Well, yes, I had regulation and the other, um, other comment. (1:45:30) So what do you think? (1:45:31) You don’t think it should be regulated? (1:45:32) No, because the government’s just as fucked. (1:45:34) Yeah.(1:45:34) So fuck off. (1:45:35) Right. (1:45:37) Um, I agree.(1:45:38) I, I, and when I talk about humanity, we need to regulate ourselves. (1:45:42) I do. (1:45:42) Yeah.(1:45:43) To your point. (1:45:43) And I don’t. (1:45:44) Don’t let your children get hooked on this stuff.(1:45:47) Well, yeah, but I also think that the government, if, if the government steps in, I think they’ll (1:45:53) do a shitty ass job. (1:45:55) And I think that, I think that the social media companies have so much money they can (1:45:59) buy whoever they want. (1:46:00) Right.(1:46:00) I don’t disagree. (1:46:01) So I believe it. (1:46:05) I believe that regulation should somehow happen.(1:46:08) However, seeing that they won’t or haven’t, and that they’ve protected you, they’ve protected (1:46:17) the government or the government’s protected. (1:46:19) Have you seen the clause? (1:46:20) I think it’s 203 paragraph 203 C or something. (1:46:23) It’s like it protects Twitter, Facebook, and Google for anyone who posts on that.(1:46:27) They are not liable at all. (1:46:29) They’re the company who allows you to post it and they’re not liable for what you post, (1:46:33) but you are as the end user of Twitter. (1:46:36) How do I, I’m going to control what some Jamoan North Dakota is saying.(1:46:42) Like that doesn’t make sense. (1:46:44) Right. (1:46:44) There’s a, there’s a divide in logic there.(1:46:49) I got you, but go ahead. (1:46:52) I’m sorry. (1:46:52) Uh, regulation.(1:46:53) I just don’t, I don’t, I don’t think, I think getting the federal government involved besides (1:47:00) the fact these companies are global, they’re not, they’re not, I mean, they may be based (1:47:05) in the U S but they’re not, it’s not a U S institution. (1:47:10) People all over the world in every country have accounts on these apps. (1:47:15) So how could the U S government regulate them? (1:47:18) So I have an issue with China regulates by not allowing their people to get them.(1:47:24) How we would regulate would be monetarily. (1:47:27) So taxing, putting tech, the one gentleman said, putting taxes on certain things. (1:47:34) Yeah.(1:47:35) I watched some things later. (1:47:36) I went down more rabbit holes, but taxing percentages of the revenue wouldn’t make every (1:47:41) advertiser want to just go to everyone right now because it’s the wild, it’s free. (1:47:46) Yeah.(1:47:46) They just, they all try to go get it, but if you tax them on what they got, they might (1:47:52) be more specific targeting versus shotgun blasting in a way that’s part of it. (1:47:59) But that’s the way to do it here. (1:48:01) It would be monetarily.(1:48:02) There are no restrictions in a not allowed to post no restrictions in that way. (1:48:07) However, the, the company is, should the company be allowed to shadow ban? (1:48:13) Do you, what are your thoughts on that? (1:48:15) No, of course not. (1:48:16) Because their, their platform is that they are free speech, right? (1:48:20) They at least claim it.(1:48:21) Right. (1:48:22) But they shadow ban because they have an agenda. (1:48:24) Yeah, of course.(1:48:24) So how, how do you unregulate that, that they do that? (1:48:27) I, so the whole point about regulation, I don’t think the government can, (1:48:32) I don’t think they can handle it well, effectively, efficiently at well. (1:48:37) I mean, I think it may come to that and that I don’t, I’m not okay with that. (1:48:41) So I think the common man and woman has to, (1:48:46) we’re the ones that are going to have to do something about this.(1:48:50) So that’s where I think there’s going to have to be a ground swell movement (1:48:55) to get the companies to get their shit together and not manipulate us. (1:49:03) That’s, that’s the only, and I don’t see that happening because everyone’s addicted. (1:49:07) Right.(1:49:07) So. (1:49:08) It’s on us. (1:49:09) Yes.(1:49:09) It really is on us. (1:49:10) I believe that’s the end game. (1:49:12) And that’s why, that’s why when the question is, what’s the problem? (1:49:15) It’s us.(1:49:16) I don’t mean it like we’re the, we’re not bad. (1:49:19) We are just human and they know how we tick. (1:49:25) So we need to tick differently.(1:49:26) Would you compare it to a tobacco company or. (1:49:32) Yes. (1:49:33) Because they.(1:49:34) Well, when doctor, when doctors are getting paid and smoking gum, (1:49:38) all these lucky strikes are going to be really good for you. (1:49:42) Well, they. (1:49:43) They knew it was bad.(1:49:44) They knew it was bad. (1:49:45) They didn’t say anything. (1:49:46) Right.(1:49:47) So are the social media companies, the tobacco companies of the 75 years ago. (1:49:51) Yeah. (1:49:52) Or the, you know, like, were they the Oxycontin companies of 20 years ago, (1:49:57) where they, you know, they paid doctors and hospitals (1:49:59) to give out these fucking drugs, which they knew were addictive.(1:50:03) What I think is slightly different is I truly believe the intent was to get (1:50:12) engagement, to bring people together. (1:50:13) I don’t believe the intent was evil and I don’t even think it’s evil. (1:50:18) Now the problem is humans have now interacted with this algorithm.(1:50:26) Yeah. (1:50:26) And now told that what we want and it knows pretty well what everybody kind of (1:50:31) wants by what they click on and what they see. (1:50:33) Yes.(1:50:35) And we’re allowing that. (1:50:37) Yes. (1:50:38) So I give an example.(1:50:41) Um, Mark Kelly wants to take your guns. (1:50:44) He does. (1:50:45) And I go, I respond.(1:50:47) How do I respond? (1:50:48) Well, if your wife was killed by a gun, you might have a different opinion. (1:50:53) That person I’m hoping obviously thinks that I’m anti-gun, right? (1:50:58) They would assume that. (1:50:59) Yeah.(1:50:59) Right. (1:51:00) So I zag bro. (1:51:02) You’re good at that.(1:51:03) I’m just saying I consciously zag a lot. (1:51:07) Do you know how to control me? (1:51:09) Tell me like you can opposite manipulate me. (1:51:12) I just do.(1:51:14) I’m just, I know how to control you. (1:51:15) Yes. (1:51:16) Mark.(1:51:16) I dare you not to order that waffle. (1:51:18) Yes. (1:51:19) He orders two waffles.(1:51:20) Even if you dare me to order one, I’d still do it or not do it. (1:51:24) I don’t even know. (1:51:25) Yeah, you would.(1:51:26) Yeah. (1:51:26) But basically don’t tell me what to do and I’ll do it. (1:51:28) Tell me or tell me not to do it and I’ll probably do it.(1:51:30) So like, especially when it comes to speech or some kind of individual, right. (1:51:35) That’s how you manipulate me is tell me not to do it. (1:51:39) Mark, don’t vote for Mark Kelly.(1:51:42) Don’t, don’t text back. (1:51:44) Stop right now. (1:51:47) But you understand I am almost a 180 of humanity.(1:51:52) Don’t call me Sabian. (1:51:55) I’ve lost many years of my life being the antithesis, right? (1:51:59) Like the contrary person. (1:52:01) Yeah.(1:52:01) You and I are, have always kind of looked at it from that way, right? (1:52:05) Yeah. (1:52:05) So we see how it’s been, how they, when, when I say they, it’s just, we’re the minority (1:52:11) in this case, um, how humanity interacts and stuff. (1:52:14) And we’re like, oh, that’s dangerous.(1:52:17) We know, we know it’s bad. (1:52:18) We know where it’s going to lead and we know how it’s going to lead. (1:52:21) Yet we know we could be manipulated totally, just exactly the opposite.(1:52:25) If they did to us the exact opposite, what they’re doing to everyone else, (1:52:28) I guarantee I do exactly what they want me to do. (1:52:32) Get the same result. (1:52:33) You just gotta, it’s what is your motivation? (1:52:38) They got about 90% of humanity’s motivation down.(1:52:41) Yeah. (1:52:42) And they’re really good at it. (1:52:43) It’s kind of scary.(1:52:45) And, and the, to your point there, they’re not even touching the buttons. (1:52:49) Like it’s funny cause they had the three guys like acting like people. (1:52:52) It’s a freaking computer program.(1:52:54) It’s AI, dude. (1:52:55) Right. (1:52:55) It’s not even, it’s not even conscious.(1:52:58) So when we talk about intent, right, the intention wasn’t bad, (1:53:02) but our use and how we just love getting angry versus bringing people. (1:53:09) We like tearing apart versus putting together. (1:53:11) Yes.(1:53:12) More. (1:53:12) Yes. (1:53:13) It’s a power.(1:53:13) It’s a more powerful energy. (1:53:15) Yes. (1:53:17) That’s kind of humanity’s fault.(1:53:18) Isn’t it that we allow that to get us more amped up? (1:53:22) But that also, the, the, the, the, the posts and the feeds and the pictures and the tweets (1:53:30) that incite outrage use more screen time because you’re, you’re more into it. (1:53:38) Right. (1:53:39) But the ones that have a puppy and a flower, you’re like, oh, like, and you, and that’s it.(1:53:43) The one that gets you upset. (1:53:45) That’s the recurring videos of the puppy and the, and the kitten. (1:53:48) Cause you do have a couple of people who are just puppy, puppy, kitten.(1:53:51) Me. (1:53:51) Yeah. (1:53:54) That’s, I’m the only one, I guess.(1:53:56) I love you, man. (1:53:56) All the, all the chubby faces. (1:53:58) Smooshy face.(1:53:59) All the smooshy chubs. (1:54:00) So your point to that though is. (1:54:03) That, that a negative or a comment or a picture or a tweet or whatever that is upset someone (1:54:11) or incites a negative emotional response.(1:54:16) We’ll suck that person in and have, and there’ll be more screen time, (1:54:20) which is exactly what these companies are looking for. (1:54:23) And the puppy picture will be like, oh, that’s cute. (1:54:26) That’s it.(1:54:27) But this tweet, that’s 124 words. (1:54:31) We’ll fucking send somebody through the roof. (1:54:35) Ruin your life.(1:54:37) Yes. (1:54:38) And you’ll be like, oh my God, I can’t believe Joe Rogan said. (1:54:42) And you’re just, dude, it’s, it’s a sentence, bro.(1:54:46) Who God calm down. (1:54:49) What difference does it make, man? (1:54:52) Yeah, well, it’s part of the tribalism too. (1:54:56) Absolutely.(1:54:56) They mentioned that as well. (1:54:59) Polarization polarization. (1:55:00) Well, yeah, it was funny.(1:55:02) The news articles about how the center is losing part, losing participants and the left (1:55:09) and the right we’re gaining in Europe or whatever. (1:55:11) And it’s like, cause this thing pulls you to a side. (1:55:14) It does not keep you in the middle.(1:55:15) Cause the middle is not where anybody wants. (1:55:18) That’s not where advertisers are. (1:55:20) The advertisers who, for example, let’s just use it.(1:55:23) Let’s just use guns, advertisers and guns. (1:55:26) We know. (1:55:27) I was thinking Adidas and you were thinking guns.(1:55:29) Oh damn. (1:55:30) Well, I don’t know. (1:55:31) I think everybody wants Adidas.(1:55:33) Okay. (1:55:33) Oh, sorry. (1:55:35) Regardless left, right center.(1:55:36) Well, run DMC liked them because they were a song about them. (1:55:39) But then I think like, I feel like Marilyn Manson would be cool and pair of Adidas too. (1:55:43) I would think so too.(1:55:44) Yeah. (1:55:45) Sweet dream. (1:55:45) Charles Manson and Marilyn Manson.(1:55:47) Charles and Marilyn. (1:55:48) Yes. (1:55:48) Sneaker.(1:55:49) All the Mansons. (1:55:50) Yes. (1:55:50) So the whole family, the whole, not the purchase family, the Manson family.(1:55:58) All right. (1:55:59) Are they free Manson’s? (1:56:00) Yes. (1:56:01) And Mason’s.(1:56:02) Oh, more like treasure. (1:56:07) Nicholas Cage sighting. (1:56:09) I would like to Nicholas.(1:56:10) Oh, I texted you all the restaurant. (1:56:12) Aces. (1:56:12) I’d like to Nicholas cage, free eggs, omelet.(1:56:15) Yes. (1:56:15) I know. (1:56:16) What’d you think? (1:56:17) I think it’s delicious.(1:56:18) Should we do a beer Googles on that entire list? (1:56:20) Fuck yeah. (1:56:20) And then we’ll add a couple. (1:56:21) Yeah.(1:56:22) Okay. (1:56:22) You need to, you need to come up with a couple of menu items. (1:56:24) I’m not good about that kind of stuff.(1:56:25) We’re going to have the beer Googles. (1:56:26) I mean, I’ll have to come up with some Mexican ones. (1:56:28) I’d like my Vladimir Putin, please.(1:56:32) Why you want Vladimir Putin? (1:56:33) No Vladimir Putin likes each cards with the gravy and the fries, please. (1:56:39) Um, what are we? (1:56:41) Oh, so I’m an advertiser of X. (1:56:44) I’ll change it. (1:56:45) That’s fine.(1:56:46) No, I’ll change it to red hats with white writing on them. (1:56:50) You know, what a dick who is advertising or to all those people with white writings on. (1:57:00) And you’re also looking to advertise masks to cover your face.(1:57:04) You know, what side you’re at? (1:57:06) Do you see what I’m saying? (1:57:07) I do. (1:57:08) That’s the extremes matter that that’s where they want us. (1:57:11) They want us.(1:57:12) I’m not there. (1:57:12) No, we’re so. (1:57:14) I’m in the middle buying Adidas, bro.(1:57:15) You and I are buying Adidas, bro. (1:57:18) My Adidas, your Adidas. (1:57:19) I just bought some new Adidas golf shoes.(1:57:22) Very nice, man. (1:57:23) Did you break them in? (1:57:24) No. (1:57:25) Not yet? (1:57:25) Not yet.(1:57:26) Oh, I thought it might have been. (1:57:27) Last year’s model. (1:57:27) Might have been when you got your on the last round.(1:57:30) Shit. (1:57:31) Should have just told me. (1:57:32) I would have believed it.(1:57:35) You should have just told me that you broken in that day. (1:57:37) I’ve been like, that’s what it was. (1:57:40) No, I can’t blame the shoes, man.(1:57:41) I’m gonna be like, you’re gonna be an honest person with me, sir. (1:57:44) Okay. (1:57:44) All right.(1:57:45) What else we got? (1:57:45) No. (1:57:46) Last comment I have, sir, is. (1:57:49) Which really, I really thought this was.(1:57:52) Eye opening. (1:57:54) No pun intended after I tell you my sentence. (1:57:57) How do you wake up from the matrix when you don’t know you’re in the matrix? (1:58:02) I thought that was very interesting.(1:58:04) And I pictured myself in one of those tubes, you know, with all the goop attached to all (1:58:10) the tubes and you’re asleep. (1:58:12) You don’t know you’re in the matrix. (1:58:14) You’re eating a steak, but you don’t know that you’re in the matrix.(1:58:18) And what did they say in the matrix? (1:58:19) If you weren’t one of us, you’re one of them. (1:58:23) Remember it was if you’re not part of us, the freed minded people, you were part of (1:58:29) the system or part of the slaves because not knowing that you are is just as bad as being. (1:58:37) Because you don’t know any better, but we know.(1:58:40) We know it’s evil, right? (1:58:42) Quote unquote evil. (1:58:43) Yeah. (1:58:44) Why do you think? (1:58:45) Why do you think I love the matrix, man? (1:58:47) It is.(1:58:47) It’s all about the red and blue pill because the steak and cipher. (1:58:53) Oh, I love that scene. (1:58:58) I still question whether I wanted to take the blue pill.(1:59:03) Absolutely. (1:59:04) And I’ll I’ll get I’m going to get probably very emotional about it, but. (1:59:09) Don’t the red pill is a lot to swallow.(1:59:12) I mean, when I hear a genius tell me they don’t want to hear about bad things because (1:59:19) it’s just going to distract them from life. (1:59:22) I’m like, God, I wish I could do that. (1:59:24) Right? (1:59:26) Not give a shit.(1:59:30) I don’t know if it’s not give a shit or just prioritize my life better. (1:59:33) But like, I feel like humanity is light. (1:59:36) That’s where I feel like I’m doing that for humanity.(1:59:38) But I feel that guy’s living a lie, dude. (1:59:40) Yeah, that’s my point. (1:59:41) He’s living for his family and he’s living for his job.(1:59:44) And that’s great. (1:59:45) Fine. (1:59:46) I’m like living for fucking human.(1:59:48) Why am I burdened with that feeling? (1:59:50) Yeah, no, I know. (1:59:51) That sounds like I mean, I might have to lay on his couch for a second, but. (1:59:54) OK, why don’t tell me about your father? (1:59:56) Why do I feel? (1:59:59) Burdened with like, I don’t know why I feel that semi weird responsibility for humanity.(2:00:05) We’re sitting here talking about this. (2:00:07) It’s a it’s a Wednesday night. (2:00:09) You and I could be eating Cheetos and drinking whiskeys at our respective homes, watching (2:00:16) porn and masturbating.(2:00:18) Do you know what I’m saying? (2:00:19) Like, we could be doing anything but this right now. (2:00:23) We could be. (2:00:26) Oh, God.(2:00:27) Look, I don’t know. (2:00:28) Porn and masturbation seem right. (2:00:29) Just it just felt like the right thing to say, man.(2:00:31) Sorry, but we could be doing anything. (2:00:33) And we’re talking about this because we honestly care. (2:00:36) Yeah.(2:00:36) And and the truth is, we’ve touched. (2:00:40) I know we’ve touched at least one one or two people in a heartfelt way. (2:00:45) And we’ve we’ve commented or we’ve we’ve actually communicated with them or they’ve (2:00:49) reached out to us.(2:00:49) And it’s it feels it feels nice. (2:00:52) But like, that’s all we want. (2:00:54) Just that one person heard something like this is amazing to us.(2:01:00) That’s where the technology is in our benefit. (2:01:03) You and I are here on a Wednesday in Arizona. (2:01:07) And on November 6th at 1 a.m. (2:01:10) Here for 3 a.m. (2:01:12) Eastern, I think.(2:01:13) No, not yet. (2:01:13) I think they for the next not until not until that we not until that Sunday, right? (2:01:18) Halloween night, sir. (2:01:19) Oh, that’s but when we release, it’ll be after then.(2:01:22) Correct. (2:01:22) So, yeah. (2:01:22) So we’ll be at 3 o’clock.(2:01:24) Yes. (2:01:24) Two hours. (2:01:24) Yes.(2:01:25) Two hours. (2:01:25) It’ll be three Eastern, 3 a.m. (2:01:27) Eastern. (2:01:29) And we’re going to release it.(2:01:30) And it’s going to go out to the world in milliseconds. (2:01:35) That’s pretty cool technology. (2:01:37) That’s crazy, isn’t it? (2:01:38) And we can have one person go, oh, my God, I haven’t heard them since because we didn’t (2:01:42) do the election night.(2:01:42) Heard him since last week. (2:01:44) Maybe someone’s addicted to woods. (2:01:46) Stop it.(2:01:47) Yeah. (2:01:47) You know, ladies, it’s still soon. (2:01:49) God, you’re such a dick.(2:01:50) Woods is such a handsome, beautiful man. (2:01:53) If I weren’t taken, you know. (2:01:55) You wouldn’t touch me.(2:01:57) Well, yeah, you’re right. (2:01:58) No touching. (2:01:58) I go after other people.(2:02:00) T-bone. (2:02:01) T-bone. (2:02:02) T-bone, you want an ice cream sandwich? (2:02:04) But no, that we, that we touch like one person and then we get a message on our phone that (2:02:09) shows us because they go, oh my gosh, right when I was looking for something, boom, you (2:02:16) had a, you had a topic, you talked about it last week and it’s just synchronicity, right? (2:02:20) Boom.(2:02:21) Then that’s all done in a minute, 10 minutes. (2:02:24) That’s the greatness of this stuff. (2:02:27) It’s just not being utilized for its greatness.(2:02:30) As, as the tool it’s designed to be. (2:02:33) It’s kind of got a mind of its own. (2:02:35) Now it’s like an entity.(2:02:39) Sadly, that is correct. (2:02:42) Yeah. (2:02:43) And the old, I feel like the only way to fix is to unplug it and start a new, like, or (2:02:48) start a new, then unplug it.(2:02:49) Obviously, you know me. (2:02:50) I’d like to have a system in place before. (2:02:52) I’m not gonna, I like that plan.(2:02:54) I vote for you. (2:02:55) What else you got on your list? (2:02:57) That’s I’ve, I’m done, man. (2:02:58) No, you’re not.(2:02:59) But I. (2:03:00) That’s lies. (2:03:00) I reached out to Tristan. (2:03:01) Not through your list yet.(2:03:04) Negative. (2:03:10) Thank you. (2:03:11) Uh, Ricky, show us something.(2:03:13) Ricky, show us something. (2:03:14) Thank you. (2:03:14) Yeah.(2:03:14) It’s just all the list of the people and then what they said. (2:03:18) But basically what’s the problem was the one where it’s just like. (2:03:21) When they go, it’s a problem.(2:03:22) And these geniuses who created all these systems laugh or scratch their head or go, whatever. (2:03:30) There’s a real fucking problem. (2:03:34) So you truly believe.(2:03:36) So my question, whenever we talk serious stuff, I always think, where do we go from here? (2:03:42) That’s what, that’s because, okay, there’s, there’s a lot of information. (2:03:45) It’s a lot to digest. (2:03:47) You know, you can, you can, you can align with me and say, oh, it’s green or it’s the, (2:03:54) it’s the companies being evil.(2:03:55) You can align with check mark and say, oh, it’s humans. (2:03:58) We’re, we’re fallible and we’re using these things that we don’t have to use. (2:04:02) Or you can think something completely different, right? (2:04:05) Okay.(2:04:06) Where do we go from here? (2:04:07) Do we continue as a human race? (2:04:09) Do we continue down the same path and be addicted to these social media applications? (2:04:15) Do we, what, what do we do as a human race? (2:04:21) Would you like to answer first? (2:04:22) Or you know, I have no idea. (2:04:23) I like to be solutions oriented. (2:04:27) I don’t like bringing up problems without at least some kind of solution, right? (2:04:30) We’ve talked, that’s why I have you here.(2:04:32) We talk about partial, um, partial government regulation, specific regulation that I’m (2:04:39) going to talk about in this case should help right off the bat. (2:04:43) They mentioned in the thing, Saturday morning cartoons were protected. (2:04:47) Yeah.(2:04:47) Weren’t advertising cigarettes and beer Saturday mornings or whatever, right? (2:04:52) It was, it was protected for like cereals and whatever children. (2:04:56) Now there’s this whole YouTube for kids. (2:04:59) That’s not regulated the way television was regulated.(2:05:04) So they’re getting all these other ads for things that are manipulating them. (2:05:10) So that that’s one very clear way. (2:05:13) No ads on YouTube for kids, not a single advertisement, all not, not one read in a video, (2:05:21) nothing, not advertising, not for children, advertising driven.(2:05:25) No, I, I, yeah, I would agree with that. (2:05:28) The one thing you and I have started this whole thing with is protecting children. (2:05:33) Yeah, for sure.(2:05:34) Like we have kids, we don’t even have kids and we feel this because I think, (2:05:41) I think it’s part of our childhoods. (2:05:42) Probably. (2:05:43) Probably, but probably, um, probably want a cracker.(2:05:48) Uh, no, I, it is part of our childhood, but like we felt hurt, I think in some ways. (2:05:53) Of course. (2:05:53) Well, it doesn’t every child at some point.(2:05:55) And, but obviously you and I felt hurt and see the world differently to the point where (2:05:59) we don’t want to see the hurt in others. (2:06:01) Yeah, absolutely. (2:06:02) We will go out of our way to try to avoid that.(2:06:04) Yeah. (2:06:04) We’ve done it with talking about the church, even with Michael Jackson on the other side (2:06:10) about finding justice, we getting, finding the system. (2:06:14) That’s the problem.(2:06:16) And children in this case need to be protected. (2:06:19) They’re not so addicted. (2:06:21) Children become addicted adults.(2:06:22) So somehow that it needs to be pinched off there. (2:06:27) And then some kind of weaning where it’s not as impactful. (2:06:33) You don’t get a starburst every time or the, the streaks.(2:06:37) What was the Snapchat stream? (2:06:39) 150. (2:06:40) Snap. (2:06:40) What was it called? (2:06:41) Snap streak or something.(2:06:42) Snap, snap streak. (2:06:44) Yeah. (2:06:44) I know what you’re talking about.(2:06:46) So allegedly I, cause I, I didn’t see it myself, but I, this document or this guy talked about it. (2:06:51) Yeah. (2:06:51) On his Ted talk, uh, Tristan talked about it and it was how many days in a row you can (2:06:57) interact with each other.(2:06:58) It’s kind of like that sign at work. (2:07:00) We’ve had X amount of days since the last accident, since we lost a limb, you know, whatever. (2:07:06) It’s like that.(2:07:06) Right. (2:07:07) And it’s a score just like the send me to heaven. (2:07:11) A hap is throwing it up as high as I can or holding the button as long as you can.(2:07:15) This is how many days in a row can I communicate with someone and you keep score. (2:07:21) You don’t think that I got to keep my, I got to keep my streak active. (2:07:24) It’s the longest streak.(2:07:25) No one else has 1242 days active. (2:07:27) Oh my God. (2:07:28) It’s like four years, three years.(2:07:32) Like Christopher walking is like, I got this for three years. (2:07:36) I got to get on my Snapchat. (2:07:38) I can’t fucking do it.(2:07:40) I can only do three years anyway. (2:07:42) But you understand like that is horrible, horrible practice. (2:07:49) Yeah.(2:07:50) And that, I mean, but that’s exactly what it’s designed to do is engage you, which it (2:07:54) does. (2:07:54) But even the guy was saying they’re taking pictures of nothing just to keep the streak (2:07:58) alive. (2:07:59) Yeah.(2:08:00) They’re going through motions, sealing the wall, the ceiling. (2:08:03) And you’re like, that doesn’t sound like a streak. (2:08:07) You know, like those incentives aren’t their incentive for that company.(2:08:12) Those can’t, those can’t be there. (2:08:15) But that’s still a dopamine hit for that child. (2:08:18) But that’s the point.(2:08:19) If, if you somehow negate the sensationalism of it. (2:08:28) Yes, absolutely. (2:08:29) Yeah.(2:08:29) Yeah. (2:08:29) I could not agree more. (2:08:30) Does that make sense? (2:08:31) Yes, of course.(2:08:32) What if it was this anything after five tweets back and forth, you have to call. (2:08:38) Wow. (2:08:39) That just popped in my, that’s pretty good, dude.(2:08:43) But then again, I mean, you text message back and forth. (2:08:46) I know, I know. (2:08:47) There’s some, there’s some toughness there.(2:08:49) Well, then you at least text, get off that platform. (2:08:51) That’s putting you into echo chambers. (2:08:54) Yeah.(2:08:55) Right. (2:08:56) After five tweets, you go to a third part, you go to a call or text system. (2:09:01) I dig it.(2:09:02) Still protected phone numbers. (2:09:04) If you don’t know the numbers or whatever. (2:09:05) What’s that? (2:09:06) But yeah, some kind of, there’s this hidden thing, but you can, you can call and talk it (2:09:09) out or you can text or you can text it out.(2:09:12) That’s not now driving you even further into holes. (2:09:15) Mm-hmm. (2:09:16) I don’t know.(2:09:17) Just little things. (2:09:17) I dig it. (2:09:18) I know that sounds crazy.(2:09:20) That’ll never happen. (2:09:20) Nor would I want that. (2:09:21) Like, you know, me, I, I am a, I am a right to pursue your happiness.(2:09:27) As long as your happiness doesn’t impose on others. (2:09:29) This is imposing though. (2:09:31) This is imposing on everyone’s life.(2:09:34) This is, this is an imposition. (2:09:36) When you get a text from any agenda driven moron, either willing to buy your rental property (2:09:43) or get you to vote for a certain way. (2:09:48) Isn’t, isn’t my happiness being infringed upon in this case? (2:09:52) I’m definitely being inconvenienced.(2:09:53) Yeah. (2:09:53) I mean, you’re being bothered for sure. (2:09:55) I’m being inconvenient.(2:09:56) Yeah. (2:09:56) Especially at 12 or 4 AM. (2:09:58) Right.(2:09:58) That’s not acceptable. (2:10:00) And, and there are people that say that can, you know, the whole theater Rosevelt comparison (2:10:04) is the thief of joy. (2:10:08) There’s a little truth.(2:10:09) There’s a lot of truth to that. (2:10:10) I mean, you can change that script, but have you ever heard that phrase? (2:10:13) So theater Roosevelt’s famously known for saying comparison is a thief of joy is a (2:10:19) single thief of joy, which isn’t that what everybody does? (2:10:22) Like, Oh, they went on vacation. (2:10:24) Yeah.(2:10:24) Look at their picture. (2:10:25) Oh my God, my life sucks. (2:10:26) Cause their life’s so good.(2:10:28) Yeah. (2:10:28) Right. (2:10:29) That is all we are doing.(2:10:31) Yeah. (2:10:32) It’s we’re keeping score. (2:10:34) There’s a, there’s something wrong with the way we’re doing it, but we’re not being any (2:10:39) better.(2:10:39) Cause we want to one up everybody. (2:10:41) Cause our evolution tells us, I need to show you that I went on a trip. (2:10:45) Cause I need to look more attractive as a potential mate or as a potential alpha or (2:10:50) whatever, as higher in the system that we are.(2:10:53) We have a hierarchy. (2:10:54) But you’re stepping on somebody else to elevate yourself. (2:10:57) I’m not saying that it’s right.(2:10:58) What I’m saying is we’re just giving into our human evolution, which is that doing so (2:11:04) we are stepping on it. (2:11:05) And that’s the shit part. (2:11:07) But that’s true.(2:11:08) That’s been happening since the dawn of time. (2:11:10) Right. (2:11:11) Social media, just making it more widespread.(2:11:13) It makes it very easy, but it’s, it’s worse. (2:11:15) Cause it can be anonymous too. (2:11:18) At least when you had to call me, you had to call me a fuck face either to my face or (2:11:22) on the phone, on the, on the phone.(2:11:23) Right. (2:11:24) You’d be like, yo, Hey, hello. (2:11:26) Miss pulls Mark over there.(2:11:28) I got to talk to him. (2:11:31) Mark hello. (2:11:33) Yeah.(2:11:33) Mark your fuck face. (2:11:36) I don’t know. (2:11:37) But you understand what I’m saying? (2:11:38) Like the, the anonymity of it and the speed of it and the power of it.(2:11:43) Our brains are not, we’re not, we have 50 years of slow evolution because we haven’t (2:11:49) changed in 50, not much has changed for our environment. (2:11:51) 50 years. (2:11:52) Right.(2:11:52) It’s a, it’s, as a matter of fact, it’s gotten more comfortable. (2:11:54) We’ve probably regressed evolutionarily and technologies. (2:12:01) Yeah.(2:12:01) Exponential. (2:12:02) And we don’t know how, like to its core, we don’t know how it works, but you and I not (2:12:07) knowing how it works concerns us and it makes us kind of fearful of it or at least vigilant (2:12:11) or other people don’t even care about it and research it. (2:12:14) Right.(2:12:14) Other people are like, Oh, it works. (2:12:16) Yeah. (2:12:17) That’s not, that’s not my answer.(2:12:18) That’s not your answer. (2:12:19) Yeah. (2:12:19) But I also can’t, I can’t fault it completely because I mean, on YouTube, it’ll go, Hey, (2:12:30) you might like this band.(2:12:32) And I go, all right, I’ll, I’ll try it. (2:12:35) Holy shit. (2:12:37) I fucking love them.(2:12:38) By golly, I do like this band. (2:12:40) You know, so. (2:12:41) Yeah.(2:12:42) I can’t. (2:12:44) That’s the greatness of it. (2:12:46) I haven’t bought either albums or anything.(2:12:48) So it did, did it do any ill will to me? (2:12:51) It made a suggestion of a band that it thought I would like. (2:12:55) Right. (2:12:56) And I liked it.(2:12:57) Right. (2:12:57) And I haven’t spent any money. (2:12:59) Okay.(2:13:00) I haven’t, it was kind of time, which it was playing, it was playing in the background. (2:13:04) When I was, I don’t know, scrubbing the floors. (2:13:08) You spent your commodity of time.(2:13:09) However, it was enjoyable. (2:13:11) So it wasn’t like you wasted time, but you did spend time. (2:13:14) Sure.(2:13:15) Which is, I will admit to that. (2:13:17) That’s all their goal is. (2:13:18) Yes.(2:13:19) Because the pop-ups come in between or halfway through or at the end. (2:13:22) Yeah, yeah, yeah. (2:13:23) So they, that did exactly what it needed to.(2:13:26) Why would I blame you for that? (2:13:27) I’m not blaming you. (2:13:29) No, I understand. (2:13:29) Seeing humanity as the problem or as the, the issue, not, not as fault.(2:13:38) Yeah. (2:13:39) It’s not really fault. (2:13:41) Right.(2:13:42) It’s just, we are wired this way. (2:13:45) They know how we’re going to be manipulated. (2:13:48) Yes.(2:13:49) And they allow it to happen. (2:13:51) When they initially had the intent of good and everyone was doing it great. (2:13:54) The second they found out how bad it was though.(2:13:59) Because they came to us as companies that were a little more altruistic. (2:14:04) I remember like, don’t trust the government. (2:14:06) Trust us.(2:14:06) Like there was a lot of that going on. (2:14:08) I don’t recall that. (2:14:09) It’s a big, when everyone was like really not trusting, like it was bad.(2:14:14) You know, it was probably Bush time or early, you know, before Obama got in. (2:14:18) Yeah. (2:14:18) It was during the second Persian Gulf war.(2:14:21) Yeah. (2:14:21) All that. (2:14:22) After 9-11.(2:14:23) WMDs, 9-11. (2:14:24) The whole thing. (2:14:25) We were lied to.(2:14:26) That was, remember that whole, there was a huge backlash. (2:14:29) Yeah. (2:14:30) And we get into the two thousands, right? (2:14:34) And 2000 to 2008.(2:14:36) Is that correct? (2:14:36) Yeah. (2:14:36) Bush Jr. (2:14:37) Yeah. (2:14:37) Little Bush.(2:14:38) So that, that’s when our technology took off. (2:14:41) Yeah. (2:14:41) High distrust for the government.(2:14:43) Yeah. (2:14:43) And we didn’t believe the whole WMD thing and all that romp. (2:14:46) Remember Rumsfeld and all those douchebag Cheney shooting his friend in the, (2:14:49) in the woods.(2:14:50) Yeah. (2:14:51) Right. (2:14:51) All that craziness.(2:14:52) Right. (2:14:52) Forgot about that guy. (2:14:54) So like at that point that was prime for these other people to come in and go, (2:15:00) we’ll take care of things for you in a weird way.(2:15:04) Yeah. (2:15:04) I guess I didn’t under, I just, I wasn’t aware of the marketing of social media in that way. (2:15:11) Well, I always said it was just being good.(2:15:14) It never, never came in it from a business standpoint. (2:15:17) Yes. (2:15:18) It never shared that it was business.(2:15:19) But when they find it’s addictive, just like cigarettes are addictive. (2:15:23) In this world, in this global consciousness that we should have this woke era. (2:15:28) Isn’t that right? (2:15:29) Right.(2:15:30) When we go, okay, you’re right. (2:15:33) Cause we can still make money, but that’s, it’s about more and more to your point, greed. (2:15:41) Yeah.(2:15:42) Because the shareholders have to get their stocks up. (2:15:45) And if they don’t go up this quarter, somebody is getting fired. (2:15:48) In order to get stocks up, you have to follow that protocol.(2:15:51) The seven likes. (2:15:52) Yeah. (2:15:52) You have to follow the protocol.(2:15:54) You have to follow the protocol. (2:15:55) You can’t break it. (2:15:56) You can’t disrupt it.(2:15:57) You can’t make it less attractive. (2:15:58) God forbid people click less on it. (2:16:02) I hate when people click less on it.(2:16:04) I hate when people click less on it. (2:16:07) Guess how much a Facebook share is worth. (2:16:10) I, did you do this research? (2:16:11) I just looked it up two hours ago.(2:16:14) Oh, I was going to ask like, what is each person worth? (2:16:17) What is a Facebook share worth? (2:16:21) $236 for one share of stock. (2:16:24) Wow. (2:16:25) For, I thought I was thinking totally different share.(2:16:28) Google’s way up there, but. (2:16:29) $236 that is. (2:16:31) $236 American dollars.(2:16:32) Yeah, we don’t know if it’s split. (2:16:33) I only looked at the past year and that, the, the, the cap was like $250. (2:16:37) But my, my point is, and this, this is not just about social media.(2:16:41) I can buy four share every two weeks. (2:16:44) Whoa. (2:16:45) So my point is that the reason I bring it up is they don’t do anything.(2:16:52) They don’t make anything. (2:16:54) They don’t, they don’t. (2:16:55) They’re not a manufacturer.(2:16:56) They don’t fix baby heart defects. (2:17:00) They’re not curing diseases. (2:17:02) They don’t do anything, but one share is worth $236.(2:17:08) They connect people, bro. (2:17:09) They connect people. (2:17:10) Shut the fuck up.(2:17:13) So. (2:17:13) Well, that’s the point though. (2:17:15) When they said you found a long lost relative, you know what I mean? (2:17:19) Think about like ancestry or 23andme, right? (2:17:23) The DNA thing.(2:17:25) I don’t know. (2:17:26) Did I tell you what happened with, with my friend Rob, my coworker? (2:17:30) No. (2:17:33) His mom and dad had a daughter that they gave her for adoption before him and his brother.(2:17:40) Well, his brother, and then he were born years before. (2:17:44) They did not know about it. (2:17:45) He only found out two years, two years ago or so.(2:17:48) Wow. (2:17:49) From 23andme. (2:17:51) His son took.(2:17:52) He found out from 23andme, not from his parents? (2:17:55) Yep. (2:17:55) Oh my God. (2:17:56) And his mom had passed.(2:17:59) God, God rest her soul. (2:18:01) I’ll share my part of that story in a second. (2:18:03) Cause it’s really interesting.(2:18:04) Cause I have a feeling I know where your head’s kind of at. (2:18:07) Where’s my head at? (2:18:08) She doesn’t share. (2:18:09) She passes.(2:18:11) His mother’s gone. (2:18:12) Never, never told him. (2:18:14) His son takes a DNA test and says, you have a relative in.(2:18:18) A sister. (2:18:20) Well, his aunt, but you have X amount of alleles in common with this person. (2:18:25) You have a relative in like Illinois or something.(2:18:28) So is it his aunt or his sister? (2:18:29) No, no. (2:18:30) It’s his, it’s my, it’s Rob’s sister, but I said Rob’s son took the DNA test. (2:18:34) Oh.(2:18:35) Rob’s son took the DNA test. (2:18:37) Okay. (2:18:37) Now I understand.(2:18:38) Yeah. (2:18:38) And I’m not going to say his name. (2:18:39) Okay.(2:18:39) Rob’s son took the DNA test. (2:18:42) He comes back with like a relative. (2:18:44) An aunt.(2:18:45) An aunt relative. (2:18:46) In Maryland or somewhere. (2:18:47) Somewhere like Illinois or whatever.(2:18:49) Midwest somewhere. (2:18:50) Chicago, I think area, whatever. (2:18:51) And he’s like, that can’t be right.(2:18:53) And then something else comes up and then, and then he, I think he asked his dad and his dad. (2:18:58) Like, I don’t want to talk about it. (2:19:00) So Rob knew.(2:19:01) And then he, well, he asked his dad. (2:19:03) He said his dad didn’t want to talk about it. (2:19:05) And then he opened up and then he shared.(2:19:08) So for 40 years, he had a sister he didn’t know about. (2:19:12) And that’s amazing. (2:19:14) Cause now he’s beginning, he’s now made this weird, this connection, right.(2:19:19) With a complete stranger. (2:19:20) Who’s actually his blood family. (2:19:23) And she has two kids now.(2:19:24) They’re not upset. (2:19:25) Cause she had a great childhood with the adopted family. (2:19:28) Yeah, right.(2:19:29) And they had a, and Rob had a great childhood. (2:19:30) Cause they, I don’t think the parents, I don’t want to speak for them, but they may, (2:19:34) they just weren’t ready for a child. (2:19:35) Yeah, of course.(2:19:36) Who knows, right? (2:19:36) Maybe they thought they weren’t going to have children. (2:19:42) That’s something you should tell your fucking kids. (2:19:44) Yeah.(2:19:44) I call my mom about it. (2:19:47) And she goes, you’d be upset that you weren’t told. (2:19:53) And I am furrowing my brow and like raising my Spock eyebrow at you.(2:19:57) Like, like there’s no tomorrow. (2:19:59) I got so angry. (2:20:02) And I was just like, if you can’t give me the courtesy of an, at an adult age.(2:20:09) Exactly. (2:20:10) To tell me. (2:20:11) I’m not seven mother.(2:20:12) To tell me that. (2:20:14) And I, I get maybe not 18. (2:20:16) I get, if you go to college and graduate then at the latest though.(2:20:22) I can’t imagine any later than that. (2:20:25) Just cause like school, what if you get distracted? (2:20:27) Like with school or something, you know what I mean? (2:20:28) Like sometimes you have to focus, I get it. (2:20:32) But my mom like defended the parents and I may, maybe we’re, maybe we are a different (2:20:37) time, but like that to me, that’s inconsiderable.(2:20:40) And I’m not, look, I’m not calling out the, I, whatever reasons they have, that’s not (2:20:44) my family, not my business. (2:20:45) However, I could not, if my mom, if, if my parents passed and I find that there was a (2:20:55) long lost sibling that they gave up for adoption 10 years before my brother was born or whatever. (2:21:03) I’d be like, what, what the fuck? (2:21:05) I’d be very upset.(2:21:06) I, I don’t, I’d be, yeah, they wouldn’t be around to see it, but I would be. (2:21:10) Yeah, of course. (2:21:13) Uh, if they told me now that now that I told my mom, like if they do have one and don’t (2:21:19) tell me fire and brimstone.(2:21:21) Yeah, of course. (2:21:22) Absolutely. (2:21:23) Cause like I’ve, I’ve made my intention clear as like, you wouldn’t tell me that is (2:21:30) completely unacceptable as an adult, but that’s my personal opinion.(2:21:35) Obviously things change. (2:21:38) Do you want to hear my 23 and me story that has nothing to do with social media? (2:21:42) Yeah, but this one didn’t really have to, it just had to do with information and getting (2:21:46) people together. (2:21:47) Yeah.(2:21:48) Yeah. (2:21:48) But okay. (2:21:48) 23 and me.(2:21:49) So my cousin Jose Luis is into ancestry and, uh, he, he has our entire family tree back (2:21:57) to like 1849. (2:21:59) It’s crazy. (2:22:00) All the aunts, uncles, cousins.(2:22:02) So he sends me this PDF of the whole branch. (2:22:06) It’s amazing how much work he’s done. (2:22:08) He has all these records from when my grand, our grandparents went across back and forth (2:22:13) the border from Mexico to the U S and all this stuff.(2:22:16) It’s really cool. (2:22:17) Super interesting. (2:22:18) So I’m on there and my sister’s on there and this and that.(2:22:21) And then it shows my mom, my sister’s mom, since she’s my half sister and this other, (2:22:28) and my dad’s the wife prior to my mother and this other wife. (2:22:35) And I was like, what the fuck is this? (2:22:39) So apparently my, my mother was my dad’s third wife. (2:22:44) I thought it was his second wife.(2:22:46) This, I was like, yeah, I was like 40 when I got this information. (2:22:50) What? (2:22:51) Yeah. (2:22:53) So I called my cousin, I go, what the fuck is this? (2:22:57) He goes, Oh, you didn’t know? (2:22:59) I don’t know.(2:23:00) Motherfucker. (2:23:00) Why am I calling you? (2:23:01) Of course I didn’t know. (2:23:02) Oh my Lord.(2:23:04) So he asked his mom, he goes, Oh yeah, that was Candelaria or no. (2:23:08) Candelaria was my grandmother. (2:23:09) So isn’t that a light bulb? (2:23:10) Yeah.(2:23:11) So I go, Oh, that’s a candelabra. (2:23:13) Sorry. (2:23:13) Sorry.(2:23:14) Yeah, that was candy or whatever name was. (2:23:17) And she died. (2:23:18) She died in like 1951.(2:23:20) So apparently my dad was married and apparently they got married right when he’s going to the (2:23:25) Korean war or something like that. (2:23:26) And then she passed away right around when he got back. (2:23:30) So my, but I’m like, so I call my dad.(2:23:32) I was like, don’t you think that’s something you tell your kids that you had a wife and (2:23:38) then she passed away? (2:23:40) I mean, that’s, I would think that’s something that you would tell your kids as you’re growing (2:23:45) older in age and talking about your life and you sit down with your son when he’s 25, 35, (2:23:51) 40, and you have a cocktail with the, with the kid and you talk about life and crack (2:23:56) a fucking beer. (2:23:57) Yeah. (2:23:58) My dad and I had many drinks together.(2:24:00) Many drinks. (2:24:01) Never this kind. (2:24:02) I did.(2:24:03) I found out because my cousin sent me a PDF, dude. (2:24:07) It’s your dad’s third wife is your mom. (2:24:10) Yes.(2:24:11) And he had a second wife of whom you did not know. (2:24:13) I, well, I knew her because it was my sister’s mother. (2:24:15) Okay.(2:24:16) So I saw her come pick my sister up for visitations and shit. (2:24:19) Right. (2:24:20) But the first one you didn’t know.(2:24:21) No, she died in 1952. (2:24:24) No kid. (2:24:24) No kid.(2:24:25) Not no. (2:24:25) Correct. (2:24:26) No kids.(2:24:27) According to the PDF. (2:24:29) No kids. (2:24:30) Yeah.(2:24:31) I’m not going to say that. (2:24:32) I was like, what the fuck, man? (2:24:33) That’s crazy. (2:24:33) And my cousin.(2:24:34) This is the weirdest tangent of social media. (2:24:35) My cousin, dude, I love him so much. (2:24:37) He goes, oh, you didn’t know? (2:24:39) I just laughed.(2:24:41) Yeah. (2:24:41) How is he like so casual about it? (2:24:42) And then he talked to his mother, my, my Nina, my dad’s sister. (2:24:46) And the Pinta, where were the Pinta and the Santa Marina? (2:24:48) They were still in the Harbor in Long Beach, bro.(2:24:50) Okay. (2:24:50) So she goes, oh yeah, that was candy. (2:24:52) Did this and that, blah, blah, blah.(2:24:53) That they, this, that other thing. (2:24:55) Your dad, he was in the army, blah, blah, blah. (2:24:58) And she knew all about it.(2:24:59) I was like, motherfucker. (2:25:00) Maybe she just thought you do. (2:25:02) So it wasn’t a thing.(2:25:03) Like, I don’t think they were whole. (2:25:04) It sounds like they’re so casual about it. (2:25:06) I’m like, oh my God, Christopher, I fucking quit.(2:25:11) I just pulled up my 23 and me. (2:25:14) I’m crushing it. (2:25:17) I’m crushing these tests, bro.(2:25:19) Oh man. (2:25:20) Hey, who better to evaluate the student than the student himself? (2:25:25) Let me tell you about this. (2:25:26) Please lay it on me, bro.(2:25:28) 23 and me. (2:25:29) Dude, we’re at 225? (2:25:30) Yeah. (2:25:30) Son of a bitch.(2:25:31) I’ll cut it down to 218, 219, you know. (2:25:35) 220, 221? (2:25:37) 228, whatever it takes. (2:25:38) I keep saying lower because we keep talking longer.(2:25:42) But hashtag real quick, 23 and me. (2:25:46) You know what? (2:25:47) Fuck it. (2:25:48) I’m not going to talk about 23 and me.(2:25:49) We’re going to close it out. (2:25:50) Yeah. (2:25:50) We’re closing this the fuck out.(2:25:51) So back to- (2:25:52) This is a teaser. (2:25:53) Let’s close down the social media app. (2:25:54) Hello to the world.(2:25:55) I’m going to talk about all my health defects in a future episode. (2:25:58) I can’t. (2:25:59) Dude, now I’m excited, dude.(2:26:00) You’re going to find ways to attack me. (2:26:02) Thank you very much. (2:26:03) The glove will fit.(2:26:04) Oh dear lord. (2:26:05) So back to social media. (2:26:07) It’s scary.(2:26:08) I think the use as a tool used properly and used with vigilance, it’s very helpful. (2:26:14) You and I haven’t gotten caught in those traps because other than the video that you like (2:26:19) anyway, fine. (2:26:20) Okay.(2:26:20) But not into like verbal discourse the way- (2:26:24) Altercations? (2:26:24) We won’t allow that, first of all. (2:26:26) No, I won’t have it, no. (2:26:26) So you and I are conscious enough to not allow that.(2:26:28) But you and I have kept our nose pretty much clean. (2:26:31) We’ve just been like, hey, I remember somebody goes, ask for podcast recommendations in general. (2:26:35) I send them our stuff and they write back, no.(2:26:39) Okay. (2:26:39) And I’m like, and I, you know what I wrote back? (2:26:41) Okay. (2:26:42) Bro, dot, dot, dot.(2:26:43) That’s all I wrote back. (2:26:45) So like, and what’s funny, you know what happened? (2:26:47) Someone took my bro, retweeted it and goes, dude, you asked. (2:26:53) Like, took ours, like, and that was the thing.(2:26:55) I remember like people like, you just gotta not engage in people. (2:26:59) Somebody retweeted bro? (2:27:00) Yeah. (2:27:00) Because they thought it was hilarious because the guy wrote no.(2:27:03) And it was another fellow podcaster because we’re all just trying to, we’re just squirrels (2:27:08) trying to get nuts, bro. (2:27:09) Bro, nuts. (2:27:10) So the guy’s like, he retweets it and says, yeah, you’re the one that asked and didn’t (2:27:14) ask specifically.(2:27:15) You asked, anyone have podcast recommendations? (2:27:19) Yeah. (2:27:20) Ta-da. (2:27:20) That is an open door, ladies and gentlemen.(2:27:22) If you said, do you have true crime podcast recommendation? (2:27:26) I’ll just like it. (2:27:27) I won’t send you something because guess what? (2:27:30) We don’t have true crime, ladies and gentlemen. (2:27:31) Should we do a true crime one? (2:27:33) I don’t even know how to start it.(2:27:34) I don’t even know why it was so popular. (2:27:37) Let’s move on. (2:27:37) Let’s not talk about that.(2:27:38) Anyway, social media, let’s wrap it up. (2:27:41) I think is, I think humans need to be more vigilant. (2:27:43) I think it is way much on us.(2:27:46) Like to the point, don’t click on a recommended video. (2:27:49) If they even say that because you’re helping the algorithm by doing that. (2:27:52) Cause it’s recommending it.(2:27:54) You’re clicking. (2:27:54) It’s gone. (2:27:54) Oh, I got, I can now it gives it more like strength, more power.(2:27:59) Yeah. (2:27:59) It gives it more power in a way. (2:28:01) But there’s all these little tricks, but just don’t like, don’t be callous about it.(2:28:06) I think no, there’s no, there’s a real person on the other side. (2:28:11) Yeah. (2:28:12) Know that they want you to think the way they want you to think and that you’re kind (2:28:18) of on your own, even though you think that they’re looking out for you.(2:28:22) It’s just vigilance. (2:28:23) I’m not saying to attack people, but think that they’re kind of a little bit on, not (2:28:27) on your side. (2:28:28) Maybe that might just make you be more aware about what you’re sharing and what you’re, (2:28:32) what you’re giving to people.(2:28:33) I agreed. (2:28:35) Hope you check it out. (2:28:36) Boys and girls.(2:28:36) Good show, sir. (2:28:38) Enlighten your mind and stuff. (2:28:39) These two mugs are fantastical.(2:28:42) I hate being this sexy, but I’m from Czech Republic. (2:28:45) I can’t help it. (2:28:46) And Prague, check me out.(2:28:48) Check me out. (2:28:49) Check me out. (2:28:50) Spilled CZ, ECH orders a check.(2:28:54) Thank you, sir. (2:28:55) Thank you for the Van Halen pick, dude. (2:28:56) It’s amazing.(2:28:57) I love it. (2:28:57) And the ticket. (2:28:58) I love you, man.(2:28:59) Yeah. (2:28:59) Please put it, put it, put it somewhere where you can like display it. (2:29:03) And you know, this is on the, on the heels of.(2:29:07) Mr. (2:29:07) Yes. (2:29:08) And that was one of his guitar. (2:29:11) Yes.(2:29:11) He touched kind of cool. (2:29:14) Very cool. (2:29:14) Cool.(2:29:15) Because like, I’m nobody. (2:29:17) And somehow I got a hand, like a few of them. (2:29:21) Just like, you know what? (2:29:22) It’s just like, I, let’s close on this, man.(2:29:25) Yeah. (2:29:25) Let’s close on this. (2:29:26) I feel blessed.(2:29:28) Cause you’re with me. (2:29:29) We’re spending two and a half hours talking about random shit, but we’re both passionate and love it. (2:29:34) Yeah.(2:29:34) Just love the conversation. (2:29:36) Absolutely. (2:29:38) Rub shoulders with people.(2:29:40) Just Michael Jackson’s nephew. (2:29:42) Right. (2:29:42) Out of the blue.(2:29:43) Danny Wu. (2:29:44) Danny Wu, like documenting creators and producers. (2:29:48) And you’re like, we, we, we literally, and when I say we’re nobody, we don’t have a low (2:29:51) self-esteem.(2:29:52) We know very well who we are, but we’re nobody in the, in the biz or the industry. (2:29:57) Right. (2:29:57) Been blessed, man.(2:29:58) Yeah. (2:29:58) My, I’ve had some really cool run-ins with some people when I went through some of the (2:30:02) memorabilia that I thought about. (2:30:04) I’ve been really blessed and I, it’s taken me so long to figure that out.(2:30:08) So I hope that people figure it out earlier. (2:30:11) Agreed. (2:30:12) Very well said.(2:30:14) And they should tweet that. (2:30:15) Yeah. (2:30:16) Hey, Anonymous386, I, I know you’re a human being and I think you’re probably special (2:30:23) and you probably have some gift that you can give the world.(2:30:26) Good special. (2:30:26) Thank you. (2:30:29) Megzy’s going to make you pay for that one.(2:30:30) I don’t care. (2:30:31) In closing. (2:30:32) Yes.(2:30:33) Be excellent to each other. (2:30:36) And party on dudes. (2:30:37) Party on Wayne.(2:30:38) No, the other party on. (2:30:41) Uh, did you want to talk through the, through this? (2:30:43) Cause. (2:30:44) No.(2:30:45) It’s Miss Garcia. (2:30:46) No. (2:30:46) Okay.(2:30:47) Well, thanks everybody for checking us out. (2:30:49) We’re going to have another one in a couple of days. (2:30:52) It’ll probably be a beer Googles.(2:30:53) Beer the Googles. (2:30:54) Cause that’ll be fun. (2:30:54) But this has been a Knocked Conscious.(2:30:56) Please subscribe. (2:30:57) Please follow. (2:30:58) Please rate, review.(2:31:00) We have got, we’re right, I think we’re going to be over 5,000 down the line. (2:31:03) We will be. (2:31:04) We will be.(2:31:04) I think by the end of this week, actually. (2:31:06) Yes. (2:31:06) No joke.(2:31:06) We are really close and thank you to everyone. (2:31:10) Who’s just given us a listen. (2:31:11) Yes.(2:31:12) Some people listen a lot more. (2:31:14) A lot have listened a lot less. (2:31:15) We are grateful to everyone.(2:31:17) So thank you again and be excellent to each other. (2:31:20) Party on dudes.

Share this episode