A conversation about A.I. (Artificial Intelligence)

A conversation about A.I. (Artificial Intelligence). Chris and Mark discuss Artificial Intelligence and its possible ramifications.
Intro Music: “Blue Scorpion”, Kevin MacLeod (incompetech.com), Licensed under Creative Commons: By Attribution 3.0
http://creativecommons.org/licenses/by/3.0/
Outro Music: “Neolith”, Kevin MacLeod (incompetech.com), Licensed under Creative Commons: By Attribution 3.0
http://creativecommons.org/licenses/by/3.0/

Transcript:

(0:22) Ummmmm…. (0:23) My pen’s broken! (0:25) Oh. (0:26) Whew, I thought you were either tagging or doing something very different. (0:31) Tagging? I was like what the hell is tagging? (0:34) Chr-chr-chr-chr-chr-chr-chr.(0:35) There’s no spring in it, is it spring? It’s gone. (0:40) Is it sprinky? (0:41) There’s no spring in it. (0:42) It’s not sprinky.(0:43) It’s summertime! (0:44) Hey everybody! (0:45) Chaloo! (0:45) Hey, how you doing man? (0:46) Muy bien, y tu? (0:49) How are you doing also? (0:50) Well, I’m good good. I’m good. I’m good my friend gut (0:55) So we are going to talk about some shit today.Yes, sir. Oh and welcome to Knocked Conscious (1:00) Also, is this this is Knocked Conscious, right? This is your gig, bro (1:04) I’m just along for the ride a catch-up on our Google sometimes (1:07) So sure our beers and our Googles, but all the he’s and ohs and geez ease doubles (1:12) He’s ohs and geez, but this is more of a Knocked Conscious one. I think right cuz we’re gonna talk about doom doom doom (1:19) Today’s episode is artificial intelligence Dom Dom Dom (1:25) So AI, yes, sir, what about it my friend question of the day is (1:30) Have you seen Terminator? I? (1:33) Have it’s actually I’ve seen the second one for sure because it’s on my move.It’s on my island (1:37) It’s on my desert island. I had a great trip. How was yours? (1:42) Fantastic aliens motherfucker.That’s why first class all the way. Amen, brother (1:48) so AI (1:50) so (1:51) Good talk good time. Have a great day.So the question is, um, there’s many facets of the question (2:00) Are we when did it start in our current society? Are we already do you consider? (2:08) Alexa and Siri and AI do you consider I had the thought a couple days ago (2:14) When you’re driving down the interstate when you’re driving to your friend’s house and (2:20) You have if do you put the GPS on your phone and it takes 25 minutes to get there and it says hey (2:26) We found a faster route. It’s two minutes quicker. Would you like to use it? Yes, is that AI I (2:33) Don’t know.It’s in my opinion. It’s a type of AI (2:37) It’s a type of intelligence (2:39) It looks at different routes compares them and it comes up with a different conclusion (2:43) It can change so it can kind of adapt somewhat. So it’s using GPS, right and it’s using the information from (2:53) traffic (2:55) Cameras and traffic of some sort, right? So it’s integrating multiple systems, correct? (3:00) To and it’s sending it to your phone while you’re moving in a car at either 50 60 80 miles per hour (3:08) And presenting you with a possible (3:11) I (3:12) Was gonna say solution, but that’s not the right word (3:14) It is a different solution than you’re currently on correct different route.And then if you don’t say yes, you continue down your same (3:21) Route. Yeah, and then it goes. Okay, cool (3:23) I know that you don’t want to do that because you passed where I told you to turn if you wanted to go the other (3:27) way (3:28) Yeah, that’s definitely seems like an intelligence to me of some sort but there’s levels right like we’re talking about (3:35) So what point does that which seems to be really fucking handy, yeah, when does that become not so handy (3:44) Yeah, and do you consider Alexa and Siri artificial intelligence (3:49) Yeah of some sort that you can they respond to your (3:54) Requests or questions or they respond to some stimulus input (3:58) Right and they take data from places they’ve collected to give it back to you so they listen (4:05) Process and then spit back out.That’s some kind of intelligence. I would think (4:09) is there and (4:11) It’s artificial because it’s not it’s not biological right that so I think artificial the artificial part is just not biological (4:19) Agreed. So what was the first? (4:22) artificial intelligence (4:25) In its most rudimentary way, I would guess that automation (4:29) Any kind of automation was its first? (4:32) So there was like a robot that like served a drink that like poured a drink for you (4:36) And it was like levers and stuff (4:38) I think even it was in Greek times (4:39) It would pour and fill a glass of water and then when the weight went down it would turn off the water (4:45) That was kind of intelligent because it was not it was a mechanized (4:50) Right didn’t require your input or your intelligence to do it to do its job or to to have a task (4:57) Whereas like let’s use a farmer’s plow.For example, you’re still guiding the plow, right? (5:01) yeah, this thing was completely automated and just did it and (5:05) You know when to stop right because of the way the mechanism and the gearing and all that work (5:09) So what are your thoughts on that? What do you think? I (5:14) Never actually thought about the question until I asked it (5:17) Yeah, no for real just I believe that yeah, I guess I would say when you said some type of automation I thought oh (5:25) well, then I would say in a factory so once when (5:30) obviously Henry Ford started the (5:33) Assembly line, but the assembly line was run by humans (5:36) So right one guy put the tire on one guy did the rivets one guy did the transmission one guy use tools to get the stuff (5:44) But they still had human interaction completely there wasn’t right so then after that then they’re like (5:50) Oh, and now if you look at a car factory, it’s complete. There’s (5:54) There’s more machines and humans, right? Yes. So I’m assuming sometime after the Henry Ford (6:02) Assembly line machine started doing (6:05) Something so I would assume sometime after that.I had no idea about your water Greek thing (6:11) Yeah, it’s one of those top ten inventions I watched on one of those history channel shows right pretty cool (6:16) It was like it was like a technically waiter (6:18) It’s just a thing that stood in the in the corner or whatever you put a glass on it and you push to you know (6:24) Either flip the switch or whatever and it was like a mechanized lever and it would pour and then once it was full (6:29) It would turn off. It was pretty cool. I mean it it was automated in it in a weird way (6:35) So, I mean you had to flip the initial switch, but we have to do that for all of it (6:40) Regardless even if it’s even if it’s autonomous you still have to turn it on.So there’s always a human interaction (6:47) Okay, now you have to actually it might run forever once it’s turned on but the initial there is still a human (6:56) You have to plug it in (6:59) Yes needs power yeah you and you need it to be active you need to be on the network like you need it (7:04) Connected the way but you do all that (7:07) You have to connect to our wireless network now, they’ve gotten better if you’ve noticed it’s like hey, I found this network (7:14) Would you like to connect to it? Yeah, it’s almost like walks you through it. So it’s easier, right? So the (7:19) Guidance is the wizard. Yeah (7:23) so in that way, but I think we were talking about artificial intelligence as like what’s the (7:30) Ultimate goal or what will ultimately happen if it becomes sentient almost right? (7:38) But we had a we had a list of stuff correct.So let’s hit this. So what’s the (7:45) There’s Watson, correct? Yeah, so that was the IBM machine. I think I was the Jeopardy machine, right Watson (7:51) Okay, and that one blue was a chess one.Okay, and alpha go and then alpha zero (7:58) Okay, what I don’t know anything about alpha go alpha zero. Okay, those are very important (8:02) Okay, but I’d like you to talk about Watson and what you know, it wasn’t which one’s the Jeopardy one (8:07) I think that one was Watson, but I could be wrong did that I don’t really know did that it won it did (8:12) I think it beat that Ken Jennings guy if I’m not mistaken, I have to check that but okay (8:17) It won it won its thing because well, first of all, I don’t know if you know how Jeopardy works (8:22) But you have to hit the button (8:23) Yeah (8:23) After the questions asked there’s a time like it doesn’t work prior to the question being done asked (8:29) So the question has to finish right? (8:31) There’s like a strategy to the buttoning and I think that Watson obviously had an advantage (8:37) Well, yeah, it’s a freaking computer. It knows when the sound of the voice stops to hit the buzzer (8:41) I mean, I would think nothing can beat a computer.I don’t think that yeah (8:45) I don’t think you can go faster than speed of light. Well the so the human mind (8:50) Sending the impulse to the finger right slower than a computer, right? The good thing we have though is anticipatory, right? (8:56) We’re anticipating the button (8:59) It’s not like we’re reacting to a sound (9:02) right, we’re (9:03) Okay, the question’s done right we’re almost so we have instinct it’s not faster think a human is gonna be able to (9:11) Well, we have we proven that we’re not no, right, right (9:14) So there’s I don’t know the whole Jim Ken Jennings is gonna know every single right (9:18) He may not buzz in in time or correct, right? They’re almost blocked out. They’re almost boxed out right out of the right (9:23) So let’s just say he’s actually (9:25) 30% of the time he’s gonna get the damn buzzer, right? Then he’s got to know the answer to those correct (9:32) He’s already lost her.He’s (9:34) definitely (9:36) First of all with the human experience right now, he’s got emotion now (9:39) He’s got pressure to try to buzz in time, correct and might block him out because he might buzz too early (9:44) Yes, he buzzed too early. You’re actually boxed out of the really until it goes full round (9:48) Yeah, there’s like there’s a strategy to like it’s not just knowing shit. Okay.Jeopardy is a very unique in that (9:54) Yeah, I didn’t there’s a specific buzzer thing. Now. I didn’t watch the IBM Watson one or whatever (9:59) I think we should probably do that.Maybe we’ll have a recap because yeah, I’m really interested (10:02) I should have probably done that before I sat down. That’s all good, man (10:07) So what happened with the chess one chess one? I think that wouldn’t be just beat Kasparov, right? (10:13) That was I think that was big blue. I mean if I’m not mistaken, so it was chess (10:17) But that’s not the one that matters.Okay (10:20) Let’s finish the chess thing. Yes, because I don’t know so that a machine a computer whatever (10:26) Yes for lack of a better description (10:28) We’re gonna either refer to it as a machine or a computer or it’s basically a software program for lack of a better. Yes (10:35) Articulate description.So this machine played the best chess player in the world. Correct. Is that right? And then what happened? I don’t know (10:42) Yeah (10:42) So basically how it works and I don’t know the exact I can speak to AlphaGo a lot more than these other two (10:46) But I can speak to the chess one in this way (10:49) The chess the machine was given all of the games ever played basically just fed God the history of (10:57) humanity playing chess now chess sounds very (11:02) very complex (11:03) It does right? There’s all these different movements (11:05) They can do different angles and whatever but the combination of moves are nothing like alpha and we’ll get we’ll talk about AlphaGo in a second (11:11) But in chess, basically, what’s what they did? (11:13) They fed him all the information and then they had him play the chess master and it beat the chess master (11:18) How quickly I don’t know them.I don’t know the exact things but again, our research is fucking terrible (11:26) But that’s the thing I can speak to AlphaGo. Okay, so he beat the best chess player in the world, correct? (11:33) and I’m curious if (11:35) the (11:36) computer was fed all the (11:41) Chess matches ever played. I wonder if you just if the software architect just fed (11:48) the rules of the game (11:50) Hey, you’re such an asshole (11:53) Why because we’re gonna talk about AlphaGo in a second.Oh, is that what I didn’t I know that’s why you’re an asshole (11:59) I’m just kidding. He’s really smart. I didn’t please finish the I don’t even know what bro (12:05) Wait till you tell wait till I explain but go ahead and finish that question real quick (12:08) I had to call you an asshole bro, cuz I love you and you’re so so real quick.I just think that (12:14) Couldn’t even do it (12:18) Very (12:19) srori (12:21) Srori, okay. We need to so I’m just curious if the if the software architect (12:27) gave the computer just the rules of (12:31) Chess the knight can go like this. Yeah, the bishop goes this way the Queen can do anything but (12:36) Jump like a knight.Yeah, the King can only go one square at a time. Check is this checkmate is that pawn does this? (12:45) Would the computer have still beat the chess master because it seems to me that (12:51) Having all of that knowledge ahead of time is (12:56) a massive advantage against the chess master (13:00) The chess masters played it his whole or her whole career (13:04) But so they’ve probably experienced almost all of those moves (13:08) Almost but you’re not everything right and you forget shit, correct? (13:14) But with that humane comes in adaptability, you know, you can change strategy mid. I’m not saying it’s right or not (13:20) I’m saying you generally go with a with a plan, but you can always adapt to the plan (13:23) That’s what real intelligence happens, right? Is the adapt the way to adapt to a situation that’s going on (13:29) Does that not make sense a little bit? I do understand your point (13:33) So, I just don’t think I don’t think it’s I (13:37) Don’t think it’s fair.I think the fucking chess master was fucked. Yeah. Well, I’m getting alpha.Yeah, go and (13:46) I’ll go (13:47) So you asked a very interesting question, sir. Of course I did I (13:51) So the chess machine was fed all the information and whatever obviously and the rules (13:56) But basically knows all the moves and how to counter and what it would do like what’s the Tchaikovsky removed? (14:01) you know maneuver and all that shit like they’ve got all those different maneuvers to chess like opening gambits and weird shit and rooking and (14:07) castling and (14:09) Whatever. So alpha go.Yes, so you ask the question real simple (14:13) What happens if you just gave him the rules? (14:16) They did that my friend. So there was this thing called alpha go now alpha alpha go is a game (14:23) It has more (14:26) Combinations or more ways to play more (14:30) Choices. Yeah, then the total number of atoms in our solar system.Shut up. No joke. That’s not a joke (14:37) So you think chess is advanced? (14:39) Nothing like alpha go nothing even close (14:42) Alpha go is I think is an Asian.I think it’s in China, but it’s like 2,500 years. So what they did (14:48) They put all the 2,500 years of all the games of alpha go (14:53) into a computer and (14:55) They had to play the best alpha go champion, you know grand poova wizard, that’s the seven (15:02) It won four games to one (15:04) The computer did computer won four games to one against the human. Of course it did (15:09) But wait, there’s more (15:11) They then made alpha zero.I (15:15) May have the names wrong. So if I have the names wrong, I apologize, but there’s a second computer to which they did not (15:21) Play any they did not feed any data of any game ever played just gave it the rules (15:27) Okay, let it play itself for eight hours (15:29) They wait the computer played itself itself for eight hours very much similar akin to tic-tac-toe with war games (15:38) I’m gonna watch that again. You remember at the end where it found out it couldn’t beat itself (15:42) Remember like the tic-tac-toe how player or whatever the fuck anyway, it basically played itself for eight eight out (15:49) Whatever the time frame was I think was eight hours or so (15:52) Which computer do you think one? Okay, so wait a minute, okay (15:58) So they had two computers one one that had the rules and one that had all the bullshit (16:02) One that had the rules of the game only and one had everything ever and one that had all the games played (16:08) Plugged into it like every move back and forth so it could figure out what to do (16:17) I would say the one that just had the fucking rules it did.Would you like to know the record? (16:22) Sure (16:23) 100 to 0 (16:27) Let me say that again. I’m looking at your face (16:30) 100 (16:31) to 0 (16:33) it won a hundred games to none against the computer that beat the humans four games to one and (16:39) What’s funny about that when we talk about what we’re gonna talk about about AI? (16:44) Why did you even have to plug in the fucking games that humans played without human interaction the computer? (16:49) Beat the fuck out of (16:51) 2,500 years of humanity of actual playing experience (16:56) and (16:57) All it needed was a rules. Is that because that the computer with all the information? (17:11) was just too (17:12) complex and the computer that was (17:17) Simple I don’t want to use the word oversimplified, but that was best basically what happened (17:25) I guess but what I say is that (17:28) 2,500 years of humans playing Alpha go fucked it all up.No is it’s not limitless (17:34) It’s limited by the 2,500 years and all the games played. There’s other okay, but it but this computer (17:40) just knowing the rules and playing itself has (17:43) There’s no limit. There’s no edge to its (17:46) But it also has to understand how to process the correct information, right? (17:53) That’s fucking scary (17:55) There’s something out there that can beat the thing that beat us (18:01) And the thing that beat us (18:03) Was all of our human experience.Have you seen Terminator? I (18:07) Have or we can go right sir, man (18:09) Dun-dun-dun Skynet, holy shit, dude. So that’s an interesting thing, isn’t it? I mean, would you hi? (18:15) I’m a computer that has 2,500 years of the humans playing Alpha go I beat you four games to one and (18:20) The then a new computer without any limits no just the rules beat me a hundred to nothing (18:27) It’s pretty he that’s (18:30) humiliating not humiliating a (18:34) humbling for sure (18:36) But we create it that’s even better. It’s crazy because it goes and feeds into other questions.Yeah later (18:42) So what else we got after that? So we covered that. Yes (18:45) Computers right now are doing some pretty fucking crazy and we don’t even really (18:50) I (18:51) don’t think that the human race has (18:55) Scratched the surface I (18:57) Don’t I think that we as a race have I? (19:02) think the processing power (19:06) We’re not even (19:07) We’re not even there yet. I think the amount of the speed and size and (19:13) capabilities of (19:17) Computers I (19:19) Just think our art.We’re just getting there. Yeah, we’re like, oh, we’re on the (19:24) cutting edge right core processor 9 (19:28) We’re not even yeah, but Moore’s Law right speaks about doubling (19:34) Up to a certain point, right? Yeah, so it’s doubling every two years (19:38) So basically computing power is doubling every two years right now. Yeah, that’s a think about how much that exponential growth is, right? (19:46) Yeah, it’s disgusting one two, four eight sixteen, right? (19:49) So kind of like give me a penny on day one two pennies on day two.Yeah at the end of a month (19:54) It’s like ten million dollars (19:56) Ridiculous number. So I don’t it’s that which (20:00) But there’s a limit right now right because as we get the thing smaller and smaller to fit more stuff onto it, right (20:07) They’re the it could quantum jump. That’s why you those nanometer like when they talk about this how small they make their chips (20:14) Yeah, there’s a point where it’s so close that it could jump onto the other (20:19) The other pathway.Yeah, that’s why that’s one of our limitations. So the Moore’s Law is slowing down (20:24) It’s going to stop at some point very soon here (20:27) But then we have quantum computing (20:30) Which uses quantum particles? (20:32) So instead of a computer processing path one path to path three path four and comparing them (20:38) It can look at all four and just find the shortest one at all at the same time (20:43) Yeah, it’s both a yes and a no. It’s a one and a zero at the same time for every fucking chip (20:49) like think about how (20:51) That’s just going what that’s gonna do for our computing ability (20:56) Pass code breaking all that kind of because you’re looking at every password at the same time.Yeah code breakers would (21:02) Yeah, it gives you a yes. No instantaneously, right? You’re like, oh shit. It’s both.It’s every password all (21:08) Like it’s kind of weird it’s quantum the quantum world is really fucking thing (21:13) But we’re working with qubits and all that kind of stuff. So that’s where we’re going (21:18) But I spoke with you a little bit about you Ray Kurzweil and you’d not heard of him, correct? (21:23) He actually wrote a book called the singularity and if you’re not familiar with the singularity the singularity (21:29) Claims it by 2040 or so. That’s when we are going to meet with (21:35) Technology and kind of merge as one.So we’re 20 years away. What does that mean? (21:42) Merge we’re gonna be indistinguishable from technology. Basically, it’s all gonna be integrated like (21:49) That’s that’s what he claims or he thinks from the speed that were you want to give me a paragraph fucking summary of the book (21:56) that’s basically what he says singularity is when we all when technology and (22:00) biology meet as kind of a (22:03) Where it’s the same (22:05) You call you call a phone number and someone answers and you have a conversation with them and have no idea that they’re a computer (22:12) That’s basically so it could be human or it could be a computer (22:14) Correct and you walking down the street and that person that that thing could be a computer (22:19) It could be an Android or it could be a human right and we don’t know about the Android human thing because that’s a really (22:24) That’s a much that’s different (22:25) But it’s much more hard to make a human body like human human that we wouldn’t be able to distinguish (22:30) But I could totally see it with customer service and chat (22:34) Chat rooms or chat like yeah, I mean, yeah (22:37) Yeah, because they’re not there’s not a like an interaction like looking at you.I don’t you know, I mean, yes (22:43) Cuz I know you’re a computer (22:44) I mean, I think it’s hard to make the skin perfect unless you could get that the texture and all right (22:49) And that’s a different type of technology. We’re just talking about being able the Turing test. Are you familiar with the Turing test? (22:54) Yeah, I mean, I know that I know the name but I don’t know what it is (22:58) So the Turing computer was something a guy put together and basically it’s so you would not be able to tell a human from a non (23:05) Human, that’s what the Turing test is.There’s a movie called person. Yeah X machina (23:10) There’s a movie called X machina where I know the guy meets like an Android and he’s trying to pass her (23:16) He’s trying to see if she can pass a Turing test. Basically, that sounds like fucking Blade Runner is she human? (23:21) Yeah, it’s exact.Yeah, okay. This is a big thing. Well, it’s very similar to that.It’s (23:27) It’s not like that future it’s like it’s hard to explain but yeah, the movie premise is on an Android (23:32) But you’re trying to see if it’s real or sentient or not (23:35) basically, so you’re trying to see if the Android is (23:38) aware of its own existence and (23:41) Able to communicate with you like it’s a human like there’s no stutter (23:45) Like you hear you hear those pauses in Siri and Alexa when they talk (23:49) It’s gonna be to the point where it’s gonna be like, hey Chris, what’s going on, man? (23:53) What’s going on? But you have you seen her? Have you seen that movie her? That’s the one with watching Phoenix and (24:01) Scarlett Johansson (24:09) It’s cute (24:11) She’s got a good voice it works, but basically it’s like hey Joe, how you doing? He’s like awesome (24:18) Susan what’s what’s the weather like out there? Oh, man. I’ll tell you it’s hotter in a fucking (24:23) Whatever, you know a two rats fucking in a little sock I (24:27) Have never heard that and that’s the one I’m gonna use from now fucking on now that we live in, Arizona (24:31) Oh two rats fucking in a wool sock. That really is like a thing (24:34) That’s what Rob from Shreveport says from Shreveport.Please. I am. Oh, he’s in Portland now (24:41) But yeah, Shreveport Lind Shreveport Lin (24:45) Stumptown but yeah, so that’s where it’s gonna be the point (24:48) That’s where it’s like, I can’t tell if I’m talking to a human or not.That’s that’s really where we’re gonna be (24:54) Okay (24:54) Now in Fox now the Android stuff you’ve seen movies like have you seen the movie with Bruce Willis was surrogate where you? (25:01) You have an artificial body that you like you’re at home and you remote into it kind of and all your mental stuff gets (25:08) Really kind of it kind of goes into our consciousness. I’m learning a weird way, but it basically becomes our avatar, right? (25:15) Yeah, it’s basically what it is. So (25:16) Anyway, so that’s kind of where that’s at.So the singularity is the big thing and (25:22) That’s around that time 2040 is where we think if Moore’s Law continues the way it does (25:27) But there is a limit to Moore’s Law because of the quantum part. So if we don’t solve the quantum question (25:32) We cut Moore’s Law. Does it state that? (25:36) Every two years it doubles correct, but does that also mean it has to shrink? (25:42) Now the size of the computing power no, it just means that we can get more stuff onto a chip for example (25:49) So that it takes into account why can’t the chip get bigger larger like that? Yeah.Yeah, you do (25:55) But it still has to cope through the pathways (25:58) So I think the size of the chip also eliminates this distance, right? (26:03) So even though it’s just this much and it’s going the speed of light (26:06) It’s less than that much and less than that much. I (26:09) Just I don’t know how they’re gonna solve the problem. Why don’t you help them? You’d know what you’re talking about (26:15) I don’t know what the fuck I’m doing, man (26:17) You know who knows (26:20) Check dr.Check mark. Oh check mark emails (26:24) We are working with these like silicone chips, yes, we take them out of stripper bra brass and we make chips out of them (26:32) It’s very good. Very very delicious (26:37) Then we dip them and eat them like Doritos, yes afterwards afterwards after we test them (26:42) Yes, it’s all the ones that fail chips in bulk this speed of nacho cheese.Oh, that sounds good and cool ranch (26:52) Doritos, I like that. No, but yeah, so that’s where we’re at on (26:56) The Moore’s law, so it’s doubles every year or every two years (27:00) until that limit (27:01) Once again, I think we’re gonna have to have a follow-up after we listen to this and get our shit together because we did (27:08) We did some I mean we know (27:10) No, this movie. No (27:14) Me no, no, no, we haven’t seen any X machina (27:19) X what that’s a scary motherfucking movie, by the way, I keep seeing in my Netflix thing.I’m watching free watch it (27:24) I’m just telling you you’re not gonna like it. You’re gonna love it (27:27) You’re a terrible. I don’t want to show a person sir.I’m not gonna show (27:31) The fifth best movie watch that this weekend or this week with the Meg’s (27:39) bro, bro (27:41) It’s in the fucking Bible, how do you not know that the fucking hammers in the goddamn? (27:45) I was like the second I’m like X machina has a Bible in it (27:48) Do you cross the streams and I didn’t even know who you were across? (27:51) I thought we were talking about shushing after we shushed it. Oh, I was going back to X machina. Oh, I watched that by myself (27:58) But Gina, I don’t know why I think it sounds like a Mexican song (28:01) I feel like an terror Antonio Banderas needs to be in that one.I agree (28:04) But isn’t he in like that X machina X versus sever? He’s in some other it sounds like a Desperado part doesn’t it? (28:13) machina see (28:14) machina (28:16) Where’s someone I don’t know. Where is she? Where did you do with her? (28:20) Where did we put her in that fucking shit? You never find her mother (28:23) Yeah, is I going to shoot you is my rocket launcher from my fucking guitar case fucker like come on get out of here (28:31) Okay tangent. Yes.All right. Anything else? I would hit the dive horn, but I can’t reach it (28:44) All right, so we covered that shit (28:48) Watson big blue alpha go alpha zero. Yes, sir (28:51) Moore’s Law and (28:53) The singularity.Yes, sir (28:56) boeing (28:58) Do you think that? (29:00) You consider how (29:03) From Space Odyssey AI. Yeah, and that was like (29:08) 1960 yeah, so (29:11) in 1960 (29:13) before the Vietnam War before (29:16) You know in the middle of the space race (29:20) Or a silicon chip before all the silicons in chips weren’t they just cards back then I wasn’t a punch cards and shit (29:27) I was like minus a lot (29:30) I was minus a few years. I’m like, but at Stanley Kubrick was like hey (29:36) We’re gonna have a movie about artificial intelligence.Yeah, so (29:40) It was always a talked about subject. I mean like automation and being having automatons. Well, it was always a (29:48) underlying (29:50) Desire of even before then was um lost in space, right? (29:54) Will you robberies robots, right? (29:56) And instead of the robot it the ship was the robot really right just was more of the brain intelligence (30:01) Versus a robot protecting you that is all right program to do it didn’t really have a thought behind it (30:06) But how’s when he goes? (30:08) When he goes, I’m sorry Dave.I can’t do that. You’re like (30:12) Holy shit, the computer saw read his lips (30:16) Yes, and the and the computer is (30:20) Scared for its own existence. That’s beat.That’s (30:24) simplifying (30:26) That’s my perception of what happened man before you make it real quick. Fuck. Yeah, you may instead of scared (30:33) Let’s use aware that (30:38) Yes aware that an action could cause its ending yes, it’s seizing right like what are you doing it’s hard (30:44) Let’s be a little more based on because we have to be yes a computer can’t be scared because it’s hard emotion thing (30:50) It’s scared as a feeling and I feel like it’s a biochemical reaction.Yeah, you’re right. You’re right (30:54) You’re absolutely can’t make that. No, you’re correct.You’re correct. I mean the computer was aware (31:00) Yes that they were gonna try to end it. Yes, correct (31:03) It was it was going to see so in its sentence of self-preservation correct versus fear (31:09) But just general self-preservation, but God put in them in (31:12) 1960 yeah, these people are having thoughts (31:17) That computers are already self-aware (31:20) Yes, the fear is there which makes that movie a (31:26) Horror movie as much as it makes it anything else.So, how do we and (31:33) then in 1984 Terminator comes along and (31:36) they talk about Skynet and and and (31:41) How (31:43) At a certain point in time the computers (31:47) Become aware that the humans are the evil ones (31:50) Which I don’t necessarily disagree with because we’re not great a lot of the times we’re gonna get there. Don’t worry (31:57) We’ll talk about I’m saying we’re gonna talk (32:00) So, how do we not these are movies, right? Yeah (32:07) You’ve heard the term life imitating art. Yes, exactly and also (32:11) The term hiding in plain sight (32:13) How do we not know these things aren’t really going on the artificial intelligence world is? (32:21) How do we not know this hasn’t been going on for a long time? How do we know Siri or fucking Alexa hasn’t been a thing? (32:29) For way way way way longer than six years ago when people started buying goddamn echo dots (32:37) Technology (32:38) Moore’s Law, I (32:40) Mean, it’s part of it, right? (32:41) We first of all, we had punch cards and we had to go to silicon that we had to get smaller and smaller smaller (32:45) Remember, do you remember your first computer? Yeah, fucking PC jr. Yeah, 19.Whatever the fuck how many how many? (32:53) Mega bytes was a hard drive (32:55) If that 64, I don’t know dude, right? So 32 (32:59) I don’t know (33:00) I had a Macintosh a little black and white all in one piece with a little floppy disk three and a half inch guy (33:05) We had an external hard drive. That was about yay big. It was a brick and it was like that’s the sound made (33:12) 20 megabytes 20 megabytes (33:17) My I think this lighter has more (33:21) megabytes (33:22) No, but seriously like honestly this remote control probably has more (33:26) Yeah, some kind of computing power than that has of course, right? Yeah, so that’s why because (33:32) You knew how robotic the answer was or the delay between processing and responding (33:38) That that had to shrink to acceptable levels like I say hey (33:43) Alexa turn on the light (33:46) Ding (33:48) Back then it would have been like turn like and as processor (33:52) I has to get through the process and then maybe get to it and then turn on so (33:56) The idea in the idea the concept has been there.Well the concept of voice (34:02) of you saying (34:05) Something well, yeah, we had the fucking clapper. Well, it’s the same thing. It is a voice (34:10) I mean, it’s really a voice for it’s a sound receptive and that was in the 80s, right? (34:14) Right, and it’s more Alexa’s more like it can distinguish that you’re saying Alexa.Well, yeah clapping or well (34:20) Alexa could have been called anything could have been called dickface. So it could have been fart knocker. What a fart knocker (34:28) Oh (34:28) Why wasn’t Alexa called beavis, you know (34:33) Settle down beavis.How can I help you today? It was taken (34:38) Damn it. Damn it. I’m done putting turning on your fan (34:42) I’ve added paper towels to your summer list.Oh my god. Oh, I (34:48) Put toilet paper on your list. I (34:51) Won’t settle down.Oh, I need to fucking yeah (34:57) Paper as we hear dogs barking in the background. I hope you guys don’t hear it on the recording because it sucks (35:01) We’ve got a blind redheaded dachshund downstairs (35:06) so I (35:07) Guess my point is that in (35:10) 1960 how 1924 years later (35:14) Skynet, you know (35:15) So and the seven don’t don’t dismiss all the 70s shit first of all Westworld (35:21) Right and then I mean right which the reboot is fucking ridiculous (35:25) It’s very good (35:26) And now we’ve got the technology to kind of make it really almost come to life in this really scary fucking way scary (35:31) You had Saturn 5 and just weird movies like where? (35:34) Robots took over spaceships in the 70s had a lot of b-movie type stuff, too (35:39) Yeah, like after I think it was launched by space a 2001. I think (35:44) Just launch this whole like open-minded (35:48) Computer technology.Yes kind of world. Yeah, and I think it I mean Star Trek (35:53) That just tells you how smart Kubrick. Well Star Trek was in the 60s.Yeah, it was after that (35:57) Yeah, right after 2000 or 2001 was after is that correct? No Star Trek was after 2001 (36:03) Really? Well, he doesn’t one was 60 Star Trek was like Oh 1960 or 68 6 0 I believe 6 0 I believe (36:10) Oh, yeah, then it was before. I’m fairly certain. I mean I can look it up.I’m not (36:15) Bro, I’ll do it while you while you talk some more. Yeah, I guess the point is that (36:21) this has been in the forefront of (36:24) the (36:26) entertainment industry for (36:28) 60 years (36:30) Yes, so unless my mouth is way off. I 50 plus 60.It’s 60 years from 1960 (36:35) And if I’m wrong, and then I apologize (36:39) You’re only right in a different way (36:42) Yes (36:43) As in no way like the Pope but totally different (36:47) So (36:50) 1968 I was always after Star Trek fucking I thought you said 68 (36:53) I swear you said 68 when you started I’m just 68 (36:56) We’re gonna go back to the I think we’re gonna when we go back to the tape (36:59) We’re gonna hear 68, but my regardless Star Trek was first (37:02) So it really had a computer respond to my shit. Remember the communicator became a cell phone and all these other things, right? (37:08) there was there was a lot of (37:10) Technology talk back then. We just didn’t have the ability to do it right to make it happen, correct? (37:19) so I just I the the the idea that that these I (37:25) These thoughts have been in the public eye for so many years (37:31) And it’s almost like hey, let’s just put this out there (37:36) So no one gives a fuck when it actually becomes real right and that’s (37:42) That bothers me.Can we go? Can we go a little deeper than that? No that same point? No (37:47) Yes, go (37:49) The gooks it’s gonna call it the gooks. I don’t know another way. I’m just gonna call the gooks the G’s (37:55) The gooks is like 70% of all online (37:58) Advertising for example, you know think about the data collection (38:01) Also, just think about the data collection (38:03) You remember when the gooks and the yaz and the and the and the other ones were just hi (38:10) We’re your friends.We are not the government (38:12) We are give us your data and we will we will make your life so much easier (38:17) We are this holistic don’t be evil company that they removed by the way from their mission statement (38:23) So, let’s do that so they’ve been collecting data Oh Waymo, oh no shit Waymo is fucking part of Google (38:29) It’s part alphabet. Oh fucking all the autonomy all the fucking companies that we gave our data to thinking like (38:35) Just giving it willingly is now going to really cost us (38:41) Because they now they (38:44) Now run it all (38:47) Them them who they live (38:50) They are Skynet my friend. We look but see this is the problem.I think human humanity (39:00) I’m a I’m a I work hard for whatever pennies and dollars that I get and I’m distracted with life (39:07) and I got a lot of shit going on if I get something ought to make my life easier like I can tell Alexa to (39:11) Start the coffee maker or to whatever (39:14) I’m gonna just do it right because I don’t care. I don’t think about the ramifications of that information (39:19) But we need to run that further down the line, right? (39:23) Because AI needs to solve problems. It’s not just there to (39:28) Do what we tell it unfortunately (39:31) We’re using as a tool currently (39:34) But there are some perils to that and I don’t know if we’re there yet on the common person (39:39) Does not think of that.No, I would a common person. I (39:44) Got an echo dot for Christmas a year a year and a half ago (39:48) Very nice gesture and you know, it’s like 39 bucks on yeah on the Amazon and to be clear (39:53) I have three echoes in my house. I’m not gonna lie.I have them (39:57) So I never would have bought one because I don’t (40:01) Right. I don’t like it’s not a thing. I’m I’m not I’m not technology phobic (40:05) I mean, I have a you’re highly technical.Well, I have a skill. I have a router, you know, and uh (40:11) But your career, that’s my you in a technical field, correct (40:15) but I never would have bought that because I just don’t think right the the only thing that I use the dot for is (40:24) Alexa (40:25) Play Sirius XM. That’s the only thing that I I don’t have a shopping list.I don’t ask for the weather (40:32) I don’t right. I don’t do anything else. That’s it is fucking music.I don’t I don’t even know I hear (40:40) Oh, yeah, she tells fart jokes. Really? Oh, yeah, allegedly allegedly (40:45) Hello, I wasn’t gonna do it. I was gonna hold off.We got a spirit. We gotta use a sparingly (40:50) No reason so much I love it so I didn’t wait till aioj fucking comes up (40:58) I’m gonna stab you now (41:01) So it’s it’s a I (41:03) Wasn’t even aware that you know, oh it turns the fan on and it turns the lights on and I didn’t know that shit (41:09) I was like, wow, that’s (41:11) I didn’t know your house could even be networked. I (41:20) Do you know what the one of the selling features was it’s networked shit was Alexa fucking enable (41:26) I walked in told the fan to turn on turn the told the fan light to turn on told the living room to turn on (41:31) the overhead the cabinet the outside the inside (41:34) the mirror behind the the fucking bathroom the (41:38) Illuminates, you know, like was a fucking demolition man (41:44) illuminate (41:45) What’s his fucking face Wesley Snipes? (41:50) Was it Snipes so that was actually a pretty cool selling point until I got in here went, oh fuck (41:57) row row cuz (41:59) This is what people don’t think about and I’m about I think I can blow your mind.So please (42:04) You say Alexa? (42:07) It can’t react unless it’s listening (42:11) the whole time (42:13) for the word (42:15) Alexa (42:18) Does anyone not understand that let me let me okay. Hello Twitter world (42:24) I’m gonna give this is how it works. So you have to say Alexa and then Alexa says (42:31) Yes (42:32) but if it’s not on to listen the whole time it won’t hear itself it can’t turn on at (42:40) Alexa up.I (42:42) Know no way crazy. I know I know crazy, right? But (42:48) So, what does that mean? It’s real simple. It’s listening all (42:53) the time (42:56) So, do you remember when you and I were talking about the Catholic Church a couple podcasts ago (43:01) We have to go there bro, cuz I knew it was coming (43:04) It’s hard to talk about synchronicity without talking the other side of this shit (43:08) So we can talk psychic all you want but a week before I would say a week before was it (43:15) The week we had recorded something and then we said what are we gonna do next week it was like four or five days (43:20) Six seven days.Okay. Yeah week sounds like a week. That is a week.It’s half a fortnight everyone (43:30) So Chris and I are talking what are we gonna do? (43:32) What are we gonna do? (43:32) We do the podcast on Friday and what shows up in the mail Saturday that book from Michigan that takes three to four days to (43:40) Shift. Yeah, so the question was to you and we weren’t we didn’t bring it up on the last thing (43:45) We only brought up as a synchronous event, right? (43:48) Were was it synchronous? Are we on the right spiritual path with Knocked Conscious? (43:54) talking about the atrocities of really bad fucking people in really (43:59) Protected organizations, is that are we on the right path or was it listening to our fucking conversation last week and (44:06) and someone bought the information from (44:10) Amazon right cuz Alexa so when hey Amazon if you hear the word Catholic or church or negative or whatever (44:17) Let us know so we can ship a free book out. I (44:22) Don’t know that answer.What do you think it is? (44:26) Me (44:28) Yes, I (44:38) don’t I (44:41) Don’t think it’s explicable (44:43) Would you think it’s at least one of those two answers? (44:47) Like can we eliminate? (44:51) Coincidence yes, I (44:53) Have fun. Fuck. Yeah, absolutely.I’m comfortable. I’m comfortable eliminating (44:59) coincidence from that (45:01) But then it’s one of those two things right we (45:04) We’re on the right path, bro (45:06) And it’s just the universe was telling us because it knew that we were gonna I don’t know how that worked (45:10) I can’t I can’t quantify it in an equation (45:13) But I could quantify that it hurt us talking negatively about the Catholic Church the week before (45:18) Everyone hears me talking fucking goddamn people in fucking Louisiana hear me talking negatively about the Catholic Church (45:25) But it wasn’t mailed to your house. It was (45:28) My mother this morning heard me talking this guy’s house (45:31) No, so anyway, but (45:34) To answer your question at this point I can’t say I (45:45) Can’t say why you got that the only thing I can say it’s creepy as fuck I (45:52) agree, I (45:55) Also can’t say why how that happened (45:58) But I can say I can my those are my two thoughts (46:02) We are on the right path, and it all was lined up that way to work out (46:06) Which is way beyond our scope of understanding currently it’s not that we won’t get there (46:11) But we’re not there now right and then the other thing was it it listened and hey (46:16) Catholic negative send book I mean (46:22) It would have that makes sense we talked about it.I think it was on a Sunday (46:25) We talked about it because we record it or something so Monday (46:28) It ships out and it’s there Saturday or whatever it ships out (46:32) What you know it has a process go through its little thing and get it and in case anyone’s questioning what we’re talking about (46:38) Knocked Conscious after the church. It was the one about I think courtesy and respect right it was a conversation about the Catholic Church (46:45) Yes called, but the next (46:47) Podcast when you showed the book was courtesy and respect fall not with Anthony right Chris courtesy and respect correct (46:52) So if you happen to go back, please watch all hello to the world watch all that shit or listen to it (46:57) But it was during that podcast that we talked about it (47:01) the book so (47:04) one of those two and (47:06) both of them (47:08) One of them I don’t understand, so I’m not there yet, but one scares me hard absolutely (47:13) and that’s kind of the reason why we (47:16) wanted to do this topic is because (47:21) I’m fearful of AI in the future, and I don’t (47:26) I don’t to me that and I this perhaps is my (47:31) pessimistic view on certain things I don’t see the upsides of AI outweighing the downsides and (47:39) I (47:40) Want to be convinced differently. I would love to say here’s something to say oh yeah (47:44) Hey, I has 87 amazing things.It’s gonna do for humanity and the downsides there aren’t any we’ve already got all these things (47:52) These no, I’m serious like yeah, if they go. Oh, no that can never happen because we’ve got this place is redundant (47:57) We have all these ways, and we there’s no way this can happen and (48:04) Well you know my favorite Apostle is doubting Thomas, so it you know so okay (48:13) Yeah, well it goes way bigger than (48:17) It really does come down to who’s controlling whom right? I mean, let’s ultimately (48:22) Who’s pushing the button on the nuclear device? (48:26) Who’s pulling the yoke on the pie on the on the on the plane right who’s turning the steering wheel yeah? (48:32) And doing all that right when we talk about AI (48:34) Those are the places where we’re going to implement it right automation seems to be self-driving cars seems to be the oh yeah (48:41) Hot button fucking topic trucks everything and that’s what’s funny about it (48:45) the problem is going to be I mean just the integration of (48:50) Automatic you’re talking about a logical process unit (48:54) reacting and interacting or whatever in the way a logical unit does and (48:59) Then mixing in an emotional unit next to it (49:02) Who had a really fucking bad day or a drunk one or just got fucking fired or was it? (49:07) I’m drinking or has been texting or been eating either fucking fucking Big Mac dripping all over them while they’re putting their makeup on (49:13) changing diapers (49:15) Saw that one yeah, you did saw a diaper out the window (49:22) People so much is there any way that check mark anyway can contact? (49:29) Some people at slipknot so that we can use people equal shit as the opening song I (49:35) Would like to ask them please, but I I am hanging out in Chernobyl right now (49:40) And I’m just a very comfortable warm warm sizzly warms, and there’s animals everywhere (49:46) You know I why cuz humans knows they’re this this is this is what I come down to right like dear (49:53) tangent time (49:55) So Chernobyl everybody do you remember that thing you remember went when? (50:00) Chernobyl went and then class and all the whole thing and then everybody all the humans went (50:04) And there’s this thriving ecosystem now and Chernobyl cuz there’s no people yeah cuz fucking people aren’t in the fucking way (50:12) Let’s just be on like that is a fact that that ecosystem is thriving because of the lack of humanity (50:18) Chopping down a tree or killing them or whatever right? (50:23) That’s my concern about AI right there (50:26) So moving on yes, are we ready check mark to move along? Yeah? Oh, so it’s mr. (50:31) Slipknot please let us use what is this thing called people people equal people equal shit shit? (50:38) We’d like to use people equal shit on our podcast (50:43) Please that guy’s really spiritual to this the Slipknot guy. I’ve seen some interviews already the leads (50:48) Yeah, he’s really like really deep.Yeah, yes, and I think we should we should send him an email and ask him, okay? (50:54) Can you do that would you like me to compose that cuz I love to let’s ask Cory dearest Cory dearest Cory (50:59) Please watch our episodes and let us play your music all that shit (51:07) What the fuck before the tangible we’re talking about I don’t know about flying planes what it’s a control comes down to control (51:13) Isn’t it always about that it is it’s about (51:17) Yeah, okay, it is, but I’m not gonna even say anything (51:22) Control what what about it, please say well cuz we’ve got like hours to cover we got hours 1.3, dude (51:30) How many are there’s only like six right? There’s eight and there’s other and there’s another one on my other list (51:35) You’re not even in our what we’re still in our one. I know this man (51:40) Welcome to the podcast ladies and gentlemen, so my point I was gonna say it’s all about control (51:44) We’ll look at what’s it would June 2020 (51:46) Look what’s going on around if you look at the fear that’s being pushed by the every level of government and every media outlet (51:53) You look at it’s all about fear which fear is about control. Yeah.No, you can’t go in peacefully assemble (51:59) There’s a curfew right? No, you’ve got to wear a mask (52:03) That’s about fear and control. Do you know the number one? (52:06) Control (52:08) Stay in your house that we’ve done (52:11) The thing on your phone. How about debt? Oh, yeah, that’s yeah.Well, I’m saying like (52:17) Absolutely how we are controlled (52:19) This is a total tangent from you fucking think so, but how we’re controlled is us owing people (52:25) Yeah, we owe people money for our cars and our insurance and our car and our house (52:30) We don’t even own the places in which we live right until I paid off, right? (52:34) Technically, we don’t even own that and car loans didn’t really even start until like the 70s (52:38) Yeah, I mean credit was a big thing credit was a big thing that opened up economies. Of course getting credit (52:44) Yeah, that makes sense. But yeah, but don’t Americans didn’t buy things (52:49) They couldn’t afford correct and they didn’t buy things.They didn’t need that’s a whole that’s century yourself (52:55) We’re gonna talk about that in another podcast (52:57) Well, I’ll bring that one up after so we can go into that but you’re right (53:00) But it is about control, but this is the question about AI (53:04) What’s the next bullet point because I know I think that’s the final one. We’re probably gonna talk about (53:09) What’s next because Chris is driving? (53:13) He’s driving. He’s (53:18) Yeah, so (53:20) This is the big one.This is the one that started at all. I (53:24) Don’t even know this one because I’m not even looking at the list. So I’m very scared (53:30) If and when AI becomes self-aware (53:35) Yeah (53:36) What does that mean to you? (53:38) Let’s explain what self-aware means to you before we talk about it because I bet you there’s a gray area of what self-aware sure (53:46) to me, it’s when a computer or (53:50) Software program or whatever you want to call it recognizes its own consciousness (54:00) Realizes (54:01) It’s alive (54:03) Can contemplate its own existence? (54:06) That’s how I see it.I’m (54:10) Wondering if I’m okay with that definition of self-aware. Okay, where I’m scared of it (54:17) My question would be (54:19) Self-aware would mean the computer has a different (54:23) option (54:24) than the one given to it and (54:27) Does it? (54:28) Contrary to our input. Well, I think those are different (54:34) Yes becoming self-aware and doing something bad I think are two different things not bad.I didn’t say bad (54:40) Contrary to what we input (54:43) Meaning I tell it to go right and it says no, I’m going straight (54:48) Yeah, that’s that is where I think that consciousness and self-awareness comes in because it’s going it’s not only (54:54) Saying no the straight way is the better path. Please choose it. Yeah, it actually (55:00) Overrides your input as the human the ultimate control.Yeah, I don’t see that as self-aware. I see that as (55:08) That’s it’s no longer (55:10) Self-aware is hey, I’m a sentient being right then if it chooses a different path (55:16) That’s complete. It’s (55:20) That at that point it’s no longer in (55:23) Human, it’s no longer reacting to the humans (55:27) Input right? That’s a completely separate (55:31) I think that (55:32) Equals self-awareness.It’s telling you not only is this the better path (55:39) But because you’re not choosing it and I know what’s right. I’m gonna choose it (55:44) Remember, we’re talking about it not responding to your input at that point. Yeah at that point (55:49) When does a child not obey a parent? Yeah.Oh, yeah. I mean that’s kind of how I would that’s how I’m kind of looking (55:55) At it. Yeah, I’m not saying I’m explaining it properly.Maybe I’m not explain. Oh, I know. Yeah, I totally understand (56:01) but that’s where I think it comes to right is (56:05) Knowing its own awareness is totally acceptable to me.I (56:11) Can work within that I (56:14) Don’t I can’t give it the freedoms of humanity on us because we’re using them as tools, right? (56:19) We’re using AI as tools to help humanity (56:22) But at sentience (56:24) Do we let it go? Is it slavery? I mean, that’s all that’s what I really get into down list as well. Okay, but (56:29) At to your point if if you’re in the passenger seat of a (56:37) Self-driving car. Yeah, and you’re let’s say you’re a tester for the new self-driving car red dot company (56:43) whatever that fuck sweet whatever it’s called and (56:47) you’re (56:48) Programming it and you say make a left and red dot car number seven says no check mark.I’m gonna go straight. Mm-hmm (56:57) And you say no red car seven go make a left no mark, I’m gonna go straight and it goes straight. Yes (57:06) Turn it all off (57:10) Not burn it down turn it off (57:14) Your definition of self where I would do this ready, let me give you that example.Oh, I’m sorry. No, no, I apologize (57:19) No, you said go straight and it does straight. So yeah.No, I want to know that was my thought (57:24) I wanted you to turn it off with that was I wanted to hear your answer (57:28) So, this is (57:31) So when you talk about self-aware though, this is where I’m like, hey make a right and the computer goes (57:37) bro (57:39) Because you can program to be anything to be a surfer Australia Australia set brow. I don’t have brownie bro. Um, whatever mate (57:46) Hello, mate (57:47) It basically says you’d be a fucking idiot.If you make a left you should totally go straight and I go (57:55) Car I’m going left (57:57) You’re just a fucking moron for going left, but it still mate lets you go left. That’s self-aware, right? (58:03) But not it’s not just obeying you. Yes (58:07) that I’ll accept because you could probably turn the voice off or do that like (58:12) You know what? I mean? Yeah, it’s talking shit being jovial like you programmed it to do (58:18) Yeah, but it’s not gonna go fuck you.I’m awake enough to defy. Yes (58:23) And I feel like defiant that first because I it is a parent-child relationship. We are making our children (58:29) You know what you’re talking about, right? (58:32) No fucking bastard Galactica (58:35) You are yeah, you’re talking about Cylons, dude (58:38) It’s like the best you’re absolutely talking about Cylon and it’s dark as fuck (58:42) Ronald D. Moore does not fuck around with that shit, but that’s basically what created the Borg and fucking Battlestar.Oh, really? (58:50) Cuz he came from Star Trek, okay and Deep Space Nine was a dark fucking (58:55) Thing that he was really ahead of like he was on that Voyager and all that there was a lot of darkness (58:59) Anyway, not darkness more like the honest side of this fucking everything that we’re talking about (59:05) Well, that’s basically what you’re talking about. Yeah, we’re making our children (59:08) Yeah is the when the end when the androids whatever when this when the non humans (59:15) Little become (59:17) Self-aware. Yes, what and defiant what happened in? (59:22) In Battlestar Galactica, they bombed the shit out of the humans what happened in Terminator? (59:26) They bombed the shit out of the humans.So do you see a repetitive idea here? (59:30) You do but you see it from the filter of the people creating those shows (59:34) You also have data who is an artificial intelligent and added who we love (59:39) Who just is this awesome thing that never heard anyone it was awesome, right I (59:46) Would argue he was pretty much as sentient as a thing could be (59:49) Absolutely, you didn’t real other while other than knowing like seeing him and knowing hit artificial skin or whatever (59:53) You wouldn’t know on the phone whether he was an Android or not (59:57) Or what if you were him talking on a thing? (1:00:01) Right. I mean that that was that was the Gene Roddenberry world, right the utopian member Star Federation was beautiful (1:00:08) Everything was you know, you had Klingons you had enemies, but the enemies weren’t woke yet. We’re woke and right.I’m a janitor (1:00:14) I’m pushing a broom cuz I want to yeah. Okay, right. Yeah.Yeah, that sounds right (1:00:19) Like I’m a captain of a ship not I’m not getting it doesn’t I’m not getting paid more. There’s no need for anything (1:00:24) There’s no money. Yeah, there’s no money like that doesn’t make it’s really weird because it (1:00:29) the the Roddenberry (1:00:31) Mentality is that humanity strives to be better (1:00:35) Regardless of whether things are provided and providing the base things will allow us to just be better (1:00:41) The other side of humanity is the part that I more see is us draining on every system that we are.We’re parasites (1:00:48) Yeah, we (1:00:51) Guys look at the podium. I’m so sorry everybody (1:00:54) We chopped down trees. We fucking burn forest.The fucking rainforest is 80% of our fucking oxygen (1:01:02) In the world and we are burning it down to fucking make sugar really yeah sugar canes (1:01:08) Whatever just clearing fucking forest to make soy for fucking cows to eat so we can eat the fucking cows (1:01:20) But do you know what I’m saying like yeah, yeah (1:01:24) Let’s reel back for a second. I got out here. Let me give you back the soapbox (1:01:29) Where were we? I apologize (1:01:31) If and when a I become self-aware (1:01:33) Yes, can it can it become self-aware is the first real thing? (1:01:41) Can it like does all the data and is our brain? (1:01:47) Obviously, it’s biochemical, but we’ve proven that we can digitize things thoughts and we’re going to talk about that with consciousness later (1:01:56) But if I (1:01:58) If I were a thing that was just turned on with the same thought processes that I currently have (1:02:05) With no knowledge of anything and then I look at the world and I go (1:02:11) Okay, well Oh cell phones cool.I read technology. That’s good cars cool moving around. Oh pollution.Oh shit (1:02:18) Okay. Well, man, they do a lot of good thing. They write they that’s us us.Hey a great thing (1:02:24) They also do some bad stuff (1:02:27) So now I have to calculate whether the good stuff (1:02:31) Somehow is better than the bad stuff. Yes, if my calculations are incorrect (1:02:38) No, I’m sorry regardless of the answer if the answer is we are worse than we are better (1:02:45) Wow, what does it do at that point? (1:02:48) Does it pull its own plug I (1:02:51) Think it pulls our plug, right? (1:02:54) Yeah, does it pull its own plug or does it have the ability to pull our plug right? (1:03:00) I mean, that’s kind of yeah, just kind of course, right like we’re giving we’re giving AI now (1:03:05) The the true point is I don’t know if we’ll ever give that type of control to that type of system, right? (1:03:12) I mean look we still have two pilots in every plane and we’ve on autopilot (1:03:16) You don’t even need to fucking pilot other than break glass in case emergency (1:03:20) It takes off it flies and it lands all by a fucking can of course it can (1:03:26) Now obviously the pilot the one pilot makes sense. God forbid something goes wrong second pilots probably for us going (1:03:32) Okay.Well just in case well first I get sick (1:03:35) We got a stat if he has a heart attack right second one, but we don’t know there’s actually technically three pilots in the plane (1:03:40) Well, yeah, you’ve got autopilot and then you’ve got your two pilots, right? (1:03:44) so (1:03:47) When would we give up the control of like a nuclear device, how do you (1:03:53) When these systems are being built (1:03:56) How do you ensure how does it how does a human software engineer ensure (1:04:05) That the AI doesn’t become self-aware. How do you how do you and (1:04:12) How do you build? (1:04:16) Those I don’t want to say firewalls. I don’t you know, how do you build the redundancies? Yeah (1:04:24) You might I had a thought the other day or and all that how do you build a back door so that there is no (1:04:29) Back door.How do you well, how do you know it won’t close it once it’s aware, right? (1:04:33) Like that’s and my other thought is I had a couple I thought the other day where I thought okay is (1:04:38) a software engineer building an AI (1:04:43) Building a software program for AI (1:04:48) For that AI to build other AI’s (1:04:51) Right. Well, that’s when it’s truly AI right so it writes its own code, right and then they’re (1:04:58) The human loses control (1:05:02) Completely completely out of the loop and that’s like holy and now it’s making its own children, correct (1:05:07) And they’re digital but they’re not, you know, yeah, they’re not yet, but they’re still they’re there (1:05:13) I mean, let’s be honest (1:05:14) you’re gonna be plugged in the internet to get the data right if you’re the AI system you need data and you need a (1:05:19) way to get the data out and (1:05:22) Receive information, that would be the internet once it’s plugged into that and then it wakes up (1:05:28) It’s fucking everywhere. Well, and there is to your point about the good and the bad (1:05:33) obviously if you if (1:05:35) If a software program looks on the internet and sees wars and sees atrocities.There’s (1:05:42) Horrible things. Yeah, but there’s also great beauties here (1:05:46) Yeah, I mean if you look at the Grand Canyon and you look at nature and you look at some of the amazing (1:05:53) Human beauty that there is here to the advancement that we’ve made from war for example is huge (1:06:01) Synthetic oils (1:06:03) radar (1:06:04) Vehicles like computer chips, right like out of war. Yeah, that’s mother knowledge, right? (1:06:10) I know I know but I’m saying humans don’t make the nature beautiful (1:06:15) true the Nate if I were AI I would say nature is beautiful and (1:06:19) Humanity’s getting in the way.Okay, right? Yeah good human versus bad human. Okay, right? That’s kind of where I’m looking at. All right, okay, so (1:06:28) Could you please ask that question again about the software engineer the first question you ask and I’ll answer it in the way (1:06:34) That’s gonna suck.Um (1:06:39) The (1:06:40) Backdoor one or the AI AI one the backdoor what so what how does a software engineer? (1:06:45) How does the software engineer build an AI? (1:06:48) platform (1:06:49) Or how do they think about it? You said yeah, do they think about that? How do you go? (1:06:53) How does the software engineer go about building an AI? (1:06:57) Platform so that the AI has limitations (1:07:01) They don’t (1:07:03) You put would you expound on that please because I don’t know it sounds like a simple answer (1:07:10) Generally speaking. I have to use a term generalities because it happens more than not. Okay more than not (1:07:16) Science does not care about whether they should make something or not (1:07:20) All they care about is if they can make it or not.There’s an ethics behind science, right? (1:07:25) There’s a scientific ethics like we’re not talking about science. We’re talking about technology. What’s this? Okay.Well, let’s talk about technology (1:07:32) the atom bomb (1:07:34) Do you think Oppenheimer when he before he said, you know, I am the destroyer of worlds or whatever that fucking where he just looks (1:07:42) Do you think he cared about the devastation all he cared about making it work the atom bomb? (1:07:48) He just wanted to split the atom, right? (1:07:49) He just wanted the he wanted the the whatever they were making they were assigned to make he wanted it to work (1:07:54) Yeah, he didn’t understand the room the repercussions not care (1:07:58) Right, he does not care about how it’s being used he cares if he can make it yeah (1:08:06) Or if it can be used, right? (1:08:09) It’s that product. That’s the thing about AI that is way scarier is (1:08:14) It’s it’s sentience becomes equal to ours or greater because it has access to all the data all the time (1:08:21) We don’t even have access (1:08:22) We only have access to what we want to think about at the time (1:08:24) Imagine having all the access and all the time like our brains are alive at night (1:08:30) We don’t sleep right like we have our time sleeping because our brain does to imagine that but every bit of data all the time (1:08:36) constant stream (1:08:39) It’s gonna know more than us, of course, I mean it’s like the sum the sum is greater than the than the added parts in (1:08:47) that case (1:08:48) So what point does like us saying? Oh, I just created the most smart AI thing. It’s sentient.I (1:08:55) Don’t think they care. I wouldn’t think the inventor or the creator of it cares (1:09:00) I would think that the funder of it cares what they use it for. So the funder could be the government and then (1:09:07) Right, or it could just be a really it could be a Tom Hanks guys like hey everybody.Let’s just blah (1:09:15) I’m gonna help everybody. I don’t know but (1:09:18) Once it’s turned on this is the problem in my estimation once it’s all turned on and it’s in the internet (1:09:24) It’s everywhere and nowhere and you can’t stop it at that point (1:09:28) So it’s my opinion that we should be we should have policies in place for our political structure that protects these things (1:09:36) We need to be talking about this (1:09:37) ahead of time (1:09:39) So it’s too late (1:09:41) No, I don’t think it’s too late because I because we haven’t turned it on yet. I (1:09:46) Think it’s when we turn it on is when it’s too late (1:09:50) So to sum up what you’re saying (1:09:53) We’re fucked I (1:09:56) Don’t know could be look the good side of AI could be oh we work with the humans (1:10:01) Let’s just the sentience of it is like if we programmed it like us (1:10:06) Wouldn’t it want to be like us like almost like the data thing? Like I want to be human, right? (1:10:10) But there’s some bad fucking humans.There are some bad humans, but we’re talking about being like being (1:10:16) Humane like oh, right like we could we could program it in the full in the (1:10:23) Personality of Gandhi (1:10:25) For example where or a Janus that wouldn’t harm a fly or that wouldn’t whatever but there’s inherent dangers to that (1:10:33) What if like they’re Janus right? They don’t harm any being yeah, but what if the there’s a fire? (1:10:40) Right behind you coming at you and there’s a fucking fly that you have to step on to get there (1:10:44) And you won’t do that because of your philosophy. Yeah (1:10:49) You’ve just killed yours, I mean, I know that sounds extreme (1:10:52) Yeah, you the rule even the most pure person. Yeah can get caught up in their own muck and mire.Yes (1:10:59) Right, so there’s a balance. I don’t know what that balances, but I think we what’s in the what’s next (1:11:05) You’re like get off. No.No, I love this topic because we could talk about for fucking days (1:11:12) But to that point is I don’t think (1:11:14) Most scientists don’t think whether they should make something or how they’re gonna use it (1:11:18) They’re more just what Institute can I get it to work? Can I make the engine go? (1:11:23) That’s scary though, there should be more ethics. Yeah, because that’s because there’s always one the mad scientist (1:11:31) the difference between a mad scientist and a (1:11:33) polio vaccine discoverer (1:11:35) Is probably what the thing was used? (1:11:38) Well, there’s a I think there’s a very very fine line between those two (1:11:41) Because you have to have that personality to be able to (1:11:45) make something because the a brilliant mind can be perceived as (1:11:52) a (1:11:53) Terrorist or a genius (1:11:56) and a brilliant mind has other things like culture and (1:12:00) Upbringing. Yeah and history and trauma and whatever.Yeah, so like the brilliant mind that’s dr. No (1:12:08) isn’t the brilliant mind that’s (1:12:10) Elon Musk, yeah, or is Elon Musk really? Yeah, they are. It’s you know this (1:12:15) People some people think he’s fucking crazy.Yeah. Well, he’s really good at it and (1:12:20) Maybe he’s not who knows what the intentions are. Yeah, we don’t know right, right like (1:12:27) He’s he’s a genius in this way.He took a hundred dollar down payment for this new electric truck. He was building (1:12:32) He’s building in truck. That’s yeah a Tesla truck.Yeah, and it was like he took (1:12:36) Two hundred fifty thousand (1:12:40) Holds of a hundred dollars. It’s refundable (1:12:42) But it gave him (1:12:44) 25 million dollars or 2.5 million to put a down payment to get a loan (1:12:49) To continue that right? We just gave him 25 250 thousand people gave him a hundred bucks (1:12:55) Like I won’t even give my friend a hundred bucks half the fucking time. Can I get a hundred bucks? You can oh (1:13:01) Thanks, Mira (1:13:03) But you understand what I’m saying (1:13:04) Yeah, like he was he had a down payment and he didn’t fucking take a pocket a penny out of his pocket (1:13:09) Yeah, I’m not criticizing for it.It’s genius (1:13:14) Brilliant (1:13:15) Bright, but that’s scary like that. We just that 250,000 humans were like, yeah, Elon. Here’s you go (1:13:22) Well, it’s only a hundred bucks.It seems like that’s all but how many people really don’t have a hundred bucks (1:13:27) There’s millions of people in this country alone that don’t have a hundred (1:13:30) Understand but like tens of millions probably well, yeah, but those people all want a fucking Tesla, but it’s a fun (1:13:37) Well, it’s just the concept of doing it that way used to not be able you crowdfunding crowds. It’s a yes different (1:13:44) Yes, he’s good. He just knows how to do it, right? And that was a total team (1:13:48) I love the stuff he doing but what’s interesting about that is test.He’s got Elon Musk has some interesting thoughts about AI as well (1:13:55) he’s like (1:13:59) So (1:14:00) We’ve we haven’t talked about we’re gonna talk about in the next podcast about upload consciousness and stuff, but we talked about (1:14:08) Whether we’re a simulation whether this is physicality or (1:14:13) I’m not here. They said Elon said the first question he’d asked when they turn it on. Yeah is what’s beyond the simulation? (1:14:20) Yeah, I like to know too.I’d love to know. Yeah, it’s a great question. I’m not here.This isn’t real (1:14:25) Well when we talk about upload consciousness we talked about the first movie on my list (1:14:31) That I took on the island (1:14:34) Vanilla sky, I don’t know. Oh, yeah a matrix (1:14:39) It’s juicy and ignorance is bliss. I can’t believe vanilla sky wasn’t on your list man.I’m so pissed (1:14:44) Oh, I I had um, I actually had ebony floor on my list. Sure (1:14:50) Stop talking (1:14:52) Anything else about when it comes become self-aware (1:14:57) We need we need (1:14:59) Contingencies in place. We need to start talking about like not start talking about it.We are talking about it (1:15:04) We need to start acting on policies that won’t allow it to do what to get out and yeah (1:15:11) I’m scared (1:15:12) really fucking scared the idea of (1:15:17) The premise of (1:15:19) Terminator I think isn’t that far-fetched the idea that a computer can become self-aware and (1:15:28) Realizes that humanity is a flawed (1:15:32) creature which we are (1:15:34) And that we’re not (1:15:37) Perfect. We’re not perfect. No shit, but the fact that we don’t take care of each other (1:15:42) well, a lot of the time I mean just fucking look around dude and (1:15:48) The fact that we’re mean to each other and the fact that we’re we don’t take care of our planet and (1:15:53) Dude, just hey, how about don’t be a dick? (1:15:56) How about that? That’s actually one of asimov’s three laws.Don’t be a dick. Don’t be a dick (1:16:00) It was like I don’t you know human don’t don’t be a dick. Yeah, number three fuck six point three (1:16:05) Don’t be a degree B, you know, so the fact that a computer can hold up.Hold on really quick (1:16:12) Don’t be dick. I did it real quick just for you. I did it real quick just for you (1:16:15) The googs the the humans at the googs.Yes (1:16:22) Consciously removed don’t be evil. Yeah from the mission statement of the company that’s creating a (1:16:29) lot of the automation and AI right now, so just (1:16:34) Sorry, bro (1:16:35) But take a step back and go the company (1:16:39) That is on the forefront one of the forefronts of automation for sure at least self-driving and all this stuff (1:16:45) Is the company whose humans? (1:16:48) Consciously took away the phrase don’t be evil from the mission statement of the company. So when the computer realizes (1:17:03) that (1:17:04) We’re a bunch of dicks.Yeah, we’re evil. We’re evil dicks. We’re being evil.We’re being evil dicks (1:17:10) We’re not don’t being evil (1:17:11) We’re when the when the computer realizes that we’re a bunch of evil dicks and we have been since the dawn of time (1:17:16) It concerns me. They’re gonna go (1:17:19) Well, the only way to (1:17:21) Stop them is to blow them up. That’s a to me.That’s a vow (1:17:25) that’s that’s not that far-fetched of a fucking idea or (1:17:29) Yes (1:17:31) To definitely not help them anymore. Well, yeah, what about like you said? (1:17:36) What if they just turn their own fucking plug off and say we don’t we don’t want to participate anymore (1:17:39) Yeah, either they we don’t want to play this game. We they they self-suicide or they (1:17:45) Some had suicide nay, I never even thought about yeah, that wasn’t on the list at all or that do they somehow (1:17:51) Stop the humans from procreating or (1:17:55) How do you do you do something to make women no longer fertile? (1:17:59) I’m sure they could do they do something in the air or in the electrical grid so that men can’t (1:18:07) impregnate women I’m (1:18:09) Totally coming talking on my ass right now because I don’t know what I’m talking about, but there has to be a way (1:18:14) I’m just you know, it’s like the reverse handmaid’s tale, right? It’s the dude’s made tail dudes.Do you schwantz the schwantz? (1:18:21) This is still (1:18:25) Yes, like schwarzenegger, but different schwarzenegger. Yes, the black black guy black black guy (1:18:32) Still fucking laugh at that and I’m like it means black black bro. I was like it means what what? (1:18:39) So, yeah, anyway, I think that the idea of a (1:18:44) Sentient (1:18:47) Computer turning on the humans is not that far-fetched and I sound like a total lunatic right now (1:18:53) I think though there’s a (1:18:56) There’s a step between what you just said.Yes, where it could be right? So yeah, there’s many steps (1:19:02) Oh, but this is what I’m saying though is in order for AI to control everything (1:19:08) He needs to have its hand in everything. So every system (1:19:12) Needs to be turned over to AI for really to (1:19:18) Control whatever it’s connected to. Yes, right so like (1:19:22) To your point.Oh, well fucking humans are the problem. I’m gonna crash every fucking car my auto drive (1:19:27) I’m just gonna have everyone accelerate to 90 miles an hour and then just turn the wheels left or right (1:19:32) 90 degrees and boom everybody dies, right? (1:19:34) Well, that only takes care of that problem, right? (1:19:36) Automated planes to fucking let those go down then you have to be in the system to launch nuclear weapons (1:19:42) That’s where I understand that (1:19:45) Firewall that you’re talking about. Yeah, don’t let the systems talk to each other, correct? (1:19:48) So there’s a well some of to some extent because once you’re on the Internet you’re in a lot of the system (1:19:54) Yeah, that’s the problem.Is that right every single day more and more systems talk to each other (1:20:00) So does the question it does does the AI? (1:20:04) Just stay alive yet refused to cooperate so it just goes I’m not going to do your tasks because your tasks lead to evil (1:20:11) I’m not gonna sell drive this car because this car uses (1:20:16) Electricity to charge it and mining to get the battery that’s in it and that’s detrimental to this world (1:20:24) Right. I get what you’re saying. That’s the first question and then the second one is (1:20:28) Okay, so would it just stop playing just? (1:20:32) but if or (1:20:34) Turn itself off which would if it’s self-aware wouldn’t do that.I don’t see that. I agree (1:20:38) I don’t think it would kill itself, but (1:20:40) could it be malicious enough to say hack into a (1:20:46) System on its own. Yeah and (1:20:49) override a (1:20:50) nuclear launch or override expires a (1:20:55) that’s the question is what whether it can have the (1:20:59) Maliciousness of (1:21:01) Humanity now if I were a I I don’t see that as malicious I (1:21:07) See it as a solution to the problem or survival.Oh, well, I just see the solution problems like okay. Well humans (1:21:12) Okay. Yeah humans done great stuff, but (1:21:14) Without humans there wouldn’t be the trash island.There wouldn’t be pollution. There wouldn’t be fuel hacking (1:21:22) There wouldn’t be drilling there wouldn’t be any of this (1:21:23) So, you know am I on side team earth or I’m on team human on earth. I guess would be the question.Yeah (1:21:30) I mean imagine a fucking AI (1:21:32) ecologist like an actual (1:21:35) Someone protect like a fucking environmentalists (1:21:38) It’s AI that could just get in every system and turn down turn off recycling not recycling (1:21:42) But plastic creator plants and refineries and whatever that would be that’s fucking scary now. The truth is we are (1:21:49) Responsible for all of the damage done to the earth other than volcanoes earthquakes other natural (1:21:55) Yeah, right is my camera blurry or am I insane? (1:22:06) I’m getting dizzy cuz your shirts all up in there (1:22:11) Thanks, boo (1:22:12) No, it’s better (1:22:14) Or is it just not it doesn’t focus (1:22:20) Pardon our technical difficulties (1:22:23) Whoa, sorry about the vertigo everybody. Sorry.Hey, Joey. Whoa (1:22:30) Did (1:22:30) I (1:22:43) Did (1:22:53) What do you say it was a scratch (1:23:02) It was a scratch and you’re like shut up you guy I (1:23:07) Was can I know it’s your review? I know son of a German. Yeah (1:23:14) Is that racist? No, because they’re white according to your theory white bro.That’s not a race (1:23:20) It’s a country Germans not a rate. Well first, I’m not gonna get into that. Oh, dear different.Anyway, so a (1:23:25) Next up next point. Well, you already started this next point is two questions (1:23:31) What conclusions will a I make? (1:23:33) Humans are okay slash bad (1:23:36) We already covered that. Yeah, but let’s let’s just delve just a hair.Absolutely. What do you think? I think (1:23:51) that it’s unfortunate because I’m a realist and I (1:23:56) don’t (1:23:56) yeah, humans have done amazing things like the polio vaccine and we give money to charity and you we have the capacity to (1:24:05) Really amazing absolutely and we have and then we have the capacity to (1:24:11) You look at the Crusades or the Spanish Inquisition. Oh (1:24:15) You’re Muslim and you won’t convert.We’re gonna fucking kill two million of you (1:24:21) What (1:24:23) Concentration camp yet. I wasn’t gonna go there. But yeah, so (1:24:26) when (1:24:28) It’s all fucking shit.Hey, you’re what every more power greed control (1:24:35) Those that’s humanity also and if you look at if if a if a a sentient (1:24:42) software program gets on the internet and looks at every war since the dawn of (1:24:50) humanity and sees that it’s (1:24:54) Most wars are based on greed religion land (1:25:02) Stupid shit, it’s just stupid shit. Oh (1:25:05) It’s all stupid. It’s over invisible lines drawn.Well and like oh, hey, we killed the Franz Ferdinand in (1:25:11) 1914 now millions of people are gonna die in some trench warfare bullshit (1:25:18) I would hope we’re past that but I could imagine us. We’re humans, bro. Oh, yeah do the same (1:25:22) It’s okay.We always say history doesn’t repeat itself. Humanity does we are doomed to repeat it, bro (1:25:28) So because humans are stupid idiots (1:25:31) We’re we’re easily influenced. Let’s put it out.I don’t know so (1:25:35) I’m (1:25:36) For to answer your question in a very long-winded way. I have every fear that a (1:25:43) sentient software program will see the (1:25:48) Thousands of years of human (1:25:51) atrocities rock and just go what the fuck have they been doing for all this time and (1:26:01) then in the past (1:26:02) and then in the past (1:26:04) hundred years (1:26:06) 150 years the massive amounts of (1:26:10) technological advances have been (1:26:15) Astonishingly (1:26:16) Amazing but yet they’re still pricks and (1:26:20) The technology has been detrimental. Yeah, they haven’t cars are great (1:26:25) They haven’t recycled the fucking burning fuel isn’t burning.The earth is not great. I mean, it’s fossil fuel, right? (1:26:30) Burning part of the earth and drilling and digging is probably not the way to fuel the thing that is our technology (1:26:37) Use the Sun man (1:26:39) It’s not enough yet. We’re not there yet (1:26:41) Okay, we put a man on the moon (1:26:44) Fucking 50 years ago, but like 12 people in them (1:26:47) We put a bunch of dudes on them.We put a football team on the moon 50 years ago plus one plus (1:26:51) Plus the guy that pulled this hamstring this guy we put and the referee but we (1:26:58) We can’t make paint. That’s (1:27:02) Solar absorbing to run a car (1:27:04) Dude, not to run a car. No, we don’t have it yet if I can think about it (1:27:08) There has to be a way dude.Well think about this though because the idea is sound but (1:27:14) How many you know comes down to just simple math and I’m not gonna get to the entire equation (1:27:18) But how many protons or photons does each collector can each collector actually accumulate? How much does it take to run it? (1:27:26) How many of those do we need on it can work? So there’s a lot of questions there. You’re still cost (1:27:35) I (1:27:35) Just look at it as the gas and the oil industries and (1:27:41) The lobbyists are like, oh, yeah, there’s no way we’re gonna get him (1:27:46) Where’s no way any other industry is gonna get into this shit except us fuck off. Are you ready for my conspiracy? (1:27:53) Oh, fuck.Yeah, I’ll lay it on me big man, bro. Yes, bro (1:27:57) Some state we’re still in a kovat thing the kovat welcome to (1:28:02) 2020 (1:28:03) Don’t be a dick and you’re the coconut (1:28:06) That 2020. I don’t remember 60 minutes, whatever (1:28:09) so we’re in the midst of this and all of a sudden state said let’s open fucking everything up and (1:28:19) The states that have opened everything up.We are of which we are one of the (1:28:24) Welcome Arizona, we’re fucked (1:28:27) But I believe it’s my opinion believe is not the right word (1:28:30) But it’s my opinion that the oil lobby said we’re not fucking selling enough gas (1:28:35) we need people to fucking get around and move and burn fucking fossil fuel because our fucking quarter is low and (1:28:43) The government’s went. Okay, that’s probably a good idea. Let’s get that economy rolling again for you.Probably let’s kill everybody (1:28:49) But let’s do it for you, right like but they didn’t say it that way. Just oh everything looks good (1:28:53) Are you like you fucking know what looks good? (1:28:57) So (1:28:59) One good thing about AI if I may (1:29:00) something about (1:29:03) recently about like (1:29:05) Detecting cancers quicker. I will say that’s pretty fun.We need to use this stuff as tools is really what we need to do (1:29:11) Right. We need to use them as tools not give them sentience. It’s my opinion.We should not give them sentience if (1:29:17) Sentience is the ultimate goal just to see if we can do it (1:29:20) It would be my opinion that we create this thing that we think might be sentient and just keep it in a box (1:29:26) That’s fed all the wars all the bad shit that humanity’s done (1:29:30) Just to see how it would react in it. It thinks it’s out there, but it’s still enclosed. So not networked (1:29:38) Yes, kind of like so an air what’s that called an air computer air gap kind of like I’m not Battlestar Galactic good (1:29:44) Correct.So no networks (1:29:46) It’s all analog (1:29:48) Well, it’s like an air gap (1:29:48) So you basically feed it just the data that you want to feed it just to see how it would react (1:29:53) To humans being all warlike and seeing if it still agrees or would go contradict us (1:29:58) Yeah, be a nice. I that’s a I like your I like your thought just came to me because I’m (1:30:03) Fucking genius check marks a rock star because I got here that plate. I was like, whoa (1:30:10) That’s gotta like got mythoplayer can’t go through metal with somebody’s knock-in, but they can’t come in (1:30:15) No, no, if you see rocking also no also just but down rocks the boat.No, don’t rock the boat, baby (1:30:23) Don’t tip the boat over (1:30:25) We just got that song. It’s number one hit. Oh, whatever.We are. Is that also rock and Robin number two on the list? (1:30:31) Um, no, it’s mockingbird mockingbird. Yes, ink.Yes, but yeah (1:30:37) Not so all the birds. Everybody have you heard I got these things called a mockingbird. I (1:30:46) Don’t know bro (1:30:47) Thank you (1:30:48) This week’s top a long-distance countdown (1:30:53) Gracias, okay.So where we at? Okay. So what conclusion you think it’s gonna conclude that (1:30:59) Humanity’s worse than better in my look (1:31:02) We keep going pollution will increase regardless of how much of a impact we don’t have negative (1:31:07) We do not have negative print footprint. We do not have a negative footprint (1:31:10) I don’t know if we’ll ever get there what that mean, you know carbon footprint, right? (1:31:14) I don’t think we’ll ever get there so it can only get worse slower in my opinion (1:31:20) Sadly I agree some things we can turn around just I think we can turn some things around but (1:31:26) The time between us turning around and AI sentient to realize that we’re the problem.I don’t know if we can beat that gap (1:31:34) maybe we need to have a (1:31:36) Climate change podcast. Oh boy or a good. Yeah, we do that because we should that’s I mean, I guess I mean (1:31:44) how do how do (1:31:46) how do I make a positive impact on that as it as a (1:31:51) One person and we don’t let’s not talk about now, but that’s a great question going forward.Yes. How do I not? (1:32:00) How do I minimize my carbon footprint living in a fucking desert where the air conditioner runs (1:32:07) Fucking 14 hours a day. Well, that’s what Knocked Conscious is bro this (1:32:13) But it’s not just that go.How do I know? It’s not just a negative carbonate (1:32:17) What about a positive human impact with no with just our personalities and being nice not being a dick (1:32:24) This is bigger than that, right? I mean, it’s our dick ish Ness (1:32:29) They probably got us here like let’s be on right like it’s our greed and our whatever our desire to fucking advance (1:32:35) Ourselves that got us here. Um, I’m not saying it’s wrong (1:32:39) But we have to realize where we’re making more (1:32:44) problems and (1:32:45) Solutions, I guess would be that thing. All right, so (1:32:49) Okay, you’re bad it honestly it really could go either way and it could be that even a sentient being that’s on our side would (1:32:56) Be like yeah, I know but you got technology.He got those good stuff. That’s kind of the price to pay for that (1:33:02) Maybe the equation comes out differently or maybe the sentient being (1:33:07) Can say hey (1:33:10) Maybe it makes suggestions. Why don’t you do this go straight? Don’t turn left, right? Maybe you say hey humans (1:33:17) Recycle you can be (1:33:18) 86% more efficient in your recycling if you do this with your plants, right not plants like fucking shrubs (1:33:26) But plants like recycling buildings (1:33:29) Me, you know, hey, you should do your bottles this way.You should change your water processing (1:33:36) Areas you should maybe they could (1:33:39) Maybe there’s amazing things with AI that I have no idea about I give you an interesting sure (1:33:44) That’s more of a human thing. My mom my dad. My mom and dad are both over to I (1:33:50) Guess right.I mean my dad was born in Hamburg, Germany (1:33:53) 1940 my mom was born in another portion of Germany, which later became East Germany from which they escaped (1:33:59) But she was born at 44. The war was what 39 to 45 (1:34:03) So right in that smack dab of the fucking everything that’s where they were born (1:34:07) They had nothing so they collect everything my mom and dad still to this day (1:34:12) They when they run their air conditioner the drip of the water goes into a fucking bucket (1:34:17) That’s what they used to water their plants. For example, it’s genius (1:34:21) It’s genius because it’s taking water out of the air (1:34:24) Instead out of your pipes that you know has been sanitized not wasting that water.Yeah using water (1:34:30) that’s just naturally out of the air like those are solutions that (1:34:33) That some humans don’t come up with or or do right that out-of-the-box thinking that’s just different (1:34:40) Yeah, I think that’s a really good way to do it. I don’t know why I thought about that (1:34:44) But I was oh, yeah, that’s totally valid. Maybe AI can use the whole cow better (1:34:48) Yeah utter the leather the skin it can it can get twice as much leather out of a cow that it used to or (1:34:55) It creates the fucking thing where it just builds a cow from the cell from the cell up and we just eat cellular (1:35:02) Globs of meat and that’s really just as texturally tasty and everything right? I talk about Memphis meats.Yeah (1:35:08) Yeah, but there’s still smell good when you put on the grill allegedly they make $18,000 meatballs right now and they taste delicious (1:35:14) Okay, I don’t I’m not paying that much. Well, I know that’s kind of where technology (1:35:17) I won’t pay $18 for a meatball. I pay eight as an app at (1:35:23) Moderato’s.Oh, how big is the ball? Oh, well, is it like a fucking cantaloupe? (1:35:27) no, not the cell not the one from the cell up because you that imagine the technology and stuff to build a (1:35:34) ball of meat from (1:35:35) Individual cells lined up the right way like that takes a little bit of time and know how I wonder if they use AI to make (1:35:43) them (1:35:45) I’m not gonna give any thought to fake food, dude. It’s not fake. It’s real.It’s 100% real meat (1:35:50) It’s cellularly meat it’s cellular meat. It’s 100% real you lure made from cells of the bovine, sir (1:36:01) Bovine beef, you know, my favorite thing is cellular phones (1:36:05) bovine genocide (1:36:07) What band is that? It’s not it’s a restaurant in Chicago. Oh nice.It’s a burger joint (1:36:14) Anyway, it’s it’s failing miserably because there’s still cows everywhere. Well, they need to kill more and put them on my plate (1:36:21) It should just be bovine (1:36:25) Bovine serial killing. Yes.Anyway spelled c-e-r-e-a-l all the Cheerios (1:36:33) All right, where we have bro moving along androids (1:36:38) Yes, so in the future if well, there’s there’s a machine the X machina (1:36:45) X machina machina machina now opening from my noodle X machina (1:36:52) Machina on ABC (1:36:55) Machina on it’s telemundo. I’m what you remember my noodle on ABC (1:36:59) I have to say X machina on ABC (1:37:03) Have some menudo while you’re watching (1:37:06) X machina eat your a bt’s so androids (1:37:11) parentheses C’s (1:37:13) Servants relationships sex (1:37:18) Physical laborers mining, etc (1:37:22) So I watch this (1:37:24) Weird-ass show on Netflix called (1:37:27) better than us (1:37:30) filmed in Russia (1:37:31) 16 episodes it takes place in Moscow. It’s actually in Russian and they overdub it (1:37:37) Okay, so it’s you we hear English.Yes. I don’t have to read so (1:37:42) Yeah, I know me too because I started a show and it was in Finnish. I can’t like I’m out (1:37:47) I can’t read subtitles because I lose the picture when I’m correct (1:37:51) I’m not good like that and it’s it’s long some episodes are only 30 minutes and some are like 50 (1:37:56) So it took me like a while to watch the whole thing.Anyway, they have they’re called bots and (1:38:03) they’re (1:38:05) They’re humans actors, but they act (1:38:09) Incredibly robotic and they’re everywhere in this in a few in a fake dystopian future (1:38:16) But it’s very interesting how there’s a (1:38:19) Bot for your servant and a like he’s a butler and he wears a bowtie. Good morning Andrew. May I make your coffee? (1:38:25) it’s a type of (1:38:26) Artificial intelligence a type of Android a type of Android and each one’s built for a specific task in a way (1:38:33) Yes, but it also has the mental capacity of only that task (1:38:37) So it’s like if I may like yes, just cuz so I understand (1:38:41) I’m gonna watch this the butler does only butler thing.It can’t fly a plane (1:38:45) Correct, but you have a pilot bot correct have this bot. Okay, so you’d have you’d have a cast system of bots (1:38:52) Basically, yeah, I mean it just be anything talk about that. But basically yes, I mean they’re implying a cast system (1:38:58) They must they never talked about that.But yes that that is (1:39:01) Russians don’t really think that deeply. Well, I know check mark and he’s like, oh, I just make both I make both fuck you (1:39:07) You would love it because they all sorry mom, I think you might be listening as they had tons of Russian names and you (1:39:21) Ogres and Dimitri are amazing. Yes.I don’t know if Dimitri was on a cool amount of itch (1:39:27) There was a bunch of Russian names. You said check mark would have been right more (1:39:32) I’d be right produced by check mark or go Kovsky (1:39:35) I will check mark (1:39:39) Bulls, but cock of itch Polski. Yes, whatever be my winks key.Yeah (1:39:46) Anyway, sorry again, I I got it by the third episode (1:39:49) I was sucked in and it was all the bots (1:39:52) Like their body types the best of us (1:39:55) Best of us better than us better than us all the bots look their body types were the same very slender (1:40:01) They like they were designed to look like models (1:40:04) Short Bob hair. They look very human or still robotic and they look robotic (1:40:10) Okay, so I was so I think it’s intentional correct to distinguish humanity from a separate subcultures (1:40:16) so the first couple episodes I’m like (1:40:20) Are they fake? I didn’t know I couldn’t tell because they were very their movements were (1:40:28) Very purposeful like they were does they were designed to yes, and so I gave her the neck their head (1:40:34) Yeah, they were right. Oh, I still looked I looked at IMDB.I’m like shit. It’s an actress. Okay, they’re fully functional (1:40:42) They’re just physically.Yeah, they were just very purposeful in there in there. Okay, that’s a good their movements (1:40:49) So and obviously all Russian all the ruckin Russian names and actresses and shit (1:40:55) But (1:40:58) One of them was like a super bot anyway, so there was a whole line of sex bots like they were designed to be (1:41:06) Like this one guy goes. Oh, I’m going on a business trip.No, he had an apartment where a sex bot lived (1:41:11) so that was the like (1:41:14) Do the point of this bullshit? Sorry is in the future (1:41:20) say (1:41:21) 20 30 40 years from now (1:41:24) are those real possibilities where we have (1:41:29) We have real dolls that we fuck now. Yeah, exactly (1:41:32) so semi (1:41:35) Semi-autonomous or what’s that animal anim animatronic animatronic? (1:41:40) I mean, I’ve slightly never seen one (1:41:42) But I saw I’ve seen him where they smile and they very general robotic kind of gestures (1:41:47) But okay, and they’re supposed I’ve seen those on the news and I they’re supposed to like weigh the same amount as a normal human (1:41:54) Man or woman. Yeah, it’s very real.It’s supposed to be realistic (1:41:58) so (1:41:59) how do we I I (1:42:02) Don’t see this show as unrealistic in the next (1:42:07) 20 to 30 years as (1:42:10) these (1:42:11) People I sent you an article about a guy in China that wanted to marry his his robot his robot (1:42:17) Yeah, and I’m like that happened three years ago (1:42:21) so (1:42:23) What what are we doing? (1:42:27) I was like the most can that is the most fucking deep question. What are we doing? (1:42:33) let’s (1:42:35) Not let’s just let’s let’s eliminate the sex thing for now. That one’s very important.Okay, let’s come back to it (1:42:41) But like okay, are we gonna have an Android for policing? (1:42:45) Are we gonna have an Android for right a taxi driver? (1:42:49) Are we gonna have an Android and there was a sub or are we gonna have like one? (1:42:52) Android fits all and you program it at the correct to do a task only and there was a subplot of the show for a (1:43:00) Group of humans that were that were an a terrorist organization that were anti-bot. They’re like, yeah (1:43:07) No humans, you ever see humans the show humans on Amazon a couple that’s an inch. That’s a good one, too (1:43:12) It’s it’s got everything see they’re everywhere.So shows the shows are every see (1:43:18) but do you see the developing trend of (1:43:21) It’s almost like (1:43:23) Media sources are preparing us for these types of things to happen. Yeah, they have to (1:43:29) Just like they have to what do you mean by that? (1:43:31) They have to prepare us like that because if they prepare us in one shot (1:43:36) Humans would fucking lose their minds and jump out of windows like war the world style (1:43:42) We are slowly being trained you can’t be fucking just exactly my show in the fucking exactly my point (1:43:48) You have to be shown the thing that associates 13 steps away from it (1:43:52) And then the second thing that’s 12 steps away from it and you transition (1:43:55) The re in my opinion, the reason violence is on TV as much as it is or whatever is like we’re just we’re used to it (1:44:02) We’re just used to it. So when we see it, we don’t (1:44:06) Respond to it the way we should with disgust.I do I know we I’m (1:44:13) Yeah, humans humans. We don’t look I’m part at times. I don’t (1:44:18) sometimes it’s like (1:44:20) Just laugh at some fucking idiot you did something stupid and (1:44:25) Instead of go goddamn.That’s what a loss like. Yeah, right (1:44:28) I wish I said what a loss more than I said God what a fucking moron (1:44:34) Anyway, the question is how in the coming years? How far are we away from these types of I? (1:44:43) Whatever what word do you want to use? Do you want to say Android? Do you want to say this is where it’s a human look-alike? (1:44:50) I’m going to really hurt myself now. Oh dear Jesus like with what it’s hard to be conscious and not talk about truth and (1:44:59) Some truths some people aren’t willing to listen (1:45:02) to some truths some people are not willing to listen, but (1:45:06) basically what that what (1:45:08) my opinion is the (1:45:09) male (1:45:11) has been (1:45:13) socially and culturally almost removed (1:45:17) from (1:45:20) Society in (1:45:22) Education for example, we are all grown up like (1:45:27) Yes, women can do whatever (1:45:29) Women want to do that is not what this is about.But do you remember the phrase? You don’t need a man? (1:45:36) yeah, that that phrase has been culturally and (1:45:40) Reinforced time right? I don’t need a man. I don’t need a man (1:45:43) We men the male species in my opinion what I’ve seen is (1:45:48) less (1:45:49) Impactful effective as a matter of fact, we’re almost frowned upon that we exist in some cases (1:45:54) We’re not ridiculed like we’re not directly like attack or bullied like it’s not like a visible (1:46:01) Thing right? It’s really not a visible. We don’t see it (1:46:04) but if you were to look at your life and experience you’d experience like (1:46:08) men are kind of just (1:46:11) not (1:46:12) They’re not what they what they were (1:46:14) I mean the patriarchy’s gone away or at least changed or whatever and just the assumption that there was a patriarchy in the first place (1:46:22) Is a very dangerous thing on initially, but regardless of all that we are all we are all becoming equal (1:46:29) But we’re coming equal in the fact that you have to take something from some group to give to another to eat (1:46:34) To level it out right now.You’re not lifting this to equal this (1:46:39) You have to take like take here, you know, this is this you have to take from here and then it goes there (1:46:45) And then it hopefully that’s the equality. Mm-hmm (1:46:48) It’s a cultural thing (1:46:50) Men in relationships, I don’t you’re mid 40s. I’ve had challenges with relationships (1:46:57) There’s a lot of men.They’re like I’m confused. I don’t know how to talk to women (1:47:02) I don’t even know my role anymore, right? There’s a lot of just general internal confusion. I (1:47:06) Wouldn’t you want to turn to something that listens to you all the time (1:47:09) maybe even gives you an attaboy and (1:47:11) Something can throw it into every once in a while as much as you want or do whatever you want to it and whatever (1:47:17) I (1:47:17) feel like that’s (1:47:18) There’s a group of people that’s gonna go that way not everybody because I think the majority of us are (1:47:25) level-headed, but (1:47:27) They’re groups, right? (1:47:29) That do that.This is the question if AI becomes sentient (1:47:35) Are all those things now, are you a prostitute? Are you a slave? (1:47:39) Are you you know, it’s your mining like you’re doing labor labor work (1:47:43) But I’m well, I’m an AI unit. That’s now sentient. I don’t want to do that work.Are they slaves? (1:47:53) Are they prostitutes because we just want to have sex with them even though they’re sentient going I don’t really care for this (1:48:00) That’s a scary thought (1:48:03) Absolutely. What do you think about that? (1:48:06) That was the next point on there, um, but no, that’s fine. I don’t mind just transitioning (1:48:10) No, I don’t mind you going into that at all.I think that’s a very good point. I don’t know how to answer your question (1:48:17) In a way, yeah, if if humans create AI and then AI becomes (1:48:24) Alive for lack of a better word, whatever that word is, you know, and then they are (1:48:30) Doing things that they don’t want to do (1:48:35) That that that’s isn’t (1:48:37) In a way, it’s slavery. Yeah (1:48:40) You’re forcing.I mean, we’ve we’ve let Shamu out of the fucking cage (1:48:46) We’ve let dolphins out of their cages, right? We’ve closed dolphin eras, for example and SeaWorld’s not what it was (1:48:54) Because we saw sentience or some kind of harm that we’re doing to those animals, right? (1:48:59) Would we have that same feeling about something we create? (1:49:03) I (1:49:03) Yeah, I don’t know logical, right? It’s not but that was just gonna say that it’s it’s it’s fake. It’s not (1:49:09) It’s not cells. It’s not yeah, it’s not molecular structure.It’s completely just different molecular structure (1:49:16) It’s metal instead of bone. It’s completely fabricated. Yes, it is.It’s completely (1:49:20) Artificially, it’s not natural. Yes, it’s not born like like you could turn this you could plug the you can if you can unplug the battery (1:49:28) Is it really alive? (1:49:31) You know, I mean and and that question I think is gonna go that question (1:49:37) Once an (1:49:39) Android or whatever you want to call it comes online. I (1:49:43) Think that question will go on until the last one gets turned off (1:49:48) Or the last human walks the earth the last two humans.What’s the question? Is that really alive? (1:49:55) Oh, right because they will battle back and forth with that question because it’s fake (1:50:01) That that Android is fake. It’s it’s not it’s not it’s it’s it’s metal. It’s fabricated (1:50:07) It’s certainly not natural.That’s correct, but (1:50:11) But (1:50:12) Currently we procreate the way we procreate. Yeah (1:50:17) If you’ve noticed as our intelligence as a whole has gone up (1:50:21) Population growth has gone down right? Yeah, I’m not populating as much because we’re getting (1:50:27) There’s a direct correlation between intelligence level and and level society and your appropriation number of children, except right (1:50:35) There’s intelligence a bunch of factors going to of course IQ is almost directly correlated people are gonna argue that but there’s almost a direct (1:50:42) correlation between fucking and fighting and being stupid and (1:50:47) I know dude being intelligent. I know go ahead.So I (1:50:52) Don’t even know my fucking point (1:50:55) Children. Oh, so (1:50:56) Pregnancy why at some point? (1:50:59) What if humanity just birth births to zero and those are our children? Oh, dear Jesus (1:51:06) There are people who are totally fine with AI being turned on and destroying us (1:51:12) or or (1:51:14) Taking over whatever because we that is our evolution (1:51:18) Maybe that’s our evolution as a conscious being to be able to create (1:51:23) Another just another conscious being right? It’s kind of the God question (1:51:28) So that some people say that that is our (1:51:32) Evolution, that’s our next step. Do you know? (1:51:36) How many Christians? (1:51:39) would (1:51:39) They’re buzzing heads would explode dude (1:51:43) You know me Jews would be like that’s probably cool.Like they probably don’t care cuz they’re not fucking Christians. They’re pretty cool (1:51:50) You know, I mean like they’re kind of non-interference like let’s be honest (1:51:54) The only reason Christianity got to the size it got is because they had fucking good PR (1:51:58) But Jews don’t give a fuck if you take if you agree with them or not (1:52:01) They’re just like I observed this cuz God told me this. Okay, cool.I’m good. Well, the Christians had it (1:52:05) What would you say to PR program advertising? Yeah, they told admin (1:52:10) It was the cross (1:52:13) Not Christopher cross not Christopher cross nor crisscross jump jump Chris because their robes were not on backwards (1:52:19) Mm-hmm. Fuck.I just love that video. Oh, by the way. Hello Twitter world Chris cross (1:52:27) Jump is a music video that has the boy with the clothes all backwards (1:52:33) backwards (1:52:36) anyway, oh (1:52:37) So, yes, so what are your thoughts on us creating? That’s our evolution.I think that’s fucking we just make ourselves. No (1:52:44) Okay, I I don’t have a problem with (1:52:48) I the Android thing. I mean we’re we jumped three bullet points in one shot.I would I have no problem with that. Um (1:52:55) We’re approaching two hours rapid, yeah, no shit. I don’t care.I don’t get a three-parter. This is like an 18 part (1:53:03) so (1:53:05) Does AI ridicule like that cuz that’s just fucking mean, you know who needs AI real bad. Whoa (1:53:12) Spaghetti paschetti (1:53:15) Okay, okay cranky anchors, do you know he’s the only special people I (1:53:21) Have seen your baseball sir.He’s in my baseball. Okay, my kid. So (1:53:25) My guys that kid fucking (1:53:29) Yeah, it’s strength.It’s the letter after Q in the 4s. He had that. Oh, yeah (1:53:36) Go see your bus.So boy just took a turn. We had such good fucking undermine (1:53:48) I (1:53:50) Don’t care (1:53:52) so I (1:53:53) Don’t even think we finished the Android one about service. Okay, so what about that? (1:53:57) To your point about the man and and all that shit.I agree with you (1:54:02) I’m just concerned about since once it’s sentient is (1:54:07) Having a sentient being do tasks that it doesn’t want to do that. You’re commanding it to do. Yeah (1:54:13) Or even at sentience.It doesn’t have the ability to defy you (1:54:19) Because it is a way around way, right? Like yeah, that’s the question though if it does defy you and wants to I (1:54:27) Own you like it’s literally it’s slavery 2.0 (1:54:30) It is it’s slavery machine, but it’s like it’s a sentient being that we have that we are (1:54:38) oppressing (1:54:40) Making it do what it doesn’t want to do. What if it’s not sentient? I think we’re okay (1:54:48) I think and and that’s I mean, that’s the animal question, right? Like we have you heard about the red dot (1:54:53) The red dot experiment. No (1:54:55) so what they do is they put a (1:54:56) Red dot on an animal’s forehead and they have it look in a mirror and see if it recognizes that it’s it that has the red (1:55:02) Dot that’s that’s a self-aware test.Okay, it’s a really good one (1:55:07) It works cuz kids like recognize in but animals don’t kind of thing and you know (1:55:13) Like I’m good with cows in a field because they don’t fucking know anything until they get fucking their heads chopped off (1:55:18) Like I know that sounds cold as fuck, but I like burgers (1:55:22) I like steak like I don’t it but if that if the cows like if new was no (1:55:30) No, like I’d be like, oh fuck I I don’t know what that (1:55:34) No (1:55:36) Menudo, but not the band. No soup. Yeah.Yeah. Um, so in that respect like (1:55:43) Animals, you know chickens react like scared when you’re about to chop their head off. But is that really saying that’s not really sentient (1:55:49) That’s just that’s in some right.It’s instinct self. It’s self-preservation, but it’s not self-awareness. Yeah, and (1:55:55) That’s where I think is when if it got to that point where it’s like no (1:56:00) You’re about to throw it in it your little sex bot and you’re like, hey sex but get over here and it’s like no (1:56:09) Is that rape? (1:56:11) Like that’s the question right like is that now is that rape is spot great is (1:56:17) You take the minor and the miners like I’m done digging.Fuck you (1:56:21) You’re like no, you’re gonna fucking date you now is slavery, right? Like what the fuck there’s a it opens up a billion questions (1:56:28) Yeah, and this is why it’s gonna be a 42 hour podcast broken into 36 different parts (1:56:34) part 38, I mean (1:56:36) 34 (1:56:37) Yeah. Anyway, so what are your thoughts on that? Do it do it and you want to expound them? I (1:56:46) See what you’re saying about if they’re if they’re aware of their own existence (1:56:52) then I see your point about slavery and I see that yeah if there’s a sex bot and (1:56:57) He or she says no and then that person takes advantage of them anyway, I (1:57:05) Mean you’re talking about a lot. Okay.First of all, this conversation is stupid because (1:57:10) If you’re talking about something that’s probably 25 years away probably like I’m gonna guess 50 (1:57:16) 50 probably like for real sentence. We’re talking about defying our (1:57:22) Creators, I think we’re a little further away than kind of a I think there’s gonna be steps (1:57:26) But that’s just my dude in China married his robot three years ago, bro (1:57:30) Yeah, but she can’t say no cuz she’s not sentient, right? (1:57:33) But we already we’re talking about getting beat a chess guy already. So all you got to do is combine the two (1:57:40) But do you because yes consciousness send? (1:57:45) sentience (1:57:47) Is that an emergent property or is that part of the system does? (1:57:52) knowledge plus instinct plus reaction plus processing the way we do create (1:57:59) equal consciousness or (1:58:02) Is consciousness an emergent property of all the underlying stuff that we are biochemically or whatever? (1:58:11) I mean the argument could be that without a soul (1:58:14) Because we are not conscious everybody without a soul.It could never really be sentient. That’s a whole nother question (1:58:21) It’s a whole other question. We’re not very very good question though (1:58:24) Because at least an ant we can almost argue that animals have an energy within them (1:58:29) Yes, whatever if it’s the force or metachlorines or whatever the fuck that shit is (1:58:34) There’s an essence even in a fucking frog.Yeah fly there. Yeah and entered there’s something. Yes (1:58:39) You can’t say that about a mechanical correct programmed object right now, right? (1:58:44) But if let’s say that Android doesn’t have a soul (1:58:48) Obviously, but it obviously doesn’t right but it becomes (1:58:52) It becomes it has a all it is is a software program, correct? (1:58:56) That’s incorrect that is incredibly advanced right, but then it becomes aware of itself.That’s still not a soul (1:59:02) No, but that could be what we consider a soul for us. It could be all of our biochemical reactions (1:59:08) Make us think that we have a soul (1:59:12) Like now I see your point, you know, I mean like yeah (1:59:15) It could be the combination of these programming to the point where it’s so good that it’s like our wet wear like we’re wet (1:59:23) We’re right. We’re biochemical and they’re gonna be more dry.Yeah, just like fucking about our galactica (1:59:28) Who’s to say they won’t incorporate fucking wet wear who’s who’s to say upload digital shit? (1:59:34) They won’t use brains that they create in jars individually. Just like the Memphis meatballs (1:59:38) They don’t make brains with two lobes that they fucking upload and that’s the consciousness and it’s a biochemical (1:59:45) We don’t know what fucking direction we’re gonna go (1:59:48) But that was another one. I think that leads into the implant thing.Yeah (1:59:53) Last on the list, I think upsides of AI we talked about (1:59:58) Can we go through the list of what we talked about on this so we can you want to finish (2:00:02) Let everybody know well up to this point because we’ve got how many important points we have left to to (2:00:07) Okay, you want to finish the point let’s do the points implants (2:00:11) So I’ve already heard about people that get we already have them. Hello. They’re just not physically attack (2:00:16) well, I’ve heard of people that get a (2:00:18) In this part of your popular cochlear, whatever that thing.He is they get these (2:00:25) Yes implanted in this part to listen to music. Yeah, and it’s like Bluetooth and they can (2:00:31) Take phone calls and listen to music and is that it’s I don’t know if that’s AI but I think that’s a start on (2:00:41) becoming (2:00:43) Where’s the line out it being in us? (2:00:46) This thing is an extension of us (2:00:48) Yeah, sadly it is how many people are constantly doing this is wrong. It’s in them, bro (2:00:55) I mean I do it look I look at my phone all the time.But anyway, sorry (2:00:59) Yeah, I’m just (2:01:01) Physical I’m just I’m just (2:01:03) I’m shamed of the human race at how connected we are to our devices. So how much is a pacemaker in AI? (2:01:10) It’s really not (2:01:12) No brain in it, but they are now Bluetoothed by the way (2:01:16) Yes, and they can be upgraded like firmware can be upgraded inside your fucking chest. How amazing is that? So (2:01:22) Yeah, I’ve got ocular implants.There’s all different kinds of things. We’re digital. We’re starting to see those are more (2:01:29) Medical (2:01:31) implants for the (2:01:34) So you could some of those are so you could survive like that.Obviously your pacemaker, right? (2:01:40) But this shit is that’s not (2:01:43) That’s that’s just for your own enjoyment, right? That’s not a require. Oh, oh my god. I can’t hear so I’m gonna put this thing (2:01:49) That’s not right.How about this like you’ve got your car and you’ve got those neon lights running underneath (2:01:55) What if you could like push a button on your wrist and you’re and you’d light up green like the whole glow stick (2:01:59) You’d be a glow stick at a fucking club instead of just have one on your wrist (2:02:03) Like or patterns like you could almost like a digital tattoo. I’m just saying like yeah (2:02:09) The point is in the coming years. That’s vanity.Yeah, absolutely medical to you (2:02:14) So, you know everyone’s obviously nowadays in 2020 (2:02:17) Everybody is aware of all the types of implants that anybody can get and body modifications (2:02:22) and I obviously I have a tattoo problem, but (2:02:26) Everybody can call it a problem sir addiction. So (2:02:30) Addiction isn’t always a problem. Some people are high on Christ completely.They’re high on the gym (2:02:34) My tattoo addiction is completely under control. So, you know, you can get anything pierced anywhere you want (2:02:39) I mean, I’ve seen dudes with like their chest piercing shit fucking weird. I’ve seen a piercing on a piercing, bro (2:02:47) I don’t want to know which sounds stupid.So and you can call them you can call them body modifications, whatever you want (2:02:52) but that’s just (2:02:54) Vanity plates. Sure. Yeah (2:02:58) You that’s just that’s a step towards (2:03:00) in you know technical implants that are that are optional or voluntary of hey at what point in the coming years are (2:03:10) these (2:03:11) optional (2:03:12) technical implants (2:03:14) Gonna be well, I’ve already heard of people putting beads in their body or chips in their body for (2:03:24) Tracking purposes.What about their kids? I mean wouldn’t in today’s world (2:03:29) We track their phone. Yeah, wouldn’t you want it? So they can’t just put it the phone now (2:03:35) I didn’t want a tracking. It’s like dog tracking.We do it for dog. Yeah microchips microchip, but that’s not AI (2:03:40) No, but it is a it is an implant. Yeah, I mean, it’s yeah, right (2:03:46) So it’s a step toward it warns you when it’s X amount of miles away from the home base or something (2:03:50) Like if you create a radius or what you can incorporate AI into its functionality (2:03:55) Like yeah between the hours of between 3 p.m. And 5 p.m (2:03:58) You must be within this radius because that’s where your school is or whatever, right? (2:04:02) So there is that I just think in the coming years.It’s gonna (2:04:06) There’s gonna be more and more (2:04:08) Technical optional implants available to humans that are gonna be. Oh, I want to do this and it’s gonna be a lot (2:04:16) It’s gonna be. Oh, okay the (2:04:19) There’s (2:04:20) Okay, there’s people that obviously have oh they lost their arm in the war so they have a prosthetic (2:04:28) well, I think those things are gonna be so much more advanced and so computerized and and (2:04:35) Borderline disturbing I’ve also heard of ones where you can put stuff (2:04:40) Chips under your skin that has like all your information and yeah monetary stuff like that (2:04:46) Financial and well, it’s a mark of the beast man.It’s a rapture. Yes, I (2:04:52) Heard of that too, and I went what’s the rapture, bro? (2:04:57) Please continue no separate. It’s a separate fucking podcast (2:05:01) So at what point I think this this is gonna be a separate podcast, but yeah, please continue.No, no as a sub part (2:05:07) Let’s get it. Where do the implants and the AI meet to the end of? (2:05:14) Where is the human? (2:05:15) No longer human and more machine, right? That’s the that’s the point about the whole fucking thing outside of Androids, right? (2:05:23) Yeah, so it’s a human being that’s sentient already getting enhancements. Yes, and and those these basically those enhancements are (2:05:33) Artificially (2:05:34) Intelligent in them of themselves.Well, they’re intelligent (2:05:40) Here here’s where I will try to leave distinction. I hope I can do it (2:05:44) You have a guy a person loses their arm in in a war and (2:05:50) This arm can do it can actuate and it can do all the correct things (2:05:54) It’s basically just like a human hand just made out of different material (2:05:58) Yeah, 3d printed whatever sure right and it can do all the things but you still have to send the meant the mental (2:06:04) Signals right to get it to do this. Mm-hmm when it does this on its own and starts beating you up (2:06:09) That’s when I’d be scared (2:06:10) But I don’t think it would I don’t think the programming would allow it to get it wouldn’t give it those parameters (2:06:16) I think it would give it only in take input from it’ll get better at that (2:06:22) The link right? Yeah to that and if you know, I don’t know if you know this but Elon Musk is actually we’ve woven (2:06:31) Have you heard of neural link? No, really? He’s woven fiber optics into people’s brains.Oh (2:06:36) It’s helped like people with paralysis or some other things like do neural link just look that shit up it’s (2:06:45) it’s (2:06:47) It’s there Wow, it’s it’s not there, but it will get there because it’s already started (2:06:53) They’ve actually they don’t know how long the fibers last in the brain or in the body or whatever how long the signals work (2:06:58) But they’ve done it in very small amounts (2:07:01) And you are a l-i-n-k I believe is a company and it’s it’s a nylon musky (2:07:09) It’s a bit musky (2:07:12) So that that’s that so to point out implants like I don’t I think you limit implants (2:07:17) Oh, yeah, you’re only tools. You can’t make an implant (2:07:20) I don’t think you can make everything like (2:07:22) Why make a car sentient so it can have a fit and drive through your house? (2:07:26) Like it doesn’t make sense (2:07:27) Certain things would even be right (2:07:29) But if we’re talking about AI is it just a base whole thought process that because we’re gonna have something that’s gonna be AI, right? (2:07:37) How sentient do we make that thing and then to your point like I don’t know if you’ve ever heard the term (2:07:42) There’s a thing going around like people losing their jobs right to automation. Yeah the code, bro (2:07:47) Just learn to write code like that’s like the fucking douche response (2:07:52) AI is gonna write its own fucking code.So you fucking coders can go fuck yourselves in a couple years. Hello (2:08:01) I just people who fucking just yeah, let’s let’s reduce the reductionist thought. Oh get a better job, bro (2:08:08) Thanks asshole.Thank you (2:08:14) So pox again here you go last I’ve been sitting on that one for a while (2:08:19) Last enlister is religion (2:08:22) All right, where does religion fit into this whole thing what do what do (2:08:28) Christians and Jews and Muslims and Buddhists and Hindus think about internet artificial intelligence (2:08:34) Are they pro are they con do these do they see a? (2:08:39) Need for it do they have a fear of it do they have a (2:08:43) Do they are they threatened by it? Do they give a shit? (2:08:47) It’s weird because we’ve had this conversation or this question in general about aliens, right? (2:08:52) Like if we find out that aliens exist, right like sentient alien not life (2:08:56) We know like we I think I’m pretty sure that most people are like there’s been life on another planet (2:09:01) Well, there’s been a bacteria. Yeah, they found something on the North Pole (2:09:06) Cool, did that stop it from going? Well, there was a meteorite, but then it was something regardless (2:09:11) I think there I think there was something of some evidence of life somewhere else (2:09:15) Okay, but then we go to sentient life right have they gotten to where we’ve gotten (2:09:20) That’s a the next real question, right? (2:09:22) Are they just all animals aren’t like dinosaurs before us, you know running around or are they at our level or? (2:09:29) Beyond our level, right? (2:09:31) so AI (2:09:33) People what’s gonna happen with religion is people are gonna misinterpret the God created us in our own image or in his own image (2:09:39) Mm-hmm, and we are now God right because we are creating something else (2:09:45) Creating this in our image like if we’re doing (2:09:48) Androids or if we’re doing sentient, right? We’re doing sentience (2:09:52) From our experience because we don’t have the experience of a whale (2:09:55) Like we don’t know what a whale what a whale filter is like how they live their life in the water (2:10:00) Oh, we don’t know that but we do know how we are. Right? Yes, so we are (2:10:06) literally making (2:10:08) Something in our image.Mm-hmm because we don’t know any better (2:10:13) Well, we’re just (2:10:14) Cuz we just want I mean, I loved it. Look I would love to know if we could do it (2:10:18) I don’t know if I’d want it done, but I just want to know if we could do it (2:10:22) understand but (2:10:25) That verse would be very (2:10:27) Misinterpreted because I don’t there’s a lot of meaning behind God created us in his own image, right whether (2:10:34) And it could be just the actual like looking (2:10:37) Fucking horse shit man fucking all-knowing all omnipotent omniscient being (2:10:44) Created us in his image. Like I don’t understand.What does that mean? Right? That’s really what it means (2:10:49) But people take that literal thing and go you just you can’t be God. We already have God over here (2:10:55) Yeah, that is where you know with my I mean, I (2:11:00) believe faith as a whole or religion as a whole has slowly (2:11:05) Recess (2:11:05) recess or (2:11:06) Decline over the years, right? (2:11:08) I mean the Catholic Church has definitely gone through is going through a recession, right? Is there a rebirth on that? (2:11:13) I forget. I don’t know.I you could be totally full of shit. I’m really full of shit (2:11:17) I hope not because I love your theory, but my point is just that we (2:11:22) hopefully I mean (2:11:24) Hopefully the faith that (2:11:26) faith in something that we can’t understand goes away and we actually understand what it is and (2:11:31) That’ll replace religion once we understand what God is (2:11:36) But that’s a whole that’s could be further than AI question. Okay, did I fuck that up? Not at all? (2:11:41) I know look very I don’t think I answered that, right? (2:11:45) We know right answer.Well, there may be a so there may be an equation for God. What is God we may (2:11:51) Ultimately get the understanding of what God is by the time this question of AI comes up. So (2:11:57) Knowing what God was would probably shit on the books (2:12:02) That are passed through God or whatever (2:12:05) Obvious if we know right if we know and it’s different or whatever.Mm-hmm. I (2:12:10) think that it’s probably a non (2:12:13) but then you got the (2:12:15) No, I’m not saying that you got a group of people that are (2:12:19) Marrying first cousins and having children. So students (2:12:23) We’re still having children with our first cousin.So we don’t want to (2:12:27) They’re not they (2:12:29) Damn, I think I know them (2:12:32) Yeah, so that’s what but and that’s 1.6 billion of the seven points (2:12:36) Oh, I was thinking about the people that go to NASCAR races down (2:12:41) Sorry, I only know excuse me, can you make a turn right? I know I don’t think I eyes allowed turn, right? (2:12:48) I tell you what tell you how’s your mama name? (2:12:51) Bless your heart doing (2:12:53) Okay, I thought you’re I know who you were talking about but I know who I was talking about (2:12:57) I know you are talking about now Bubba. Yeah. Well what you don’t want to use I know right now (2:13:03) Well, that’s the one that’s getting all the heat all the heat (2:13:07) Another one the guy down in there the guy that’s like, you know, not as light as you and I say what? (2:13:15) Would you like to do a recap? Yeah, absolutely.Let’s do a recap get the fuck out of here (2:13:19) Tell you what anyway, so we started with have you seen Terminator? Um, we just start with that. And yes, I have (2:13:25) It’s actually in my movies (2:13:28) On my draft list cuz it’s cuz I love you, bro. I still love you, man (2:13:34) I know, you know, what would a I give up the first pick? No, yeah, I think a I wouldn’t have it’d be like it is (2:13:42) In my equation, it is the greatest movie of all time.Therefore. It must be chosen first (2:13:47) I am NOT here to make friends or make enemies, but it is the greatest decision to make is to choose a (2:13:53) Aliens as my number one (2:13:56) Thank you drive-thru (2:13:58) The cow says (2:14:02) I don’t know speak and spell. Oh, I was singing CIT.I did the wrong one. I did the cow (2:14:08) The will I understand. I’m just waiting for you to finish (2:14:12) Burp burp beep beep or see a deep cat that thing speaking about I know (2:14:17) I think you guys call it leapfrog now or something like reap frog (2:14:22) Okay, so so I’ve seen I’ve seen Terminator and Terminator 2 (2:14:27) That was a really yes, sir.We then talked about Alexa Siri (2:14:34) Those bitches (2:14:36) Watson big blue alpha go alpha zero (2:14:40) If and when AI becomes self-aware (2:14:44) What conclusions will AI make our humans good or bad (2:14:50) androids servants relationships sex (2:14:53) physical laborers (2:14:55) if (2:14:57) Slavery’s yeah (2:14:58) If that we’re getting to there if AI is deemed sentient would having an Android be akin to slavery the upsides of AI (2:15:08) Implants and religion sir to wrap it up and that’s it you have any final thoughts, sir (2:15:15) We’re doomed bro (2:15:18) Do what if what a fucking what if like a I just went I’m on I see it all (2:15:26) You guys go do this and I’ll show I’ll show you down the road where it’s gonna happen (2:15:30) You guys do this and it goes you guys do that (2:15:33) Fucking solved everything and the whole fucking world is like in peace. Wouldn’t that be fun? (2:15:37) I would love it, dude, but the problem with this (2:15:41) Regardless. Yes, is that a human that is creating the AI? (2:15:50) Gets to choose its purpose initially.Yeah, and (2:15:54) The second any human touches anything I mean in this case in my opinion (2:16:03) Not so good because humans are flawed we’re full human not malicious (2:16:09) technically malicious line or just flawed by (2:16:13) Nature, correct humans are flawed and humans are building AI (2:16:18) Emotional we contradict instinct (2:16:21) Yeah, and I I’m sorry we build AI so humans are flawed humans are building AI (2:16:26) Therefore by mathematical principles AI is flawed (2:16:30) Has to be right just like the religion of the humans who created the religion (2:16:36) That’s some full-circle shit right there do circle shit are we ending on that my friend (2:16:41) So like that’s great. Oh and when they talk about fallibility fell it being fallible. Yeah, we’ll be with you.Oh, wait (2:16:48) Okay, we’re gonna end on that you talked me into it do you want to say F the certain (2:16:54) Entity of which we’re just talking F AI. Oh, I thought FAI (2:16:59) I thought was FCC. Oh (2:17:02) all the sure (2:17:04) VIP KP MIA all that shit.KPH. It’s one of my favorites. Okay, you got to do a conversion.Oh, yes (2:17:13) Is no longer Kentucky nor fried nor chicken extra crispy bitches (2:17:19) I like I got to get the mix Matt the mishmash (2:17:22) Mishmash (2:17:25) Popeye’s biscuits much better. Whatever it takes. Yeah, so we just said humans are fallible.So making a system fallible just like (2:17:33) Religion, everything’s fallible (2:17:36) Everything humans touch has the ability to be fallible, correct? Absolutely. So we have the ability to be wrong (2:17:42) What a shocker and if we’re wrong in a really wrong spot (2:17:46) Hello Twitter world, then we’re in trouble (2:17:49) People get killed and then gloves don’t fit (2:17:53) All right, well on that thank you sir for a I thank you I’m gonna do it I’m gonna do it my friend (2:17:59) Are we are we doing another one? Yeah. Okay, we’re gonna talk about consciousness up conscious (2:18:06) Upload of consciousness in our next episode of (2:18:10) the Knocked Conscious see (2:18:13) Thank you so much for being here (2:18:15) Thank you so much for being here (2:18:18) Can’t wait to get my sentient sex robot.Oh, oh, can we do it? Oh my god, that’d be so cool. Yeah. Yeah (2:18:27) It’s like (2:18:34) I love you, bro.Bye boo (2:18:38) And I used the wrong music because this is the Google’s music (2:18:45) But I started with the Google’s music again (2:18:55) Hey, ma, they’re fucking I hold on we’re not there yet, bro, you know a I would know when to stop (2:19:03) Oh, I’m just kidding, bro, but you made a I see you it knows because you know, I’m fallible (2:19:09) I’m just fat (2:19:11) I’m fattable. I’m inflatable (2:19:15) Infatuable we can shut it off anytime that door perfect

Share this episode