Mark welcomes J.J. Jerome, author of “Evolution Ended: The Next Stage of American Society”. We discuss the ramifications of technology removing more & more stresses/pressures required to continue human evolution.
Website; https://www.jjjerome.com/
X: @jjjerome11
Buy the book here: https://a.co/d/8sHqvoE
Outro: ”Goodnight, Sweetheart, Goodnight” – This score is in public domain and may be freely downloaded, printed, and performed. The sound file may be downloaded for personal use. For more information see https://lincolnlibraries.org/polley-music-library/
Transcript:
(0:00) Hey everybody and welcome to another episode of Knocked Conscious. Today I had the pleasure of (0:05) speaking with J.J. Jerome. He’s the author of the book Evolution Ended.It was a great conversation, (0:11) here it is, I hope you enjoy it. No worries, well welcome. So J.J. Jerome, Evolution Ending, (0:19) is that the name of the book sir? Evolution Ended.Evolution Ended. So welcome to Knocked (0:25) Conscious. Tell us about yourself, tell us about this book.I’d love to delve into all these (0:29) different points because I’m sure I’ve got ideas and you’ve got a lot of thoughts so (0:33) welcome and share with us your ideas. Well thank you. By way of background, I’ve had two areas in (0:41) life that fascinated me and one is the brain.When I was seven years old, when I was getting (0:46) dressed for school, I used to watch these neurosurgery shows that were on at 6 30 in (0:53) the morning for doctors and I got very interested as a kid and then later on I got very interested (0:59) in engineering, especially electronic engineering. So I put them together in most of my experience (1:05) and Evolution Ended brings together the evolution of the human brain with the evolution of artificial (1:14) intelligence to talk about how humans are evolving into something completely different (1:20) due to technology. So are you kind of doing it akin to the singularity prospect? Is it (1:27) similar to that concept? Well not entirely because we’re not worried about AI going off (1:36) on its own and taking over things but what we’re saying is that the survival pressures that (1:43) humanity and really all living things go through have now been relieved due to our technology (1:50) and in the early days probably the inflection point for technology was back in the late 1950s (1:56) after World War Two or the 1960s when we started to get technologies like the birth control pill, (2:04) great antibiotics and things like that that allowed us not only to survive from a health (2:10) perspective but also to control our reproduction and so in the wild the species that evolve are (2:19) the ones that actually reproduce the most and they become dominant in the environment and (2:25) provided they can survive in the environment well all of a sudden we could decide whether (2:31) we reproduced and with all our new technologies we controlled the environment we didn’t have to (2:37) adapt to the environment we could engineer our environment to adapt to what we wanted and then (2:43) as we went into the future we could adapt our appearance through cosmetic surgery to be what (2:50) we wanted so it wasn’t about attracting a mate we could engineer that too we could engineer (2:56) fighting wars you don’t have to be the big strong alpha male when you have drones that can fight (3:01) your wars for you so those survival pressures have gone away and now what is going to become (3:08) of humanity now that even our brains can be enhanced by external things.Yeah it’s very (3:15) interesting because I literally I just had a podcast yesterday with Laura Johnson Dahlke (3:20) and she wrote a book about ectogenesis and we’re basically talking about brave new world and (3:24) how basically they’re going to be able to have artificial wounds just pump out whatever meat (3:30) blob babies are going to come out we don’t we do not know what the effect of that’s going to be (3:35) we do not know what an artificial womb is going to create it’s going to create some kind of human (3:40) cell species thing but we don’t know what’s going to be in that thing we don’t we we just don’t (3:45) know we haven’t we haven’t experienced it yet so it’s a very interesting look you know into that (3:49) but you said that to your point we’ve controlled our environment we’ve now done this and evolution (3:54) used to be based on pressure right to your point now have you had now with that we here’s a great (4:03) analogy 15 years ago 20 years ago I’m at a taco bell and I placed my order and it was it was (4:10) surprisingly not a hundred dollars like it is now but it was something like 15 dollars and 60 some (4:16) cents so I gave the attendant 21 and you know exactly where I’m going with this off the top (4:23) of my head it’s 16 minus 21 I’ll get a five dollar bill back and the change and even 20 years ago (4:31) this kid’s looking at me positively with the two bills and he hands me the one dollar bill back (4:36) and I’m like no I’d like a five dollar bill back go ahead and put in the 21 dollars and see what (4:42) number comes out so I’m assuming this is kind of that was an early stage of this part where we we (4:49) didn’t we don’t even calculate anymore like we don’t count well it turns out that we have cadres (4:55) of software engineers whose job it is to make things easy for us make everything we do a (5:01) no-brainer because we have all this software and it turns out as part of the research in the book (5:07) we found that human iq has been declining by one point per decade really for the whole 20th and 21st (5:16) century and again it’s back to that pressure you don’t have the pressure to be smart you know we’ve (5:24) all seen these sci-fi movies where you see humans have evolved a million years into the future and (5:29) have these gigantic pulsating brains and yet that will never happen because there’s no evolutionary (5:37) advantage to that happening even Richard Dawkins the famous anthropologist says what’s the advantage (5:45) of being smart in the 21st century and quite frankly there isn’t any because we’re going to (5:50) survive and prosper either way and so all these things are combining together it’s almost (5:57) undeniable that that we’re declining in capabilities you’re right about the change I knew exactly where (6:03) you were going and and everything else we can’t find our way to our parents house without google (6:10) maps these days yeah so so to that point would you say this is a secular systemic issue of a society (6:20) as in a society grows it evolves it becomes large you know what is it uh hard times create tough men (6:29) tough men create easy times easy times create weak men we talk about the fourth turning right uh is (6:35) this kind of is has this happened throughout civilization with the Greeks and the Romans (6:40) where technology has kind of stopped our evolution for a hiccup in time and then we came through some (6:47) dark ages for a while and had to work our way out of those areas again well I that it’s all very (6:53) interesting in the fourth turning is an interesting book that a lot we share a lot of common readers (6:58) um I don’t think this is cyclical though and I’ll tell you why um the main reason is the technology (7:05) is completely moving in one direction and the cycles of how humans interact between tough (7:13) leaders and such that is cyclical and we see that maybe in our elections even today (7:19) but in general technology is moving forward and we’re losing to some extent the paradigm of the (7:27) alpha male and the strong leaders you know genetically most people are followers they’re (7:34) much more comfortable following they generate stress hormones when they’re put in charge (7:40) um there are about 10 to 20 percent of people who are genetically suited to be leaders they have (7:46) different hormonal makeups which can vary but these days leadership is changing leadership is (7:54) becoming influence and we’re all becoming equal nodes on the internet where we can be influencers (8:02) whether we’re big and strong or not through our ability to communicate we’re making collaborative (8:08) decisions our ai is the actual embodiment of that because artificial intelligence the way it works (8:16) now is trained on the the sum of human knowledge so when they train up these large language models (8:24) they feed it a whole bunch of stuff from the internet which is basically the average of what (8:31) humans are thinking so now ai is becoming a statistical mimic of what society is thinking (8:40) depending on how they do it and so we’re all going to be equal nodes and we’re going to have (8:46) avatars in the future that are going to be ai avatars that look like us act like us are basically (8:53) indistinguishable from us that are going to be out on the internet out and about doing our workforce (9:00) representing us making opinions that we would probably make and reporting back to us later (9:06) but it’s a totally different model of how humanity will work it yeah it’s very fascinating in that (9:15) respect because to your point leadership i think i think people who don’t want the leadership are (9:21) the ones who are the ones who need to be who need it thrust upon them it seems because the people (9:26) who want it seem to want it for reasons right there seems to be a reason for it so we would (9:32) hope that the technology would balance out the power across because information is data information (9:41) that is the currency by which we go now as the internet grows we we still are finding that (9:49) the you know the means to get into something early for example like a neural link if someone gets (9:55) ahead of the game they can really differentiate themselves from the rest of humanity in this case (10:00) right is there any kind of challenge to or any thought to how to navigate those challenges (10:06) by making the technology even across class as it is across because once again obviously we know that (10:17) money is a driver for for technology and for things like that and money is what makes things happen (10:24) but with this change our evolution changes from a need based right those things we need we don’t (10:31) need to jump in front of the mountain lion anymore so you know a woman has a firearm she (10:36) can protect herself in that respect for example so these types of things happen and toxic masculinity (10:41) has been under attack for being for us being us i mean literally just our instincts it’s an (10:47) interesting thing how i’m wondering if the evolution is now changing the pressures are (10:53) becoming a social evolution change not a physical one like we used to have because now it’s all (11:00) about compassion and caring and it’s it’s like an ethereal replacement of a physical (11:09) evolution almost well that’s that is very much what we’re trying to communicate in the book (11:16) and let’s start with the toxic masculinity or just masculinity in general in the old tribal days we (11:23) had to be toxic men because we had to defend the tribe and and you know you had to be raging with (11:30) testosterone the biggest strongest guy you were the one who kept the tribe in line there were (11:36) beta and they did not ask that person to make the laws for the people either it’s like everyone had (11:41) their role right let’s not kid ourselves like people had a position in in their tribe at that (11:46) time they did absolutely and most of it was based on their dna if they had the dna for the big (11:52) strong guy where they were the most attractive female that appeared to be the best uh one to (11:58) bear children uh it was it was all genetically done but we needed this to survive it was a (12:05) you know you wouldn’t say the lion today that a lion shouldn’t be aggressive in killing its game (12:10) that’s how it lives that’s how it evolves but because of technology and pretty much solely (12:16) because of technology we don’t need that super testosterone charged guy to defend us anymore (12:24) you know we have we have drones we don’t need the men to be what they were nor do we need the women (12:31) to be exactly what they were and so we can survive just fine if people serve different roles and so (12:40) toxic masculinity was admired and essential 10 000 years ago okay now it may have outlived its (12:50) usefulness and so we it outlived its usefulness in this time now let’s let’s just in this time (12:57) let’s look at let’s look at a very quick event saying emp comes and knocks out all the power (13:03) we get a solar flare we get some kind of crazy event and now we’ve got all these men who are (13:09) unprepared because we don’t have that right right and you’re right it could go back at any time (13:17) but today in first world countries not like the u.s in general it’s outlived its usefulness and (13:25) and in the old days you know the women would be relegated to um uh you know farming taking care (13:31) of the kids now because of technology there really shouldn’t be a difference women have taken the (13:38) minorities other people we used to be prisoners of our dna you were going to be what your dna said (13:46) you were going to be you were going to be strong you were going to be weak you were going to be (13:49) your role was going to be tending to the house and bearing children now and even bigger than that you (13:54) were you were a prisoner to your your dna and then you were a prisoner to your region because it’s (13:58) not like you traveled like you used to it’s not like you could communicate that you’re in another (14:03) place like we are right now i mean we’re opposite ends we’re 2 000 miles apart and we’re talking in (14:07) real time that’s right these types of things weren’t available it’s not like the attention (14:12) that a woman or that a man could garner from the opposite sex was at the levels that it is now it (14:18) isn’t i mean with these you know dating apps and things like that so it’s so many things in our i (14:25) think brett weinstein talks about with heather heining about uh about how we haven’t caught up (14:29) to the evolution of the technology right we haven’t our evolution hasn’t caught up to it (14:33) the hunter gatherer’s guide i think to the world something like that that’s right so so yeah please (14:38) continue it’s just like i said it’s just interesting how we all have these ideas well i mean (14:43) i think it’s just totally fascinating now you’re right if an apocalyptic event occurs that (14:49) testosterone may be very useful again but at this point in time and you see it now propagating (14:55) through society as soon as the technology hit women could change roles all of a sudden they (15:03) had washing machines they had vacuum cleaners it was no longer necessary because of technology (15:09) for them to stay home all day and take care of the kids in the house okay so then women started (15:16) and nixon didn’t help with that let’s not kid ourselves nixon nixon put in a place a lot of (15:20) people do not know this but nixon said if you wanted to get ahead in the world you should both (15:25) earn income he said get a two two income family so everybody did it because everybody’s american (15:30) and we all want to get ahead and now it became kind of a standard thing where having a single (15:35) income makes it even more challenging and there seems to be a social attack on the family from (15:40) other angles that we won’t get into but but you know please continue no no you’re absolutely right (15:45) and then what happened is we used to live in tribes and the tribes provided our social support (15:52) tribes wouldn’t let someone be homeless they would find a place for them they wouldn’t let (15:56) be unemployed they give them some jobs to do right so the tribes would take care of us it was (16:02) the perfect unit for our dna to evolve because they were a closed system but big enough for (16:08) for things to happen and then they provided our sacred values our social values they kept people (16:16) in line they sent everybody out in a coming of age ceremony to prove that they were worthy of (16:21) reproduction basically that was the reason even though they didn’t realize it and so it was a (16:27) really amazing unit to promote evolution but now we have the technology and then women started going (16:36) out in the workforce we were able to travel just as you would you have said we left our tribes (16:42) ethnic groups don’t necessarily live together in ghettos or cities anymore the way they used to (16:48) when the immigrants first came over the jewish the italian the irish they lived with each other (16:53) in an extended tribe that no longer is the case people live all over the place so our new tribes (17:01) are our facebook friends our political parties people we affiliate with and they it’s even (17:08) exacerbated there from covet because now we’re working remotely yes so we’ve got we’ve got people (17:14) like for example i’m looking to actually move out of the city to a place where i can become a (17:18) little bit more self-sustaining but still keep my remote job so i’m able to take the advantage of (17:24) the city you know the city advantage of the technology while still living a different (17:30) lifestyle than we ever used to and used to never be able to live those two simultaneously you had (17:36) to choose one or the other oh right and i think overall it’s a better lifestyle but we now the (17:42) people who give us our sacred values or whether you watch cnn or fox right it’s it’s it’s what (17:49) podcasts you listen to it’s it’s who your your social media friends are and and and these are (17:56) our new tribes because whether you’re a democrat or a republican you typically swallow the sacred (18:03) values it’s a hook line and sinker so when you talk to democrats they can tell you there are five or (18:09) six issues the republicans are the same they they buy into the whole thing they want leadership (18:16) you have your influencers who are your leaders through the internet and this is the beginning (18:21) of this hive mentality right and and it’s changed dramatically the way we interact we no longer (18:29) interact through pheromones and and as hormonally as we used to and humans will never fully evolve (18:37) past that again because there’s no pressure to do it but we used to right and i don’t think i (18:42) don’t think it goes away it might atrophy to an extent but i don’t know if it’ll completely go (18:47) away and to to that question this is where it’s very interesting we’re seeing i i’ve i’m kind (18:54) of become very libertarian minded very self-reliant you know it’s not it’s just we can i believe in (19:00) that small community out you know local out we we should have that small piece everyone should (19:05) know somebody that can help them in a bind you know that i i know that doesn’t exist but it you (19:10) know in a beautiful world i wish everyone had that one person right so in this world that we live (19:15) this libertarian world it’s about freedoms that’s what it’s it’s always about the freedoms and i (19:21) just lost my train of thought on it could you could you refresh me again what you’re talking (19:25) about freedoms we have lost a little bit because again we don’t have the big tribes and family units (19:32) that support us anymore and people are very alone and um you see this in the depression of young (19:41) people you see this in a lot of things that we see in society yeah and that was the point i was trying (19:47) to say i apologize uh so that the algorithms of these social medians are not helping us they’re (19:52) putting us in these in these echo chambers where they either want us to combat each other because (19:58) or they want us to agree 100 which doesn’t allow for any additional growth because you don’t hear (20:03) any uh any competitive ideas for example so even in addition to your technology dumbing us down (20:11) because we don’t have the pressure to think we’re now getting our own thoughts reinforced to us (20:18) like they’re correct without the pressure of challenge so that when you have an interaction (20:24) like you and i have a difference of opinion that’s 180 degrees and i say water’s wet and you (20:29) say no water’s dry or something you know whatever that is the ideological idea we are already on (20:36) opposite ends because we’ve been told the whole time to believe what we’ve been thinking because (20:42) it puts us in that algorithm which is uh there’s almost like a double dominant there’s an addictive (20:48) component to it because when you’re looking at social media and you get a like for example they’ve (20:54) studies that say it puts out some dopamine or oxytocin both of which make you feel pretty good (21:01) and very satisfied and you can actually become addicted to likes and and reinforcement that you (21:09) get on social media well it’s easy to find that drug of of reinforcement because of our technology (21:17) in the real world you don’t find people who reinforce you all the time and so people you (21:24) know they they get addicted to this stuff it’s been shown donald trump had during his presidency (21:28) i believe it was 25 000 tweets and if you were a fan of donald trump each of those tweets just (21:36) reinforced a tiny bit hormonally what what your belief system was i mean why would you possibly (21:44) tweet that much unless there was a reason and he was extremely smart about using social media in (21:51) that way and it worked out for him in general yeah he knew how to use it to happen in the (21:56) emotional part and that and that’s what’s happening is we are actually losing the reason even though (22:02) we’re becoming we talk more reason we talk more logic meditation introspection ryan holiday and (22:08) the stoicism right we talk about some certain people we’re not living that we’re living emotionally (22:13) because we don’t have the logical reason to even explain how we got to an idea like when i when i (22:22) speak with someone it can be about anything but it generally generally is ideological right some (22:28) kind of political ideas okay tell me your point x equals y or something okay how did you get there (22:34) show me your work three four questions in if it’s ideological it crumbles it usually crumbles apart (22:40) if you can’t show me how you became you know into this mindset it we’re unable to communicate those (22:49) parts and what we do is we just come in and say i’m this and you’re this and we don’t we don’t (22:54) have the bridge anymore and i think that’s what we’re losing with this lack of intelligence i (22:59) think that’s the intelligence we’re losing i think you’re exactly right it’s the prefrontal (23:04) cortex and remember we the the prefrontal cortex is the latest part of the brain to develop (23:10) it doesn’t fully develop until people are in their mid-20s at 25 i think right right and you know (23:17) more or less and it’s a part that is programmed by experience so your emotional midbrain is what (23:25) it is and it doesn’t need to be programmed it’s naturally emitting hormones that make you feel (23:31) good or bad depending on the input but that prefrontal cortex is programmed over many many (23:37) years and we have no challenges to program it anymore so in the old days if you were living (23:45) in the woods you had to figure out how to make a bear trap that would be a challenge you’d figure (23:51) it out and you’d make it work if if you were a boy scout or you’d find somebody who could help you (23:56) make a bear trap and you can help them skin it or cook it when they’re done making it or something (24:00) right you know they’re it’s kind of like one of those things because there is a collaborative (24:05) effort through experience and those who weren’t smart enough may not have made it and may not (24:10) reproduce so you would learn and and relatively modern times they had the boy scouts who used to (24:17) reinforce these experiences they take them out in the woods they teach them how to hike and camp and (24:21) and some of my smartest friends were eagle scouts way back in the day now it’s it was me i was an (24:28) eagle scout were you okay yes sir and it taught you a lot right oh absolutely right and these are (24:35) the guys who can figure anything out i have one friend you ask him anything he’ll figure out (24:41) how to fix it how to make it work and he’s super smart but you just don’t develop that (24:48) kind of smartness from what we have today and schools have dumped down the experience my son (24:55) years ago had to build a Rube Goldberg mechanism that had okay let’s explain what a Rube Goldberg (25:02) is so basically what it is is it’s a machine that has highly complex parts and takes many different (25:09) maneuvers to do a simple task so for example you’ve seen this thing where a ball goes into a (25:14) bucket and drops down and then it shoots this water over here and all this stuff and then ends up just (25:18) unscrewing a lid for example that would be right okay so please continue so so older folks (25:26) that would have been you know kind of a fun challenge but he and his classmates (25:31) completely melted down when they tried to do this because they had no physical experience of cause (25:39) and effect it was a great learning experience even just building that one thing you remember (25:46) erector sets you remember building things and and people don’t do that anymore either so we have (25:52) we have lego we had lego cities because we were german so that’s what that’s what we did so right (25:57) and and but now everything is in two dimensions or even three dimensions online and even when you’re a (26:03) gamer it’s not building things in the old way it’s it’s a more of a reflex thing and and so i think (26:13) we’re losing a lot of that because again we don’t need it to survive great example to your point (26:20) about gaming so you’re you’re you’re pushing your little controller first of all it’s not like a (26:25) real gun for example but every time you pull the trigger all the thing the hand vibrates that is (26:31) not how a real firearm for example would react that’s right situation so you come out in your (26:37) brain i’ve been gaming for years i can handle a gun and you have no idea what that thing can do (26:44) in your hand and how that how the kick is and how the recoil and how to load you know all these other (26:48) things that you see it but you don’t have the tactile you don’t have the actual physical (26:53) experience of experiencing it when it’s on a screen right exactly so as ai moves forward (27:02) um at some point in the near future it’s going to be able to pass something called the touring test (27:08) are you familiar with that yes alan touring test where basically it should you should be able to (27:14) have a conversation with it and not tell whether it’s a human responding or whether it’s a you (27:19) know a machine basically right right so way back in the 40s or maybe it was 1950 alan touring came (27:25) up with this this test because he was thinking about artificial intelligence even back then he (27:31) was a brilliant mathematician and the question was when can you say a machine is thinking and (27:38) our ai today is very close to being able to have conversations that are indistinguishable from (27:45) human conversations and so the legal system will one day soon face a challenge where somebody (27:53) says this ai is a thinking being it should have all the rights that humans have how could you tell (28:00) the difference and then some court somewhere is going to probably say that’s correct because we (28:06) can’t find a way to differentiate between a human and ai and once that happens it’s going to be a (28:15) very interesting time first of all then we won’t be able to turn ai off because it’d be about same (28:21) as murder we’re going to be competing with ai in every area will ai be able to vote for example why (28:29) not if it’s a if it’s a sentient being if it’s an entity why not give it the vote would be allowed (28:35) to own businesses will it be allowed to enter into relationships and have legal status and be (28:40) able to inherit the money when someone dies and i think people will enter into serious relationships (28:47) with ai in the people already have people already one of the there’s an ai model that has double (28:54) the double the followers of like the next human person or something and it’s or some i saw (29:01) something crazy like that and you’re just like what i i you know i there’s look there’s movies (29:06) like ex machina to your point alan turing the turing test dystopian beyond dystopia uh her (29:13) right horror with uh with uh scarlett johansson’s the voice i think on the phone and it’s uh river (29:20) phoenix joaquin phoenix i think is the user right so to your point i mean it is and it’s interesting (29:27) because the what do we what’s the call alignment we have the alignment issue we haven’t even talked (29:33) about that how are we going to give a sentient thing something that we’re not aligned with i (29:39) energy needs are different it doesn’t but i guess it there are machines that eat biomass now (29:46) so maybe there’s a solution there but like you know what i mean how would you diet and how would (29:51) you do it just seems like you’d have to have almost two different societies it’d be well (29:56) that’s the question right now we can’t even handle racism between humans who are slightly (30:02) different colors right so how are we going to even handle this and then human beings may no (30:09) longer be the apex uh species on the earth because we’ll be working alongside ai they can perform (30:17) tasks much better than we can and so the question is how are we going to compete interestingly (30:23) enough we may have to start evolving again to compete with ai because we may have to be getting (30:29) smarter and things like that i think we go back to chimp brain i think i think we go back to hulk (30:34) smash i’m sorry i mean i hate to say it but that kind of is our default that’s like it’s kind of our (30:40) chip default that’s our reset button is just start over kind of well it might be we don’t know what’s (30:46) going to happen and i’m not ascribing malicious intent to ai i think not not at all no you know (30:53) well yeah ai isn’t let’s be honest ai is a it is the way to process information it is a pattern (31:01) recognition tool it is all these beneficial things it is how we implement it that makes it a good or (31:09) bad entity i mean ultimately it seems like because of its it’s kind of the nuclear thing real quick (31:18) it’s a 0.001 chance that the button’s going to get pushed but the button getting pushed (31:23) is a 99.9 percent chance of nuclear annihilation across the board so the risk reward (31:28) is almost like not there it’s where where’s ai’s risk factor and percentage wise for that point (31:35) whatever percent of it getting implemented or taking over in a way right right exactly and and (31:41) i think there’s a lot of things that will protect us from that and let’s face it we have a lot of (31:46) humans out there that are pretty clever and so whether ai yeah because ai is a tool that which (31:53) they use right i mean just like a gun would be used by a terrorist or a missile you know things (31:56) like right so so is it going to be that much more clever and malicious we don’t know but i i don’t (32:05) believe ai is going to be a threat i think it’s going to do a lot more good than harm and i think (32:09) also uh the lawyers in these big companies like google and open ai are going to make sure there’s (32:17) some protections in place to to make sure they don’t get sued if something terrible happens (32:23) and so i’m not worried about ai hurting us in that way but i’m worried about what it means to be (32:30) human might change to the point that we’re not going to be too proud of ourselves in the future (32:37) and and i think it’s already starting to happen unfortunately so you mentioned a couple positives (32:43) about the ai because i read a couple things where you said it’s not as scary as we think i’d love (32:47) for you to share a couple of those points if you could and then that last point that you made let’s (32:52) talk about that little bit of that scary part of how we’re how we’re going to change and how (32:56) what are the ramifications of our changing might be right um there’s an interesting reason that (33:04) i think that ai one of the reasons i think ai is not going to be (33:08) um as dangerous as we think and that has to do with the definition of intelligence (33:14) do you have a definition you use for intelligence i know it’s a pop question but yeah i mean it would (33:20) be reacting to the environment um being able to come up with an a thought move 37 is it what was (33:30) it in the alpha go was it move 37 you know some some random seemingly random thing that’s just (33:36) some thought that’s not been thought of before i guess would be the that ability is a little bit (33:42) similar but it’s the ability to solve a novel problem and there it is perfect when you think (33:48) about it humans have done a great job with the intelligence we have of solving 90 of the (33:56) problems we have out there adapt adaptation is one of our strengths for sure so we’re not (34:01) going to be able to use our ai to make a much better chair for example we’re not going to be (34:09) able to use ai to make a much better house it might be a little better it might be more efficient (34:13) but it’s not ergonomically designed right ergonomically it might be built quicker things (34:17) like that the process is going into it but over time certain designs have kind of fit their you (34:24) know they’ve worked their way out through just the natural evolution exactly so ai will be able (34:30) to solve some incredible problems for example uh new drugs for example coming up with new drugs (34:37) far more quickly it might be able to optimize electric vehicles to get 20 percent better (34:43) range but it’s not going to change everything it’s going to be incremental because we’ve gotten (34:50) pretty far with our civilization and so i think that scenario of it coming up with some new (34:56) thoughts that are so dramatic that they change everything i don’t think that’s going to be the (35:02) case the second one i mentioned before is the legal issues that companies who produce ai are going to (35:09) have to protect themselves legally that’s that’s the second thing but when the flip side if a (35:16) being does something illegal or immoral we have the justice system to protect (35:23) protect us and put them out of humanity’s way in jail or whatever how do you punish ai if it (35:31) does something wrong yeah especially if it doesn’t have a an influencer of a human sort right right (35:41) right and then i mean but you would think all the data all the data input into an ai system (35:46) would have the thoughts of of a jeffrey dahmer and a ted kaczynski too i mean the the manifestos would (35:52) be in there too wouldn’t they so i not saying they would come to the osama bin laden osama (35:59) bin laden conclusion but it’s out there to be read it’s out there to be processed it is but (36:06) i i again believe these companies are going to put guardrails in place and the other thing is (36:11) one of the things that i find essential so i took a look at um isaac asimov’s three (36:16) laws of robotics are you familiar i saw you mentioned that and that made me think of that (36:21) right away that’s the three laws so i’d love to hear you share about those yeah so way back in (36:26) 1950 isaac asimov released his book i robot which was a compendium of short robot stories but even (36:34) he was thinking about this 75 years ago and he came up with three laws of robotics now remember (36:40) robots are just ai with arms and legs right it’s not the arms and legs you’re afraid of it’s the (36:46) thought process and his first one was that a robot may not harm a human being or through its inaction (36:52) allow a human being to come to harm well that that’s pretty sensible the second was uh a robot (36:58) must obey orders given to it by a human being unless it conflicts with the first law and the (37:05) third one is the robot must protect its own existence as long as it doesn’t conflict with (37:09) the first two these are these are classics i mean isamoff was and it almost closes that loop right (37:15) he tries to close the loop and make sense of it yes he was an incredible genius he he foresaw he (37:22) was he wrote so much brilliant oh my god i mean you know a total genius but he didn’t (37:28) envision the internet he envisioned ai and robots and the internet makes things a little (37:34) bit different he assumed you could identify artificial intelligence by looking at it well (37:39) we can’t necessarily do that so i watched a 30-minute ai news program that had two anchors (37:47) they cut away to different show like different cutaways different locations the whole thing was (37:53) generated the entire thing you could not tell it looked like it was out of you know australian (37:57) news or something and you’re just completely baffled it’s it’s astonishing what they can do (38:03) already it is it is totally astonishing so i modified in the book evolution ended his laws (38:11) a little bit paid complete homage to him because i have no business even being in the same page as (38:17) him but um times times have changed right and the first one is well that that is one thing you know (38:25) that is one thing about the technology is we’re getting ideas from people that they wouldn’t have (38:30) been able to share no one would have had any idea or chance to get to that idea because they’re in (38:36) some village in some place where and they just are able to click on a keyboard and come up with (38:41) some some genius thing it’s it’s amazing it’s it’s it is it’s amazing so the first law i basically (38:47) kept but i said you’re not ai may not injure natural living entities which is a little different (38:55) that or through its inaction allow them to come on but the second one and the third one are (39:01) interesting um and the second one has to do with ai must self-identify we need to know when we’re (39:10) talking or dealing with ai and that metatag or some kind of piece a video that has a tag on it (39:17) that we can blockchain throughout that that has some kind of base code in it telling us that it’s (39:22) right it’s it’s super important that we know who we’re dealing with and and and youtube’s kind of (39:28) they haven’t addressed that 100 they’ve done in a manual way they ask as you post the video they ask (39:34) is is there any altered content that’s generated and you click no or yes so it’s not the solution (39:41) but at least covers them for a bit i think right it helps a little bit it’s not you’re right it’s (39:46) not the solution and then we say that uh human ai must obey human instructions like asimov (39:54) and not initiate new tasks without a human command because we don’t want it running off (40:01) and doing its own thing now how you define that you know is is interesting and then the third (40:07) thing that’s not part of the three laws is that we need legislation regardless of the turing test to (40:13) allow us to be able to terminate or pause ai and while it may be a thinking being it is not (40:21) a being that cannot be reproduced or um turned back on it will i mean when you terminate a human (40:28) or an animal it’s permanent and we have the ability to back up memories and things like that (40:33) but if ai gets out of hand we need the ability to turn it off interesting so let me ask you this (40:42) because it does make a question about terminating when you turn it off you terminate are you talking (40:47) about pausing where it would be at if it started getting out of hand because then it would at that (40:52) point we would probably deem it sentient in some way because we’ve given it enough powers to do (40:57) some control things right because that’s where the concern is i ai can tell me anything at once on a (41:02) screen it can it can tell me that i’m wrong all day as long as it doesn’t have a finger on my switch (41:08) i’m good i’m good today right it’s it is an enhancing it is in performance enhancing (41:15) you know peace to our world but you know i do understand me using it for efficiency sake but (41:22) that’s where it could get out of hand i like i like that you had some slight tweaks to that yeah (41:26) because if we can’t turn it off it’s that one percent nuclear bomb thing you talked about (41:33) we and and we need the ability to turn it off we need the ability to understand that it’s not human (41:40) and we sort of have to decide what gives something or somebody rights is it an homage to a million (41:50) years of evolution is it the ability to feel pain is it their intellectual capability we have never (41:57) really defined this now we’re giving animal animals rights but what is it what is the (42:03) characteristic that grants some some entity rights and we have to figure that out we really don’t (42:10) know yeah it is really good point because i mean even in if you look at some people who have (42:16) biblical scripture type pieces animals don’t have souls for example so no a lot of people would say (42:21) animals don’t have any kind of rights in that way they’re just parts of the natural evolution is (42:27) we are apex predators and some people think though is we are here to create the next thing to take us (42:32) over to replace us or or that is what our propagation is is to create that thing i don’t necessarily (42:40) agree with i think we are we are meat blobs that are here to survive you know what i mean i mean (42:46) and i don’t that doesn’t change that we don’t have experience and it doesn’t change any it doesn’t (42:51) minimize any or minimize any of the the experience of humanity but if we can look at that and just (42:57) kind of face that fact maybe we can get on with the other stuff that really matters like experiencing (43:01) humanity knowing that we’re just meat blobs going well you’re right look at every other well when i (43:08) was a kid i said to my father you know what are people he goes oh we’re just big animals right (43:13) people forget that every other animal on the planet has one purpose and that’s to reproduce (43:21) right the ones who are more successful their dna propagates and species evolves (43:28) every other animal on the planet is there to reproduce now interestingly our birth rates (43:35) are going down in first world countries alarmingly alarmingly and and it is due to the ease of of to (43:44) you’re talking about to the birth control and ability to control ourselves and and the cultural (43:49) change of the of the woman in the workforce that is not a judgment on it that is just a thing that (43:55) happened right that is these are just you know things to look at well we’ve lost the reproductive (44:01) imperative so deep deep in that midbrain of an animal its goal is to reproduce the more the (44:08) better right and the ones that don’t have the good dna they can’t do the mating dance or it can’t (44:15) bump their chest if they’re a chimp they don’t get the mates and and that’s how evolution works (44:21) but all that is gone now we’ve lost the reproductive imperative uh people can reproduce at will but (44:28) they’re choosing not to yeah because you’re culturally told that children are actually an (44:33) impedance impedance in their life i mean really we are we are culturally getting softer in addition (44:39) to the lack of pressures right so it’s almost an exacerbated kind of situation but i also think (44:46) it’s the lack of pressure because if you’re in the wild yeah absolutely it’s both it’s like both (44:50) that’s what i’m saying it’s like an it’s a double wham it’s like a perfect storm of right of that (44:54) for sure i mean living in america i assume my son is going to survive we only had one child (45:00) but if i was living in in an area where we needed our children to build the tribe (45:06) they needed to help us survive to help us catch game and cultivate food you want to have as many (45:12) children as you can we don’t need them anymore that’s part of the pressure it’s been relieved (45:19) and if you want to look at it holistically we probably have too many people on the planet now (45:25) we’re having a lot of climate change issues because we have more people than we could support (45:31) we’re generating a lot of carbon we’re using a lot of resources and i think people are kind of (45:37) feeling this inherently the same way when you put animals in a zoo they don’t reproduce because it’s (45:44) not a good environment somehow they sense it i think humans are sensing now that the pressures (45:50) are not survival from not having enough people around it’s survival for other reasons that is (45:58) a very interesting take i like that yeah i mean and it’s interesting because it is it does become (46:04) ideologically driven in a weird way you get a people rural based homesteaders talk about children (46:11) family community church outward and then you’ve got the other side that talks about you know (46:16) government and city and whatever you know the other side whatever that other whatever that side (46:21) right and i’m gonna go this is a little controversial but um that’s what we live (46:28) i live in controversy trust me oh well this is a little controversial we look at our political parties (46:34) and i’m not going to favor one side or the other but and they both have you know fabulous (46:40) talking points and representatives they’re both fault highly faulted we can get yes (46:45) for sure but i i view from the evolutionary perspective that the republicans are more (46:52) survival of the fittest kind of folks and many of their policies are about um if you’re six if you (47:00) work hard you’re going to be successful you’re going to survive and if you don’t you might not (47:03) deserve to you know as a philosophy yeah that is applied a bootstrap philosophy pull yourself up (47:10) sure absolutely the democrats are well we should support everyone because their circumstances may (47:18) not allow them to be successful and we should try to promote equity which is not a survival of the (47:24) fittest philosophy right and the question of course becomes are we transitioning away from (47:32) survival of the fittest is that a more appropriate philosophy or is the other one the one that will (47:38) take humanity further and i’m not going to judge either way but there’s definitely that that look (47:45) to their philosophies and i just find it very interesting and when you look at the rural (47:51) component versus the urban component right the rural is more republican which is more we have (48:00) to survive we’re out on this farm we’re in the midwest in the cities everything is there for you (48:06) you don’t have to survive in any fashion the food store is right there everything i went (48:12) hunting for my ribeye steak today right exactly at the supermarket right so it’s interesting to (48:19) think about this though yeah it’s it’s so fascinating and then you look at it from this (48:23) perspective you got the one of the largest technological people in elon musk who speaks (48:30) about having children so and then he also has neuralink so it i i look at motivations i i do i (48:39) look at motivations i look at why people say what they do uh yuval noah harari wrote that book sapiens (48:47) i thought it was a great book but now i’m watching that i’m watching him speak of us as useless (48:52) eaters and what to do with us and how the global government is now collapsing and how that’s not (48:59) good and i’m sitting there as just a free person like to your point the frontal cortex in me (49:05) yeah it’s still developing i’m pushing 50 it’s still developing okay i have become more libertarian (49:11) minded no one’s born libertarian no i was i was a neocon i was a grade a neocon lied to about iraq (49:19) and all that stuff and now i’ve come to this more libertarian mindset of you know community out (49:27) family out it’s local matters you know all that but you you have to come to that you don’t you (49:34) come to it through experience you don’t come to it from being told it on a screen right and and (49:40) that’s what’s interesting to your point is we have to really put ourselves through our own pressures (49:46) in life and a lot of times we’re told to avoid it right like one of the points i was trying to make (49:50) is you used to say the strongest person was the one who would dominate now it’s the weakest it’s (49:56) the one who is the biggest victim right the victim mentality is now taken over so it’s a (50:01) complete 180 shift so what what are you do you have concerns for your child and and his future (50:08) and or do you like what you know what are your what’s your percentage of good bad or whatever (50:14) that white pill black pill you know where are you at on that on that on that so first of all harari’s (50:20) books are great and if you liked his books you’ll probably love evolution ended as well it’s (50:26) starts out somewhat similar but we go a lot more i think into the ai world and and (50:35) my book’s a little newer than most of his stuff right right for sure it’s probably by 10 years (50:40) i’d say sapiens is probably 14 and and the world’s changed a lot and and the conclusions are (50:46) different but but still it’s that same kind of genre um so do i have concerns about humanity i (50:54) have grave concerns but the answer will surprise you and i feel like ai is here just in time to (51:02) save our asses and because the capabilities we lose we will be able to employ ai to do a lot of (51:11) stuff make us more efficient in every way allow us to transcend our personal weaknesses whether (51:20) it’s intellectual or physical or whatever it will give us the freedom anybody can be an artist with (51:27) ai i i have zero art capability but a lot of my presentations and talks i use ai generated art (51:36) people who can’t write a book can now write a book through ai people who can’t analyze a sales (51:43) report can use ai to do it people who can’t build a production plant a manufacturing plant could use (51:51) ai to help them do that so as we lose our capabilities which to a large extent is caused (51:57) by technology technology will be there to help us now will i take a guy 200 years in the future and (52:05) put him up against um one of us in a game of jeopardy or some other game and expect him to win (52:13) no because he’s been dependent on technology his whole life but i think humanity will survive and (52:19) will actually prosper due to this tremendous technology that we’ve developed and as humans (52:27) uh there will be less need for many of us i think our population will eventually diminish to probably (52:32) half of what it is now we’ll be engaged in a lot more fun activities but what it is to be human the (52:42) the the competition you know if you’ve been watching the olympics you see this incredible (52:46) need to compete which is actually to some extent driven by our need to evolve and and compete all (52:54) that how is that going to continue how how are our drive our exploration all that stuff going to (53:02) continue in the future i don’t know that’s it’s a fascinating question because to your point ai (53:10) and and i i love your outlook on it i have it it sounds dystopian to hear it like that (53:16) even though it’s not a negative it’s just an outcome because it’s kind of you (53:21) because right but it also has to do with how our perspective is like well okay we’re going to (53:26) shrink the population but i’m like they’re also trying to kill us i’m watching i’m watching the (53:30) manufacturer of consent i’m watching them culling us by culling our by culling our meat by culling (53:36) our food sources they’re sabotaging places around the country it is a scary thing what people don’t (53:42) know what’s going on that is not being reported that is this underlying it’s one thing if it (53:49) were to happen to your point like the natural evolution of humanity doing that i think people (53:56) are pushing for this to happen and that’s the concern is that they’re pushing for this (54:00) generation to to shrink and instead of it happening naturally and i think it’s i think when you push (54:07) it it’s kind of like an early adopter of a technology a television it’s not quite ready yet (54:13) and that is the humanity scare of the time that we switch it over because it could it could be that (54:20) point of one minute too early one minute too late to your to that but ai great with resource (54:27) management pattern recognition it it would have the integration those types of things for those (54:34) types of outcomes but i love that point of you would just need a human to say okay at least or (54:39) hit okay this is right and then are you sure okay that’s you know we need a windows 98 double click (54:44) i guess well think about sexual attractiveness you know we were talking about population decline (54:50) and things like that so in the 1970s and 80s was the sexual revolution and everything was about sex (55:02) people wanted to be as attractive as possible look at photographs from back then look at the disco (55:07) era you know dancing was a mating dance back then right and now look at modern times people (55:15) dance with anybody nobody in particular they just jump up and down look at the way they look now (55:21) people attractiveness for a lot of people is not really a thing the population is fluid (55:27) overweight right uh sexuality is fluid all the things that we used to think about now look at (55:35) um but that is a cultural i think i think camille paglia talked about that douglas marion camille (55:40) paglia talked about that in the both roman and greek time is at the end of a cultural (55:45) when a cultural decline is about to happen those specific cultural things happen in the fluidity (55:53) the you know the kind of just softness of things my concern is how much does technology make that a (56:01) permanent thing going forward versus it like to as the cyclical that i kind of mentioned earlier but (56:07) please continue no i mean i think you’re right now i think it’s going to be more permanent (56:12) because again if people don’t feel threatened these great cultures didn’t feel threatened right (56:17) so the the reproductive imperative wasn’t as great for those for those cultures but look at (56:25) what’s happening in society that beauty in certain areas is being vilified right and you go to walmart (56:33) and you see um the models are now you know plus size the horse illustrated uh swimsuit model (56:42) oh my god lizzo getting attacked for losing weight adele getting attacked for losing weight and (56:46) getting healthy right healthy is a far right exercise is a far right uh i don’t know if you’re (56:52) you’re far right if you exercise or something i don’t i heard that and and that same magazine (56:57) that had um it wasn’t sports illustrated but there was another one who had plus size models (57:03) it was one of the glamour magazines and it says is competitiveness the new sickness or the new (57:09) neurotic you know neuroticism being competitive it’s like a toxic competition is toxic right having (57:17) a healthy is like it’s a toxic right right exactly and that’s not how we survive i’m not saying that (57:25) we didn’t sometimes there is no judgment on anything but there are way there are net better (57:30) and networks there are net better and net worse ways to live that’s just a thing that’s how life (57:35) works this isn’t a judgment on any of them it you still have your will to to choose how you wish (57:41) right exactly so it’s going to be interesting times coming up it sounds very interesting well (57:50) this has been absolutely fascinating love to talk with you again jj we’ve been in at about an hour (57:56) um i i i can talk more if you like if you have a couple other things you like to talk about or we (58:01) could uh if you want to share all your uh the ways to get in contact with you we call it a time and (58:06) then have another conversation down the road sure um well the place that if you want to get my book (58:11) it’s on amazon it’s called evolution ended it has an ai generated audio book available which most (58:18) people tell me sounds pretty real so if you’re interested in trying the ai version that might be (58:23) fun that’s going to be me because i i don’t do i don’t do reading very well i would say ai that’s (58:29) one of the greatest things is like i need a pdf reader that i can drag all the things in to read (58:34) to me because i absorb much better in this meet with the medium with listening versus reading (58:40) grab the audio book and let me know what you think what happened was amazon said we can do (58:45) this for you and they didn’t even charge to do it and i pressed the button and i selected the voice (58:51) i had six options and 15 minutes later the audio book was published wow yeah i thought it was (59:00) pretty cool uh my website is jjjerome.com i’m on twitter it’s jjjerome11 but you can just search (59:07) for jjjerome and there’s a lot of interesting things on twitter because i uh you know post (59:14) other people’s stuff as well but there’s a lot of interesting insights there and uh that’s where you (59:19) can find me excellent all right any parting words before we call it a day not at all i really enjoyed (59:26) the conversation and uh it’s it’s a brave new world out there it’s everybody needs to be aware of (59:32) what’s happening it is a brave new world i’ve got so many podcasts upcoming to talk about all (59:38) technology and how this is affecting us so thank you jj jj jerome evolution ended thank you that’s (59:45) the book everyone come and check it out i’m gonna hit end here but don’t go anywhere jj because we’ll (59:50) talk afterwards okay thank you so much for joining us you’re now officially part of the Knocked Conscious (59:55) family sir thank you have a great day good night sweetheart well it’s time to go good night sweetheart (1:00:11) well it’s time to go i hate to leave you but i really must say good night sweetheart good night