Join us in the first segment of a captivating three-part series with Brian Roemmele, one of the world’s leading futurists and thought leaders on voice technology.
In this episode, we delve deep into the riveting realms of intelligence and consciousness. We challenge the boundaries of your perception, engaging in a thought-provoking discussion about the implications of AI for our understanding of intelligence and the potential intersections with quantum mechanics. This conversation promises to leave you questioning what you thought you knew. And don't miss Brian's intriguing thoughts on the 'cure for outrage' - a timely exploration of how technology might just hold the key to bridging societal divisions.
Stay tuned for the next episodes where we continue our voyage into the unknown with Brian, extending the conversation into the farthest reaches of future technology and its potential impact on humanity.
-----------
What to write and publish a book in 30 days? Go to JamesAltucherShow.com/writing to join James' writing intensive!
What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!
Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!
------------
Visit Notepd.com to read our idea lists & sign up to create your own!
My new book Skip the Line is out! Make sure you get a copy wherever books are sold!
Join the You Should Run for President 2.0 Facebook Group, where we discuss why you should run for President.
I write about all my podcasts! Check out the full post and learn what I learned at jamesaltucher.com/podcast.
------------
Thank you so much for listening! If you like this episode, please rate, review, and subscribe to “The James Altucher Show” wherever you get your podcasts:
Follow me on Social Media:
------------
- What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!
- Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!
------------
- Visit Notepd.com to read our idea lists & sign up to create your own!
- My new book, Skip the Line, is out! Make sure you get a copy wherever books are sold!
- Join the You Should Run for President 2.0 Facebook Group, where we discuss why you should run for President.
- I write about all my podcasts! Check out the full post and learn what I learned at jamesaltuchershow.com
------------
Thank you so much for listening! If you like this episode, please rate, review, and subscribe to “The James Altucher Show” wherever you get your podcasts:
Follow me on social media:
[00:00:08] I don't even know what to say. This is just like the smartest podcast episode I've ever
[00:00:16] listened to and I was doing it. But all credit goes to my guest, Brian Roemmele, expert on
[00:00:23] ... He's an expert on everything, but we talk consciousness, intelligence, quantum mechanics,
[00:00:30] AI, outraged, the fact that everyone is outraged and what the cure for that might be.
[00:00:41] This isn't your average business podcast and he's not your average host. This is the James
[00:00:46] Altucher Show. Brian, I've been following you for years, of course, on Twitter. I started
[00:01:01] following you because you would tweet this incredibly crazy stuff. All these things
[00:01:07] from around the world that were just fascinating. I was trying to find ... Some of them were
[00:01:12] years old because I've been following you for years. What were some of the things
[00:01:15] you tweeted that were just ... It was like you searched the world for incredibly interesting
[00:01:20] things that I'd never seen before. They were interesting intellectually and visually and
[00:01:26] then you would tweet them. What's an example? Do you remember?
[00:01:28] Oh my gosh. I would go into a coronary fit right now trying to remember some of them.
[00:01:35] I got to say, first off James, thank you. Such an honor to be here. Deep fan of your
[00:01:42] work. My pleasure. Basically all your books. Big subscriber on
[00:01:46] Paradigm so thank you for that and you've made me some money in stocks so I appreciate
[00:01:52] that. Excellent.
[00:01:54] So there's a method to my madness. I saw Twitter because I had come off of Quora.
[00:02:01] I'd been one of Quora's top writers for a couple of years and I was a bit of a
[00:02:06] Twitter snob. I thought that it was hot takes, it was drive by commenting and it was very
[00:02:12] little commitment intellectually to what people were saying. That's fine. I'm not
[00:02:17] judging that. I just didn't want to participate. So I started creating what I
[00:02:22] feel ultimately now is a human Rorschach test across social media. I just
[00:02:29] throw out ideas and concepts that are intriguing and they're designed to make
[00:02:36] you think. I think what I really try to do most of the time is I'm trying to
[00:02:41] elicit thought. What do you see here? And what's your first reaction and was
[00:02:47] your first reaction right? What's your second reaction? And I just kind of went
[00:02:53] with it. So yeah, there would be all sorts of things. I really like showing
[00:02:58] animal intelligence because I've always been involved in machine
[00:03:02] intelligence. And I think if we don't understand animal intelligence to some
[00:03:06] degree, even primates, right? How are we going to really understand machine
[00:03:11] intelligence and what is intelligence and things of that nature? So Twitter
[00:03:16] kind of became an open market of my ideas. And yeah, I get very eclectic
[00:03:21] at times. Yeah, like what were some of the... I remember some of the animal
[00:03:26] ones. Like what were some things that intrigued you on animal intelligence?
[00:03:29] Yeah. Intrigued you so much that you tweeted about it.
[00:03:31] Oh, okay. So one of the ones that I thought was really quite profound, it's
[00:03:36] sort of my concept of what I call the Kim Peek proof. Kim Peek was Rain Man
[00:03:41] in the Dustin Hoffman movie.
[00:03:44] Oh, that was based on our real person?
[00:03:45] Absolutely. Kim Peek. I actually got to meet him briefly. Incredibly
[00:03:50] beautiful individual. And what we learned about Kim Peek is the limits
[00:03:54] of human capability mentally. He actually was able to recite to you exactly
[00:04:01] what the weather was like the day you were born, what day of the week it
[00:04:05] was, all sorts of trivial things about who was the best baseball player at
[00:04:10] that particular moment in time. What game was being played that day?
[00:04:15] I mean just incredible things. So my concept about any scientific theory
[00:04:22] is that if you cannot account for the outlier in the theory, you really
[00:04:26] don't have a theory. You have sort of a postulation.
[00:04:30] What do you mean? Like I don't understand. So let's say what's an
[00:04:35] example of scientific theory where there's an outlier, like the theory
[00:04:39] of relativity?
[00:04:41] All right. Well, let's keep it with Kim for a second. So how do you explain...
[00:04:45] So Kim had encephalitis, the swelling of the brain, which essentially
[00:04:50] broke down his corpus callosum, that sort of piece that connects your
[00:04:55] right and left hemisphere. It's not widely known exactly why he
[00:05:00] developed what is savantism, right? A sort of a savantic syndrome. Kim
[00:05:06] in one of the clips that I put up there, and it was really about Kim.
[00:05:10] It was about primates. It was about this particular primate
[00:05:14] research center in Japan that was showing that a chimpanzee can be
[00:05:18] trained to remember a flash of 45 numbers on a grid of about 100
[00:05:26] grid, and remember the sequence after it flashes on. Now humans
[00:05:31] can't do that. It's like how did that primate know how to do that?
[00:05:35] Why?
[00:05:35] How fast was the flash? How many seconds were you allowed to look at?
[00:05:38] Like half a second, maybe a second at maximum. And some of these guys
[00:05:43] and gals were trained, and some did better than others. But the
[00:05:47] fact that one was able to do it, 100 out of 100 monkeys, the fact
[00:05:52] that one was able to do it fascinated the heck out of me. So I
[00:05:56] started studying the brain more and I said, okay, what did humans
[00:05:59] give up that chimpanzees had that we don't have? Well, it turns
[00:06:04] out it's Broca and Warnicke or the phonological loop. So we
[00:06:09] gave up short-term memory to be able to do what we're doing
[00:06:12] right now is having a conversation. So I started really diving
[00:06:17] into and again, this is all based on computer science, ultimately.
[00:06:22] But I started diving in what is human creativity and human
[00:06:25] memory? Well, you have the right hemisphere technically. And
[00:06:28] again, some of this is symbolic talk, I'm not trying to
[00:06:31] talk in, you know, the sense of where neurobiology and
[00:06:35] neuropsychology is at this moment. But you have the right
[00:06:39] hemisphere, which is your creative thought process. And
[00:06:42] let's call that a bunch of nebulous clouds that you have to
[00:06:45] assemble to the left hemisphere, which is a serial chain of
[00:06:50] thought that is only so long. Once you lose that chain of
[00:06:54] thought, the idea sort of dissipates and you get
[00:06:57] frustrated. And I see this with creative people, song
[00:07:01] writers, writer, you know, book writers, you know, people
[00:07:05] like yourself that are just trying to come up with that
[00:07:07] idea, and it just dissipates. Where did it come from? Where
[00:07:10] did it go? So it came from potentially the right
[00:07:13] hemisphere. And then the process of mechanicalizing it. Now
[00:07:18] mechanicalizing it to speech is easier because it flows out
[00:07:21] of us, we're not we're not translating the inner monologue
[00:07:25] that voice we hear when we read, or when we type sometimes.
[00:07:30] But when we have to mechanicalize it to a keyboard
[00:07:33] or writing it on a piece of paper, there's a mechanical
[00:07:37] slowdown in that process. So the flow is broken. And
[00:07:41] sometimes ideas and thoughts dissipate. So we went from
[00:07:46] 10 fingers to thumb clawing on a glass screen. And so what
[00:07:51] happens when you do that? Well, the flow of ideas slows
[00:07:55] down significantly. So much so that I believe that it is
[00:08:00] one of the underlying premises for hostility that
[00:08:03] you see in social media. I think the inability to emote
[00:08:07] correctly by using all of your functionality, your
[00:08:11] micro movements and your face, my hand movements, all of
[00:08:15] this frustrates people they can't communicate correctly
[00:08:18] because we're communicating through an archival system.
[00:08:21] Right? So think about this James typing was designed for
[00:08:25] archival purposes more than it was for communication
[00:08:28] purposes.
[00:08:29] Right. And by archival, you mean I would type
[00:08:32] something so I could store it so other people could later
[00:08:34] exactly. And right now we're kind of we're kind of archiving
[00:08:38] in real time to each other. We're like archiving directly
[00:08:41] to each other.
[00:08:42] And and the problem is about 80% of what humans use to
[00:08:49] communicate being in a physical presence or doing what
[00:08:52] we're doing with the video here video definitely helps.
[00:08:56] But it doesn't complete everything because there's
[00:08:58] nuances you pick up when you're physically present
[00:09:00] with somebody that you don't get otherwise. And I think
[00:09:03] those nuances are totally gone when we're throwing
[00:09:07] emojis at each other because emojis are completely
[00:09:10] open to interpretation. There is there's at least two
[00:09:13] interpretations to them. One is the direct and one is
[00:09:15] the ironic funny version of that emoji. And none
[00:09:20] of it is really useful. So we've kind of
[00:09:23] de-evolved in our communication. So that
[00:09:26] gets back to the monkey scenario right? Okay.
[00:09:29] And we'll kind of go back to Kim Peek now. So if the
[00:09:32] brain in fact can store as much information that if
[00:09:36] our listeners remember Rain Man, Kim Peek was
[00:09:39] better than Dustin Hoffman's presentation of Rain
[00:09:43] Man. He was absolutely phenomenal. And his social
[00:09:46] skills was actually quite a few degrees higher.
[00:09:50] He did have some of the situations of communication
[00:09:53] that Savants had. So what we have now is we
[00:09:57] have an outlier. We have somebody that can hold
[00:10:00] almost everything they've ever seen and heard. So
[00:10:03] it's in the brain. And that was a question I've
[00:10:06] had ever since I was a child is how much of the
[00:10:09] brain is really stored? And how does it stay
[00:10:14] there? And where is the brain? Because we know
[00:10:16] holographically, we can't pull out a single
[00:10:19] memory out of somebody's brain. We wish we could
[00:10:22] right? You can't laser cut. Let's take that
[00:10:25] little unpleasant memory out of your brain. It's
[00:10:27] holographically stored. In fact, I later found out
[00:10:29] through Candice Perth's work, molecules of emotion
[00:10:33] is that your memories are stored throughout your
[00:10:36] body through a neuropeptide memory system, a
[00:10:39] chemical memory system. And that's why when we
[00:10:42] get heartbroken, right? Think about that hurt
[00:10:45] that love that this tore you apart. Do you
[00:10:49] feel it here? You probably feel it here or
[00:10:52] your gut. Right? So your heart, yeah, right?
[00:10:56] There's and there's a more neurochemicals or more
[00:10:59] serotonin in the gut than in the brain.
[00:11:01] And that's evolutionarily wise. That was for us
[00:11:03] to remember this food good, this food bad,
[00:11:07] right? Because we had to have a way humans are
[00:11:11] born naked, right? We don't we have to be
[00:11:13] taught for the first eight years how to get
[00:11:16] through this world or we're going to die.
[00:11:18] Right? That's just the realities of it. And
[00:11:20] our brain was is as big as it's going to get,
[00:11:23] you know, beyond killing moms to a high
[00:11:27] degree. Right? Essentially, the containment
[00:11:30] of the human skull is based upon how wide
[00:11:34] the hips can be displaced on a female before
[00:11:38] it's impossible for them to walk. I mean,
[00:11:40] that's really the bottom line. And yet there's
[00:11:42] trillions of possible bites that could be
[00:11:44] stored in the brain, right? Because every
[00:11:46] possible ways to connect the neurons in
[00:11:48] your brain and in your body is could be
[00:11:51] another memory, another thought, another
[00:11:53] another way of awakening a memory.
[00:11:55] Exactly. So if you're dealing with somebody
[00:11:57] like for a stroke patient or somebody with
[00:11:59] dementia Alzheimer's, you can usually
[00:12:01] bring something more of a higher response
[00:12:05] if you put something that they smell
[00:12:07] that brings back a memory. I like to use
[00:12:10] the apple pie concept. Go back and think
[00:12:13] about the first time you've ever smelled
[00:12:15] apple pie. Hopefully your mom baked or
[00:12:17] your caregiver baked and the first time it
[00:12:19] was in front of you hot apple pie baked.
[00:12:22] What does that sound like? And already a
[00:12:24] lot of things start happening. Most of us
[00:12:26] the like upper pie starts elevating.
[00:12:29] All right. So what's going on there?
[00:12:30] It's setting off a whole series of memory.
[00:12:34] Now, if you've never tasted apple pie or
[00:12:36] anything that might feel like that,
[00:12:39] you're not going to have that response.
[00:12:41] So the memory is encoded all over the
[00:12:44] place. And to me, that's fascinating
[00:12:48] because we're trying to create artificial
[00:12:50] intelligence, but we're not fully
[00:12:52] understanding human intelligence.
[00:12:54] We're not understanding human consciousness
[00:12:57] because really it's not intelligence as
[00:12:58] much as consciousness that we really
[00:13:01] need to kind of get to the bottom of
[00:13:03] what is consciousness? Where is it?
[00:13:06] Where does it come from? Where does it go?
[00:13:08] Tell me more about like how is Ken
[00:13:10] Peak the outlier of a theory?
[00:13:12] So it's a great question, James.
[00:13:14] So basically if you essentially
[00:13:16] say that the human brain can only
[00:13:19] hold so much memory and then
[00:13:22] it stops or you don't remember
[00:13:26] everything that has ever been done to
[00:13:28] you because you can't recall it.
[00:13:31] I call bunk on that.
[00:13:33] I believe that the human sense organs
[00:13:37] record basically everything.
[00:13:39] Now, I'm a really big fan of
[00:13:42] Tor Norstander's The User Illusion.
[00:13:45] And a lot of my work is based upon
[00:13:47] not just his body of work.
[00:13:49] I tell everybody to read this book.
[00:13:52] It will absolutely profoundly change
[00:13:54] you on everything you think it.
[00:13:56] So say the book again.
[00:13:57] Tor Norstander's The User Illusion.
[00:14:00] So James, I was such a nerd.
[00:14:02] I followed this guy during his book
[00:14:04] tour. I went to a couple of his
[00:14:05] signings because I knew some of the
[00:14:07] work by Lashley and others that
[00:14:10] were trying to understand where
[00:14:12] consciousness was in the brain.
[00:14:14] And that came from a whole
[00:14:17] series of weird
[00:14:19] coincidences in my life growing up
[00:14:21] near Princeton and spending a lot of
[00:14:23] time at Princeton University as a
[00:14:25] child. Where did you go up near
[00:14:27] Princeton? Well, I was born in
[00:14:29] Nork. I had my young
[00:14:33] life in Carterette, the armpit of
[00:14:34] New Jersey. And then I wound up
[00:14:37] in Flemington. And then from
[00:14:39] Flemington, I spent a lot of time
[00:14:40] in Princeton. I had friends whose
[00:14:42] parents worked at the university,
[00:14:44] the Institute of Advanced Study,
[00:14:46] David Sarnoff Research Center and
[00:14:48] Bell Laboratories. So, you know,
[00:14:51] I'm talking about 60s and 70s
[00:14:54] because I'm an old man. And that
[00:14:55] was the prime time of Bell
[00:14:58] Laboratories. Bell Laboratories
[00:15:00] had so many incredible thinkers who
[00:15:03] were paid just to think.
[00:15:05] Walk around barefoot,
[00:15:07] in their pajamas way before it was
[00:15:09] cool. No shaving for two or three
[00:15:11] weeks. You would essentially think
[00:15:13] they're homeless and nobody knew
[00:15:15] what they were thinking about, but
[00:15:16] they were paid to be there. Same
[00:15:17] thing with the ins...
[00:15:18] People think I'm homeless even
[00:15:20] though I try not to be.
[00:15:22] Don't change, man. Yeah, this is
[00:15:25] not my normal look either. But so
[00:15:27] anyway, in that environment, I
[00:15:30] started to really feel that we
[00:15:32] need to understand consciousness
[00:15:35] at a very young age. And I was a
[00:15:36] rationalist. I thought it was going
[00:15:37] to be a physicist growing up.
[00:15:39] That's why I love Princeton so much
[00:15:41] and mostly quantum physics is
[00:15:44] where I wanted to be. But also
[00:15:46] planetary science had a lot of
[00:15:48] influence by Carl Sagan, obviously
[00:15:50] and people of that epoch.
[00:15:54] But I started realizing that
[00:15:56] quantum physics didn't give us
[00:15:58] answers. It gave us actually a lot
[00:15:59] more questions. And the thing
[00:16:02] that really made me very angry
[00:16:04] about quantum physics is the
[00:16:06] observer aspect of it. I thought
[00:16:09] that that had to be the most
[00:16:11] astrological explanation of
[00:16:14] reality that could ever happen.
[00:16:15] So I rejected it. So I got
[00:16:17] involved in business and
[00:16:20] in computers and stuff like that
[00:16:22] because I can control the
[00:16:23] rationality within computer
[00:16:26] language. That was way
[00:16:27] irrational, but it did not
[00:16:29] stop me from continuing my
[00:16:32] curiosity. So quantum physics
[00:16:35] led me into trying to understand
[00:16:36] the brain and to understand the
[00:16:38] observer. Why the heck does an
[00:16:41] observation change a scientific
[00:16:44] test? Now, some people say, well,
[00:16:46] it's not really like that, Brian,
[00:16:47] you don't know it is. Ultimately,
[00:16:49] it is whether it's photons or
[00:16:51] something. The test is changing
[00:16:54] via the observation. And then
[00:16:57] entanglement was the next
[00:16:58] thing that completely freaked me
[00:17:00] out. How could how could matter
[00:17:02] be entangled? And how could that
[00:17:04] matter communicate across
[00:17:06] apparently, especially in the
[00:17:08] 70s, apparently across
[00:17:11] light years of distance
[00:17:13] instantly? How is that possible?
[00:17:15] Do we have a correct theory to
[00:17:16] explain that? No, we don't. We
[00:17:18] didn't then and we don't now.
[00:17:20] So that puts you to the edge.
[00:17:23] If you are an empirical
[00:17:24] scientist, you got to go to
[00:17:26] the edge and look over and say,
[00:17:28] you guys, gals, we don't really
[00:17:30] have a theory really. We have
[00:17:32] the best possible prediction.
[00:17:34] And that's kind of the thing.
[00:17:52] If you think about it, we don't
[00:17:53] really have a theory about
[00:17:54] anything. Exactly.
[00:17:55] So, you know, and you know, you
[00:17:56] think of like Stephen Hawking's
[00:17:58] the theory of everything, you
[00:17:59] know, a brief history of
[00:18:00] everything. We don't really know
[00:18:02] anything because so quantum
[00:18:04] mechanics are the basic building
[00:18:06] blocks of all physics, all
[00:18:07] reality. And we just don't
[00:18:09] understand how it works.
[00:18:10] There's I mean, there are
[00:18:12] particles even called strange
[00:18:13] quarks because it's just so
[00:18:14] strange. We just don't even know
[00:18:16] how a clue of how it works.
[00:18:17] And then the other big important
[00:18:18] thing consciousness, which you
[00:18:20] brought up, we don't have a real
[00:18:21] theory of that. We have we have
[00:18:23] no better understanding of that
[00:18:24] than we had 5000 years ago.
[00:18:26] For all we know what they had
[00:18:27] a better understanding 5000
[00:18:28] years ago than we have now.
[00:18:30] And so now we're trying to
[00:18:31] break it down using science
[00:18:33] that I'm sure is primitive
[00:18:35] compared to what's actually
[00:18:36] needed to understand these
[00:18:37] things. And we're trying to
[00:18:38] make some sense of it. You
[00:18:40] know, theoreticians have their
[00:18:41] ideas and experimentalists have
[00:18:42] their ideas. But none of it
[00:18:44] really works. We haven't
[00:18:46] concluded anything. You know,
[00:18:48] yes, we could make as you
[00:18:51] know, the movie this weekend
[00:18:52] was Oppenheimer, we can make a
[00:18:54] nuclear bomb. We can make a
[00:18:55] rocket ship. We can make a
[00:18:56] computer. But these are all
[00:18:58] are all mechanical things like
[00:19:01] you put these tools together
[00:19:02] and it works according to
[00:19:05] Newton's laws of physics. But
[00:19:07] you know, like like we
[00:19:09] decades away, probably from real
[00:19:10] quantum computing. And again,
[00:19:11] we're probably I'd like your
[00:19:13] opinion on this, we're probably
[00:19:14] decades away from any real
[00:19:16] intelligence and artificial
[00:19:18] intelligence. Right now it's
[00:19:19] like these massive statistical
[00:19:20] models. But we don't have
[00:19:23] like you're beginning to
[00:19:24] allude to with the brain
[00:19:26] being different from from
[00:19:27] exactly James brilliant. You
[00:19:29] know, so let's look at it
[00:19:31] from a perspective of, you
[00:19:34] know, scientific explanation
[00:19:36] from a chemical and
[00:19:37] electronic and electrical
[00:19:39] standpoint, standpoint within
[00:19:40] the brain. We are not even
[00:19:42] sure what particular chemicals
[00:19:45] and again, we can go to the
[00:19:46] very basics. You know, we a lot
[00:19:48] of people believe that the
[00:19:51] science of chemistry is fully
[00:19:52] understood. And certainly it
[00:19:55] must be fully understood
[00:19:56] within a human body. But you
[00:19:58] change just a couple of
[00:19:59] different chemicals in
[00:20:00] somebody, they're going to
[00:20:02] think dramatically different.
[00:20:04] And we're seeing it play
[00:20:05] out as a real time
[00:20:06] experiment in our
[00:20:08] environment. We're throwing
[00:20:10] chemicals out there. We don't
[00:20:12] know what the neurotropic
[00:20:13] effects are on people. We do
[00:20:15] know that there are
[00:20:17] neurotropic effects, but we
[00:20:18] don't know what the
[00:20:19] grand neurotropic effects
[00:20:20] are until we come back two or
[00:20:22] 300 years and say those
[00:20:23] crazy people I can't believe
[00:20:25] they did that. And that's
[00:20:26] got that's coming. We live
[00:20:28] long enough in the UNI be
[00:20:29] like four or 500 years, I
[00:20:30] don't know. We're going to
[00:20:31] look back and say, Oh, yeah,
[00:20:33] that was a big debate, but
[00:20:34] it was a debate about
[00:20:35] politics and philosophy had
[00:20:37] nothing to do with
[00:20:38] chemicals. You know, and
[00:20:40] that's kind of the problem is
[00:20:42] chemicals divide, divine
[00:20:45] your feeling and your
[00:20:47] construct of the world. Right?
[00:20:50] So if you're chemically off
[00:20:51] just a little bit, your view
[00:20:54] of the world is
[00:20:55] dramatically changed. It's
[00:20:56] a very, very fine tuning.
[00:20:58] And, you know, ancients
[00:21:00] understood the pineal gland
[00:21:01] as being vitally important,
[00:21:04] so important that the
[00:21:05] pinecone became such a big
[00:21:08] thing all the way to Sumerian
[00:21:10] times, all the way to the
[00:21:11] Pope's staff as a pinecone
[00:21:14] under Jesus' feet, all the
[00:21:17] way down to Sumerians, you
[00:21:19] know, they're holding
[00:21:19] pine cones. The pine cone
[00:21:21] represents the pineal gland
[00:21:23] that symbolism. Why does it
[00:21:25] represent? How do they even
[00:21:26] know about that gland in
[00:21:28] Sumerian? Unfortunately, when
[00:21:30] you brutally break somebody's
[00:21:32] brain open, the pineal gland
[00:21:34] looks like a pinecone
[00:21:36] almost precisely. Really? Yeah.
[00:21:38] Yeah. And and it's not
[00:21:40] a guess. This is not us
[00:21:42] philosophizing. They actually
[00:21:44] pointed out they the
[00:21:45] Egyptians actually talked
[00:21:47] about, you know, the seat
[00:21:49] of consciousness,
[00:21:50] essentially, was within
[00:21:52] the pine cone or the
[00:21:54] pineal gland.
[00:21:55] And why did they think
[00:21:57] that? Like, why did they
[00:21:58] have a sense that that was
[00:21:59] the seat of consciousness?
[00:22:00] Well, they they poked
[00:22:03] they poked things into it
[00:22:04] through the eye sockets
[00:22:06] while people were alive.
[00:22:07] They cut open skulls.
[00:22:09] South America, we can see
[00:22:10] a lot of people that had
[00:22:12] their skulls cut open and
[00:22:13] lived, you know, in
[00:22:15] ancient times. And they
[00:22:18] believe that.
[00:22:21] We don't know why.
[00:22:22] We know that maybe
[00:22:24] from the very basic
[00:22:26] we can speculate that
[00:22:28] proto doctors were
[00:22:30] experimenting with people's
[00:22:31] brains. I don't know how
[00:22:32] they got people to agree with
[00:22:33] it. Hey, buddy, let me cut
[00:22:35] open, you know, well, they
[00:22:36] could have been torched or
[00:22:37] slaved or whatever.
[00:22:38] It could have been but why
[00:22:39] they why they fixated on
[00:22:41] that gland is always
[00:22:44] interesting. We can go to
[00:22:46] certainly Indus Valley,
[00:22:48] what you know, India and
[00:22:50] things of that, you know,
[00:22:51] the Bende is a
[00:22:52] representation of the third
[00:22:54] eye. We can go to, you
[00:22:56] know, parts of Asia.
[00:22:58] Certainly the Greeks talked
[00:23:00] about it.
[00:23:01] All of this stuff plays out
[00:23:03] to the pineal gland.
[00:23:05] We don't fully understand it.
[00:23:07] Why? Well, all right, we
[00:23:08] call it a third eye.
[00:23:09] Here's what's interesting.
[00:23:11] There's retina tissue within
[00:23:12] the pineal gland.
[00:23:14] Now we only found that out
[00:23:16] about 100 years ago
[00:23:18] that there's full on retinal
[00:23:20] tissue in the pineal gland.
[00:23:21] So.
[00:23:23] Hold it, what's going on?
[00:23:24] Thousands of years ago,
[00:23:26] they called it a third eye,
[00:23:27] but it is in fact an eye
[00:23:29] embedded essentially
[00:23:30] within the center of your brain.
[00:23:31] And it's not even a part of your
[00:23:32] brain, it's more of a part of
[00:23:34] your organ.
[00:23:35] What does that mean, though,
[00:23:36] that there's retinal tissue?
[00:23:37] Like does it transmit
[00:23:40] information to the eyes?
[00:23:41] No, we see things with it.
[00:23:43] It is an eye in and of itself.
[00:23:47] It what's it looking at?
[00:23:49] Photochromatic
[00:23:51] discharges of light within
[00:23:53] the fluid of the pineal gland
[00:23:55] is a best guess.
[00:23:55] I can give you
[00:23:57] a photochromatic and pressure
[00:24:00] chromatic type of crystalline
[00:24:02] structures that are in the fluid
[00:24:05] within. But what does that mean?
[00:24:06] I don't know.
[00:24:07] For an evolutionary reason,
[00:24:08] like why do we need to look at
[00:24:10] that? No idea other than to
[00:24:11] look at ancient history.
[00:24:13] This is what fascinates me
[00:24:15] about the frontier of science.
[00:24:17] James is like,
[00:24:19] why are we looking into that
[00:24:20] more, more, you know,
[00:24:22] ironically here?
[00:24:23] Why are we looking into that
[00:24:24] more? What does it represent?
[00:24:26] So if you were to go to most
[00:24:29] brain researchers to say that
[00:24:31] the pineal gland is sort of a
[00:24:32] vestigial organ
[00:24:35] when we were low and reptiles,
[00:24:37] it may have served some function
[00:24:39] about mediating your day
[00:24:41] night cycle.
[00:24:42] We do know melatonin is released
[00:24:44] when the pineal gland
[00:24:46] is entering into a sort
[00:24:48] of darker maybe twilight
[00:24:51] type of stage and that people
[00:24:53] with certain melatonin
[00:24:55] imbalances tend to have a problem
[00:24:57] with the pineal gland calcification,
[00:25:00] things of that level.
[00:25:01] I don't really talk about this much
[00:25:03] because it can really go down
[00:25:04] a deep rabbit hole because we're
[00:25:06] looking at ancient
[00:25:09] ancient principles
[00:25:10] and they're using the crude
[00:25:12] language of their epoch
[00:25:14] and we're trying to compare it
[00:25:15] to our scientific language
[00:25:17] of this epoch.
[00:25:18] So whenever we're looking at
[00:25:20] explanations in antiquity,
[00:25:23] they can only explain through
[00:25:24] relativity of what they knew
[00:25:26] at that period in that epoch.
[00:25:28] Right.
[00:25:28] And so to us, it looks crude.
[00:25:30] It looks childlike.
[00:25:31] It looks uninformed,
[00:25:33] but it may be very well
[00:25:35] informed and it may be not very crude.
[00:25:38] It's just the wording
[00:25:39] to us seems to be very
[00:25:42] non scientific.
[00:25:43] They certainly knew that the pineal
[00:25:45] gland meant something
[00:25:47] going like winding it back
[00:25:49] to to Kim Peak.
[00:25:50] Are you suggesting that
[00:25:52] as humans evolved to have things
[00:25:55] like speech and longer term memory,
[00:25:57] we gave up short term memory.
[00:26:00] So when Kim Peak suffered from
[00:26:02] encephalitis, for instance,
[00:26:04] he lost some abilities to socialize
[00:26:07] and communicate and perhaps
[00:26:08] have long term memory.
[00:26:10] But in return, the savantism aspect,
[00:26:12] he gained this enormous,
[00:26:14] enormous short term.
[00:26:15] Exactly. Or long term or
[00:26:17] or tribute or he got
[00:26:20] he got long term memory
[00:26:20] because he could remember facts from, like,
[00:26:22] one hundred years ago.
[00:26:23] Facts that we would never even digest.
[00:26:25] We would not remember the weather
[00:26:28] on the day that somebody was born.
[00:26:31] I mean, and we're talking anybody,
[00:26:32] you just tell them,
[00:26:33] tell him what, you know,
[00:26:35] what your year and day
[00:26:37] he'll tell you the day of the week
[00:26:39] and what the weather was like
[00:26:40] in your part of the country.
[00:26:41] That's how finite it was,
[00:26:42] how he knew that.
[00:26:44] He read a lot.
[00:26:45] He certainly read some books more
[00:26:47] than others trivia books about sports.
[00:26:49] He was incredible about biblical things.
[00:26:52] He made some insights that are absolutely,
[00:26:55] I think, prophetic in my view.
[00:26:57] You know, he well, he was born
[00:26:59] a Mormon in Utah and
[00:27:02] he saw the Bible and
[00:27:04] other Mormon scriptures
[00:27:06] as being very profound.
[00:27:08] He saw that he saw the
[00:27:10] the holy books
[00:27:12] very much as a method
[00:27:15] for aligning your life
[00:27:17] with the tragedies of life
[00:27:19] is I'm subtexting a lot of this here.
[00:27:23] He said that there were much more
[00:27:25] than about God and religion.
[00:27:27] And he said that they certainly were.
[00:27:29] He absolutely believed that there was a God.
[00:27:32] There was no question.
[00:27:33] He thought it was ridiculous
[00:27:34] that that we were even questioning it.
[00:27:37] You know, he said, how could you?
[00:27:40] He wouldn't even have a talk about
[00:27:41] and wasn't programming.
[00:27:42] It was just it was his insight
[00:27:44] from, you know, enveloping so much knowledge.
[00:27:47] And I find the same is true with AI.
[00:27:50] It's it's very interesting.
[00:27:52] And AI is living on our language,
[00:27:54] a human construct, right?
[00:27:55] And we can jump into that
[00:27:57] getting out of this rabbit hole.
[00:27:58] I took you into them. So sorry.
[00:28:00] But no, so Kim
[00:28:03] Kim had some insights about that.
[00:28:04] And he said these books were guides
[00:28:08] so that you could live a better life.
[00:28:10] And he said the life
[00:28:11] that's going to take place
[00:28:13] after I leave.
[00:28:14] And he knew he was not going to be around long.
[00:28:16] And this was, you know,
[00:28:19] 15 years before he passed away.
[00:28:22] He just knew and he knew
[00:28:24] his unborrowed time.
[00:28:25] The doctor wanted to put him away
[00:28:27] in an institution and lobotomize him
[00:28:30] for his own safety.
[00:28:32] And his parents said no.
[00:28:34] And, you know, you can look up Kim Peek
[00:28:37] making a comment about the doctor
[00:28:39] wanted to go out and play golf.
[00:28:41] So he gave the parent three minutes,
[00:28:43] lobotomize them and institutionalize them.
[00:28:45] I got a game to go play.
[00:28:47] And Kim just jokes.
[00:28:48] This is how funny it is.
[00:28:50] He he had enough knowledge
[00:28:53] and humanity connection
[00:28:55] that he could make a joke out of it.
[00:28:57] Ah, go play golf, doc, you know.
[00:28:59] And and he laughs, you know.
[00:29:01] So he had an incredible sense of humor,
[00:29:03] much more than the
[00:29:04] than the the humor that Dustin Hoffman had.
[00:29:07] And Dustin played an interesting character,
[00:29:10] but it wasn't really Kim,
[00:29:12] but it was based on him.
[00:29:13] So what what what he basically said
[00:29:15] is that humanity is going to
[00:29:19] regress into sort of a cesspool.
[00:29:22] He used the word cesspool
[00:29:24] because there were no guides to keep you
[00:29:27] doing the right things, you know.
[00:29:30] If you're only answering to yourself,
[00:29:32] then essentially you're going to keep
[00:29:34] compromising yourself until you pull yourself
[00:29:37] down to the lowest possible element.
[00:29:39] And that was his insight.
[00:29:57] This is interesting because you look at it
[00:29:58] from what's going on in society
[00:30:00] and it is almost like a contradiction
[00:30:02] because right now our innovations
[00:30:06] per decade are amazing.
[00:30:08] Like in just in the past decade,
[00:30:10] we have so much innovation
[00:30:12] in genomics, which is actually editing DNA
[00:30:15] in innovation and AI, innovation in
[00:30:20] 3D printing, automation, space travel, computers.
[00:30:24] It's amazing. And yet at the same time,
[00:30:27] I'm willing to bet that the actual
[00:30:29] IQs of people five thousand years ago
[00:30:32] were higher than they are now,
[00:30:34] just because we get to just sit in front
[00:30:36] of a TV and zone out for,
[00:30:39] you know, on average six hours a day,
[00:30:41] you know, at least in the U.S.
[00:30:43] And they didn't do that.
[00:30:44] They had to be aware of every single
[00:30:47] plant, animal, tribe, worm, mosquito,
[00:30:53] hut, whatever, in a five mile radius
[00:30:55] of wherever they were.
[00:30:56] And they were constantly moving around.
[00:30:58] They had to constantly think
[00:30:59] because their lives depended on it
[00:31:02] and our lives just don't.
[00:31:03] And James, incredible insight
[00:31:05] because basically by reducing us
[00:31:08] to essentially being at the ultimate machine,
[00:31:12] right? Because let's look at humanity.
[00:31:15] Humanity is basically tool builder
[00:31:17] and storyteller.
[00:31:18] That's our existence.
[00:31:19] We're two tool builders and storytellers.
[00:31:22] If we took that away,
[00:31:22] we basically wouldn't exist.
[00:31:25] And everybody has to become a storyteller.
[00:31:28] That is the reality of life.
[00:31:30] We have to we have to become a salesperson, right?
[00:31:33] We have to convince people.
[00:31:34] And this is what's really funny
[00:31:36] is that we diluted ourselves
[00:31:37] into this sort of STEM thing
[00:31:39] that we can kind of go off and be so abstract
[00:31:42] that we don't need to be a storyteller
[00:31:44] or even tool builder.
[00:31:45] We can just sort of intellectualize it
[00:31:47] and the Greeks failed them that.
[00:31:49] And by the way, I just want I
[00:31:51] and I'm sorry to interrupt
[00:31:52] and probably listeners will criticize me,
[00:31:54] but I just want to explain like we had to evolve
[00:31:57] extreme storytelling ability
[00:31:58] because we're relatively weak animals
[00:32:01] compared to even chimpanzees.
[00:32:03] And so in order,
[00:32:04] we had to communicate to our kids,
[00:32:06] hey, don't go over the mountain
[00:32:09] because we knew that's where the lions were.
[00:32:10] We had to tell them that's where the
[00:32:12] that's where Satan is.
[00:32:14] And we have to tell a whole story
[00:32:15] about Satan and monsters
[00:32:17] and we had to develop storytelling
[00:32:18] ability to basically help the next generation live longer.
[00:32:21] Absolutely.
[00:32:22] Because they wouldn't live a lot without stories.
[00:32:24] We didn't have and then, of course,
[00:32:26] we could build tools to help us eat
[00:32:28] because again, we're relatively weak.
[00:32:30] We can't just like kill something and eat it.
[00:32:33] We have to it's hard for us to kill.
[00:32:35] It's hard for us to cook.
[00:32:36] Like we had we had a developability
[00:32:38] to build tools so we could plant things
[00:32:41] and do other things. Absolutely, James.
[00:32:43] So so the storytelling is usually
[00:32:45] through allegorical stories, right?
[00:32:48] The only let's imagine we didn't have a writing system.
[00:32:51] We only had a communication system,
[00:32:53] which is like 99% of human existence.
[00:32:56] We had no writing system.
[00:32:57] We had a communication system.
[00:32:59] And so the way you communicate
[00:33:01] is through bold allegorical stories,
[00:33:03] but we needed to record it.
[00:33:05] So how do we first record our allegorical stories?
[00:33:08] We looked up.
[00:33:09] We looked up at the unchanging sky
[00:33:12] and these constellations became allegorical stories
[00:33:17] to our to our youth
[00:33:18] and our younger folks that would pass it on.
[00:33:21] So our first books were the stars.
[00:33:26] So when we looked up, we would say,
[00:33:27] oh, that's the lion and the story of the great lion
[00:33:30] that overcame, you know, and these stories.
[00:33:33] Joseph Campbell, a hero with a thousand faces
[00:33:36] and the hero's journey and all of that.
[00:33:39] Joseph Campbell cataloged all of these allegorical stories
[00:33:44] and he came to the same conclusion
[00:33:46] and I am certainly I'm sure you do,
[00:33:48] is that all of these allegorical stories
[00:33:50] are almost precisely the same.
[00:33:52] There's about 20 of them
[00:33:54] and they're all about the hero's journey,
[00:33:56] monometh of what we're all going through.
[00:33:59] And we go through many of them.
[00:34:00] We go through the macrohero's journey
[00:34:02] from life to death.
[00:34:05] And then we go to these sort of micro cycles
[00:34:09] within a day or a week, you know,
[00:34:12] that we're going on the, you know,
[00:34:14] the abyss and coming back from the abyss
[00:34:17] and things of that nature.
[00:34:18] But all of these stories had that pattern to them
[00:34:21] and they're Jungian patterns, right?
[00:34:23] From Carl Jung, essentially.
[00:34:25] Jung took the inner space of the brain
[00:34:29] and the mind, which is what AI does, right?
[00:34:33] Does the same thing.
[00:34:34] So AI is going to see the same Jungian patterned
[00:34:37] and it says, wow, every single movie,
[00:34:40] every single song, every single book,
[00:34:43] ultimately everything that we're attracted to
[00:34:46] is on a monometh, is on a hero's journey.
[00:34:49] There is a bold allegorical story that's a part of it.
[00:34:52] And we know that all movies are formulas.
[00:34:55] If we break that formula big enough,
[00:34:57] it won't get funded because producers
[00:34:59] won't even put it out there.
[00:35:00] And this is why some really esoteric movies
[00:35:03] just don't resonate because we're looking for the pattern
[00:35:07] because part of human intelligence
[00:35:10] is pattern recognition and pattern matching.
[00:35:13] Did I see that before?
[00:35:15] Has anybody seen that before?
[00:35:17] Can somebody see that before?
[00:35:19] Please help us because we're looking for that pattern.
[00:35:22] We're desperate for it because our very
[00:35:25] existence depends on trying to grok and decode
[00:35:29] the unknown that's in front of us.
[00:35:31] We're the only animals that we know of on this planet,
[00:35:34] maybe off-world after yesterday,
[00:35:38] that is in constant search of decoding the pattern.
[00:35:42] And a lot of times if you're fast at decoding pattern,
[00:35:46] you're considered intelligent.
[00:35:48] So if I'm fast at adding numbers together,
[00:35:51] is that intelligence?
[00:35:53] Well, computers can do numbers added together
[00:35:56] even better than Kim Peek.
[00:35:57] Kim was great at adding numbers together.
[00:35:59] 15 digit numbers.
[00:36:01] I think the largest one I heard him do is 45 digits.
[00:36:04] Added together, multiplied, boom, done.
[00:36:07] But a computer can do it better.
[00:36:09] Does that make us inept?
[00:36:11] Does that make us cower?
[00:36:13] No, the computer is better at adding those numbers together.
[00:36:16] Now, can computers do other things faster
[00:36:19] and we can dive into that what AI is doing?
[00:36:21] But essentially, yes, fastness to me
[00:36:26] doesn't equate with intelligence.
[00:36:28] It's just fastness.
[00:36:30] But we have deluded ourselves.
[00:36:32] That person is fast.
[00:36:34] They're smarter than I am.
[00:36:36] That equated very well when we were out on the savannah.
[00:36:40] The person who could think fast
[00:36:42] under a pressure situation tended to live.
[00:36:46] So we are the generations of the fast thinkers.
[00:36:50] There's no doubt about it.
[00:36:52] Every single one of us are from those
[00:36:54] because the ones that didn't think fast, they didn't make it.
[00:36:58] So we are the sum total of that.
[00:37:00] So it's built into our DNA.
[00:37:03] Fast thinking means better survival ability.
[00:37:07] But you know, and you've written about this,
[00:37:08] you've talked about it, fast thinking isn't necessarily
[00:37:12] always the best result.
[00:37:14] It's a best result under the time compression
[00:37:16] of that circumstance.
[00:37:18] But retrospect allows a much more nuance
[00:37:22] of that scenario.
[00:37:23] Social media makes fast response down into microseconds.
[00:37:28] The hot take.
[00:37:29] Something happens politically.
[00:37:31] You've already aligned to a specific team.
[00:37:34] This is my team.
[00:37:36] I don't care if the entire team changes.
[00:37:38] I like their flags, I like their uniforms.
[00:37:42] I'm sticking with my team.
[00:37:44] When I was a kid, I grew up back east, Metz Yankees, right?
[00:37:48] And I was getting old enough
[00:37:49] where the entire team, plus the coach was gone,
[00:37:52] was replaced.
[00:37:54] But I had friends who were Yankees fans or Metz fans.
[00:37:56] I'm like, dude, everybody in that team is gone.
[00:38:00] The coach is gone.
[00:38:01] There's nothing other than the uniform and the name.
[00:38:05] I'm a Yankees fan.
[00:38:07] We have tribalized ourselves into these categories.
[00:38:10] And it's not a modern phenomenon.
[00:38:13] It's an ancient phenomenon.
[00:38:14] And it all wraps up into this fast thinking
[00:38:18] who is a winner?
[00:38:21] Winning in a ancestral sense is literally living.
[00:38:26] All right, you survived being attacked
[00:38:29] by a very unlikely another clan
[00:38:32] because we really didn't battle as much
[00:38:34] as the movies would have us believe.
[00:38:36] That was very unlikely.
[00:38:38] And the other thing is how do we survive food-wise?
[00:38:43] Who was able to ascertain the right food to eat?
[00:38:47] Turns out the wise old men and the wise old women
[00:38:52] of that tribe were the most valuable aspects of any tribe.
[00:38:57] They were also the first thing that was taken out
[00:39:00] when you conquered another tribe
[00:39:02] is you would take out the wise people.
[00:39:03] Well, it's interesting because as you age cognitively,
[00:39:09] the brain gets weaker at fastness.
[00:39:13] Like I can no longer add numbers as fast
[00:39:17] as when I was in my 20s.
[00:39:18] I can no longer remember things as quickly
[00:39:20] as when I was in my 20s.
[00:39:22] But pattern recognition increases
[00:39:25] and which is why like the peak age
[00:39:28] or the peak age of a mathematician might be 25 years old
[00:39:32] but the peak age of a historian,
[00:39:34] it turns out is in their 60s.
[00:39:36] So because they can pattern recognize from,
[00:39:38] oh, this is like when we invented air conditioning
[00:39:42] in 1900 invention of AI is like that
[00:39:46] in some weird way that they think of
[00:39:48] and they write a book about it.
[00:39:50] So the elders kind of have that wisdom.
[00:39:52] And I think James, we're lacking,
[00:39:54] we are floating and drowning in a pool
[00:39:57] of information and data,
[00:40:00] but we're starving of wisdom in this epoch.
[00:40:03] More so than I think anytime historically
[00:40:05] that I can account for wisdom is turned upside down
[00:40:09] because we've given the highest attribute
[00:40:12] to the latest tech.
[00:40:14] You know, I'm from the tech world
[00:40:15] and we're riding this wave of AI.
[00:40:17] I mean, literally just this morning,
[00:40:20] 27 new models were invented in AI
[00:40:24] in the open source community.
[00:40:25] And I'm trying to grok what the power of every model is
[00:40:29] and trying to help clients
[00:40:31] and the world that follows me,
[00:40:35] which model is better to use
[00:40:36] under any particular circumstance?
[00:40:38] But on the aspect of wisdom,
[00:40:42] we just totally discount it.
[00:40:43] In fact, we're the only epoch that I'm aware of
[00:40:48] that have reduced the elderly
[00:40:50] to an almost useless functionality.
[00:40:54] So much so that I worry for the people
[00:40:57] that are gonna enter their 60s in 20 years.
[00:41:01] Because as they enter their 60s, 20 years from now,
[00:41:05] society is gonna organize if it continues on
[00:41:08] in the same mental gymnastics,
[00:41:11] they're gonna see somebody in the 60s
[00:41:13] as being almost useless.
[00:41:16] Okay, they can still do a job,
[00:41:18] but there is not up to the changes.
[00:41:22] There's not up to the rate of change.
[00:41:24] And of course, rate of change
[00:41:26] has been increasing multifold.
[00:41:28] I think we can really say the start date
[00:41:31] was probably the invention of the vacuum tube
[00:41:34] was where the acceleration of rate of change.
[00:41:38] And throughout the 70s and 80s,
[00:41:40] personal computers and things of that.
[00:41:42] And now AI, the rate of change is quite dramatic.
[00:41:46] I don't think very many experts in the field
[00:41:49] understand what this really means
[00:41:51] as far as the acceleration
[00:41:53] and the ability to understand it.
[00:41:55] So as you get older,
[00:41:57] the wisdom is a grounding effect.
[00:42:00] It lets you understand where you need
[00:42:03] to roll with punches, right?
[00:42:05] Hyper reactivity is probably one of the biggest problems
[00:42:09] that we have in society today.
[00:42:10] People are over reactive to something that they see.
[00:42:14] An older person might look at it and say,
[00:42:16] you know what, let's see what happens in a couple of days.
[00:42:21] Yeah, keep your mouth shut
[00:42:23] and let's just see what happens.
[00:42:24] You're totally right.
[00:42:26] Like when I was in my 20s,
[00:42:27] if I got a letter from the IRS,
[00:42:29] I would consider jumping off the roof right then.
[00:42:31] Now it usually takes me about a month
[00:42:33] before I consider it.
[00:42:34] Yeah, we're still considering it, right?
[00:42:36] Yeah, so that heightened sensitivity
[00:42:40] is being fed by certain AI algorithms within social media.
[00:42:46] We can say that there's one particular algorithm
[00:42:49] that's designed to pull you into hyper reactivity
[00:42:54] much more than people realize.
[00:42:56] And that's TikTok and that's done through video,
[00:43:00] almost again a Rorschach type of conditioning.
[00:43:05] If you know certain biological responses to imagery,
[00:43:11] sounds, even flash, even subliminal things
[00:43:16] that are implanted in some of this media,
[00:43:20] you know that you can start building a certain level
[00:43:23] of disposition within a person.
[00:43:26] And about 80% of the world is hypersensitive
[00:43:31] to being hypnotized and they may not understand that.
[00:43:35] And I think the other 20% are not hypersensitive
[00:43:39] but still capable of it.
[00:43:40] And hypnotized not necessarily in the kind of classic,
[00:43:44] like look at this watch and fall asleep.
[00:43:45] Exactly.
[00:43:46] But just like the words we hear, the words we see,
[00:43:50] what you're saying basically is both TikTok
[00:43:52] and AI to an extent, they learn us,
[00:43:56] they learn from us by our responses.
[00:43:57] So like TikTok learns how many seconds
[00:43:59] you stayed on a video, exactly what categories,
[00:44:02] what thousand categories overlapping is this video about?
[00:44:07] And then it's, and because it's able to fine tune
[00:44:09] those categories so much just the way AI does
[00:44:11] with a piece of text.
[00:44:13] And then, oh, James likes these thousand different contexts
[00:44:18] that overlap, we're gonna just keep feeding him
[00:44:20] more and more videos like this
[00:44:22] and maybe even guide him towards-
[00:44:24] With some novelty.
[00:44:25] Yeah.
[00:44:26] The idea is over a cross section of people
[00:44:30] that have the same predispositions that you might have,
[00:44:33] there's a model built.
[00:44:34] Not like there is a James model,
[00:44:37] but in a sense there is within the grand AI
[00:44:40] that say TikTok is using.
[00:44:42] And so getting outside the nefarious aspect
[00:44:46] of a state actor and all that, we can leave that aside,
[00:44:50] just looking at time consumption.
[00:44:52] Everybody is battling for your attention and your time.
[00:44:55] And the longer you spend in a TikTok
[00:44:59] where you don't feel time is moving, right?
[00:45:02] If you're engaged and there's just enough novelty
[00:45:05] to tip you over the edge to say, hold it, what was that?
[00:45:09] So it's novelty and familiarity mixed together
[00:45:14] in this very seductive dance with your brain.
[00:45:17] And then along the way you can implant
[00:45:20] particular directionals philosophically and emotionally.
[00:45:24] It's not very hard.
[00:45:26] We've been doing it with movies and TV.
[00:45:29] Anybody battling me?
[00:45:31] Have you ever seen a commercial that was a tear jerker?
[00:45:33] Have you ever cried at the end of the movie?
[00:45:35] Have you ever gotten the, let's go after that person?
[00:45:37] You've been programmed.
[00:45:38] That's part of an emotional programming.
[00:45:42] Well think about like you grew up in the 60s and 70s.
[00:45:44] Think about the television shows that you watched.
[00:45:48] Like everything was to sort of make,
[00:45:52] take things that were considered harsh in society.
[00:45:54] Like the powers that be for instance,
[00:45:57] considered hippies and drugs harsh.
[00:46:01] Vietnam War was harsh.
[00:46:02] Civil rights was a harsh topic.
[00:46:05] And so then think of the shows you watched.
[00:46:07] The Partridge family sort of normalized the hippie.
[00:46:11] The monkeys?
[00:46:12] I mean it was all psychedelic imagery.
[00:46:14] Yeah, it's all psychedelic imagery
[00:46:16] but without mentioning drugs.
[00:46:18] And then there was like the Mod Squad.
[00:46:20] So there's like, it's an FBI team that had appros
[00:46:24] and dressed like hippies.
[00:46:26] And then you had the Brady Bunch.
[00:46:28] This somehow were they divorced?
[00:46:30] Were they widowed?
[00:46:31] But it just sort of normalized the idea
[00:46:32] that families might be,
[00:46:35] it's not like the normal family unit anymore.
[00:46:37] Things are a little different and it's okay.
[00:46:39] It's like everything we watched,
[00:46:42] think of all the war shows.
[00:46:44] I Dream of Jeannie was a war show.
[00:46:45] Major Nelson was going to outer space to shoot people.
[00:46:49] F-Troop.
[00:46:51] Hogan's Heroes.
[00:46:52] Hogan's Heroes.
[00:46:53] Hogan's Heroes, Baba Black Sheep, Star Trek.
[00:46:57] These are all like war shows basically.
[00:46:59] Yeah, so basically that's why when some people
[00:47:03] are very, very smart, they intellectualize this,
[00:47:06] saying I'm invulnerable to hypnotism.
[00:47:09] It's like no, except that fact of your humanity.
[00:47:12] We are emotional creatures.
[00:47:15] We are driven.
[00:47:15] Every single decision we make is ultimately
[00:47:18] a neuropeptide release of chemicals,
[00:47:20] which are by definition emotions.
[00:47:23] So once we understand that we're emotional creatures,
[00:47:26] we accept that reality and then we build from there.
[00:47:29] The problem in society right now
[00:47:30] is that we think we're making rational decisions.
[00:47:34] We think that there's a logical decision that took place
[00:47:37] and the user illusion will prove to you scientifically
[00:47:41] because tour nor standards
[00:47:43] is not a scientist per se, he's a science writer.
[00:47:46] In the late 90s, if you were to write the book today,
[00:47:49] there's much more data and information
[00:47:52] getting back to what reality is.
[00:47:57] There's a half second delay that takes place
[00:48:00] from you doing something and you realizing that you did it.
[00:48:03] Your brain is making you think you did it
[00:48:07] but the super person inside of you
[00:48:10] that we can call it subconscious
[00:48:13] or you can call it the editor of reality for you
[00:48:16] because reality is being edited to you.
[00:48:18] We don't have real time screens on our eyeballs.
[00:48:21] Reality is being represented in the brain
[00:48:25] at a very low bit rate.
[00:48:26] It's 31 bits per second is consciousness.
[00:48:30] That's it and anything beyond 31 bits per second,
[00:48:34] we are just getting background noise and information.
[00:48:37] We're not really dealing with what's going on.
[00:48:41] So we have a throughput bandwidth issue.
[00:48:44] I call it the, well called the bandwidth issue.
[00:48:49] And then we have the half second delay problem.
[00:48:52] Those are the two fundamental things that user illusion
[00:48:55] the tour nor standards wrote presents
[00:48:57] but it presents it in very dense material.
[00:48:59] I read it once a year.
[00:49:00] What's the half second delay?
[00:49:01] The half second delay is what a lot of people like to say
[00:49:04] in a street nomenclature as muscle memory.
[00:49:09] It's like, oh, I hit that ball because of muscle memory.
[00:49:11] Well, there is no muscle memory.
[00:49:13] It's not it is your, in a sense, your subconscious
[00:49:17] is doing it and you're letting it do it rather
[00:49:21] you're giving up the control
[00:49:25] that you're in control of your reality
[00:49:28] and you're just letting it happen.
[00:49:30] And I'm sure you know this being very creative
[00:49:33] is that when you get into the flow of things
[00:49:34] you're kind of taking a step back.
[00:49:37] You're just letting things happen.
[00:49:39] And I see this, I've studied creative people
[00:49:41] my entire life find it fascinating.
[00:49:44] And the hardest thing is, is most people have
[00:49:47] imposter syndrome that are highly creative.
[00:49:50] See don't know where it came from
[00:49:51] and they don't know where it went when it's not there.
[00:49:54] Great songwriters, great novelists
[00:49:56] they are literally in fear
[00:49:59] that it may not come back one day.
[00:50:01] Now you can intellectualize it ex post facto.
[00:50:04] After the fact, you can say, oh yes, of course
[00:50:06] I studied this, I studied that
[00:50:08] and that's how I came up with that.
[00:50:10] But the spark of insight, that creative spark
[00:50:13] that comes into you, nobody's been able
[00:50:16] to fully define it.
[00:50:17] It's a collection of all of these different pieces
[00:50:21] that if you take a step back
[00:50:23] combined in a way that's magic
[00:50:26] but if you try to force it
[00:50:27] if you try to overthink it
[00:50:29] and you try to capture a cloud in your hand
[00:50:32] or get a cup of water by grabbing it as much as you can
[00:50:36] it dissipates.
[00:50:48] Great episode, great conversation.
[00:50:50] We're gonna have part two all about consciousness
[00:50:55] and then part three, the most intense conversation
[00:50:58] I've ever had about AI in my life
[00:51:02] with Brian Romel, such a smart guy
[00:51:06] such a great conversation
[00:51:07] I really love talking to Brian.
[00:51:09] Stay tuned for not only discussions
[00:51:13] about consciousness, AI, quantum mechanics
[00:51:15] but how knowing all this stuff can help your life.




