0:00
World events remind us that there is actually evil out there.
0:02
Just horrendous barbarism is still possible.
0:05
When a country goes to war, it's not enough to just have the Department of War
0:09
fight these wars.
0:10
It is actually the whole country.
0:12
The idea that somehow the American people are not capable of this beggar's
0:15
belief.
0:15
I think our biggest risk as a country is suicide, not homicide.
0:18
How do we win the AI race, particularly as it moves towards more physical AI
0:23
and robotics, etc?
0:24
The things that we did to win in the past, we accidentally turned our back on.
0:28
And there's an opportunity to reclaim that with figure.
0:31
In the moment right now, you could say, we need to build more weapons.
0:34
We need to do this. We need to do this.
0:34
Yes.
0:35
But the most important thing we need to do is-
0:37
So, Catherine, when we were talking about guests that we had to add on,
0:48
Sean, was at the top of your list.
0:50
What was that?
0:52
Yeah, well, you know, it's, I think people are, you know,
0:57
after the Jeremy Stern profile and Colossus and a lot of, I think, you know,
1:03
stories that have come out recently or podcasts that have come out about Sean.
1:06
He's one of these people that if you were in the know several years ago, you
1:11
knew.
1:11
He was the OG, like, fixer for everyone.
1:14
You know, I think Trey Stevens, who's, you know, the co-founder of Andrew all
1:18
came out
1:19
on Twitter and said, like, he's single-handedly responsible for my career.
1:22
And I, you know, John Doyle, so many of our founders, have pointed to Sean as
1:27
the person
1:27
who made their career and introduced them to Palantir, supported them in Palant
1:31
ir,
1:31
but also sort of, you know, gave them wings to fly away from Palantir and to
1:35
start something new.
1:36
And you hear that story time and time and time again.
1:39
But it wasn't until, and Eric, you and I were talking about this.
1:41
It wasn't until a couple years ago that I think Sean actually became more of a
1:45
public figure.
1:45
He was sort of the behind-the-scenes guy, the behind-scenes fixer.
1:49
And I think the thing that really changed it was, Sean, and I'd love to talk to
1:53
you
1:53
about kind of what was the inspiration for this.
1:55
You sort of wrote this seminal piece about First Breakfast, about defense re
2:00
formation,
2:00
and we're the first person to really start talking about it.
2:04
But again, this was like 17 years into the journey of Palantir that you decided
2:08
I'm
2:08
going to be, you know, a strident voice for what needs to happen in America.
2:11
So I'd love to talk to you about this, you know, going from the behind-the-
2:15
scenes person,
2:17
the guy behind the guy, and so many of our companies, to saying I need to come
2:21
out and
2:22
be a voice for this movement. What was the kind of impetus for that?
2:25
It was kind of equal parts in active desperation and active optimism.
2:30
You know, I felt like after years of just seeing
2:34
the building, the Pentagon from the inside, seeing how defense was operating,
2:40
I felt this frog boil that continued to happen set in a historical context.
2:46
But the reason to say something is actually I thought this was the moment that
2:50
it could
2:50
all be fixed, that alongside of that happening, seeing what was happening
2:54
outside of the building,
2:55
that the founders were re-emerging. There was a huge amount of energy.
2:58
People wanted to build in the national interest, and it was a moment to kind of
3:02
crystallize what
3:03
at least put forth what I thought the fundamental diagnosis was that really the
3:08
things that we did
3:09
to win in the past, we accidentally turned our back on, and there's an
3:13
opportunity to reclaim that
3:14
with vigor. And we needed to do so quickly. That time was running out, there
3:18
was a shot clock here
3:19
that we have frog-boiled our way to a place where we've lost deterrence.
3:23
You know, any one of these items in isolation, you can write off, you could say
3:27
, okay,
3:28
the Russians annexed Crimea in 2014. That's just one thing.
3:32
Then you have the militarization of the Spratly Islands in 15. You have the
3:36
failure of JCPOA to
3:38
keep the Iranians from getting a bomb. You have a pogrom in Israel, and
3:42
certainly after October 7,
3:43
it was kind of a radicalizing moment that like, what is going on here? We need
3:49
to act,
3:50
and I think we've only had more things since then. Now, I think the good news
3:54
is, in the last year,
3:55
more has changed in the department than I've seen change in the prior 19 years,
3:59
and people are
4:00
seizing that moment for reformation. And it's been rewarding to kind of get it
4:08
out there,
4:09
get people to rally behind it, and all of us building the national interest.
4:13
Yeah. I mean, what is it? Because I think you get this question probably all
4:17
the time.
4:18
What is it about the last? I mean, as you said, it wasn't a single moment,
4:21
but there is some change in the culture and the zeitgeist. And I think you have
4:25
a unique
4:26
understanding too of culture and kind of how these memetic shifts happen. But
4:30
what was it about,
4:31
you know, 18 months ago where it's like everyone seems to agree on the thing
4:36
that was so contrarian
4:37
for many years where you were sort of banging your head against a wall saying
4:41
like, this is,
4:42
you know, this needs to happen. Well, maybe unsurprising for my worldview,
4:46
it all comes down to leadership. You know, I, we call them the founding fathers
4:51
for a reason.
4:52
There's something special about the American spirit that is, you know, every
4:57
founding story is,
4:58
is equal parts heresy and heroism. And we had the right people who kind of saw
5:04
like, hey,
5:04
this is not working. And we're, the shot clock is running out and we have to do
5:09
something.
5:09
And we had those people both inside the building and outside the building. So
5:13
it's kind of a
5:14
conspiracy coalition of the willing coalition of the capable to go do that.
5:20
So it's hard to point to anyone single moment. I think the kind of the election
5:24
is a big part
5:25
of it, not to not to make it political, but just being able to get in a
5:29
leadership that viewed it
5:30
with clarity and set the conditions to make this change happen. Yeah. So you've
5:35
been busy. You,
5:37
you know, you've joined the army. You've written a book that's coming out. I'd
5:42
love to get into
5:43
to mobilize. But first, why did you decide now's the time to sort of write the
5:47
right,
5:48
the canonical book of what needs to happen in America? And then maybe we can
5:51
talk about sort
5:52
of the sort of fundamental thesis to you. But why now? Why write a book, join
5:56
the army and also
5:57
be leading Palantir? Yeah. Well, there's a there's a code you kind of thread
6:02
through that,
6:03
which is like, how do we mobilize to prevent a bigger conflict? And if you're
6:07
really paying
6:08
attention, it's hard not to think that we're kind of in the late thirties here,
6:11
that things
6:12
are brewing. They've been brewing for a while. People talk about great power
6:17
competition.
6:18
And I think we're kind of coming out of the malaise of having won the Cold War,
6:22
or the Soviets having lost it, perhaps more accurately. And that kind of led to
6:26
a lot of
6:27
bad behaviors that allowed us to believe a lot of lies about the future that we
6:31
're kind of now
6:32
marking to market. And so we will not have the luxury that we really had in
6:41
World War II of
6:42
letting the adversary attack us first and then deciding to mobilize. And I
6:46
think a more
6:47
clear-eyed view of what actually happened in World War II is that's not it. It
6:51
's not this
6:51
facile thing that we just flipped a switch. And the automotive industry decided
6:54
, okay,
6:55
after Pearl Harbor, we're going to make all this war material. What really
6:58
happened is that
6:59
leadership from FDR realized in the thirties, in the late thirties, that we
7:03
needed to mobilize,
7:04
but there was not yet a national will or popular mandate to do so. And Len Le
7:09
ase provided the
7:10
mechanism to do that. It took us 18 months to build factories and re-tool them,
7:14
and we were able to
7:15
create capability deterrents that we sold to the Brits and to the Soviets, such
7:21
that
7:21
when World War II really kicked off for us, when Pearl Harbor happened, we were
7:24
at full rate production.
7:26
And the way that we mobilize, when a country goes towards the whole country
7:29
that goes to war,
7:29
it's not enough. I think part of the legacy of having won the Cold War is
7:32
thinking it's enough
7:34
to just have a defense industrial base. It's enough to just have the Department
7:37
of War fight
7:38
these wars. It is actually the whole country. And I think that's the most stark
7:42
thing. It's like
7:43
we all, as American citizens, need to be invested in both the prosperity the
7:47
country gives us,
7:48
but also the freedom that underwrites that prosperity. And we have come a very
7:52
far away
7:52
from that world. In 1989, only 6% of spending on major weapon systems went to
7:58
defense specialists,
7:59
i.e. companies that were exclusively in the business of defense. That's not
8:02
that long ago.
8:02
That's when the Berlin Wall still stood. Now that number is 86%. So really,
8:08
what we think of as
8:09
normal is an aberration from the past. And Mobilize seeks to set that story in
8:13
context of one,
8:14
hey, this is the industrial base, what I like to call the American industrial
8:18
base,
8:18
that won World War II. Chrysler built Minuteman missiles and minivans. And
8:23
every camera, car,
8:25
cereal box that an American consumer bought was actually also subsidizing our
8:29
national security.
8:30
And that's really important. We see this with the hyperscalers. We see this
8:32
with technology.
8:33
The amount that our private sector spends on R&D dwarfs what the government is
8:37
capable of spending.
8:38
And you want to get on that price performance curve as a way of delivering
8:41
capabilities to
8:42
our brave men and women in uniform. Yeah.
8:45
So how do we get back to that? And oh, so the second part of it. Yeah, so we
8:53
had the American
8:54
industrial base. But who was the American industrial base? Today we think of it
8:58
as North
8:59
of Grumman. We think of it as Lockheed Martin. But actually it was Glenn Martin
9:03
. It was Jack
9:03
Northrop. It was Leroy Grumman. They were people. They were founders. They were
9:08
kind of not thinking
9:10
about, hey, what's the performance going to be next quarter? They were building
9:13
something way
9:14
bigger than themselves, way bigger than their companies. And those founders
9:17
weren't just outside
9:18
of government. They were inside of government. It was the Heimann Rickovers
9:21
that against the will
9:22
of the Navy, building the nuclear Navy. And I love that story, too, because it
9:26
takes a lot of
9:26
Hutspa Oppenheimer. The father of the bomb said the nuclear Navy wasn't going
9:30
to work. He told
9:31
Oppenheimer, he told Heimann, he was going to fail. And he still proceeded. And
9:36
that's something I
9:37
think in the Valley we recognize as the classic founder personality. And a big
9:41
part of what happened
9:43
after the end of the Cold War, we wanted a peace dividend. We started spending
9:49
less in defense.
9:50
We had this famous dinner, the Last Supper. We went from 51 prime contractors
9:53
down to five.
9:54
I think the conventional explanation of what happened is wrong. People think,
9:59
hey, we had
10:00
consolidation. Consolidation means we lost competition. That's not, yeah, okay,
10:06
maybe at the margin.
10:06
But first of all, it's always been a monopsony. The nature of the competition
10:10
is not what people
10:11
think. It's not these companies competing against each other. The competition
10:14
has always been the
10:15
services competing. It's been competition inside of government that drove
10:18
innovation,
10:19
not competition from industry. What really happened from the Last Supper is
10:23
that consolidation
10:24
bred conformity. It was the beginning of true financialization of defense.
10:27
These companies
10:28
really could no longer think about growth. They thought about financial medics,
10:32
dividends, buybacks,
10:33
cash flow. And it kind of became very narrow. And that conformity is not an
10:39
environment that
10:40
founders can thrive in. The heretics were expunged. They left. They went to
10:45
other parts of the American
10:46
economy, like tech. But those heretics are required. In fact, if you look at
10:51
part of the book, we
10:52
catalog all these amazing defense innovations, almost to a T. Every single one
10:57
of them was a
10:58
heretical idea. The institution was against it. The bureaucracy was against.
11:02
The process tried to
11:03
kill it. And these determinative outcomes, you can think about the Higgins boat
11:06
, the boat that won
11:07
World War II. The Navy didn't want to buy the boat. The Navy tried to steal the
11:10
designs for the boat.
11:11
In the end, 92% of all boats in World War II were Higgins boats. Think about
11:15
where we'd be if this
11:16
Scott's Irishman wasn't just willing to just almost pathologically commit
11:21
himself to making
11:22
this happen. The boys that would not have landed at Normandy. Yeah. No, I mean,
11:26
and that's super
11:26
interesting. Yeah, because I think that as you said, it's always been, you know
11:30
, this sort of force
11:31
consolidation that we've picked the winners there, you know, we're post history
11:36
now, like we don't
11:37
actually need to build for for wartime. So it's your view that that really that
11:41
just expunged
11:43
all of the talent that used to go to defense. And it's interesting because it's
11:45
also I mean,
11:46
it's in some ways it's sort of serendipitous that that's exactly when the
11:49
internet's rising.
11:50
So it's like, if you're this kind of weird personality, you're going to go work
11:54
on this new thing
11:55
that's so exciting. You know, I think of like, you know, Mark Andreessen, right
11:58
? Like he could have
11:59
been in the defense industry maybe 20 years earlier, but it's like, you know,
12:03
the people who were
12:03
building the internet in the 90s, they wanted to build the new thing. So I
12:07
guess like, is that
12:09
sort of the fundamental problem? Is that like, it just became a place where
12:12
anyone who was
12:13
interesting or anyone who had a different view just could not thrive? Yeah,
12:18
exactly. And then
12:19
we compounded on that problem by, you know, the nature of the monopsony. So
12:24
unlike a monopoly where
12:25
you have one sellover thing, a monopsony is where you have one buyer thing.
12:28
That the Department of
12:29
War is a monopsony. The nature of the monopsony is it forgets, it starts
12:35
imposing all sorts of
12:37
constraints on its suppliers and how they behave and what they need to look
12:40
like. And that led to,
12:42
it's like put it, we put all of these companies on the Galapagos islands. They
12:45
're not on the main
12:46
land anymore. And so what do you get on the Galapagos? You get these exquisite
12:49
giant tortoises.
12:50
They're really amazing. It's like alien life. It's very cool. Except when you
12:55
take the tortoise
12:55
back to the mainland, they're not competitive. They're going to get eaten alive
12:58
by the wolves.
12:59
And so we started creating a huge number of barriers for these people who even
13:04
when they
13:05
had ideas that could those ideas come back into defense. I like to say, like
13:10
when we started
13:11
Palantir, there was no front door in the Department of Defense. You had to, the
13:15
only front door,
13:16
there was exactly one. It was in the intelligence community. It was Inchitel.
13:19
If you were an outsider,
13:20
there was no other way. And the only people that worked here were insiders. Of
13:24
course,
13:25
now that is, that's part of the sea change that happened starting really in
13:28
2015. But that means
13:30
putting forth the 18 theses and the defense reformation and 24 makes a lot of
13:35
sense because
13:35
we have 10 years of heretics who have been knocking at the gate ready to come
13:39
help the department.
13:40
Yeah. And so much of the book, I mean, you talk about these sort of heretical
13:43
heroes,
13:44
you know, and doing the research. And I mean, you're now sort of like the
13:47
walking encyclopedia for
13:48
the defense industry in tech. Who are the most exciting heroes? Like who's the
13:53
person you look
13:53
at and you say like, gosh, that that is, that is the person that people don't
13:56
know about they
13:57
need to know? Well, I think it's hard to pick only one that a new profile that
14:02
we put out there,
14:03
which is some of these figures are historical figures, monarchs, Shriver,
14:06
Edward Hall,
14:07
Hyman, Rick over, James Boyd, John Boyd, sorry, but the one that's new is
14:11
Colonel Drew Cucor.
14:13
And Cucor is the father of Maven. And here you have this marine colonel, you
14:20
know, born
14:21
raised by a single mother of Southern California, very modest background, his
14:26
only way of going
14:27
to college was ROTC joined the Marines. Lots of incredible experiences there,
14:32
but he had a
14:34
seminal experience where he was trying to evacuate Izzadi refugees who were
14:39
fleeing ISIS and a young
14:42
Marine looking at ISR made a call that he thought he saw RPGs. And that would
14:47
have made it unsafe
14:48
for the Marines to land and exfil these people. And there actually wasn't one.
14:53
As a consequence,
14:54
you have order of thousands of people who were tortured and slaved and raped
14:59
because of this
14:59
failure of operation. And it just changed this man. And so when he had an
15:04
opportunity in the
15:06
basement of the Pentagon on a project with no resources to go after bringing AI
15:10
to the department,
15:11
he leaned all the way in. And you can see the journey of the heretic here where
15:16
everyone hated
15:18
him. Everyone tried to kill him. Every service thought they were doing AI.
15:21
People tried to throw
15:22
IG investigations. And it's like one of the details we document is that someone
15:26
said that
15:26
Colonel Cucor is housing Iranians in his basement. So they actually sent out
15:32
criminal investigators
15:34
to his house to look at this. And here's a Mormon devout Mormon for four
15:38
daughters,
15:40
1400 square foot home that doesn't have a basement, by the way. And you know,
15:44
the investigators were
15:45
just completely dumbfounded, but it shows you sometimes that what, you know,
15:49
are you, are you
15:50
willing to put it all on the line? Are you so committed, so incorruptible in
15:53
what you're trying to
15:54
deliver? Or is this just a career? And that's one of the things I think I draw
15:58
a lot of inspiration
15:59
from these folks is just seeing, you know, if they can do this, we can too,
16:03
seeing that these
16:04
people exist inside of government, they exist outside of government. And I
16:08
think it's that's
16:09
in the moment right now, you could say, we need to build more weapons. We need
16:12
to do this. We need
16:13
yes. But most of the most important thing we need to do is inspire the latent
16:17
heretics
16:18
to actually step up. This is the moment your country really needs you. What's
16:22
been exciting
16:23
over the last year or so is I'm seeing those people. I'm seeing them inside the
16:26
building. I'm
16:26
seeing outside the building. And that's that is, you know, what is driving the
16:29
change is the
16:30
leadership is setting the conditions to empower the heretics. They're
16:33
protecting them. John Boyd,
16:35
who was a famously difficult fighter pilot, his own service, the Air Force
16:38
hated him,
16:39
but the Marines learned everything they could from him. And all of his heresy,
16:42
he was really the
16:43
father of the F-16. All of his heresy was proven correct in Gulf War I, where
16:47
we destroyed the
16:48
fourth largest army in the world in days. It's just everything came to bear his
16:51
high-low mix,
16:52
his oodle-loop. But, you know, John Boyd said to be or to do, you can be
16:59
somebody or you can do
17:00
something. But you can't have both. And, you know, how committed are you to
17:04
this? But he was so difficult.
17:06
He gets a lot of credit. I think the other person who gives credit, who I don't
17:10
even know the name
17:11
of, is a three-star Air Force general who protected him because people like
17:15
that do not survive in
17:16
these bureaucracies on their own. It's not like, hey, he's difficult and we
17:18
somehow tolerated him.
17:20
It's no. Someone realized there's something special here. And despite that, we
17:24
tolerate him.
17:24
You see that with the dynamic between Bernard Shriver, who built our inter
17:29
continental ballistic
17:30
missiles at Edward Hall, who specifically built the Minuteman. Shriever fired
17:35
Hall once and then
17:37
hired him back, realizing there was no way we were going to get to solid-fueled
17:40
intercontinental
17:41
ballistic missiles without this notoriously difficult human. And I think that's
17:47
great leadership.
17:48
Yeah. Yeah. And you just said something interesting, which is that it really
17:52
takes
17:52
leaders inside the building to model the change, to encourage it, to protect
17:58
the people who are
17:59
those heretics. And you've joined the Army. And it's a program from General
18:04
George and Secretary
18:05
Driscoll. Maybe tell us a little bit about the origin of that and then what you
18:09
're doing specifically
18:10
inside the Army now to support that change. Well, the origin story of this is
18:17
really,
18:17
so I've worked with the Israelis in some capacities since roughly 2014.
18:22
This is a very technical country. And they're proud of how technical they are.
18:28
After October 7th, you know, October 8th, they mobilized roughly 360,000 reserv
18:32
ists.
18:33
By definition, all these reservists are prior service through national cons
18:37
cription.
18:38
And most of them had now had 20 years of experience in industry.
18:42
And when they got back to the IDF, they were horrified at the state of
18:46
technology in the IDF,
18:47
which is actually an implicit self-critique, which is, hey, when I was 20, I
18:51
was really good at coding,
18:53
but I didn't know what I was doing. Now I have 20 years of experience building
18:57
internet-scaled
18:58
things. And I actually know how to do these things correctly. So I saw them
19:03
modernize more
19:03
in the four months after October 7th than I did in the prior 10 years of
19:07
working with them.
19:08
And that was just I couldn't, I couldn't unsee that. So of all countries in the
19:14
world,
19:14
we are drowning with that talent. You know, the skills we have at building
19:18
things
19:18
in the valley, the companies that A16Z backs, like we know how to do this as a
19:23
nation.
19:24
The 20-year-old version of our green suitors maybe did it. You know, there's
19:29
the will,
19:30
there's the intelligence, the capability, but there's also then the trade craft
19:33
and know
19:33
how the experience, all the dead ends that I've run into in my career. The
19:36
mistakes I've made,
19:37
if you're going to make mistakes, please make new ones. Don't make the same
19:40
ones I already
19:40
have done. How can you stand on the shoulders of American industry to go faster
19:44
and do this?
19:45
And so I'm not sure I really had a lot to give the army at 24, but I think at
19:50
44,
19:51
there's a lot I can do to accelerate certain things. And that's not just a
19:55
narrow statement
19:55
about me. I think that's a whole statement about the valley. This whole
19:57
statement about
19:58
American manufacturing, everyone in El Segundo, how do we make sure if the
20:03
Chinese make civil
20:05
military fusion compulsory, why do we make voluntary civil military fusion
20:09
impossible?
20:10
And when I look back at history, we didn't use to make it impossible. In World
20:14
War II, we
20:14
direct commissioned 100,000 people that look like what we now consider a
20:18
detachment 201
20:19
into the military. And we should be doing that again. The authorities exist.
20:23
They're just laying
20:24
their dormant and we're basically underutilized. So I was proud to join the
20:29
army with three other
20:30
colleagues. We have Bob McGrew, former chief research officer at OpenAI, Boz,
20:35
the CTO of Meta,
20:37
and Andrew Wheel, the former chief product officer of OpenAI, head of science
20:40
now.
20:41
And I think we've been able to work on different projects that really, we act
20:45
as senior advisors
20:46
to army senior leaders. And there's different projects that we kind of get our
20:49
hands dirty and
20:50
and help. But I think it's been a really, I've learned a lot by doing it.
20:54
Hopefully the army's
20:54
benefiting from it. But I think more broadly, we would like to catalyze this
20:58
across all of the
20:58
services and a broader call for folks who are listening to this now in industry
21:03
.
21:03
What's been the biggest surprise? I mean, you've obviously worked with the
21:06
department for years,
21:08
but being on the inside now, what surprised you?
21:11
In my, so my focus is really two things. I'm helping them think through
21:17
how to plan for structure over long periods of time. So how do I generate the
21:22
force I want for
21:23
all the different military occupation specialties? So that's been one. But the
21:27
second part of it is
21:27
thinking through how do we want to employ software as almost like a malleable
21:33
weapon system as
21:34
something that our commanders can wield to drive advancement? And they call
21:38
these the operational
21:39
data teams. What's been hugely impressive to me is the quality of talent in our
21:45
green suitors.
21:46
People who are not formally trained computer scientists, people who have just
21:49
learned these
21:50
things, the most compelling AI applications I'm seeing across commercial or,
21:55
you know, so private
21:56
sector or public sector are being built by these green suitors. And it's, and I
22:01
think there's
22:01
something about the existential stakes. You know, you're not doing this for fun
22:04
. You're not doing
22:04
this for 10% efficiency. It's a binary outcome when you're lose. The other
22:08
thing about this
22:09
moment that I think is really interesting with AI is it's massively empowering
22:12
to people with
22:13
specific skills. So, you know, it is the Intel Warren officer who really knows
22:17
their domain.
22:18
And I was wondering, you know, as someone who's been doing this for 20 years,
22:21
like, where was this
22:22
person 10 years ago? And the conclusion I came to is they were always there 10
22:27
years ago. What would
22:27
they have done though with their idea? Make a PowerPoint slide, brief some
22:31
program bureaucrat,
22:32
who would tell them how bad their idea is? No, because they're smarter than
22:35
that. They wouldn't
22:35
have wasted their time. Now they spend two weeks, they build it themselves. Now
22:39
they're having an
22:40
empirical conversation about how what they've built actually drives the army
22:44
forward. And everyone
22:45
is quick to adopt it because everyone wants to win. So it's been really
22:48
exciting to see that.
22:51
The other part of it, I think, is big institutions. It's conserved across
22:57
private sectors well,
22:58
struggle with zero to one. Everyone wants to get, you know, everyone wants, if
23:03
you have some sort
23:03
of innovation, they almost want to rush to get to N as quickly as possible. How
23:08
do I scale this
23:08
across the formation? Well, the army is a very big place. And thinking very
23:12
critically about the
23:13
pathing of what is the journey and cycle of getting an innovative idea to scale
23:18
. That's literally what
23:20
we do as an industry all day long, right? And how do they take and vibe those
23:24
lessons
23:25
rather than cargo culting their way there, which is frankly what I think the
23:28
private sector,
23:29
you know, large fortune 100s do as well. So they have more to learn from
23:33
startups in this capacity
23:34
than they do from, you know, big fortune 100 companies. Totally. And that's so
23:39
interesting too,
23:39
about just, you know, that is something I hear time and time again, like the
23:43
level of technical
23:44
ability of someone very junior in the army or the navy today. It's like they
23:48
came up tinkering,
23:50
and yes, as you said, like now the tools are there, they can just build
23:52
something where it's
23:53
and you're not only learning from startups, you're literally learning from
23:55
individuals who are
23:56
enlisted, who have a great idea, which is it's it does feel like, as you said,
24:00
like this is a
24:01
revolutionary time for the military where they can actually learn from their,
24:04
you know, junior
24:05
people who have a great idea and that can be deployed very quickly. Which plays
24:10
to the American
24:11
military strengths of bottoms up innovation, mission command type control. It's
24:15
really something
24:17
that our military can uniquely do than no one else can. Let's get into the SaaS
24:21
apocalypse.
24:22
You know, there's a line of thinking that says, hey, now that the switching
24:26
costs are,
24:26
now that AI is here, the switching costs are very low, you know, there's no
24:30
code moat,
24:31
there's there's no data moat, there's no UI moat, and there's a set of SaaS
24:34
companies that are
24:35
on the conveyor belt on the way to the guilty. And you know, maybe it's monday.
24:39
com first,
24:39
and maybe it's Atlassian, and you know, companies that aren't systems record,
24:43
and then maybe it's
24:44
coming for them to know that people say, hey, you're not going to vibe code,
24:48
you know, Atlassian,
24:49
you're not going to vibe code, these, these, you know, incredible products with
24:51
all these
24:52
integrations and all these, you know, distribution, et cetera, would say you
24:56
and the SaaS apocalypse,
24:56
how do you make sense of it? I think both things are true. So I have a, I would
25:01
give you a different
25:01
rubric to think about it, which is what software is really fundamentally about
25:07
beta, and what software
25:08
is about alpha. And I think that the software that's about beta is going to
25:12
really struggle that,
25:13
you know, this is, this is software that made you more similar to everyone else
25:16
. And this has been
25:17
my historical critique of the software industrial complex, which is that the
25:21
feedback loop for the
25:22
people building the software is, can I sell it? Not did it, did it add value,
25:26
which is downstream of
25:28
can you sell it? And, and so the, you can think of almost like vibe coding, the
25:33
, the advent of AI,
25:35
it allows you to make software that's specific to you. It's inherently alpha
25:38
focused,
25:40
if you do it right. But I, so I think that the platforms that are already
25:44
focused on alpha are
25:45
going to continue to have an advantage. It's like actually going to be a win
25:49
that fills their
25:50
sale up on the other side that the stuff that's all beta is really going to
25:55
struggle. And you can
25:56
almost argue like maybe the beta wasn't that valuable to begin with. But one of
25:59
the jarring
26:00
moments for me was in COVID, if you really look back at what were CEOs talking
26:05
about in their earnings
26:06
calls about software, they, no one talked about the $5 billion ERP
26:11
implementation they did that
26:13
saved their supply chain because all of them fell over like paper tigers in two
26:16
weeks.
26:16
And what they were talking about was zoom and teams and how that enabled them
26:21
to go remote.
26:22
And you cannot think of like that's crazy. That is, that should have been a
26:26
sputnik moment for the software industry to say, wow, we haven't built shit
26:30
that's valuable.
26:32
How depressing. And on the flip side for us, at least COVID was a huge tailwind
26:39
because it's
26:40
specifically because we were able to help our customers adapt to this reality
26:44
at the speed
26:45
of the disruption. I think it kind of separated the wheat from the chaff. And I
26:49
think we'll kind of
26:50
see that, you know, maybe there's a lot of things we've been spending on almost
26:53
mimetically like,
26:53
well, other people use it for this. This is a standard industry solution for X.
26:58
Those things
26:58
are going to feel a lot of pressure. And on the on the flip side, they're going
27:02
to be software.
27:03
It's almost almost like a toolkit and approach that allows you to express how
27:07
you're more
27:08
different than other companies. It's almost become software that allows you to
27:11
express your
27:11
competitive advantage. Your strategy is going to be a premium on the day to
27:16
stuff like vibe
27:17
coding. You can't do it. I think that's actually true. Like, I think it's
27:19
actually true that day
27:20
to is much harder, a lot of it's unsolved, and you're gonna have to figure that
27:24
out. But I don't
27:25
think that's going to preclude the pressure on the beta on the beta software.
27:28
Yeah. In terms of a crewing value, right now it seems like the hardware layer
27:36
has the highest
27:37
margins, whereas in the internet economy, the applications had the highest
27:41
margins. I'm curious
27:42
if you think, and I will be like the internet where sort of the entities that
27:47
control the
27:48
and user relationships crew the most value, or if you think it'll be more like
27:52
the cloud
27:53
where the infrastructure layer occurs the most value or has the highest margins
27:58
. How do you
27:59
think about how to play out? If you thought about the stack as chips, models,
28:05
AI infrastructure,
28:07
AI applications, what I see happening empirically is the models are being
28:12
commoditized and always
28:14
under pressure. The model companies are expanding up. Sometimes they almost
28:18
call it in a diminutive
28:19
way a harness, but it's actually they're building software around it that is AI
28:23
infrastructure
28:23
to do something like code. And then the people who started as narrow vertical
28:30
AI solutions are
28:31
kind of earning their way down this stack to realize like, oh, I need this
28:34
actual AI infrastructure
28:35
to be able to scale to my customer base and handle more use cases. So our
28:39
theory has always
28:40
been the value is going to accrue in two places at the chips layer and at the
28:43
AI infrastructure
28:44
layer, what we would call ontology. But those two layers, I think, are going to
28:48
be pretty defensible.
28:49
There's this funny chart in the economist the other day about what's going to
28:53
happen to the
28:53
economy. And it gives three three predicts either everything goes to the, you
28:58
know, goes vertical,
28:59
AGI, either, you know, we're all dead or, you know, we're all economically dead
29:05
, everything
29:06
collapses, or, you know, 2% growth. And so, you know, the economist is is has
29:13
hedging just like many
29:14
others. I'm curious how you what's your sort of mental model for what AI is
29:19
going to do to the
29:19
economy in terms of, you know, the productivity stats and GDP growth, but then
29:24
also the the job
29:26
market. And I mean, AI as it achieves its goals over the medium term. And we
29:30
sort of, you know,
29:32
start to reach the potential that people have been talking about, you know,
29:34
people say AI 2027,
29:36
it's even if it's 2030. How do you think it's going to impact the economy?
29:39
I have a lot of thoughts here. So hopefully we'll hit them all and I won't
29:41
forget them as we go
29:42
through this. So the first bit is what always irks me about how we talk about
29:46
AI is as if somehow
29:48
we have no human agency. AI is going to do X. No, that's not right. Humans are
29:53
going to use AI
29:54
to do X. There's a choice here. Do we want to invest in AI slop and essentially
29:59
AI slot to borrow
30:00
an expression from from John Colson, Patrick Colson? No, I think that these
30:05
things are,
30:06
I don't want to invest in that at least. So what is our normative view of why
30:10
AI is valuable?
30:11
How does it result in American prosperity? How does it make our society better,
30:16
not worse,
30:17
and restoring the fact that we have agency and therefore an obligation to steer
30:22
this
30:22
in a specific way? So that's the first part of it. Then if, okay, if we have
30:27
agency,
30:27
what is that? You know, my view of this is we have a historic opportunity to
30:32
fix the fundamental
30:34
breakdown that happened in the 70s between wage growth and GDP growth. That
30:39
this should
30:40
be if we look at just the example we have about the Intel Warren officer who's
30:44
suddenly able to do
30:45
so much. Well, I see that playing out on the ICU floor. I see that playing out
30:48
on the factory floor.
30:49
There is an opportunity to give the American worker superpowers with AI. It's
30:53
David Slingshot
30:54
in a world where the Chinese Goliath has been this giant sucking sound of
30:58
American prosperity.
30:59
If we do that, it's a basis for underwriting the re-industrialization of the
31:05
country.
31:06
And that we're not going to do this symmetrically. That's why it's a slingshot.
31:10
It's not like
31:10
hey, this is how they do it there. We're going to do this here. It's actually
31:13
we're going to do
31:13
this in entirely new ways like Hadrian is a perfect example of that, right? We
31:17
are re-industrializing
31:19
using technology, making these people 50 100 times more productive than it
31:23
could be otherwise,
31:25
and it's going to lead to all sorts of new possibilities in particular because
31:29
I think the great
31:30
lie of globalization is that we can do the innovation over here and we're going
31:34
to have the
31:34
production go over there. But guess what? Innovation is a consequence of
31:38
productivity. If you don't
31:39
make the thing, you can't innovate on how you make the thing and what the thing
31:43
is. You see that
31:44
with SpaceX, there's a reason the R&D engineers are co-located on the
31:47
production floor. What is the
31:48
feedback loop and cycle time they expect to come out of that? And you see that
31:51
in the negative where
31:53
we used to think who she was just some cheap set of pipetting arms for contract
31:57
pharmaceutical
31:58
research and now 50% of all clinical trials are being done in China. And so I
32:02
think we should
32:03
view this as a national emergency and a national opportunity around AI. And
32:08
what concerns me
32:09
a bit, these technology revolutions are usually by the vast vast majority are
32:17
tool revolutions, not concept revolutions. It was not Galileo who invented the
32:22
telescope.
32:23
He used it to discover planetary motion. It was the future of these
32:27
technologies,
32:28
the microscope, the power loom, the telescope, the personal computer. They are
32:32
determined
32:33
not by the inventor of the technology, but by the people who wield the
32:36
technology. Today,
32:38
when we listen to the AI doomerism, we're listening to the inventors who are
32:41
incredibly smart,
32:42
but just like their creations, they have their own jagged intelligence. Just
32:46
because
32:46
they were smart at building the model doesn't mean they're going to be right
32:49
about the implications
32:50
of it. And then we are implicitly giving up our own human agency and how to
32:55
steer it. It is us
32:57
as the wielders of it that are going to determine the future course of this
33:00
technology. And I see,
33:01
you know, maybe the most authentic thing about that economist graph is those
33:05
range of outcomes
33:06
are exactly what's possible. And it's up to us to pick which one we want to be
33:10
on. It's a choice
33:12
we're making. It's not something that's being done to us. Yeah. No, you just
33:15
said something so
33:16
interesting that I think is is underexplored, which is the co-location of R&D
33:21
and production,
33:22
which is something we very much understood that was sort of, you know, that's
33:25
the as the Henry Ford
33:27
style, right? Like that's how we used to build things in the physical world.
33:30
And then, of course,
33:31
globalization led to this sort of separation of them. And even you still see it
33:34
in companies,
33:35
right? It's like the engineering team in many companies is not the same as the
33:38
production
33:39
team. You can be a production company or an engineering company. We see this a
33:42
lot in our
33:42
American dynamism portfolio. What was the impetus for sort of that
33:47
philosophical division?
33:49
You know, I think a lot of people point to policy changes in the 90s. But what
33:53
was like the real
33:54
impetus from your research that sort of led to this sort of divorce between
33:58
production
33:58
and engineering? And how are you seeing it come back together again in
34:01
companies today?
34:03
Europe has created exactly zero companies from scratch in the last 50 years
34:07
worth more than 100
34:08
billion euro. We have created all of our trillion dollar companies from scratch
34:12
in America in the
34:13
last 50 years. The difference is founders. You know, you have really good
34:18
companies over there,
34:19
but they're like 300 years old, 100 years old, whatever it is. We kind of had
34:26
the
34:26
Europeanization of our mega cap companies until recently, you know, Intel at
34:32
some point. There
34:33
was this fork in the road where they could they could have promoted their CFO
34:37
to be the CEO or
34:39
Pat Galstinger as CTO. Back then this is before he came back later to be CEO.
34:45
Who did they pick?
34:45
They picked the CFO, the person that Wall Street would understand. Not the
34:50
person who could actually
34:51
determine the future. And by the way, it really looked like it was working for
34:54
10 years until it
34:55
fell off a cliff. But that was all financial engineering, not real engineering.
34:58
You know, when was the the last Boeing CEO to be an engineer? I think was 2004.
35:04
You know, there was a way there. So you think about there's a period of time in
35:07
our economy
35:07
where we understood that the engineering was leaving these things. Elon says
35:10
the pathway to
35:11
the CEO is through the CTO, which sounds like a crazy, heretical statement
35:15
because like certainly
35:16
from my generation, the way we grew up, that's not true. That's not how that's
35:20
not what we were
35:20
taught. Now I'm not saying that because I'm the CTO here. Don't read anything
35:25
into that. It just
35:26
means that like, if you want to infer that you're breaking news today, Cindy
35:30
Grove, who was the
35:33
president of Intel, used to start his annual sales and marketing kickoff
35:36
meeting by reminding
35:38
all the salespeople, just remember, it's the engineers who create all the value
35:41
. You guys just
35:42
move it around. You know, it doesn't mean the salespeople aren't important or
35:46
aren't necessary,
35:47
but there is kind of a sequencing here. And I think we kind of got very
35:50
confused about that,
35:51
that we became very good at financial engineering and forgot about engineering.
35:54
One thing one of our portfolio CEO said is that maybe salespeople are the least
36:00
AIable in terms of being able to be automated or replace them. I'm curious how
36:05
you think about
36:06
sort of the jobs at tech companies, you know, leverage with AI, how you're
36:12
using in your own
36:12
income, you know, in penalty or how do you think about that? Yeah. So one part
36:18
I was going to say
36:19
from earlier that I think is relevant to this is, you know, Pascal said every
36:24
human has a God-shaped
36:26
hole in their heart. And part of the potential pathology from the labs is that
36:31
they have filled
36:32
that hole with AGI. And so there are things that they assert as empirical that
36:37
are actually
36:37
articles of faith. They may be true. They may not be true. I don't know, but
36:41
they get confused
36:42
between what's an article of faith and what is actually an empirical reality.
36:45
And so if you view this through a very pragmatic clear-eyed view, say the sales
36:49
people,
36:50
I'm not sure why the goal is replacing people to begin with. Like, isn't the
36:53
goal to win?
36:54
Isn't the goal to be dominant in your industry? You want to be better. And so
37:00
maybe being better
37:01
is about a mixed mammal AI teaming. It's, you know, how do I build the Ironman
37:05
suit for the sales
37:06
people I do have? How do I make the best salespeople more productive and system
37:09
atize? What is it that
37:10
makes them good for everyone else? Like, there's all sorts of other ways of
37:14
thinking about the
37:14
problem if your goal is winning. But if your goal is AGI, it's the aesthetic of
37:19
the fact that you
37:20
couldn't replace the person with this model is offensive. And you're just going
37:24
to, you know,
37:25
you're just going to keep driving at that. And I think it could be a
37:28
distraction. This is one
37:30
way in which I think the Chinese do have a little bit of an advantage, which is
37:32
, first of all,
37:34
just to be clear, I'd bet on us 100 times out of 100. But they have a pragmatic
37:39
approach like
37:40
the whole point of AGI is to win. It's not AGI. How do I improve my productive
37:47
forces, as they would
37:48
call it? Now, I think that the good news is if you look at the people who wield
37:52
the technology in
37:53
America, that's what they're focused on. You know, the CEOs I deal with, none
37:57
of them have asked me,
37:59
"Hey, I want to fire a bunch of people." Or, "How do I get rid of these people
38:02
?" Maybe because we're
38:03
too expensive for that sort of bullshit use case. But they come to me and say,
38:06
"I want to dominate
38:07
my industry. I want to destroy my competition." Okay, great. So the ambition is
38:12
there. And maybe
38:13
you get more efficient by doing it. But actually, the whole point is to grow
38:16
massively. So that
38:18
ambition sets the frame of how you're going to apply the technology and what
38:20
sort of solutions
38:22
you find valuable. Speaking of China, how do we win the AI race, particularly
38:28
as it moves towards
38:29
more physical AI and robotics, et cetera? What are the things to make sure we
38:33
get right or the
38:34
things we need to we need to fix? I think our biggest risk as a country is
38:38
suicide, not homicide.
38:40
You know, I have anyone who knows me knows I'm a big China hawk. And I think
38:44
part of the challenge
38:45
with China is it's not enough for the CCP to prosper. America must also fall.
38:50
Like, look, if you want
38:51
to buy our soybeans or not, I don't begrudge you. That's a business decision.
38:54
That's free trade.
38:55
Great. But when you're trying to smuggle in agricultural funguses so that we
38:58
can't grow
38:58
soybeans, that's a different ball game altogether. And that that offends my
39:01
kind of
39:02
American Calvinist sensibilities of fair play. But so all that said, that would
39:07
make it seem
39:08
like I care a lot about homicide. But I think our problem is actually one of
39:11
national will and
39:12
focus. And like, are we actually addressing the problems that we face here? Are
39:16
we encouraging
39:17
the agency and our people to believe the world can be better? You know, this
39:21
manifests in a sense
39:22
of just kind of like nihilism and polarization that we forget what makes us,
39:26
what unites us,
39:28
and we focus on what divides us. And there's this kind of sense like, hey,
39:32
nothing really works and
39:34
doesn't really matter. So let's just burn it all down. And a big part of like
39:39
how I think of what
39:40
Pounder does in the world is it is about the legitimacy of our institutions,
39:45
like whether it's
39:46
doors falling off planes or basic government services working, these
39:48
institutions should all
39:50
work excellently. In the absence of them working, it breeds this nihilism. And
39:55
then you get the wrong
39:57
reaction to it. So that's what that's what I think we should focus on
39:59
addressing. So now,
40:01
to physical AI, the point of having an ambition, like let's re-industrialize,
40:04
let's be maximalist
40:06
about this, not some sort of half measure that's like a little bit of friend sh
40:08
oring here or there,
40:09
whatever. It's like, no, we invented all of these technologies. We invented
40:13
mass production. We
40:15
invented nuclear power thing after thing after thing. It's like the idea that
40:18
somehow the American
40:19
people are not capable of this thing. How is that beggar's belief? Right? So I
40:23
think it's actually
40:23
about will and motivation and leadership. Yeah. No, I think that that is such a
40:28
good segue into
40:30
what you're doing in terms of building culture. Because I think this is
40:33
something that
40:34
is overlooked. A lot of people think it's a technical problem or production
40:37
problem. And I
40:38
agree with you that I think it's a seriousness and a will problem. And you have
40:43
now been investing
40:44
in film, which is totally different than what you do at Palantier. So I would
40:48
love to understand,
40:50
you know, why did you start a film production company? And how do you think
40:54
that's ultimately
40:55
going to change the culture around having more will about doing these hard
40:58
things?
40:59
Well, it really starts with my own assimilation journey. I came to the U.S. as
41:04
a young child. I
41:05
settled in Orlando. And my assimilation journey, as a four-year-old, five-year-
41:10
old,
41:10
was watching movies with Dad on the couch. And what were the movies of the '80s
41:14
and '90s? It was
41:15
Hunt for Red October and Red Dawn and Rambo 2 and 3. And you know, I like to
41:21
say, as a five-year-old,
41:22
I knew what it felt like to be an American before I knew civics. That was way,
41:25
way down the line.
41:26
And I think a lot of people experience that again after a long period of time
41:31
when they
41:31
watched Top Gun Maverick. And so, you know, we sometimes over-intellectualize
41:35
these things,
41:36
like there's a feeling to it. Even subtle things like I heard from the guy who
41:40
made the movie 300,
41:42
that after 300 came out, Navy SEAL recruitment went through the roof. And he
41:46
was kind of perplexed.
41:48
Obviously, it's a movie about Spartans. What does this have to do with Navy
41:52
SEALs? But it
41:53
clearly inspired so many people to be like, "I want to look like that. I want
41:56
to be that strong.
41:56
I want to be that heroic." You know? And so, the virtue of entertainment as,
42:01
first of all,
42:01
it's got to be entertaining. It's not probed out here. But then it lets us
42:04
reflect on ourselves.
42:06
And who do we want to be and what do we want to be like? And if our
42:09
entertainment is all
42:11
Terminator, AI ruins the world, technology is a force of evil. It's all dystop
42:16
ic future scenarios.
42:17
That sets us sort of condition, which I would juxtapose to my youth in Orlando
42:21
growing up in
42:22
the shadow of the Space Coast, which was just like science and technology is
42:25
amazing. And we're
42:26
going to be living on other planets. And, you know, as a sixth grader, write a
42:29
report on how
42:30
we're going to get to Mars. And it just inculcates a fundamental belief that
42:34
the future will be better,
42:36
and that science, technology, the will to invest in these hard problems is
42:40
worth it.
42:40
It's worth it. And so, I think we have a moment to reclaim storytelling in a
42:45
way that's both
42:46
entertaining and inspiring. Yeah. No, I love that you point out that you grew
42:49
up in Orlando. It's funny.
42:51
I also grew up in Florida in the '80s and '90s. And it was like, you know,
42:55
people made fun of
42:56
Florida, right? Like, they didn't understand if there was anything good there.
42:58
But Orlando,
42:59
I mean, you say it's the shadow of the Space Coast. It's also Disney World,
43:02
right? It's the best
43:04
stories of a century. It's American culture. And so, I'd love to, I mean, were
43:10
you a Disney
43:10
kid? I mean, like, was that something that, like, also spoke to you of, like,
43:14
these stories of
43:15
good and evil that are passed down through cartoons? Absolutely.
43:17
How did that help you? So, also, I mean, there's a part of the story, which is
43:23
the business. Like,
43:24
why did we end up in Orlando? You know, after we fled violence in Nigeria, my
43:29
dad had a childhood
43:30
friend who was living in LA who sold knickknacks at theme parks. And he's like,
43:34
"Hey, look, I know
43:34
this horrible thing just happened to you. There's this up-and-coming place with
43:38
theme parks. I don't
43:38
live there. I need someone I trust there. Why don't you go to Orlando?"
43:42
So, literally, it's not just that, was I a Disney kid, my parents' job was to
43:48
provide knickknacks
43:49
in the theme park stores. So, like, after school, they would take me to Sea
43:53
World and I would pet
43:54
the stingrays while they restock the shelves. You know? And so, I grew up very
43:59
much in vibing
43:59
the storytelling, the aspect. I mean, Epcot. Epcot was all about painting an
44:04
optimistic vision
44:05
of the future and what technology was going to be like and the stories of
44:08
heroes and, you know,
44:09
that there was both, there was both evil and bad in the world and there were
44:13
clearly heroic
44:14
actions that you could take and it was all super inspiring.
44:17
And I mean, that's what's so interesting too is like, I feel like the height of
44:23
the sort of
44:24
good and evil battle inside of Disney film was sort of the Lion King 90s, right
44:29
? Like, I mean,
44:29
they were different films then. How do we get back? I mean, maybe it's not back
44:33
, maybe it's
44:34
forward, but how do we get back to those stories for children, for people to
44:38
feel optimistic again?
44:40
I mean, you know, in some ways you don't hear, oh, I grew up in this city and
44:44
it's the height of
44:44
optimism. You don't hear that about California anymore. What will it take for
44:49
movies to transform
44:50
that? Well, you know, this is a kind of a personal opinion, but I'm really
44:55
excited that
44:56
David Ellison is going to have Warner Brothers because, you know, if you think
45:00
about Hollywood,
45:01
the original studio heads, they were founders. Like Jack Warner, in the 30s,
45:05
Germany was the
45:06
third largest export market for American entertainment and the Nazis actually
45:11
deployed sensors into
45:12
Hollywood to control what was being made and every studio capitulated except
45:17
for Warner Brothers.
45:18
Jack Warner was the only person willing to stand up and speak truth and only a
45:23
founder can do that
45:23
because if you're a professional CEO who's employed, you can't survive that.
45:29
And so I think in some ways there's a mirror to the present day Hollywood and
45:34
the
45:34
defense industrial base. It's conformity. It's kind of lack of opinion. It's
45:39
lack of a normative
45:39
view of what is it trying to communicate. Then if you go a little bit further
45:44
down the line,
45:44
you look at the Vietnam era, like we had very cynical Hollywood content in
45:50
Vietnam as a reflection
45:51
of how we felt about ourselves. In '73, George Lucas made American graffiti
45:56
because he was tired
45:57
of it. He's like, I'm tired of it. I just want to make a movie about boys
46:00
driving cars chasing girls.
46:01
And it was a palate cleanser that kind of like, yeah, the American people
46:05
remembered like,
46:06
okay, we went through our period, our cycle of grief and cynicism and we're
46:10
ready and it set
46:11
the conditions for the movies of the 80s and 90s that we all love. I think we
46:15
're also kind of tired
46:17
of it right now. We're tired of the cynicism that everything's going to be
46:21
worse. And you see
46:22
that in the performance of stories, Top Gun Mavericks, the easy one to point to
46:26
, but the content that's
46:28
doing well right now is American oriented. There are inspirational figures. The
46:33
heroes aren't anti-heroes
46:35
that are drug addicts that you wouldn't want your kids to grow up to be. There
46:37
's actually some sense
46:38
of inspiration in it and some pride in terms of who we are and how that
46:43
reflects in the entertainment
46:45
itself. So I think we're, if you think about the next two to 10 years, we're
46:50
going to see a lot
46:52
of content like that. That's what I see in the development pipeline from these
46:54
studios themselves.
46:55
That's exciting. So maybe talk to us about some of your projects or things you
46:59
're working on and
47:00
what you're most excited about. But then also, yeah, that's so interesting. We
47:02
've had these
47:03
conversations, Eric and I with Mark and others about how the pipeline for the
47:07
last 10 years has been
47:08
Dower. We're sort of getting the end of the pipeline. You're seeing it in sort
47:11
of the Oscar
47:12
nominees this year. It's like they're not optimistic. But as you said, like
47:16
maybe 10 years from now,
47:17
we're going to see this pipeline of just like pro America exciting, optimistic,
47:21
enthusiastic,
47:22
golden age sort of content. What are you seeing and what most excites you?
47:26
Well, you know, I don't want to give away too much of my own development
47:30
pipeline here.
47:31
But I would say like you see like Call of Duty is being made right now with by
47:36
Pete Berg
47:36
and Taylor Sheridan. You see the entire Taylor Sheridan universe. I mean, and
47:40
talk about
47:40
Sicario. It was like 2014. Sicario has basically came to life with the new the
47:46
Jalisco new generation
47:47
cartel, right? Like, like there's a sense in which the storytellers have
47:54
exactly the frame that were
47:56
that we're kind of excited about. I think there's recent events are very
47:59
interesting to tell stories
48:00
about right now. Even a movie like War Machine, which just came out from
48:06
watching the trailers,
48:08
it was not yet clear. You can imagine like five years ago that that the
48:12
storyline would have
48:12
been something more like the US government build evil robots that the human
48:16
soldiers had to defeat.
48:17
In this case, it was more like aliens basically sorry to give away the plot if
48:23
you haven't seen
48:23
it. But you know, you have an alien robot and brave Americans, rangers have to
48:27
defeat them and
48:28
do through their own ingenuity. That that itself is I think showing you it's it
48:32
's belies the shift
48:34
in the narrative and storytelling that's happening. I think world events remind
48:38
us that there are
48:38
actually there is actually evil out there. Russian tanks can just roll across
48:43
the border. You know,
48:45
October 7th, just horrendous barbarism is still possible that these things don
48:50
't maintain themselves.
48:51
So what would I like I think so putting this back in a geopolitical lens, you
48:59
know,
48:59
as much as I've been saying, we shouldn't call China near peer, we should call
49:02
them peer because
49:03
calling them near peers like a shiplet that lets us off the hook. When you look
49:08
at
49:08
operations like Maduro or midnight hammer, it's hard to think of more you have
49:14
done to restore
49:15
deterrence in the world. A reminder that we do have the will maybe because we
49:19
didn't have the
49:20
will you forgot that we had the capability, but we have both the capability and
49:23
the will to do
49:25
things that are quite amazing. At the same time, it signals a very obvious
49:29
truth, which is somehow
49:30
none of the Russian and Chinese shit worked. So if you're a third party country
49:34
and you're thinking
49:34
about how what is the future of the world and how do you want to be allied and
49:38
maybe you've been
49:40
hedging because you've been seeing America and retreat, it's also a reminder
49:44
that the Chinese
49:44
did not come to save Maduro and none of the equipment they provided actually
49:48
seemed to do
49:49
anything. So is that really an option for you? So I give you the geopolitical
49:54
answer, but I think
49:56
okay, here are some some projects I think I can share. So Oppenheimer was
50:00
hugely successful and
50:01
complicated right where it's like it's three dimensional to the point of
50:05
entertainment. It's not
50:06
probed. I think there's a very powerful story in Hyman Rickover and the birth
50:12
of the nuclear
50:13
navy. We talked a little bit about him, but you know what I love about Rickover
50:16
is he was born
50:16
in a shuttle in Poland came came over at the age of six, one of these near
50:21
almost near missed
50:22
stories where they were on Ellis Island and you get to Ellis Island, you have
50:25
10 days for someone
50:26
to come pick you up. And so his mother sent a telegram. It gave someone money
50:30
to send a telegram
50:31
to the father who was already here to come get them. The guy pocketed the money
50:34
for the telegram.
50:36
On day 10, someone happens to arrive that they know from the old world who then
50:41
runs out, gets
50:42
the father to claim them, get buys them one extra day. So on day 11, they get
50:46
picked up.
50:46
But you know, this near miss where we almost didn't have Rickover, but Rickover
50:50
was a notoriously
50:52
difficult personality. So five foot two short short guy in World War II, he
50:56
drove a coal ship,
50:57
not he had no it was not a prestigious post. But after World War II, he was
51:00
sent to Oakridge
51:01
at the the vestiges of the Manhattan Project. And he was inspired to get this
51:05
idea of putting
51:06
nuclear power inside of the submarines. Because before then, submarines sucked.
51:11
They they could
51:11
go underwater for like an hour. They were diesel power. They were allowed. They
51:15
were basically
51:16
surface ships that could occasionally submerge. And after that, they became
51:20
really exquisite.
51:21
And I think he built the first one in like five years, six years, the Nautilus.
51:25
But the Navy didn't want him to succeed. They not only did Oppenheimer think it
51:30
was a stupid idea,
51:31
the Navy did too. His first office was a women's restaurant. I kid you not, you
51:35
know, it's like,
51:36
how can we humiliate this guy to quit? And he just kept going. And what I think
51:40
is interesting when
51:40
you look at his memoirs, like, it's not that he was immune to the humiliation.
51:45
He felt every
51:46
slight and insult, he documented them. But somehow he was able to channel that
51:50
into something he was
51:52
going to push through and still succeed despite that. Zumwalt, who was the
51:56
chief of naval operations,
51:57
the senior most uniform person in the Navy said, the Navy has three enemies,
52:03
the Soviet Union,
52:04
the Air Force, and Hyman Rickover, the zone Admiral. The other thing that I
52:08
think fits very closely
52:09
with 18 theses, like Hyman Rickover was a four star Admiral for 30 years. That
52:14
is something we
52:14
can't even contemplate today. We almost view our officers as cogs to keep
52:21
moving around.
52:22
Every two to three years you have to keep moving. As the first director of the
52:26
nuclear Navy of
52:27
naval reactors, you know, he was in that role for a very long time. But that
52:32
role, even today,
52:33
is an eight year stint, but shows you the primacy of people that we understand
52:37
with something this
52:38
exquisite, something where this much knowledge and continuity matters, you don
52:42
't just keep pulling
52:43
people out every two or three years. And our ships or subs are the safest in
52:50
the world by launch on.
52:51
So every six months, the Soviet Submariners would get six months of respite at
52:57
Sochi to recover
52:58
their white blood cell count because they were getting irradiated. We've had no
53:03
deaths due to
53:04
radiation. You know, he built it with the specification, this has to be safe
53:08
enough for my son. He built
53:09
it to a specification that is 100 times safer than the minimum safety standard.
53:14
And that is this
53:15
sort of aspiration only a founder could have. Yeah, this is going to be a great
53:18
movie.
53:18
What else is there to do? And you're doing everything. So what else is on the
53:25
radar for you?
53:25
Well, you know, I in some sense, maybe it's enough, but all of these things
53:31
have a through line where
53:32
it's really about American greatness and inspiring the next generation. And it
53:35
's driven home to me
53:37
when I think about my kids and recognizing that the America I grew up in
53:41
is something that every generation has had to fight for. And I'm in that phase
53:45
now where I'm
53:46
fighting for the prosperity that the next generation ought to have. And so
53:50
whether it's
53:51
soft power and inspiration and movies or hard power and deterrence of
53:55
adversaries and preventing
53:56
World War III, it's all about American greatness and the prosperity of the
53:59
American people.
54:07
Thank you.
Reply