Will AI Destroy Our Financial System and Economy? - With Tom Lee

Channel: Alex Kantrowitz

Published at: 2025-06-27

YouTube video id: m2Y-FaFHnfg

Source: https://www.youtube.com/watch?v=m2Y-FaFHnfg

I can picture some black swans driven by
AI. Yeah, talk about that. Um,
so AI could create a black swan if it's
too successful
because it's going to create
PhD level workers at a cost that breaks
all economic models, right? I mean what
is the value of our work if something
that is not us can do it better and then
if it's combined with like a robot then
it can complete all tasks that and never
tires never needs vacation it'll
outperform every human and in that world
um and of course if it gets sentient
then it really is a threat to like our
modern civilization
Or it could even
make the definition of money unimportant
because you robots don't care about
money, right? So that I think is like
one black swan outcome is that it's
terrible. But what's the percentage? Of
course, you know, guys like Elon Musk
and a lot of the books written about
this like the coming wave put the odds
low because humanity hopefully
intervenes.
Um the second way it's like that you've
described the the setup is that the
bubble bursts in AI and I think that's
going to happen for sure. Wireless and
internet already give us the template
because wireless was an exponential
growth industry from 1990 and the growth
didn't slow till let's say 2015. So it
was one generation
25 years of compounding growth of 40%.
The wireless ecosystem from
infrastructure, handsets, software,
carriers, the towers, all peaked
one-third into the cycle relative to the
S&P. So they all became market
performers.
All they peaked all at the same time.
Nothing will break away. But then 10
years into the cycle, two groups broke
out of wireless and captured the value.
The tower industry, which was like a
10bagger relative to everything else,
and Apple, which was late. Apple was
only a second chapter wireless story.
So to me, um, the AI story is
everything's going to peak at the same
time, probably one-third into the cycle,
but then in that period of consolidation
and shakeout, then one or two industries
truly pull away and capture the value
again. And so what does that shakeout
look like? Seems like it could be ugly.
Oh, yeah. That's going to be well we
know that you have to create capital
loss for investors like the internet
bubble bursting. So the when the
internet bubble burst
it only triggered a mild recession that
like in 99
because the loss was concentrated in
tech mainly and some telecom and it
really hit some geographic regions very
specifically.
The reason we had a bigger recession
after that was because of 9/11. But it
really would have been a mild recession,
you know, like that's why the the GDP
data was fine and actually like 90% of
stocks were doing okay. In fact, small
midcaps actually positively gained
during that period of time because the
internet bubble bursting didn't take
down the economy. I think if the AI
bubble bursts,
you're not winding the clock back to
zero, but it may have burst because it
may be bursting because someone decides
to do containment like pull the brakes
on this and saying like we're too close
to generative AI or we're too close to
sentience. You mean artificial general
intelligence? Yes. Sorry. Article. Do I
do you think that this industry is even
capable of pulling the brakes? I don't.
I think people are going to have to make
some decision because you're right. I
think AI safety like if we look at
employment and AI safety, I think it's
one less than not even 1% of all jobs
filled.
If you look at the financial industry
and say classify job as safety, it's
more than half of the jobs is safety.
So the AI industry has to invest in
safety. But you're right, there's like
zero incentive for safety right now. Um
because the financial industry doesn't
have like an open source version of
finance that's trying to build the same
thing and give it away and it's sort of
keeping pace with their innovations and
like we did if we did a simple thought
exercise and said if you wanted to train
morality
of a of AI using internet
it's going to be the the most unmoral
entity ever because it sees that to gain
and win has nothing to do with
integrity. I mean if you trained AI on
the Bible for instance, you would raise
a highly ethical in you know entity. So
I think that's what we have to sort of
fork as a society is like do how much
sentience do we want something to have
that actually has no moral guard rails.
Right. Right. So the other side of it is
like like I mentioned maybe the
technology doesn't work as planned. it
doesn't get to this this part and that I
think it could mirror that same thing
with that you mentioned with wireless
where there were expectations of the
technology that weren't going to come to
fruition uh until a decade later but
that when when you start to see that
when you're onethird of the cycle you
peak then how do we know that we're
onethird of the cycle in with AI yeah
well I think
like I can sort of give you some
guidelines that I saw in the late '9s,
you know, that maybe we can just say
roughly use it again today. But so in
1997, I wrote this report called the
mobile data report, which was actually
the first report that Solomon Brothers
ever produced about like how the
wireless industry could actually like
replace computers. like it's like you
know what you mobile data like what you
could be doing and uh you know like we
ended up like companies like WorldCom
used this report to to do their wireless
strategy but we thought mobile data
could be like a $40 billion business by
20 I forget the you know 15 years out so
2010 or whatever
and uh it turns out that like mobile
data is like turned out to be vastly
bigger but the stocks didn't do that.
Well, and actually the companies that
captured mobile data were was like Meta
which didn't even exist in the '9s,
right? It was Omnis Sky that was the
company in the '90s and Palm Pilot, but
they ceased to exist. So I Palm Pilot
would have been the first iteration of
an iPhone, right? Um so I think today or
back then what I noticed was people had
had to play with their models to justify
valuations. So cost of money had to go
to like 5%. And then the terminal PE or
the what you call the terminal multiple
was higher than the the best stocks
we're trading at today. So you had to
rerate the entire industry to justify
the valuations. So you knew that someone
was going to take a loss because these
are unrealistically funded models.
So, you know, I Nvidia is not crazy
today, you know, because it's 30 times
earnings, which is not a premium. I
mean, Toyota traded at 40 times earnings
for years in the '90s just making cars,
and Nvidia is not making a car. You
know, they're making a a really
difficult to replicate chip. So, I guess
we're not there yet, but you'll know
because everyone's having to fake their
model to explain why they're still
buying the stock. But let's talk about
the private companies. I mean, I know
it's private, so everything is different
in the private market, but um OpenAI is
in the middle of this. We know they're
at least getting 10 billion, maybe 20,
maybe 30, maybe 40. They're losing, they
lost six billion last year. They're
probably going to lose money this year.
They're not going to make money
according to their projections till
2029. Now, if they work and they reach
AGI, great. Uh if they don't,
what happens? Yeah. Well,
fortunately like let's say you know the
open AI and the the peer group
collectively isn't multiple trillions
right so it but it is you know nearly a
trillion ultimately when we get to the
peak of valuation for all these things
it's not that different than um what
happened to when the internet bubble
burst fiber
industry really was required consumed so
much capital. I don't know if you
followed the Selex back then, but they
were digging up rail lines, um, digging
up cities to lay fiber. And then people
said after the internet bubble burst,
there's so much fiber, we're never going
to use any of it, like we have so much
excess capacity. But after the bubble
burst and fiber prices collapsed,
a couple things happened. You know, the
second owner of a hotel made money. So
the the people who ended up owning these
and then because you lowered the price
there was a lot of innovation. It
created travel companies, you know, like
Expedia wouldn't exist without
Netflix couldn't exist without
collapsing fiber prices. Although
Netflix actually never paid for
carriage, but you know, I mean like
internet streaming became profitable and
um so I think that will happen with a
lot of code that it it may be rerated as
you said because it's so open sourced
and I'm not making a prediction. I'm
just saying that that's possible to look
for. So let's just talk briefly one more
about one more thing when we talk about
this potential black swan event with AI
and it's going to your first point of
that it becomes too successful. So you
mentioned that like okay if AI can do
PhD level work then
basically people won't be able to make
money working and society could fall
apart. The story that the AI companies
tell is that we'll have abundance and
everybody will have exactly what they
need and you can have one person that
will do whatever they want because
they'll have these warehouse data
warehouses of geniuses behind them. U
why why why so when you went to the
black swan uh possibilities you didn't
take that side you took almost the other
position. Why is that? Well,
I think it's possible that it's exactly
what you described, which is all of our
needs are met
without needing to work. So, housing and
food and
um I don't know, a lot of recreational
activities.
It means the monetary system probably
ceases to exist. I mean, because then,
for instance, do you need to go to get
an Ivy League education or do you need
to be the best student in your class
when your robot's always going to be
smarter than the smartest human in the
class? You know, like it's going to
change what we define as achievement.
Like, why do we work hard? I mean it is
it's some people might consider it
nirvana because let's say the 10% of the
people do aspirational like that they
live their life aspirationally like
that's when we grew up you know not
everybody wanted to be the best but when
you look at societal impact or in a
company like at my former employer which
had 200,000 employees the adage was
always
20% did % of the work or really like 8%
did 90% of the work, right? Yeah. Well,
that there there's no incentive system
for that anymore in a world of
abundance. So, I do think it the
consequence is money may stop mattering.
And then if we're able to do whatever we
want, then why wouldn't there be a
situation where everybody gets
everything they need if money doesn't
matter? Sure. But then stock like stocks
may not matter. Yep. You know, like or
what is a company any anymore because
it's not a group of highly skilled
people
and if it's a high group of highly
skilled robots, well, anyone can copy
the code. So then there's no advantage
for a company. I mean, it's it's it's
actually probably
like one of the some people might say
it's a good I think that would be a kind
of a very dangerous outcome.