Qualcomm CEO Cristiano Amon: Future Of AI Devices, AI Fashion, Blending Reality and Computing

Channel: Alex Kantrowitz

Published at: 2026-01-20

YouTube video id: nk4X-iD8HP0

Source: https://www.youtube.com/watch?v=nk4X-iD8HP0

What does the AI device of the future
look like? Let's ask the CEO building
the chips that will power it. That's
coming up with Cristiano Aman right
after this. Welcome to Big Technology
Podcast, a show for Coolheaded and
Nuance conversation of the tech world
and beyond. We are here at Davos at the
Qualcomm space and we have a great show
for you today. We're going to talk about
the future of the AI device. We're going
to talk about what an AI PC is and
whether anybody's going to want it.
We're going to talk about the data
center buildout, robotics, and
industrial AI. And here to do it with us
is the perfect guest, Qualcomm CEO,
Cristiano Aman. Cristiano, great to see
you.
>> Great to see you, too. Very happy having
this conversation with you.
>> Definitely, it's we're it [music] is a
perfect time for us to have this
conversation because talk of an AI
device is going from theoretical to
concrete and Qualcomm might be at the
center of it. So, let me give for our
audience uh if you're new to Qualcomm, a
little bit of a introduction to the
company. $170 billion company. So, it's
very big. It's the designer of the
Snapdragon Snapdragon chip, which is in
mobile phones, notably high-end
Androids, also PCs, autos, and
increasingly wearables. There's also the
Dragon Wing chip, which we're going to
talk about, which is in industrial use
cases like robotics, and you just got
into AI data center, building servers
for AI inference. So a chip designer
really at the center of the AI story whe
whether it comes to wearables or in the
data center.
>> I like that.
>> Okay.
>> Very [clears throat] good. I think uh
that's a great introduction of Qualcomm.
Maybe I'll just add one thing to it. I
think you know Qualcomm is a a very
unique semiconductor company. I think uh
especially in in today's environment
when connectivity is important,
computing is important, AI processing is
important. one of the few companies that
had all of it in in under a a single uh
roof. And we're probably one of the few
semiconductor companies that go from
five watts to your earbud now to 500
watts when you think about a data
center. And uh it's a exciting time for
the company also exciting time for
technology since AI is going into
everything
>> and designing the chip for the
smartphone has put you in a very
interesting position because as we all
start to imagine what an AI device is
going to look like obviously when it
comes to AI the compute underneath is
really important and you're in position
to do it and recently you've talked
about how your belief is that the market
opportunity for an AI device and we're
going to get into what the form factor
is going to look like, but the market
opportunity is 10 billion devices, which
would make it bigger than the smartphone
market. How do you get to that number?
>> So, it's it's uh it's interesting and I
think for you to for you to get at that
number, it's actually important to see
uh how the smartphone had, you know,
evolved over different generations and
and I think you have a couple things. uh
you have the evolution of phones, you
have the evolution of compute and and
then how AI changes that going forward
and maybe I will I'll I'll take us a
little bit into that journey just to to
to talk about it. One of the biggest
change I won't go all the way back to 2G
but one of the biggest change that
happened in the phone industry when when
we develop broadband uh into cellular
and we said we can have broadband
speeds. We realized that on the other
side of that broadband you need a
computer. So your phone need to become a
computer and you need to develop a
computer that will fit in the palm of
your hand. And that's that's the
smartphone. That's the smartphone that
changes computer forever. I guess it's
our inseparable device. Um we carry with
us all the time and it is doing uh it's
been at the center of our digital life.
Now uh as as you keep advancing I think
in you know in smartphones right now we
are in the billion every single year is
1.2 billion phones are purchased. Uh
it's the number one consumer electronics
and and everybody has one. But when you
start thinking about what's happening
with AI and especially as computers
using AI now understand us, then you're
starting to go into not only the
computer that you carry, but also the
computer that you wear, especially
because uh if agents are going to be
useful for you, they're going to be with
you all the time. And then you started
to go from from carrying a phone to also
having a glass or a ring or a bracelet,
a watch and all those things. But they
changed the nature of what they of
wearable used to be.
Wearable was uh when you talk about
wearables and technology was designed to
just extend your phone functionality.
Like for example, yes, you have a
smartwatch will tell you the time but
also give you now your sensors back to
the phone and give a notifications from
the phone to you. But that's all going
to change. It's all about connecting to
a model, connecting to an agent. as
those things change and we all going to
start wearing those things then you
start to think about big numbers uh you
know if uh if if you have uh everybody
has end up getting a watch a ring on on
a glass that's not connected to an agent
then you're talking about order of
magnitude as big as the phone and I
think that's exciting that's how we
think about the future of the of the
mobile industry
>> but here's the question the question is
why does it need to be a wearable I was
speaking with Sam Alman right before the
end of the year. And now Open Eye is
going to build a family of devices, but
the rumor had been that it's going to be
a smartphonesized device, no screen, and
it just listens to you and then it will
push you notifications about your life.
And I was like, well, why can't it just
be an app on the phone? Why does it have
to be a wearable?
>> Okay, it it doesn't have to. Look, uh,
we're working with them. Unfortunately,
I cannot tell you what it is.
you will see and it's going to be
exciting. Uh but but here's uh you we
need to be thinking about this a little
bit different right and wearable is one
of one of the things it's going to be
more so so I'll I'll start first
answering your question uh this whole
category of personal AI devices is
humans already decided what they're
going to wear a long time ago right so
uh I don't think we you and I going to
be wearing like a big helmet I think we
can wear glasses We can wear jewelry. We
so humans kind of decide what they're
going to wear and you can put you can
you can make you know that's our job to
make electronics uh very dense and a lot
of computing power in small form factors
come from our phone DNA and you can put
electronics in all of this plus
connectivity connect to an agent is
going to be very useful but you could
have something in your desk it could be
you could have uh you know something in
in you know next to your uh to your bed
you can you can connect to agents uh in
different devices and uh I think what
we'll see everything will become will
become smart in one way because see the
biggest fundamental thing is if now the
computers understand what we see what we
say what we write uh and and that uh
changes a little bit the the human
computer interface and with that changes
the whole uh uh you know definition of
what the computer is. So wearable is the
most logical thing to us because we're
thinking about mobility and things
you're going to carry with you, but you
could have things in your desk. See, the
way to to think about this is let's
let's think about uh devices that get
caught in the transition of technology.
For example, you're now uh you have a
laptop right in front of you, right? And
uh I'm can I can bet you right now uh
and I see it's qual powered. I can bet
you that the laptop uh has ability for
you to touch the screen, but you
probably don't touch that often. You use
the keyboard. That's what that's what it
was designed for. The user interface was
designed for this. You touch your phone.
Now the phone when you pull your phone
out of your pocket, you're going to be
touching going to apps. It's not very
natural for you to point in the phone
like this to try to record images.
Glasses are your head moves. camera move
of you. Maybe you can talk to the phone.
Maybe the phone is here and you talk to
it before you pick it up. I So
therefore, there's going to be other
things that going to be in your desk
that you're going to talk to. So we
don't know how those things are going to
pan out. But I think going back to your
question, wearables is logical that
wearables going to be things that we be
wearing and carrying around.
>> But help us flush out a little bit about
what this experience will be like.
>> Yes.
>> I mean, obviously we're we're not we're
not there yet. No. And we've had many
stops and starts. Google Glass was an
example. People were wearing computing
on their heads a long time ago. Um, now
it seems like the technology is actually
getting there to the point where maybe
it will be useful, maybe it can make
sense of our context. So, Cristiano,
when you think about, all right, I'm
going to put chips in glasses and maybe
some other different formats uh and
people will use them and have x
experience. What is that experience?
>> Yes, let's talk about the experience.
And now we're going to uh break this
conversation. I'll talk about the
experience and I'm going to talk about
the technology that goes uh u you know
behind it. So think about how glasses
are performing today. You know you have
for example the meta rayband glasses. I
think there's going to be other glasses
coming within uh the Google ecosystem uh
this year. And what are the glasses
doing today? Like you have cameras so
you see what you see can understand the
image. It can annotate the image and you
have a microphone. It has a speaker may
or may not have a display. You know, you
have use cases even without the display
like you have the meta ray band glasses.
What the experience look like? You're
going to be first of all for those
things to to be uh to get scale they
have to have very low friction and the
experience has to be useful. Otherwise,
it's like a [snorts] gimmick. You're not
going to use it. So, the experience
going to be like this. uh I am I'm
talking to you and then uh let's say I I
see somebody uh in the audience and I
and I just said uh who is this person
and the glass will tell me uh I don't
know let me check uh I I check on the
web is this person uh here is this is
this person name I said oh okay yeah you
know you met her before there was an
email that was sent to you uh from from
this person it has to be something like
you have like you have your friend with
you all the time you walking on the
street and I said, "What is this?" This
is what it is. Uh or even something like
uh you go into your day and your agent
is going to come to you and say, "You
know, I noticed that right now uh you
seem to be free. Can I talk about your
agenda? There's a conflict we need to
resolve." Those are examples of how this
experience going to be. is going to be
this agent that it has ability to
understand your context, understand what
is around you around the what you see,
what you say and react in real time. And
what is interesting is we're not there
yet, but you see the beginnings of the
change and I like to do parallels. So,
I'm going to go tell you the parallel
with the smartphone. When the first time
the smartphone arrived, like when you
like when you saw the iPhone, you saw
the Android, maybe I don't know, I may
get this number wrong, but maybe like
there was 10 apps and you say, "Okay,
those are the 10 new apps." You couldn't
at the time imagine that you're going to
have probably hundreds of thousands of
apps. And if you probably look at your
phone right now, you know, you have a
ton of apps. So your phone got better
over time because all of a sudden a new
app became available in the app store
and I think that's how it's going to be
with those agents. Eventually the agent
gets integrated with some other service
and you started to see it. For example,
we have a customer of us in India that
is doing smart glasses. They integrated
with the digital payment system. So now
you can look at a QR code and say pay
this and it will pay. Uh, and so you go
from translate this, explain this to me,
pay this. You can get a bill and you
say, I got this bill. Please pay this
bill. Get out of my checking account,
notify me when it's done. And you may
take a picture and uh, email to me
because I want to keep a copy of it.
Those are going to be how you're going
to interact with those computers and
that's what the experience going to be
look like. Is there a world where we get
too close to computers where think about
sometimes that free time is really nice
and now the agents being like aha you
know he has a moment I'm going to go and
help him resolve a conflict or I'll help
him understand who this person is as
opposed to them you know him going up
and asking who the you know have we met
before does does there eventually come a
point where humanity and computers come
too close together That's a good
question. I and I think uh I don't know
the answer to the question, but I I
think like everything uh it's going to
be for you to decide. Um look, it's uh
you know there some of us, not all of
us, uh sometimes you just put your phone
down and it's going to be like that. Uh
you're just going to have to decide when
it's time to to disconnect. But I I feel
it's going to it's going to be a little
bit different because now uh we are
going to it's going to be easier to work
uh uh with uh computers and the
computers are going to be uh easier to
work with us and I'm going to use this
question that you asked me to tell
something funny. I wasn't uh in CES and
uh I was having a conversation with a
customer of Qualcomm about this exactly
this thing about the smart glasses and
the and the camera and the fact that now
the camera see what you see and can
annotate the image and and then uh
somebody said you know um what if
sometimes there are things that you want
to forget uh and and then uh the answer
was well you you may but the AI won't
forget
>> [laughter]
>> But it's uh you know those are going to
be interesting things like uh like with
technology I think how humans are going
to use it and how those going to be
developed we're going to see it
>> the natural extension of this
conversation is um as AI becomes more
powerful and humanity comes closer to AI
there's going to be people that are
going to want to say let's just bring us
together Elon Musk has talked about how
the reason for building Neuralink his
brain computer interface company is he
said eventually AI is going to get more
powerful than humans and we better merge
with them or they're going to destroy
us. So I want to just ask you would you
merge with AI?
>> No. But look in the in the conversation
that we just had we're we're talking
very consumer centric when you said
about you know too much technology. But
it's easier to also understand uh when
you move from the consumer to the
enterprise if you actually think about
the fact that if you have uh the ability
to learn everything in real time like
we're actually seeing some use cases
right now especially for industrial when
when you have somebody that is as an
operator of of an [snorts] equipment or
of a refiner or everything and then all
of a sudden you have this agent with you
that you get to a particular equipment
and you say, "How do I operate this?"
And it will say, "Here's how you're
going to operate it. You do this, you do
that." So, the ability for you to have
access to knowledge in real time, I
think there's incredibly um uh
incredible opportunity to actually
democratize knowledge and learning. I
think so. That's another thing about the
connection between um you know uh AI and
uh and in in terms of augmenting human
capabilities. We we can say that because
we saw that with phones,
>> right?
>> With phones is how many nations got
access to the internet and became uh you
know access to digital through the
phone. You know they wasn't through a
computer and I think it was is
incredibly empowering to have people to
be connected and ability internet. I
think maybe that's going to be the same
thing with those personal AI devices.
Okay, I'm going to move off this in a
second, but I asked if you'd merge with
AI, and you said no very quickly.
>> Yes.
>> Why the reflexive no?
>> Oh, because look, it's it's different. I
think uh I I you know, it's I think it's
fun. I think uh people like to have
those stories about science fiction. I'm
I have a very clear belief. I think
there's humans, there's humanity. Um AI
is our creation is trained on the stuff
that we do. I think if you look a lot of
those models, so it's really a tool uh
designed uh to augment but it won't take
away our humanity.
>> Okay, very quickly on form factor,
you've mentioned glasses a number of
times. you didn't mention earbuds and
you know when you think about the way
that this competition is shaping up you
have different companies making
different bets on different form factors
especially when you look at the tech
giants big technology as we like to
cover here on the big technology podcast
you have Meta making a big bet on AI
powered glasses Google as you mentioned
I think we're going to see a very big
bet from them Google Glass part two
although maybe they'll have a new name
Apple it might be 2027 seven until we
see a pair of glasses for them. Maybe
their big bet is going to be the the
AirPods and how AI already is delivered
in the AirPods with things like
translate and I mean Siri inside there,
but still has some work to do and maybe
they'll do it with their Google
partnership. Um why why do you think
glasses over uh earbuds?
>> Look, I I won't say one over the other.
We we have the benefit I think have been
I would assume the majority of the
companies that are actually uh building
personal AI devices. We have uh I think
the benefit be working with them. So we
have a pretty broad visibility like I
give an example. Uh there are some
companies right now they're designing an
earbud with a camera uh uh
>> an earbud with a camera
>> with a camera because if you put it in
your ear and you have a camera uh it can
see in front of you. So it can provide
some context uh in addition of having a
just a speaker and a microphone. Um I
think it go back to the howto's
conversation. What are the things that
humans are going to wear and wear most
of the time? Glasses. I am a believer
that glasses is the most natural. Uh and
maybe because I wear glasses since I was
13. So um I'm you know I it doesn't I'm
used to them. But when you turn your
head, uh, you know, your camera goes
with you, it's close to your eyes, you
should thinking about this. Um, this is
I should have I should have thought
about that when you asked the question
about wearables because that's the the
most simple way to answer that question.
If the AI understands what we see, what
we say, we're here, it's going to be
closer to our senses and glasses, it
captures everything. It's closer to your
mouth, closer to your ear. But earbud,
it's it's the same thing. is just
missing the vision and that's why some
people putting a camera on an earbud.
But if you just have an earbud connected
to an IP address, you can connect to an
agent and you can have a conversation
with the agent.
>> What about pin?
>> Uh the same thing. It's another way to
put a camera on it. Uh there's pendants,
there's jewelry. Uh so it's uh we'll
see. But I I I think you're going to see
people experimenting with form factors.
I think glasses is likely going to be
the primary way that those devices are
going to be built.
>> So, let's say glasses is the winner. Um,
do you think that style matters? Let me
give you a a a binary here. I have the
more stylish glasses with the worst
assistant or the less stylish glasses
with an amazing assistant. Which wins?
>> This is a great question. This is a
great question because we're going to
see another thing happening in u in the
industry which is when you start
thinking about wearables then you're
going to have the mix of f fashion and
technology and I actually think I'm
going to make a prediction here uh I
don't want to be offensive to any other
company but I think that's where
horizontal model is going to win versus
vertical model and the reason I'm saying
that is because it's very unlikely that
Everybody on earth is going to use the
same exact glasses. People want
different form factors. They want
different colors. It is different. Uh
it's the same especially things that you
wear. As a result, I think you're going
to have uh different brands. there are
going to be uh it will be a little bit
of a interesting dynamic because uh is
that is that a Ray-B band or is you know
is a Ray-B band that you're wearing or
it's a meta if it is a Ray-B band made
by a consumer electronics company is the
consumer electronics brand or is Ray-B
band we'll see but I think you're going
to have the combination of fashion and
technology and and there's going to be
choices different brands for different
people from different age groups and
etc. So I think that we're going to see
a lot of diversity very unlike the phone
space when you know most people uh will
carry a similar phone. I think that's
going to be different.
>> I'm going to answer my own question.
I'll take the better assistant and the
ugly glasses over the nice glasses and
the bad assistant.
>> Yeah. I the best thing is maybe the the
the most be nice.
>> Maybe the most uh successful glass is
going to pair with the best assistant
>> eventually. You would think we get
there.
>> Yeah, I think so.
>> Um handicap the AI device race for us.
We have many companies that are running
at this. We have Meta that's been making
this multi-year uh metaverse bet which
has really transformed into the smart
glasses bet. We have Google, which all
indications are. I mean, if you look at
their recent uh thinking game
documentary, they're just like pointing
their phone at uh at things and saying,
"What is this?" It's like, "You need
glasses." Um you have OpenAI. You're
you're working with OpenAI on this
project. Family of devices that are
going to be in a bunch of different
places. And Apple obviously has to be
considered a power player here as well.
Who wins? Look um I I will I'll answer
this question by going into the
beginning of the internet right so uh uh
Orcut wasn't the social media that won
end up being Facebook and then and then
later Instagram I think uh Map Quest
wasn't the main map you know eventually
it was Google maps so it's early to call
I think you see all those companies I
think uh they have big ecosystem they're
investing on their ecosystem
We'll see what happens. However, I'm
going to try to give you a little bit of
an answer. I I think the the I have this
view and this is uh maybe a a longer
conversation that we're going to have
time for, but I think at the end of the
day, the winner of the edge is going to
be the winner of the AI race. And the
reason I say that is because uh the
especially for for everything that is
personal the edge has real context. Um
you know you can all
>> meaning your phone your device the
devices that you use as
>> where the humans are the humans don't
knock on the data center say give me
some AI they're experiencing to that
some other devices over there and what
happened is if you look how models got
trained models got trained on the
information available on the internet
but when you fast forward to a model
that is when you add physical AI
understanding our world understanding
your context understanding you uh that's
going to be a lot more useful for you
than a generic uh model that got trained
on of data available on the internet. So
whoever had access to that data is in a
very very strong position. So it's
companies that have uh you know presence
in all of those different devices
already. I think they have an advantage.
I will not bet against them.
>> All right. But then let [clears throat]
me let me take this a level deeper then
with you because we have seen those
companies. I'll just name them. Amazon,
Apple, Google, Meta. uh they've all
tried to build this contextually aware
personal assistant. We've heard
presentations about Alexa Plus and Apple
Intelligence and the Meta all the
different buddies you can have in the
meta properties. Uh Google obviously
with Gemini but even though they have
all this data uh we still don't really
have an assistant that's capable of
doing what they what they've promised. I
mean, Apple uh, you know, might be the
most notable and promising this
contextually aware assistant that will
help you figure out when your flight is
and tell you, "All right, time to get to
the airport." They haven't done that
yet. What is holding these companies
back? Is it a hardware problem, a AI
problem? Where is the bottleneck?
>> It's I think it's a it's a combination
of things, but I am I am more optimistic
I think than uh I think than you
describe. I think we're starting to see
I think the beginning of uh some real uh
experiences. I think you had you have to
get the maturity I think of first of all
the AI models need to get more mature. I
think they need to get more capable. I
think you had a lot of changes even
within AI. You went you started to see
mix of experts. You started to see you
know chain of thought reasoning. Um so
you you have different things specialize
um in specific task. I think we're just
the beginning of uh of physical AI which
is uh really important for you to have
context. So I think this is going to
happen. The other part of it is is
compute. you need to have a lot of high
performance computing and this is where
we come into the picture because you
cannot do everything on the cloud
because of also latency. Um it is not
going to be useful for you if uh I go
back to when you ask me to describe the
experience. If you and I uh are walking
together in the street and I'm gonna say
hey who's this person and you say this
is so and so the answer you can't be say
hold on let me think it let's keep
walking I've been thinking about it uh
and then the person went by you missed
the point and I think you're going to
have to have certain things you need to
do on the device it needs to be fast
like all companies right now uh voice to
text they're starting to do locally
because you can't uh you you won't
tolerate any delay. So, and we're going
to get there.
>> Yeah, we were just talking earlier in in
the room here about uh potentially being
on the ski ski hill and having the
glasses point you down the hill that
suits your skill set. But if you have to
wait like 2 minutes, you know, you might
be a a bunny hill skier and down the
black diamond. So, you really want to be
able
>> to work fast when you do that.
>> Glasses.
>> Yeah. Your glasses will be the first
casualty. Yes.
>> All right. We're we're here with
Cristiano Oman, the CEO of Qualcomm.
Here at Qualcomm Space at Davos, we're
going to be doing four conversations
through the week here and thrilled to be
here. On the other side of this break,
we're going to talk about AIPCs, AI data
center, the constraints on the AI
buildout and robotics. If we have time,
we'll be back right after this. And
we're back here on Big Technology
Podcast special edition here at Davos
and we're here broadcasting uh talking
together on a Monday going live across
our channels on Tuesday. And let's keep
going here about what how AI will
transform devices. Uh the AIPC is is a
subject that has been interesting to me.
uh a lot of not a lot of noise about how
if you have AI baked into your computer
uh then you'll be able to be more
productive and uh it can really
transform the way that you work. That's
the marketing. In reality that roll out
that promise has been slow to be to to
um to to meet the reality. This is from
the head of product at Dell speaking to
the Verge. He says, uh, what we've
learned over the course of the year,
especially from a consumer's
perspective, is they're not buying based
on AI. And in fact, I think AI probably
confuses them more than it helps them
understand a specific outcome.
Obviously, Qualcomm has a stake in the
success of AIP PCs. What is happening
today and where is it going?
>> It's a it's a great topic of
conversation. Look, f first of all, as
as we enter the PC space, I would argue
that in uh that a lot of what's driving
the sale of Snapdragon Power PC is the
fact that we deliver multi-day battery
life, a lot of performance in a very
exciting thin and light form factor,
right? So, we just build a better PC. uh
on the consumer side, I would agree with
that that you don't see yet a lot of
agents and and you know I I know people
want to see this right away. I wish it
was seen right away. I don't necessarily
disagree with that on the consumer front
because uh Microsoft just launched
agents for Windows. It just launched.
So, I think I think it's going to people
are going to use it more and more as you
starting to rely on agents. And I think
you're going to see things that are
going to be um uh running on your
device. But I think that's not the story
for AIPC. The story is a little bit
different. what we seeing happening with
uh AIPC and and the fact that we
actually have the ability to run uh
significant high performance inference
on on a laptop. We seeing is something
else. What we seeing is right now you
have uh many many many uh applications
and services on your PC
that are doing a lot of cloud
computation and if you could rely on the
computing that is available on the PC uh
not only is going to be faster but it
has a completely different economics.
I'll give an example.
>> [snorts]
>> If you're a SAS company and all the SAS
companies right now are being threatened
by AI. Uh if you're if you're a SAS
company and you say I have I'm going to
[snorts] have an agent within my
application and every time I going to
I'm I have this data I'm going to run it
and you're paying for computer in the
cloud. Your economics change
dramatically if you actually use the
computer into the device. I'll give you
like a practical example. Uh there's
many things now. You just have a button.
You see that on the Microsoft Copilot.
You see that on across a number of
different applications. Summarize this
like you have a you have a bunch of
data. You have several pages of a
document. Summarize this. you can go all
the way to the cloud and uh and and have
a cost of cloud compute to to run the
model or you can run that model that
summarizes on on your text in the
computer that's free uh because it's a
it's the computer that you already have.
So we starting to see a lot of interest
for enterprises or even applications to
start running a portion of the
application on the uh AI engine on the
device and that's starting right now.
>> So the reason to buy AI PC hardware as
opposed to like let's say letting cloud
code take over your computer mostly its
cost. I think you see uh you well I just
gave you one example there's more like
gaming for example
>> a lot of the gaming engines right now
are thinking about uh uh using AI on the
PC for example you can have on an RPG
game you have a dialogue with a
character like like a model you have a
dialogue the game play changes uh you
know I there's example of cost there
example of new use case is example of
agent I think the answer to your
question is
first of all, why should you buy a
Snapdragon power PC? Because by
definition, even if you're not using AI,
it's going to be a faster multi-day
battery life and it's going to feel like
your phone. You can you can use your
laptop all day without you can you go
places, don't take the charger with you.
The second part of it, why should you
buy an AIPC as a consumer? As a
consumer, I think over time you're going
to see more and more apps having an AI
front end and they're going to leverage
the capabilities on the PC, but it's
going to be transparent to you. on the
enterprise. I think the economics are
going to change because uh you know
those uh a lot of the ISVS and and SAS
applications are going to require uh the
uh onboard computing and I think that's
going to make a difference.
>> Very interesting. So that'll be a
requirement from software companies. So
Qualcomm has also gotten into the data
center world and you're building data
centers. So obviously you have the chips
in the devices like we talked about but
now you're working on building data
centers for AI inference. So let's talk
a little bit about um well actually why
don't you first give us a little bit
about why this is a move that Qualcomm
is making. Yes. And it was look uh we
always we always believe that um what's
going to happen with AI in a data center
uh you you started to see all this build
out for training and but eventually and
now now it's well understood when we
start develop our solutions that's what
we thought eventually inference is going
to take over training because ju just
think about that for a second. If you're
a company spending billions of dollars
building a data center for training, you
expect to get a return on that
investment. So when you start putting AI
into production, you're doing inference.
And we always believe that when you go
to inference, there's going to be a lot
of competition between the different AI
players. So then I think the total cost
of ownership matters, how much power you
consume matters and the architecture
matters. So first answer to your
question is we realize when the data
centers start to transition to inference
we have an opportunity leverage our
assets to build a very power efficient
inference solution for the data center
scaling the technology that we develop
for the edge
>> because the power is efficient in the
phone and power is such a bottleneck in
AI you can use that advantage and put it
in a data center that's the that's the
logic
>> if you just if you just look at today
you have this uh very aggressive ramp uh
of growth of AI and you don't have the
same ramp on energy you know already
there's a gap between available energy
and AI so I think energy is going to be
uh resource also to operate an inference
data center that's one of the biggest uh
you know items in operating expenses and
then I think people wanted to have uh a
different architecture which is the
second part of my answer. The second
part of my answer is we believe that the
data center is going through another
process of disagregation and let explain
what I mean by that.
One of the uh key things that happen in
the mobile industry, if you look at your
smartphone today, your smartphone it's a
very difficult engineering challenge
from a semiconductor standpoint because
I have to pack a lot of computing in
your smartphone. It has to fit in your
pocket. It cannot get hot. You're going
to touch your screen uh your face. It
cannot get hot. I cannot have fans. I
cannot do liquid cooling on the
smartphone. And your battery has to last
all day. Otherwise, it's it's not
useful.
>> It's worthless.
>> Yes. So, in order to do that, we had to
perfect uh the disagregation of the
compute for lack of a better way to
describe it. I'll give you an example.
uh in the PC everything was CPU
ccentric. So if you're going to do a
decode of uh of music or you do decode
of a video, you go and load up the CPU.
You can't do that on the phone. It burns
too much power. So you create a
dedicated hardware just for music
decode, a dedicated hardware just to
JPEG encode when to take a picture. A
dedicated hardware for you to do video
decode and everything is aggregated. And
I think and you do that because you
wanted to maximize the use of the
available energy in the battery for you.
>> And this all exists in the phone.
>> Exists in the phone. It's the most we
call heterogenous compute. If you look
of a Snapdragon today, it has several
engines for different things. We don't
run everything on the CPU or even for
that fact on the GPU.
Data centers go into that and we're
starting to see uh disagregation.
There's an architecture that they use
for prefield. There's an architecture
they use for decode. So we're building
what we believe is postgpu when you
started to do inference and you need the
dedicated engines. We're building that.
I actually believe that the Nvidia
acquisition of Groc validates that you
different engines for different things
and I think that's what we're doing and
I think that's our focus on data center.
>> Okay, let's talk about robotics. Are you
buying the hype on humanoid robots?
>> I will like like this whole conversation
with you. I've been I've been doing
comparisons and uh and I'm going to do a
comparison with automotive to kind of
outline our strategy. But let me give
you the answer first. I buy I buy the
opportunity to humanoid robot. However,
the opportunity is going to manifest
itself different and some of those
things going to take time. For example,
to get straight to your question, a
robot that is going to be with you in
her house and it's going to do
everything you ask the robot to do. Uh
it's going to take a time to train that.
It's very difficult.
>> Telea operators
>> uh it's difficult. Uh every house is not
going to be the same. Every task is not
going to be the same. It's going to be a
lot of uh training required. Having said
that, uh, a robot that can can do
certain tasks and do that task over and
over and that's actually not a hard
problem to solve. So with that, I'm
going to give you my comparison
metaphor. When we start uh in auto when
we start, you know, building platform
for automotive and we're very proud of
our automotive business right now. We
also got into uh a stack for autonomous
driving. When you think about autonomous
driving, when you think about robo taxi,
like a level five, no steering wheel, uh
you go to the back seat and you take a
nap, um that requires a lot of training.
Uh because you can get to 0 to 95%, but
for you to get to 99.999%
of the corner case, you have to do a lot
of training. However,
if you do assisted driving with the
human still responsible to pick up the
steering wheel and something happens,
then you have the ability to put this in
every car from level two to plus two
plus to level three and and then all the
way to uh level four. So, that's a
massive market opportunity and that's
we're doing right now. You can bring
some form of assisted driving to every
single model. I feel the same way about
robotics. If you do a humanoid robot or
humanoid arm or you do anything that it
can leverage the world that's been
designed for us and you train the robot
on a particular task that I think we're
very it's already happening and I I
believe the opportunity from a business
standpoint is massive. That's why we're
really focused on industrial robots. Uh
because you can you can train a robot,
for example, your task is going to go to
the supermarket at night and put the
stuff back on the shelf. That's a
self-contained problem. You're not
training a robot to do everything. I
think the robot that will do everything
is going to take a little bit of time
until we get there.
>> There was a half marathon in China of
humanoid robots and the highlights
looked really funny. uh robots pulling
on their face uh at at the starting line
and robots taking their whole team,
holding on to ropes and like flinging
them into the uh the side of the of the
course. And people went pretty fast and
pretty far uh with that the power the
robot had as it sort of crashed out of
the course. But some of those robots
finished pretty fast. Uh I won't say
they they they beat my half marathon
time, which eventually they will. uh but
they were respectable in their finish
and that included time for battery
changes. And the argument has been that
in China,
China is so close to the production
process. Think about uh you know their
cars, right? They have this electric car
uh boom because they've been building
things with batteries and electronics
for for so long. Um Deus Sabis, the CEO
of Google DeepMind, recently said that
China is only a couple months behind the
state-of-the-art uh western models. uh
but it seems like they're ahead on
robotics. Do you do you agree with that
argument?
>> Look, um there's there are many things I
think that China it's uh as it's
remarkable I think what they're doing. I
think uh I think there's everybody talks
about the China speed. uh we know that I
think from uh having a number of
different partners in China using our
technology from not only uh cars but
also phones now robots and industrial
and I think there is some uh merit in
the argument that you're closer to a
very large industrial base and you can
you can prototype fast you can build
things fast you can fail fast and uh and
I think those things are helpful in
developing the technology But the the
technology going to require for robotics
it's very very broad right you go from
advanced semiconductors I think uh
that's one area uh that uh that the the
China companies are partner with
companies like welcome and others uh
you're going to have a lot of ecosystem
I think uh that is going to be important
for training a lot of software uh but
yes uh this is fascinating everybody is
on a race and uh and uh things are
moving fast.
>> Lastly, I want to talk about industrial
AI, uh, which is something that I think
as far as the AI conversation gets
probably the least ink, uh, but is some
of the most interesting stuff that's
happening today. I mean, even here at
the space, we have a robot uh, that
we're looking at that was built in just
a couple of weeks with a $50 uh,
Qualcomm chip and it's moving pretty
well.
um talk a little bit about the
applications of AI uh in the industrial
space and maybe why you think people
aren't paying so much attention to it.
It's just it it's not sexy enough for
like the headlines.
>> You know, that's that's pro I'll say
it's probably there's so much attention
on data center right now that is it
probably takes all of the air I think uh
in in the conversation data center. I'll
probably I'll even resonate just the
fact that uh we said we're building
something for the data center gots a lot
of attention but the reality is the
industrial opportunity for AI is
massive. It's massive because uh you can
put you know AI uh processing on pretty
much everything and you you find that
every single industry every single
vertical has a massive number of use
cases. It's true in retail. It's true in
warehousing. It's true in healthcare.
It's true in uh manufacturing, in
energy. And we're actually seeing
incredible amount of demand especially
because if you actually have uh ability
to process in real time things that come
from physical AI, um motors, machines,
you know, all of those things you can
put sensor. But just to give you an
example uh if I we don't get too fancy
with uh uh different machines just in
computer vision alone um a camera you
can put a camera on a manufacturing line
and you train the model just to see if
what's coming in the conveyor belt uh
against the template uh you know is what
you expected you do quality control with
you know with just a camera you put the
camera for example into uh looking at a
shelf of a of a supermarket. You now can
have the ability to check inventory real
time. You can actually sell online
what's in the store with a real time I
think management inventory. You can put
the same camera on a smart city and
you're reading license plates and I
think it's it's a massive massive
opportunity. some of the many meetings
actually where I'm having here at Davos
is with industrial companies. They're
super interested in industrial AI and uh
I think that's actually happening right
now.
>> Okay, five minutes left. Two questions
for you. Uh one of the reasons why I'm
so happy to be speaking with you is
because in a sense you can see the
future, right? because you're the when
when something is going to be
mass-produced, you're the first call
that's being made from uh let's say
someone building an AI wearable, uh
you're working closely with Meta, so you
have a pretty good understanding of the
demand that they're anticipating because
they need your chips to be able to build
things out. thinking about the AI
buildout and maybe also the AI device
buildout uh and looking into the crystal
ball that you have of what the future
looks like.
Are things going to continue a pace? Can
they possibly keep moving as fast as
they have been?
>> Look, uh I feel that the [clears throat]
and I think that question is really
directed I think what's probably
happening on the data center because on
the personal on the device side we're
just at the beginning. uh I think we're
seeing uh a big trajectory like for
example glasses continue to increase
quarter over quarter uh but I think the
the broader question is to the speed on
the data center and here's my answer if
we if we go back to the year 2000 when
the do crash right uh you have that
correction on on the do go back to year
2000 and you think about what we thought
back then what the internet would be. I
will tell you that today, 25 years
later, 26 years now, um it is actually
way bigger than people thought it would
be. So, whatever they thought is in
2000, the internet will be exactly way
bigger right now.
>> And you can still buy pet food on this
one.
>> Yes. However, uh it didn't happen all in
2000. It happened. So, I think what's
going to happen is AI right now uh in
the long run is going to be bigger than
people think. It's probably under hype
for the long run. Now, how fast this is
going to uh get deployed and and how
pervasive and we we'll see. Could we
continue to build uh at the space? It's
possible. Could could the slowdown is
also possible. Well, we're excited about
it. I think it's finally this and this
is more for Qualcomm. Finally, people
just woke up that the edge opportunity
is massive and we I think this all of
this air that was all about data center
is some of it started going to the uh
paying attention to the edge right now
and I think we're just the beginning of
that curve.
>> Okay. Finally, I got to ask you a Davos
question. We're here at Davos. We have
the slopes behind us. This this is real.
Um, for those wondering, uh, you know,
it the corporations have been through
this really interesting journey. There's
been moments where they've been into
what's called stakeholder capitalism
where they've think about the group of
people beyond the shareholder. Uh, and
and I think we're kind of in a moment
now where there's there's more of a
naked pursuit of the bottom line. I'm
not speaking about Qualcomm. I'm just
saying broadly uh it seems like
corporations are are much more they they
they've sort of put away this illusion
that they care about much else than than
the bottom line. And and I wonder if if
uh you know we're here at at at the
right outside the World Economic Forum.
There's 48 uh conversations that will
happen in this in this event that will
be about AI. people will be talking
about how AI will be able to cure cancer
or get our best chance at curing cancer
and empower uh the disempowered and and
so I'm curious like from your
perspective do you think AI is going to
be the new the new altruism or the new
corporate altruism and is that a a good
or a bad thing?
>> It's a complicated question. Look, I I
think it's a it's a technology is a
tool. I think it's going like uh like
computers did it uh and and will
continue to do I think will will help
accelerate uh many things will help you
know uh accelerate for example drug
discovery as an example um it will it
will help uh you know many things will
increase productivity as as as I said
before it's probably going to uh
democratize education it's going to
change how we think about education This
is something that keep changing.
[snorts] It's it's it's going to be it's
going to be a tool. I I don't think it's
going to be like uh this change this
society kind of thing. I'll tell how
I'll give you a very personal answer.
When I and this is this is going to be
terrible because it's going to show my
age, but uh when I got out of college,
all right, it's just the beginning of
the internet. Still, I remember um going
to my first job and and there was like a
fax machine and you got to go to the fax
machine and you get the faxes that you
got overnight and put the other faxes in
there and you have uh somebody still
typing, you know, uh intercomp memos
like we don't talk about this anymore. I
think when the internet arrived and
email arrived, it was a revolution and
it was I think the AI is going to be
that kind of revolution. uh almost like
computers but uh it's going to be like
us uh doing things with computer just
more uh that's how I feel about it.
>> All right. Well, uh it it is it's been
amazing following this space because uh
every every time I think I'm caught up,
there's something new and I think that
you're going to be right at the center
of it with all the devices that are
going to come out. And uh you know,
maybe when OpenAI uh does release this
family of devices, we can talk again
about about the state of the
competition. By the way, we have a great
uh uh live audience with us. Guys, make
some noise so people can hear
that. [applause]
Uh to to Cristiano and the Qualcomm
team, thank you for having me here at
your space at Davos and very excited to
be engaging in a number of really great
conversations [music] about the state of
AI. I'm sure that our audience by the
end of them will have a really good
understanding of where things are going
and this was a great way to kick it off.
So, Cristiano, thank you so much for
coming on the show.
>> No, thank you. Thank you. I really had
fun having this conversation with you.
Thank you.
>> You too. All right, everybody. Thanks
for listening and we'll see you next
time on Big Technology Podcast. Thank
you. [applause]
>> Thank you very much.