Intel unveils latest AI chip as Nvidia competition heats up

Pat
Gelsinger,
CEO
Intel, speaking
on
CNBC’s
Squawk
Box
at
the
WEF
Annual
Meeting
in
Davos,
Switzerland
on
Jan.
16th,
2024.

Adam
Galici
|
CNBC



Intel

on
Tuesday
unveiled
its
latest

artificial
intelligence

chip,
called
Gaudi
3,
as
chipmakers
rush
to
produce

semiconductors

that
can
train
and
deploy
big
AI
models,
such
as
the
one
underpinning
OpenAI’s
ChatGPT.

Intel
says
the
new
Gaudi
3
chip
is
over
twice
as
power-efficient
as
and
can
run
AI
models
one-and-a-half
times
faster
than


Nvidia’s

H100
GPU.
It
also
comes
in
different
configurations
like
a
bundle
of
eight
Gaudi
3
chips
on
one
motherboard
or
a
card
that
can
slot
into
existing
systems.

Intel
tested
the
chip
on
models
like


Meta’s

open-source
Llama
and
the
Abu
Dhabi-backed
Falcon.
It
said
Gaudi
3
can
help
train
or
deploy
models,
including
Stable
Diffusion
or
OpenAI’s
Whisper
model
for
speech
recognition.

Intel
says
its
chips
use
less
power
than
Nvidia’s.

Nvidia
has
an

estimated

80%
of
the
AI
chip
market
with
its
graphics
processors,
known
as
GPUs,
which
have
been
the
high-end
chip
of
choice
for
AI
builders
over
the
past
year.

Read
more
CNBC
reporting
on
AI

Intel
said
that
the
new
Gaudi
3
chips
would
be
available
to
customers
in
the
third
quarter,
and
companies
including


Dell
,


Hewlett
Packard
Enterprise
,
and


Supermicro

will
build
systems
with
the
chips.
Intel
didn’t
provide
a
price
range
for
Gaudi
3.

“We
do
expect
it
to
be
highly
competitive”
with
Nvidia’s
latest
chips,
said
Das
Kamhout,
vice
president
of
Xeon
software
at
Intel,
on
a
call
with
reporters. “From
our
competitive
pricing,
our
distinctive
open
integrated
network
on
chip,
we’re
using
industry-standard
Ethernet.
We
believe
it’s
a
strong
offering.”

The
data
center
AI
market
is
also
expected
to
grow
as
cloud
providers
and
businesses
build
infrastructure
to
deploy
AI
software,
suggesting
there
is
room
for
other
competitors
even
if
Nvidia
continues
to
make
the
vast
majority
of
AI
chips.

Running
generative
AI
and
buying
Nvidia
GPUs
can
be
expensive,
and
companies
are
looking
for
additional
suppliers
to
help
bring
costs
down.

The
AI
boom
has
more
than
tripled
Nvidia’s
stock
over
the
past
year.
Intel’s
stock
is
only
up
18%
over
the
same
time
period.

AMD
is
also
looking
to
expand
and
sell
more
AI
chips
for
servers.
Last
year,
it
introduced
a
new
data
center
GPU
called
the
MI300X,
which
already
counts

Meta
and
Microsoft
as
customers
.

Earlier
this
year,
Nvidia
revealed
its
B100
and
B200
GPUs,
which
are
the
successors
to
the
H100
and
also
promise
performance
gains.
Those
chips
are
expected
to
start
shipping
later
this
year.

Nvidia
has
been
so
successful
thanks
to
a
powerful
suite
of
proprietary
software
called

CUDA

that
enables
AI
scientists
to
access
all
the
hardware
features
in
a
GPU.
Intel
is
teaming
up
with
other
chip
and
software
giants,
including


Google
,


Qualcomm

and


Arm

to
build
open
software
that
isn’t
proprietary
and
could
enable
software
companies
to
easily
switch
chip
providers.

“We
are
working
with
the
software
ecosystem
to
build
open
reference
software,
as
well
as
building
blocks
that
allow
you
to
stitch
together
a
solution
that
you
need,
rather
than
be
forced
into
buying
a
solution,”
Sachin
Katti,
senior
vice
president
of
Intel’s
networking
group,
said
on
a
call
with
reporters.

Gaudi
3
is
built
on
a
five
nanometer
process,
a
relatively
recent
manufacturing
technique,
suggesting
that
the
company
is
using
an
outside
foundry
to
manufacture
the
chips.
In
addition
to
designing
Gaudi
3,
Intel
also
plans
to
manufacture
AI
chips,
potentially
for
outside
companies,
at
a
new
Ohio
factory
expected
to
open
in
2027
or
2028,
CEO
Patrick
Gelsinger
told
reporters
last
month.

Comments are closed.