U.S. Markets open in 59 mins

Is Competition Heating Up in the AI Chip Market?

Danny Vena, The Motley Fool

While the concept of artificial intelligence (AI) has been around for decades, it was the technological innovations of the past several years that brought the concept to reality. The combination of big data, faster processors, and more sophisticated algorithms moved AI from the drawing board into the boardroom, with companies big and small assessing the best way to use this nascent technology to gain a competitive advantage.

NVIDIA (NASDAQ: NVDA) pioneered the graphics processing unit (GPU) and has reaped the rewards of the recent emphasis on AI as much as any other company. NVIDIA was instrumental in providing the processors necessary to achieve the AI advances that are taking the world by storm. Researchers found that the GPU's ability to handle computationally intensive tasks -- enabling image rendering -- also provided the processing necessary to advance AI systems.

The company's shear dominance of the field set off an arms race among technology companies to find the next generation of AI processor, in essence to "build a better mousetrap." Those that succeed could be the next big winner in a field that's only just getting started.

NVIDIA DGX-2 AI supercomputer in a box.

NVIDIA DGX-2 AI supercomputer in a box. Image source: NVIDIA.

In "search" of a better solution

Companies have taken different approaches in the race to find a better solution than the GPU. One of the biggest potential competitors has been Google, a subsidiary of Alphabet (NASDAQ: GOOGL) (NASDAQ: GOOG). The search giant was among the pioneers in AI research and one of the first to introduce a competing solution.

In mid-2016, the company revealed the first generation of the tensor processing unit (TPU), which the company described as "a custom ASIC we built specifically for machine learning." Google claimed that the TPU was 15 to 30 times faster and 30 to 80 times more power efficient compared to "contemporary CPUs and GPUs." 

Google just introduced the third generation of its TPU, which has eight times the computing power of its second-generation predecessor. Until recently, Google only has been using the TPU internally, but announced in February that the chips would be available in limited quantities on Google Cloud for developers and researchers, as well as for rent to its cloud customers.

At this point, the TPU is not available for sale, so it doesn't represent a direct threat, though it could slow the pace at which Google buys NVIDIA GPUs for its data centers.

Google's Tensor Processing Unit (TPU) AI chip.

Google's Tensor Processing Unit AI chip. Image source: Google.

Monkey see, monkey do

Microsoft (NASDAQ: MSFT) made the decision long ago that it would focus on a customizable processor for its Azure Cloud known as a field programmable gate array (FPGA), a chip that can be programmed by a customer after manufacturing. Recent reports, however, indicate that Microsoft has been hiring chip designers who specialize in AI.

A Microsoft spokesperson told CNBC that the jobs would be part of the company's efforts to design cloud hardware in its Project Olympus initiative. "That group has been working on server design, silicon and AI to enable cloud workloads for some time," the representative said. 

Microsoft has been chasing Amazon.com (NASDAQ: AMZN) in cloud computing, coming in a distant second. This effort seems designed to increase its competitiveness in the space. The recent hires seem to indicate that this development process still is on the drawing board, posing no immediate challenge to NVIDIA's supremacy in the space.

IT technicians walking in a data center between rows of rack servers.

Image source: Getty Images.

The cloud pioneer

Amazon pioneered the concept of modern cloud computing and has raced to stay ahead of competition from Google and Microsoft. Several years ago, the company acquired Annapurna labs, which developed networking chips for smart home products, routers, and streaming devices. 

More recent reports suggest that Amazon is targeting two areas close to home. The company wants to design processors that would accelerate responses from its Alexa-powered Echo smart speakers, which currently rely on chips from Intel. The company also has been interested in building custom chips for Amazon Web Services (AWS) for use in its data centers, though at this point, there's no indication that those ambitions have been realized.

NVIDIA is still the king -- for now

While the race is on across the tech industry to develop a "GPU killer," none has emerged to challenge NVIDIA's brainchild. The company isn't resting on its laurels waiting to be dethroned and has continued the breakneck pace of innovation that's led to its current fortunes.

NVIDIA's data center revenue, which includes chips used in AI, has produced eight successive quarters of year-over-year growth exceeding 70%, though it has recently slowed from the triple-digit pace. Thus far, no single competitor has emerged to challenge NVIDIA's dominance in the AI market, and for now, competitors seem content to "chip" away at its lead.

More From The Motley Fool

John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Danny Vena owns shares of Alphabet (A shares), Amazon, and Nvidia. The Motley Fool owns shares of and recommends Alphabet (A shares), Alphabet (C shares), Amazon, and Nvidia. The Motley Fool has a disclosure policy.