JP Data LLC
101 Jefferson Dr, 1st Floor
Menlo Park CA 94025

Phone
(408) 623 9165

Email
info at jpdata dot co
sales at jpdata dot co

Is Nvidia moving towards AI chipset world Domination

Nvidia, by all accounts is the de-facto standard in AI chipsets today. In addition to being a chipset manufacturer, Nvidia has been able to offer additional value to their customers by creating derivative products and solution based on their chipsets. This has helped Nvidia to not only understand customer pain points but become wall street’s darling by creating additional value.

Tractica has been following the AI chipset market for some time. We believe that by the end of 2025, the global market for deep learning chipsets will reach $72.6 billion. According to our estimates,  the market for AI chipset would be  worth $71 billion dollars by 2025.  The deep learning chipset market report segments the market in multiple ways, including architecture, power and performance. Another way of segmenting the market involves segmentation by enterprise vs edge and training vs inference.

As of 2019, Nvidia is well on its way to sell into majority of these market segments.  Nvidia’s data center business was $2.9 billion in FY2019 that ended in Jan 2019 and continues to grow.  Nvidia’s V100 is the de-facto standard for training today that is in production today available to OEMs. AMD is the other company shipping but in our discussions with users, nobody has brought that up as a viable option.  Nvidia has developed chipset for training T4 that was adapted for 57 separate server designs by world’s leading OEMs within 60 days of introduction, a record by itself. 

Nvidia is not for behind in automotive market. Its automotive business generated $633 million in FY2019. Although is a small number in comparison with the date center market, it appears small as not many OEMs are shipping AI based products just yet. When L4 and L5 cars start appearing in the market, this number will go up drastically. The only other competitor in this market, Mobileye, who has not  way ahead of its competitors in terms of automoative AI products. Its Xavier platform is the most advanced platform in that punches 30 TOPS.

These three chipsets fall into 250W, 75W and 30W and offer mid to high performance. These three together encompass majority of the AI chipset market and segments as shown below.

MarketNvidia productOther segments
Enterprise TrainingV100High performance, high power, training
Enterprise InferenceT4Mid power, high performance, inference
Edge – mid powerXavierMid power, mid performance, inference
Edge – low powerJetson NanoLow power, low performance, inference

However, Nvidia has no intentions of stopping there. With the introduction of Jetson Nano, Nvidia now has a product in  5W edge segment. Jerson Nano, which priced as low as  $99 is targeted towards hobbyists and universities who are trying to learn AI. This has given another potential market for Nvidia products in the low power category.

This has left only one category that is not being pursued by Nvidia – a sub 5W category that is targeted towards mobile phones and tablets.

The greatness of Nvidia’s AI product line is that the software stack is uniform across all of them. If an application runs on Jetson Nano, it can just as easily run on Xavier or T4 (although much faster). The brilliance of this strategy is that once a developer is trained on Nano, they can potentially build applications  for cloud, on premise, edge or client devices. So once the students are trained in university, they can carry the knowledge to their employers and recommend GPUs for their production products.

Is Nvidia moving towards AI chipset world Domination

Nvidia, by all accounts is the de-facto standard in AI chipsets today. In addition to being a chipset manufacturer, Nvidia has been able to offer additional value to their customers by creating derivative products and solution based on their chipsets. This has helped Nvidia to not only understand customer pain points but become wall street’s darling by creating additional value.

Tractica has been following the AI chipset market for some time. We believe that by the end of 2025, the global market for deep learning chipsets will reach $72.6 billion. According to our estimates,  the market for AI chipset would be  worth $71 billion dollars by 2025.  The deep learning chipset market report segments the market in multiple ways, including architecture, power and performance. Another way of segmenting the market involves segmentation by enterprise vs edge and training vs inference.

As of 2019, Nvidia is well on its way to sell into majority of these market segments.  Nvidia’s data center business was $2.9 billion in FY2019 that ended in Jan 2019 and continues to grow.  Nvidia’s V100 is the de-facto standard for training today that is in production today available to OEMs. AMD is the other company shipping but in our discussions with users, nobody has brought that up as a viable option.  Nvidia has developed chipset for training T4 that was adapted for 57 separate server designs by world’s leading OEMs within 60 days of introduction, a record by itself. 

Nvidia is not for behind in automotive market. Its automotive business generated $633 million in FY2019. Although is a small number in comparison with the date center market, it appears small as not many OEMs are shipping AI based products just yet. When L4 and L5 cars start appearing in the market, this number will go up drastically. The only other competitor in this market, Mobileye, who has not  way ahead of its competitors in terms of automoative AI products. Its Xavier platform is the most advanced platform in that punches 30 TOPS.

These three chipsets fall into 250W, 75W and 30W and offer mid to high performance. These three together encompass majority of the AI chipset market and segments as shown below.

MarketNvidia productOther segments
Enterprise TrainingV100High performance, high power, training
Enterprise InferenceT4Mid power, high performance, inference
Edge – mid powerXavierMid power, mid performance, inference
Edge – low powerJetson NanoLow power, low performance, inference

However, Nvidia has no intentions of stopping there. With the introduction of Jetson Nano, Nvidia now has a product in  5W edge segment. Jerson Nano, which priced as low as  $99 is targeted towards hobbyists and universities who are trying to learn AI. This has given another potential market for Nvidia products in the low power category.

This has left only one category that is not being pursued by Nvidia – a sub 5W category that is targeted towards mobile phones and tablets.

The greatness of Nvidia’s AI product line is that the software stack is uniform across all of them. If an application runs on Jetson Nano, it can just as easily run on Xavier or T4 (although much faster). The brilliance of this strategy is that once a developer is trained on Nano, they can potentially build applications  for cloud, on premise, edge or client devices. So once the students are trained in university, they can carry the knowledge to their employers and recommend GPUs for their production products.

AI chipsets have generated lot of excitement in the market and a wide range of companies have jumped in the market that includes semiconductor, cloud, hyper scalers and start-ups. As of 2019, of all the companies in the race, Nvidia is the only company that’s been able to generate billions of dollars that the market has promised. We are still at the beginning of AI revolution and race is still wide open but Nvidia is making all the right moves to become the AI chipset king of the future.