101 Jefferson Dr, 1st Floor
Menlo Park CA 94025

(408) 623 9165

info at jpdata dot co
sales at jpdata dot co


Memory innovations must keep pace with compute to achieve best performance

Media often focuses on the compute and TOPS (TeraOPs Per Second) when writing about AI chips or hardware accelerator. Memory is mentioned in the article but is not part of the headlines. In reality, memory is just as important part of the overall acceleration and deserves more attention. As the compute performance is increasing, it has become increasingly important to have right data available at right time to ensure the performance is optimal. Not having the data available means the…

Read More

Takeaways from 2022 AI Hardware and Edge Summit

I was recently at the AI Hardware Summit in Santa Clara. The event made a come back after two year hiatus and was packed with 700+ attendees. The conference was co-located with the AI Edge Summit and there was one overlapping day. Like any other conference, there were tracks, keynotes and of course exhibits. To my surprise, the event was packed – all exhibit booths were busy, people were shaking hands and eating lunch together on a table. Companies were…

Read More

Which AI chip companies have generated over a billion dollars in revenue

AI chips were not among headline generators until Nvidia dedicated its GTC 2016 to ‘Deep Learning’. The industry then realized potential and at least 70+ startups were born based on JP Data’s tracker. Some of them have gone to market and started generating dollars since, whereas others are still trying to get their products to the market. My report was the only report available that focused on both qualitative and quantitative aspect of AI chips in 2016. The first version…

Read More

Data center AI chip market continues to move forward: Expected to reach $21 billion by 2027

MENLO PARK, CA, USA, June 1, 2022 — According to a new research report by market intelligence firm JP Data, the market for data center AI chips is expected to reach $21 billion by year 2027. New R&D activity around AI is driving the need for training chips, whereas AI proof of concepts going into production is driving the need for inference chips. All vendors offering CPUs, GPUs, FPGAs and ASICs have made significant progress in the past two years with new…

Read More

Competition between low power and ultra-low power AI chip vendors

According to a new report that we published recently, ultra low power AI chip market is expected to reach $3.2 billion by 2027. The ultra-low power segment is defined as chips that consumer less than 100 mW. These chip are targeted towards always on applications such as wakeword recognition that are always running in the background. The ASP of such chips is expected to be ~$1. In essence, the ultra low power chip market is currently going after what is…

Read More

Yet another AI chip market segment has emerged

AI chip market continues to grow. Driven by always-on applications such as keyword detection, a whole new market segment for AI chips that consume less than 100 milli watt power has emerged in the past two years.  These chips have already shipped in multi-million volume and the market shows no sign of slowing down. Top AI chip companies such as Nvidia, Intel and Google are absent in this market segment and start-ups have capitalized on this opportunity. Several AI chips…

Read More

Always-on applications are expected to drive ultra-low power AI chip market to $3.3 billion by 2027

MENLO PARK, CA, USA, April 18, 2022 — Driven by always-on applications, a new market for AI chips has emerged in the past two years, according to a new research report by market intelligence firm JP Data. This new segment of AI chips consuming less than 100 milli watt (mW), referred to as ultra-low power AI chips, is expected to reach $3.3 billion by year 2027 at a CAGR of 65.5%. Always-on AI applications will drive the demand for…

Read More

Ampere will merge inference and training workloads in data center and cloud

Note: This was published via Tractica in 2020. At 2020 GTC keynote, Nvidia announced latest compute platform for enterprises, Ampere. Ampere is a giant step up from the previous generation and the white  paper gives quite a few details. While there are many impressive feats achieved in the chipset specs, few jump out and have long term potential to disrupt the AI chipset market. The numbers that is changed drastically is the increase in inference performance in comparison to V100…

Read More

Is Neuromorphic technology ready to take next steps?

Note: This blog was published via Tractica in 2019. Neuromorphic technology has been around since 90s in some form of the other. Early version of the Neurimorphic chipsets include SpinNNaker from University of Manchester and IBM’s TrueNorth. Both of them won several accolades from the university community and were primarily R&D projects. A recent chip from Intel called Loihi was yet another attempt from a prominent chipmaker to revive the interest in Neuromorphic technology. With some of the recent development…

Read More

The exciting world of AI

AI keeps on evolving… on a weekly basis! Ever since I started following this industry, not a single week as passed by when I haven’t heard something interesting. While this is fantastic for world and consumers but sometimes I worry that things are getting unnecessarily hyped (just as they’ve always been. Remember dot com boom?) I mean what is the big deal if someone was able to generate an artsy image using auto encoder in a research paper? It is…

Read More