JP Data LLC
101 Jefferson Dr, 1st Floor
Menlo Park CA 94025

Phone
(408) 623 9165

Email
info at jpdata dot co
sales at jpdata dot co

Blog

Yet another AI chip market segment has emerged

AI chip market continues to grow. Driven by always-on applications such as keyword detection, a whole new market segment for AI chips that consume less than 100 milli watt power has emerged in the past two years.  These chips have already shipped in multi-million volume and the market shows no sign of slowing down. Top AI chip companies such as Nvidia, Intel and Google are absent in this market segment and start-ups have capitalized on this opportunity. Several AI chips…

Read More

Ampere will merge inference and training workloads in data center and cloud

Note: This was published via Tractica in 2020. At 2020 GTC keynote, Nvidia announced latest compute platform for enterprises, Ampere. Ampere is a giant step up from the previous generation and the white  paper gives quite a few details. While there are many impressive feats achieved in the chipset specs, few jump out and have long term potential to disrupt the AI chipset market. The numbers that is changed drastically is the increase in inference performance in comparison to V100…

Read More

Is Neuromorphic technology ready to take next steps?

Note: This blog was published via Tractica in 2019. Neuromorphic technology has been around since 90s in some form of the other. Early version of the Neurimorphic chipsets include SpinNNaker from University of Manchester and IBM’s TrueNorth. Both of them won several accolades from the university community and were primarily R&D projects. A recent chip from Intel called Loihi was yet another attempt from a prominent chipmaker to revive the interest in Neuromorphic technology. With some of the recent development…

Read More

The exciting world of AI

AI keeps on evolving… on a weekly basis! Ever since I started following this industry, not a single week as passed by when I haven’t heard something interesting. While this is fantastic for world and consumers but sometimes I worry that things are getting unnecessarily hyped (just as they’ve always been. Remember dot com boom?) I mean what is the big deal if someone was able to generate an artsy image using auto encoder in a research paper? It is…

Read More

Who will go into mass production first for AI ASICs?

Artificial intelligence (AI) and deep learning have generated lot of excitement over the past few years. Many semiconductor startups have emerged to build chipsets optimized for AI. They are tackling compute, communication, and memory-related problems specific to AI algorithm accelerations and building highly optimized architectures that promise low power and high performance. Nervana was perhaps the first company to build a chipset specifically for AI, which got started in 2014. Nervana wanted to sell cloud services based on its chipsets and…

Read More

Is AI the best news for semiconductors since microprocessor days?

The microprocessor chip acts as the brain of any computer system, whether the device is used for the Internet of Things (IoT), consumers, or the cloud. In fact, the semiconductor industry owes its success to Intel’s introduction of its 4004 microprocessor back in the 1970s. That later evolved into the 808X series, which led to the widespread adoption of chips. Even today, microprocessors receive more media coverage than any other type of chipset. Microprocessors are still the single most expensive semiconductor component in terms…

Read More

ASICs for deep learning are not exactly ASICs

The term “application specific integrated circuit” (ASIC) became popular in the 1990s when such chips promised to bring down the cost per chip for a given application, such as mobile phones or Ethernet cards. An ASIC, by definition, meant developing hardware to solve a problem by building gates to emulate the logic. These chips offered little programmability, but provided maximum performance at a given power and cost budget. It is hard to trace the origins of the word ASIC and…

Read More

Will video analytics save retail market?

The world of computer vision has continued to evolve in recent years with computer surpassing the accuracy of human being. Computer vision has enabled many applications that were only part of science fiction a short time ago. The rise of AI has been instrumental to get to the level of accuracy that market demands. A new application enabled for computer vision called video analytics, has emerged in recent years. Video analytics deals with extraction of data from incoming video and…

Read More

Is Nvidia moving towards AI chipset world Domination

Nvidia, by all accounts is the de-facto standard in AI chipsets today. In addition to being a chipset manufacturer, Nvidia has been able to offer additional value to their customers by creating derivative products and solution based on their chipsets. This has helped Nvidia to not only understand customer pain points but become wall street’s darling by creating additional value. Tractica has been following the AI chipset market for some time. We believe that by the end of 2025, the…

Read More

Intel is going ‘all-in’ to dominate AI chipset world

AI revolution, started in 2012 when neural network Alexnet surpassed accuracy of all previous classic computer vision techniques and has not looked back since. The AI algorithms are compute sensitive by nature and the need for accelerating AI algorithms in hardware has long been recognized with over 100 companies jumping in with their chipsets. Intel is currently locked in battle with Nvidia to become world’s dominant AI chipset company. While Nvidia’s focus in on discrete GPUs for AI acceleration, Intel…

Read More