101 Jefferson Dr, 1st Floor
Menlo Park CA 94025

(408) 623 9165

info at jpdata dot co
sales at jpdata dot co


Power limitations will force AI chip makers to look into alternate architectures for AI acceleration

The increasing complexity of AI algorithms posed has given rise to a new industry of AI chipsets. In the past two years, all the top semiconductor companies along with cloud companies as well as startups have jumped in to build chipsets. By Tractica’s own estimates, there are over 100 design starts with more being announced every day and companies coming into existence. The use cases for AI applications are quite widespread and they vary from ultra low power IoT devices…

Read More

Nvidia is moving faster than the competition in AI chipset industry

Since 2015, more than seventy companies have entered the AI chipset market with more than 100 chip starts announced. All of them are trying to tackle the AI algorithm acceleration problem using different techniques. These companies range from cloud companies to top semiconductor companies to start ups.  Intel for instance has poured billions of dollars into AI via acquisition of Altera and Mobileye. Many start-ups have raised capital exceeding $100 million. For instance, Wave Computing, Campbell based company, recently announced…

Read More

AI chip companies continue to raise more capital

Note: This was published on Tractica web site in 2018. High mask costs made it hard for chipset start-ups to raise capital during early 2000s. Today mask costs can range up to 25 million dollars and by some estimates the design costs for a chip at 12nm node can run as high as $174 million. This made it hard for investors to justify ROI as most of them demanded 10X return. Only a few markets offered high volume for chipsets…

Read More

New architectures emerge in AI chipset race

As AI chipset market is getting crowded, many AI companies have started creating  solutions that caters to a niche market. The needs for chipset power, performance, software etc. vary greatly depending on the nature of application. For instance IoT edge market needs ultra low power (in milliwatts), mobile phones can work well with power consumption of up to 1W, drones can consume a bit more, automotive can go from 10-30W and so on. Today two most prominent architectures are CPU…

Read More

Who will go into volume production first for AI ASIC?

AI and deep learning has generated lot of excitement over the past few years. Many semiconductor start-ups have emerged since to build chipset optimized for AI. They tackle compute, communication and memory related problems specific to AI algorithm accelerations and build highly optimized architecture that promises low power and high performance. Nervana was perhaps the first companies to build a chipset specifically for AI who got started around 2014. Nervana wanted to sell cloud services based on their chipsets and…

Read More

More and more places and things are now starting to recognize you

Facial recognition is one of the popular applications of computer vision technology that deals with recognizing identity of a person. The technology has been in use for several years with varying degree of accuracy. Classic facial recognition techniques, such as SIFT (Scale Invariant Feature Transform) and SURF (Speeded Up Robust Features), relied on extracting unique features of face. These techniques relied on comparing values of the incoming picture with a reference picture to generate a match. While this worked fairly…

Read More

Is AI the best news for semiconductor industry since microprocessors?

Note: This blog was published via Tractica in 2017. Microprocessors form the basis of any computer system. Whether the device is used for IoT, consumer or cloud, the microprocessor chip is there that acts as a brain. In fact, Semiconductor industry owes its success to microprocessor when Intel introduced 4004 back in the 70s. That later evolved into 808X series leading to widespread adaption of chips. Even today, microprocessors get more coverage than other types of chipsets in the media.…

Read More

(More and more) Deep Learning Chipset Companies Are Coming Out of Stealth Mode

Artificial intelligence (AI) applications are hot and everyone is trying to capitalize on the buzz. It is not surprising that everyone wants better, cheaper hardware that gives them the best performance for their AI application. Ultimately, this boils down to the chipset that runs the underlying algorithms, which has sparked an intense race to win the hardware battle. Note: This was published via Tractica in 2017 Deep learning technology is, by far, the most popular type of neural network used…

Read More

Machine Learning coming to a cloud near you

Note: This blog was published via Tractica in 2017 The popularity of artificial intelligence (AI) technology has led to application developers demanding more compute power. AI applications come in different shapes and sizes, so the need for compute performance varies quite a bit. Training and inference also have different performance requirements. The training phase requires higher compute performance capacity, while lower compute performance suffices for inference. Compute performance needs can range from a personal computer (PC) with a graphics card,…

Read More