AI Chip Startup Makes Training to Edge Inference Transition

Wave Computing was one of the earliest AI chip startups that held significant promise, particularly with its initial message of a single architecture to handle both training and inference.

The problem, however, was that the pace of innovations in frameworks, software, and data types quickly outpaced what many of the first AI chip startups were focusing on. And as they raced to ready their hardware for market, existing devices from large companies swept in, quickly cobbled together robust software stacks, and stole the market. This certainly happened in the AI training systems market (Nvidia being the prime example) but all has not been lost.

Some of the early startups rerouted their strategies to focus exclusively on training anyway, pushing the idea that improvements over GPUs were 10X and even 1000X in some cases. And others, like Wave, decided that trying to tackle a dual workload (training/inference) with one architecture might not be the best approach given the dramatic architectural requirement differences for training chips (which need a lot of memory and compute) and inference (low-latency, low power, low cost). Others still pulled back entirely from the datacenter and are still regrouping, keeping business afloat by tackling the crowded market at the edge.

Wave Computing Chief Data Science Officer, Jin Kim, appeared at the sold-out Next AI Platform event in May to talk about lessons learned in this space from a datacenter perspective and how their reroute took them in a new direction that might be even more promising than the path they expected.

Their company is now focused on training with a discrete device that is optimized for the compute and memory bandwidth/capacity and another entirely separate chip for inference acceleration in the datacenter. Their MIPS technology acquisition will help them carve out a niche at the edge, which might be where the real revenue resides as datacenter folks figure out just how they plan to implement AI (and if they’ll go beyond the trusty, tested GPU/CPU).

With that said, this change of plan is a bit disappointing because our initial description of their architecture for AI got us thinking about efficiencies and performance possible outside of general purpose devices. Their data flow architecture as initially described in 2017 had some notable features, but all has not been lost. Kim says that they carried over many of the same elements, albeit with more optimizations for the workload and whether its running at the edge or in the datacenter.

Kim says that what these early end users of AI want is a coherent strategy that stretches from datacenter to edge under a single system architecture and software platform. “We came to the conclusion for our business model not to support both training and inference as a single architecture. There are too many differences in data types, latency, and power dissipation between the two. We instead created a common architecture but with different implementations for training and inference in the datacenter and at the edge.”

The MIPS acquisition is key to Wave sustaining an edge business to fund their push into the lucrative datacenter market. As we noted last year, MIPS has an established toolchain and customers, including those in high-value areas like automotive, surveillance, and defense. If they can prove out hyper-efficient at the edge and carry that architecture into the server systems for these workloads, there could be a wealth of opportunity that Wave never could have predicted when they first entered the market.

Hear more from Jin Kim from the event in the recording below. He describes other trends, including where precision will go for future architectures and where they see their architecture fitting in for the long term.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.